Updates from: 02/16/2022 02:09:44
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Deploy Custom Policies Devops https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/deploy-custom-policies-devops.md
try {
$FileExists = Test-Path -Path $filePath -PathType Leaf if ($FileExists) {
- $policycontent = Get-Content $filePath
+ $policycontent = Get-Content $filePath -Encoding UTF8
# Optional: Change the content of the policy. For example, replace the tenant-name with your tenant name. # $policycontent = $policycontent.Replace("your-tenant.onmicrosoft.com", "contoso.onmicrosoft.com")
active-directory Concept Certificate Based Authentication Technical Deep Dive https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/concept-certificate-based-authentication-technical-deep-dive.md
Previously updated : 02/09/2022 Last updated : 02/15/2022 -+
Let's cover each step:
:::image type="content" border="true" source="./media/concept-certificate-based-authentication-technical-deep-dive/sign-in-alt.png" alt-text="Screenshot of the Sign-in if FIDO2 is also enabled.":::
-1. After the user clicks the link, the client is redirected to the certauth endpoint [http://certauth.login.microsoftonline.com](http://certauth.login.microsoftonline.com). The endpoint performs mutual authentication and requests the client certificate as part of the TLS handshake. You will see an entry for this request in the Sign-in logs. There is a [known issue](#known-issues) where User ID is displayed instead of Username.
+1. After the user clicks the link, the client is redirected to the certauth endpoint, which is [http://certauth.login.microsoftonline.com](http://certauth.login.microsoftonline.com) for Azure Global. For [Azure Government](/azure-government/compare-azure-government-global-azure.md#guidance-for-developers), the certauth endpoint is [http://certauth.login.microsoftonline.us](http://certauth.login.microsoftonline.us). For the correct endpoint for other environments, see the specific Microsoft cloud docs.
+
+ The endpoint performs mutual authentication and requests the client certificate as part of the TLS handshake. You will see an entry for this request in the Sign-in logs. There is a [known issue](#known-issues) where User ID is displayed instead of Username.
:::image type="content" border="true" source="./media/concept-certificate-based-authentication-technical-deep-dive/sign-in-log.png" alt-text="Screenshot of the Sign-in log in Azure AD." lightbox="./media/concept-certificate-based-authentication-technical-deep-dive/sign-in-log.png":::
active-directory How To Certificate Based Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/how-to-certificate-based-authentication.md
Make sure that the following prerequisites are in place.
>Each CA should have a certificate revocation list (CRL) that can be referenced from internet-facing URLs. If the trusted CA does not have a CRL configured, Azure AD will not perform any CRL checking, revocation of user certificates will not work, and authentication will not be blocked. >[!IMPORTANT]
->Make sure the PKI is secure and cannot be easily compromised. In the event of a compromise, the attacker can create and sign client certificates and compromise any user in the tenant, both synced and cloud-only users. However, a strong key protection strategy, along with other physical and logical controls such as HSM activation cards or tokens for the secure storage of artifacts, can provide defense-in-depth to prevent external attackers or insider threats from compromising the integrity of the PKI. For more information, see [Securing PKI](https://docs.microsoft.com/previous-versions/windows/it-pro/windows-server-2012-r2-and-2012/dn786443(v=ws.11)).
+>Make sure the PKI is secure and cannot be easily compromised. In the event of a compromise, the attacker can create and sign client certificates and compromise any user in the tenant, both synced and cloud-only users. However, a strong key protection strategy, along with other physical and logical controls such as HSM activation cards or tokens for the secure storage of artifacts, can provide defense-in-depth to prevent external attackers or insider threats from compromising the integrity of the PKI. For more information, see [Securing PKI](/previous-versions/windows/it-pro/windows-server-2012-r2-and-2012/dn786443(v=ws.11)).
## Steps to configure and test Azure AD CBA
To enable the certificate-based authentication and configure username bindings u
- [Technical deep dive for Azure AD CBA](concept-certificate-based-authentication-technical-deep-dive.md) - [Limitations with Azure AD CBA](concept-certificate-based-authentication-limitations.md) - [FAQ](certificate-based-authentication-faq.yml)-- [Troubleshoot Azure AD CBA](troubleshoot-certificate-based-authentication.md)-
+- [Troubleshoot Azure AD CBA](troubleshoot-certificate-based-authentication.md)
active-directory Scenario Desktop Acquire Token Wam https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/scenario-desktop-acquire-token-wam.md
MSAL is able to call Web Account Manager, a Windows 10 component that ships with
## Availability
-MSAL 4.25+ supports WAM on UWP, .NET Classic, .NET Core 3.x, and .NET 5.
+MSAL 4.25+ supports WAM on UWP, .NET Classic, .NET Core 3.1, and .NET 5.
-For .NET Classic and .NET Core 3.x, WAM functionality is fully supported but you have to add a reference to [Microsoft.Identity.Client.Desktop](https://www.nuget.org/packages/Microsoft.Identity.Client.Desktop/) package, alongside MSAL, and instead of `WithBroker()`, call `.WithWindowsBroker()`.
+For .NET Classic and .NET Core 3.1, WAM functionality is fully supported but you have to add a reference to [Microsoft.Identity.Client.Desktop](https://www.nuget.org/packages/Microsoft.Identity.Client.Desktop/) package, alongside MSAL, and instead of `WithBroker()`, call `.WithWindowsBroker()`.
For .NET 5, target `net5.0-windows10.0.17763.0` (or higher) and not just `net5.0`. Your app will still run on older versions of Windows if you add `<SupportedOSPlatformVersion>7</SupportedOSPlatformVersion>` in the csproj. MSAL will use a browser when WAM is not available.
Applications cannot remove accounts from Windows!
## Troubleshooting
+### "Either the user cancelled the authentication or the WAM Account Picker crashed because the app is running in an elevated process" error message
+ When an app that uses MSAL is run as an elevated process, some of these calls within WAM may fail due to different process security levels. Internally MSAL.NET uses native Windows methods ([COM](/windows/win32/com/the-component-object-model)) to integrate with WAM. Starting with version 4.32.0, MSAL will display a descriptive error message when it detects that the app process is elevated and WAM returned no accounts.
-One solution is to not run the app as elevated, if possible. Another potential workaround is to call `WindowsNativeUtils.InitializeProcessSecurity` method when the app starts up. This will set the security of the processes used by WAM to the same levels. See [this sample app](https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/blob/master/tests/devapps/WAM/NetCoreWinFormsWam/Program.cs#L18-L21) for an example. However, note, that this workaround is not guaranteed to succeed to due external factors like the underlying CLR behavior. In that case, an `MsalClientException` will be thrown. See issue [#2560](https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/issues/2560) for additional information.
+One solution is to not run the app as elevated, if possible. Another solution is for the app developer to call `WindowsNativeUtils.InitializeProcessSecurity` method when the app starts up. This will set the security of the processes used by WAM to the same levels. See [this sample app](https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/blob/master/tests/devapps/WAM/NetCoreWinFormsWam/Program.cs#L18-L21) for an example. However, note, that this solution is not guaranteed to succeed to due external factors like the underlying CLR behavior. In that case, an `MsalClientException` will be thrown. See issue [#2560](https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/issues/2560) for additional information.
+
+### "WAM Account Picker did not return an account" error message
+
+This message indicates that either the application user closed the dialog that displays accounts, or the dialog itself crashed. A crash might occur if AccountsControl, a Windows control, is registered incorrectly in Windows. To resolve this issue:
+
+1. In the taskbar, right-click **Start**, and then select **Windows PowerShell (Admin)**.
+1. If you're prompted by a User Account Control (UAC) dialog, select **Yes** to start PowerShell.
+1. Copy and then run the following script:
+
+ ```powershell
+ if (-not (Get-AppxPackage Microsoft.AccountsControl)) { Add-AppxPackage -Register "$env:windir\SystemApps\Microsoft.AccountsControl_cw5n1h2txyewy\AppxManifest.xml" -DisableDevelopmentMode -ForceApplicationShutdown } Get-AppxPackage Microsoft.AccountsControl
+ ```
+
+### Connection issues
+
+The application user sees an error message similar to "Please check your connection and try again". If this issue occurs regularly, see the [troubleshooting guide for Office](/office365/troubleshoot/authentication/connection-issue-when-sign-in-office-2016), which also uses WAM.
## Sample
active-directory Workload Identity Federation Create Trust Gcp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/workload-identity-federation-create-trust-gcp.md
Take note of the *object ID* of the app (not the application (client) ID) which
## Grant your app permissions to resources
-Grant your app the permissions necessary to access the Azure AD protected resources targeted by your software workload running in Google Cloud. For example, [assign the Storage Blob Data Contributor role](/azure/storage/blobs/assign-azure-role-data-access) to your app if your application needs to read, write, and delete blob data in [Azure Storage](/azure/storage/blobs/storage-blobs-introduction).
+Grant your app the permissions necessary to access the Azure AD protected resources targeted by your software workload running in Google Cloud. For example, [assign the Storage Blob Data Contributor role](../../storage/blobs/assign-azure-role-data-access.md) to your app if your application needs to read, write, and delete blob data in [Azure Storage](../../storage/blobs/storage-blobs-introduction.md).
## Set up an identity in Google Cloud
active-directory Workload Identity Federation Create Trust Github https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/workload-identity-federation-create-trust-github.md
az rest -m DELETE -u 'https://graph.microsoft.com/beta/applications/f6475511-fd
Before configuring your GitHub Actions workflow, get the *tenant-id* and *client-id* values of your app registration. You can find these values in the Azure portal. Go to the list of [registered applications](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/RegisteredApps) and select your app registration. In **Overview**->**Essentials**, find the **Application (client) ID** and **Directory (tenant) ID**. Set these values in your GitHub environment to use in the Azure login action for your workflow. ## Next steps
-For an end-to-end example, read [Deploy to App Service using GitHub Actions](/azure/app-service/deploy-github-actions?tabs=openid).
+For an end-to-end example, read [Deploy to App Service using GitHub Actions](../../app-service/deploy-github-actions.md?tabs=openid).
Read the [GitHub Actions documentation](https://docs.github.com/actions/deployment/security-hardening-your-deployments/configuring-openid-connect-in-azure) to learn more about configuring your GitHub Actions workflow to get an access token from Microsoft identity provider and access Azure resources.
active-directory Assign Local Admin https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/assign-local-admin.md
Previously updated : 02/08/2022 Last updated : 02/15/2022
Currently, there's no UI in Intune to manage these policies and they need to be
- Adding Azure AD groups through the policy requires the group's SID that can be obtained by executing the [Microsoft Graph API for Groups](/graph/api/resources/group). The SID is defined by the property `securityIdentifier` in the API response. -- Administrator privileges using this policy are evaluated only for the following well-known groups on a Windows 10 device - Administrators, Users, Guests, Power Users, Remote Desktop Users and Remote Management Users.
+- Administrator privileges using this policy are evaluated only for the following well-known groups on a Windows 10 or newer device - Administrators, Users, Guests, Power Users, Remote Desktop Users and Remote Management Users.
- Managing local administrators using Azure AD groups isn't applicable to Hybrid Azure AD joined or Azure AD Registered devices.
active-directory Azuread Join Sso https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/azuread-join-sso.md
If you have a hybrid environment, with both Azure AD and on-premises AD, it's li
> [!NOTE] > Windows Hello for Business requires additional configuration to enable on-premises SSO from an Azure AD joined device. For more information, see [Configure Azure AD joined devices for On-premises Single-Sign On using Windows Hello for Business](/windows/security/identity-protection/hello-for-business/hello-hybrid-aadj-sso-base). >
-> FIDO2 security key based passwordless authentication with Windows 10 requires additional configuration to enable on-premises SSO from an Azure AD joined device. For more information, see [Enable passwordless security key sign-in to on-premises resources with Azure Active Directory](../authentication/howto-authentication-passwordless-security-key-on-premises.md).
+> FIDO2 security key based passwordless authentication with Windows 10 or newer requires additional configuration to enable on-premises SSO from an Azure AD joined device. For more information, see [Enable passwordless security key sign-in to on-premises resources with Azure Active Directory](../authentication/howto-authentication-passwordless-security-key-on-premises.md).
During an access attempt to a resource requesting Kerberos or NTLM in the user's on-premises environment, the device:
With SSO, on an Azure AD joined device you can:
- Access a UNC path on an AD member server - Access an AD member web server configured for Windows-integrated security
-If you want to manage your on-premises AD from a Windows device, install the [Remote Server Administration Tools for Windows 10](https://www.microsoft.com/download/details.aspx?id=45520).
+If you want to manage your on-premises AD from a Windows device, install the [Remote Server Administration Tools](https://www.microsoft.com/download/details.aspx?id=45520).
You can use:
active-directory Azureadjoin Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/azureadjoin-plan.md
Previously updated : 01/20/2022 Last updated : 02/15/2022
If your identity provider doesn't support these protocols, Azure AD join doesn't
You can't use smartcards or certificate-based authentication to join devices to Azure AD. However, smartcards can be used to sign in to Azure AD joined devices if you have AD FS configured.
-**Recommendation:** Implement Windows Hello for Business for strong, password-less authentication to Windows 10 and above devices.
+**Recommendation:** Implement Windows Hello for Business for strong, password-less authentication to Windows 10 or newer.
### User configuration
Azure AD join:
- Isn't supported on previous versions of Windows or other operating systems. If you have Windows 7/8.1 devices, you must upgrade at least to Windows 10 to deploy Azure AD join. - Is supported for FIPS-compliant TPM 2.0 but not supported for TPM 1.2. If your devices have FIPS-compliant TPM 1.2, you must disable them before proceeding with Azure AD join. Microsoft doesn't provide any tools for disabling FIPS mode for TPMs as it is dependent on the TPM manufacturer. Contact your hardware OEM for support.
-**Recommendation:** Always use the latest Windows 10 release to take advantage of updated features.
+**Recommendation:** Always use the latest Windows release to take advantage of updated features.
### Management platform
-Device management for Azure AD joined devices is based on an MDM platform such as Intune, and MDM CSPs. Windows 10 has a built-in MDM agent that works with all compatible MDM solutions.
+Device management for Azure AD joined devices is based on an MDM platform such as Intune, and MDM CSPs. Starting in Windows 10 there is a built-in MDM agent that works with all compatible MDM solutions.
> [!NOTE] > Group policies are not supported in Azure AD joined devices as they are not connected to on-premises Active Directory. Management of Azure AD joined devices is only possible through MDM
Review supported and unsupported policies to determine whether you can use an MD
If your MDM solution isn't available through the Azure AD app gallery, you can add it following the process outlined in [Azure Active Directory integration with MDM](/windows/client-management/mdm/azure-active-directory-integration-with-mdm).
-Through co-management, you can use SCCM to manage certain aspects of your devices while policies are delivered through your MDM platform. Microsoft Intune enables co-management with SCCM. For more information on co-management for Windows 10 devices, see [What is co-management?](/configmgr/core/clients/manage/co-management-overview). If you use an MDM product other than Intune, check with your MDM provider on applicable co-management scenarios.
+Through co-management, you can use SCCM to manage certain aspects of your devices while policies are delivered through your MDM platform. Microsoft Intune enables co-management with SCCM. For more information on co-management for Windows 10 or newer devices, see [What is co-management?](/configmgr/core/clients/manage/co-management-overview). If you use an MDM product other than Intune, check with your MDM provider on applicable co-management scenarios.
**Recommendation:** Consider MDM only management for Azure AD joined devices.
Azure AD joined devices don't support on-premises applications relying on machin
Remote desktop connection to an Azure AD joined devices requires the host machine to be either Azure AD joined or hybrid Azure AD joined. Remote desktop from an unjoined or non-Windows device isn't supported. For more information, see [Connect to remote Azure AD joined pc](/windows/client-management/connect-to-remote-aadj-pc)
-Starting Windows 10 2004 update, users can also use remote desktop from an Azure AD registered Windows 10 device to an Azure AD joined device.
+Starting with the Windows 10 2004 update, users can also use remote desktop from an Azure AD registered Windows 10 or newer device to another Azure AD joined device.
### RADIUS and Wi-Fi authentication
active-directory Concept Azure Ad Join Hybrid https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/concept-azure-ad-join-hybrid.md
Previously updated : 01/26/2022 Last updated : 02/15/2022
Hybrid Azure AD joined devices require network line of sight to your on-premises
| **Primary audience** | Suitable for hybrid organizations with existing on-premises AD infrastructure | | | Applicable to all users in an organization | | **Device ownership** | Organization |
-| **Operating Systems** | Windows 10 and above, 8.1 and 7 |
+| **Operating Systems** | Windows 10 or newer, 8.1 and 7 |
| | Windows Server 2008/R2, 2012/R2, 2016 and 2019 |
-| **Provisioning** | Windows 10, Windows Server 2016/2019 |
+| **Provisioning** | Windows 10 or newer, Windows Server 2016/2019 |
| | Domain join by IT and autojoin via Azure AD Connect or ADFS config | | | Domain join by Windows Autopilot and autojoin via Azure AD Connect or ADFS config | | | Windows 8.1, Windows 7, Windows Server 2012 R2, Windows Server 2012, and Windows Server 2008 R2 - Require MSI |
active-directory Concept Azure Ad Register https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/concept-azure-ad-register.md
Previously updated : 01/26/2022 Last updated : 02/15/2022
The goal of Azure AD registered devices is to provide your users with support fo
| | Bring your own device | | | Mobile devices | | **Device ownership** | User or Organization |
-| **Operating Systems** | Windows 10 and above, iOS, Android, and macOS |
-| **Provisioning** | Windows 10 and above ΓÇô Settings |
+| **Operating Systems** | Windows 10 or newer, iOS, Android, and macOS |
+| **Provisioning** | Windows 10 or newer ΓÇô Settings |
| | iOS/Android ΓÇô Company Portal or Microsoft Authenticator app | | | macOS ΓÇô Company Portal | | **Device sign in options** | End-user local credentials |
The goal of Azure AD registered devices is to provide your users with support fo
![Azure AD registered devices](./media/concept-azure-ad-register/azure-ad-registered-device.png)
-Azure AD registered devices are signed in to using a local account like a Microsoft account on a Windows 10 and above device. These devices have an Azure AD account for access to organizational resources. Access to resources in the organization can be limited based on that Azure AD account and Conditional Access policies applied to the device identity.
+Azure AD registered devices are signed in to using a local account like a Microsoft account on a Windows 10 or newer device. These devices have an Azure AD account for access to organizational resources. Access to resources in the organization can be limited based on that Azure AD account and Conditional Access policies applied to the device identity.
Administrators can secure and further control these Azure AD registered devices using Mobile Device Management (MDM) tools like Microsoft Intune. MDM provides a means to enforce organization-required configurations like requiring storage to be encrypted, password complexity, and security software kept updated.
active-directory Concept Primary Refresh Token https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/concept-primary-refresh-token.md
Previously updated : 09/13/2021 Last updated : 02/15/2022
# What is a Primary Refresh Token?
-A Primary Refresh Token (PRT) is a key artifact of Azure AD authentication on Windows 10, Windows Server 2016 and later versions, iOS, and Android devices. It is a JSON Web Token (JWT) specially issued to Microsoft first party token brokers to enable single sign-on (SSO) across the applications used on those devices. In this article, we will provide details on how a PRT is issued, used, and protected on Windows 10 devices.
+A Primary Refresh Token (PRT) is a key artifact of Azure AD authentication on Windows 10 or newer, Windows Server 2016 and later versions, iOS, and Android devices. It is a JSON Web Token (JWT) specially issued to Microsoft first party token brokers to enable single sign-on (SSO) across the applications used on those devices. In this article, we will provide details on how a PRT is issued, used, and protected on Windows 10 or newer devices.
-This article assumes that you already understand the different device states available in Azure AD and how single sign-on works in Windows 10. For more information about devices in Azure AD, see the article [What is device management in Azure Active Directory?](overview.md)
+This article assumes that you already understand the different device states available in Azure AD and how single sign-on works in Windows 10 or newer. For more information about devices in Azure AD, see the article [What is device management in Azure Active Directory?](overview.md)
## Key terminology and components The following Windows components play a key role in requesting and using a PRT:
-* **Cloud Authentication Provider** (CloudAP): CloudAP is the modern authentication provider for Windows sign in, that verifies users logging to a Windows 10 device. CloudAP provides a plugin framework that identity providers can build on to enable authentication to Windows using that identity providerΓÇÖs credentials.
-* **Web Account Manager** (WAM): WAM is the default token broker on Windows 10 devices. WAM also provides a plugin framework that identity providers can build on and enable SSO to their applications relying on that identity provider. (Not included in Windows Server 2016 LTSC builds)
+* **Cloud Authentication Provider** (CloudAP): CloudAP is the modern authentication provider for Windows sign in, that verifies users logging to a Windows 10 or newer device. CloudAP provides a plugin framework that identity providers can build on to enable authentication to Windows using that identity providerΓÇÖs credentials.
+* **Web Account Manager** (WAM): WAM is the default token broker on Windows 10 or newer devices. WAM also provides a plugin framework that identity providers can build on and enable SSO to their applications relying on that identity provider. (Not included in Windows Server 2016 LTSC builds)
* **Azure AD CloudAP plugin**: An Azure AD specific plugin built on the CloudAP framework, that verifies user credentials with Azure AD during Windows sign in. * **Azure AD WAM plugin**: An Azure AD specific plugin built on the WAM framework, that enables SSO to applications that rely on Azure AD for authentication.
-* **Dsreg**: An Azure AD specific component on Windows 10, that handles the device registration process for all device states.
+* **Dsreg**: An Azure AD specific component on Windows 10 or newer, that handles the device registration process for all device states.
* **Trusted Platform Module** (TPM): A TPM is a hardware component built into a device, that provides hardware-based security functions for user and device secrets. More details can be found in the article [Trusted Platform Module Technology Overview](/windows/security/information-protection/tpm/trusted-platform-module-overview). ## What does the PRT contain?
Device registration is a prerequisite for device based authentication in Azure A
The private keys are bound to the deviceΓÇÖs TPM if the device has a valid and functioning TPM, while the public keys are sent to Azure AD during the device registration process. These keys are used to validate the device state during PRT requests.
-The PRT is issued during user authentication on a Windows 10 device in two scenarios:
+The PRT is issued during user authentication on a Windows 10 or newer device in two scenarios:
-* **Azure AD joined** or **Hybrid Azure AD joined**: A PRT is issued during Windows logon when a user signs in with their organization credentials. A PRT is issued with all Windows 10 supported credentials, for example, password and Windows Hello for Business. In this scenario, Azure AD CloudAP plugin is the primary authority for the PRT.
-* **Azure AD registered device**: A PRT is issued when a user adds a secondary work account to their Windows 10 device. Users can add an account to Windows 10 in two different ways -
+* **Azure AD joined** or **Hybrid Azure AD joined**: A PRT is issued during Windows logon when a user signs in with their organization credentials. A PRT is issued with all Windows 10 or newer supported credentials, for example, password and Windows Hello for Business. In this scenario, Azure AD CloudAP plugin is the primary authority for the PRT.
+* **Azure AD registered device**: A PRT is issued when a user adds a secondary work account to their Windows 10 or newer device. Users can add an account to Windows 10 or newer in two different ways -
* Adding an account via the **Allow my organization to manage my device** prompt after signing in to an app (for example, Outlook) * Adding an account from **Settings** > **Accounts** > **Access Work or School** > **Connect** In Azure AD registered device scenarios, the Azure AD WAM plugin is the primary authority for the PRT since Windows logon is not happening with this Azure AD account. > [!NOTE]
-> 3rd party identity providers need to support the WS-Trust protocol to enable PRT issuance on Windows 10 devices. Without WS-Trust, PRT cannot be issued to users on Hybrid Azure AD joined or Azure AD joined devices. On ADFS only usernamemixed endpoints are required. Both adfs/services/trust/2005/windowstransport and adfs/services/trust/13/windowstransport should be enabled as intranet facing endpoints only and **must NOT be exposed** as extranet facing endpoints through the Web Application Proxy
+> 3rd party identity providers need to support the WS-Trust protocol to enable PRT issuance on Windows 10 or newer devices. Without WS-Trust, PRT cannot be issued to users on Hybrid Azure AD joined or Azure AD joined devices. On ADFS only usernamemixed endpoints are required. Both adfs/services/trust/2005/windowstransport and adfs/services/trust/13/windowstransport should be enabled as intranet facing endpoints only and **must NOT be exposed** as extranet facing endpoints through the Web Application Proxy
> [!NOTE] > Azure AD Conditional Access policies are not evaluated when PRTs are issued
Once issued, a PRT is valid for 14 days and is continuously renewed as long as t
A PRT is used by two key components in Windows: * **Azure AD CloudAP plugin**: During Windows sign in, the Azure AD CloudAP plugin requests a PRT from Azure AD using the credentials provided by the user. It also caches the PRT to enable cached sign in when the user does not have access to an internet connection.
-* **Azure AD WAM plugin**: When users try to access applications, the Azure AD WAM plugin uses the PRT to enable SSO on Windows 10. Azure AD WAM plugin uses the PRT to request refresh and access tokens for applications that rely on WAM for token requests. It also enables SSO on browsers by injecting the PRT into browser requests. Browser SSO in Windows 10 is supported on Microsoft Edge (natively), Chrome (via the [Windows 10 Accounts](https://chrome.google.com/webstore/detail/windows-10-accounts/ppnbnpeolgkicgegkbkbjmhlideopiji?hl=en) or [Office Online](https://chrome.google.com/webstore/detail/office/ndjpnladcallmjemlbaebfadecfhkepb?hl=en) extensions) or Mozilla Firefox v91+ (Firefox [Windows SSO setting](https://support.mozilla.org/kb/windows-sso))
+* **Azure AD WAM plugin**: When users try to access applications, the Azure AD WAM plugin uses the PRT to enable SSO on Windows 10 or newer. Azure AD WAM plugin uses the PRT to request refresh and access tokens for applications that rely on WAM for token requests. It also enables SSO on browsers by injecting the PRT into browser requests. Browser SSO in Windows 10 or newer is supported on Microsoft Edge (natively), Chrome (via the [Windows 10 Accounts](https://chrome.google.com/webstore/detail/windows-10-accounts/ppnbnpeolgkicgegkbkbjmhlideopiji?hl=en) or [Office Online](https://chrome.google.com/webstore/detail/office/ndjpnladcallmjemlbaebfadecfhkepb?hl=en) extensions) or Mozilla Firefox v91+ (Firefox [Windows SSO setting](https://support.mozilla.org/kb/windows-sso))
> [!NOTE] > In instances where a user has two accounts from the same Azure AD tenant signed in to a browser application, the device authentication provided by the PRT of the primary account is automatically applied to the second account as well. As a result, the second account also satisfies any device-based Conditional Access policy on the tenant.
A PRT is used by two key components in Windows:
A PRT is renewed in two different methods: * **Azure AD CloudAP plugin every 4 hours**: The CloudAP plugin renews the PRT every 4 hours during Windows sign in. If the user does not have internet connection during that time, CloudAP plugin will renew the PRT after the device is connected to the internet.
-* **Azure AD WAM plugin during app token requests**: The WAM plugin enables SSO on Windows 10 devices by enabling silent token requests for applications. The WAM plugin can renew the PRT during these token requests in two different ways:
+* **Azure AD WAM plugin during app token requests**: The WAM plugin enables SSO on Windows 10 or newer devices by enabling silent token requests for applications. The WAM plugin can renew the PRT during these token requests in two different ways:
* An app requests WAM for an access token silently but thereΓÇÖs no refresh token available for that app. In this case, WAM uses the PRT to request a token for the app and gets back a new PRT in the response. * An app requests WAM for an access token but the PRT is invalid or Azure AD requires additional authorization (for example, Azure AD Multi-Factor Authentication). In this scenario, WAM initiates an interactive logon requiring the user to reauthenticate or provide additional verification and a new PRT is issued on successful authentication.
Windows transport endpoints are required for password authentication only when a
## How is the PRT protected?
-A PRT is protected by binding it to the device the user has signed in to. Azure AD and Windows 10 enable PRT protection through the following methods:
+A PRT is protected by binding it to the device the user has signed in to. Azure AD and Windows 10 or newer enable PRT protection through the following methods:
* **During first sign in**: During first sign in, a PRT is issued by signing requests using the device key cryptographically generated during device registration. On a device with a valid and functioning TPM, the device key is secured by the TPM preventing any malicious access. A PRT is not issued if the corresponding device key signature cannot be validated. * **During token requests and renewal**: When a PRT is issued, Azure AD also issues an encrypted session key to the device. It is encrypted with the public transport key (tkpub) generated and sent to Azure AD as part of device registration. This session key can only be decrypted by the private transport key (tkpriv) secured by the TPM. The session key is the Proof-of-Possession (POP) key for any requests sent to Azure AD. The session key is also protected by the TPM and no other OS component can access it. Token requests or PRT renewal requests are securely signed by this session key through the TPM and hence, cannot be tampered with. Azure AD will invalidate any requests from the device that are not signed by the corresponding session key.
-By securing these keys with the TPM, we enhance the security for PRT from malicious actors trying to steal the keys or replay the PRT. So, using a TPM greatly enhances the security of Azure AD Joined, Hybrid Azure AD joined, and Azure AD registered devices against credential theft. For performance and reliability, TPM 2.0 is the recommended version for all Azure AD device registration scenarios on Windows 10. Starting Windows 10, 1903 update, Azure AD does not use TPM 1.2 for any of the above keys due to reliability issues.
+By securing these keys with the TPM, we enhance the security for PRT from malicious actors trying to steal the keys or replay the PRT. So, using a TPM greatly enhances the security of Azure AD Joined, Hybrid Azure AD joined, and Azure AD registered devices against credential theft. For performance and reliability, TPM 2.0 is the recommended version for all Azure AD device registration scenarios on Windows 10 or newer. Starting with the Windows 10, 1903 update, Azure AD does not use TPM 1.2 for any of the above keys due to reliability issues.
### How are app tokens and browser cookies protected? **App tokens**: When an app requests token through WAM, Azure AD issues a refresh token and an access token. However, WAM only returns the access token to the app and secures the refresh token in its cache by encrypting it with the userΓÇÖs data protection application programming interface (DPAPI) key. WAM securely uses the refresh token by signing requests with the session key to issue further access tokens. The DPAPI key is secured by an Azure AD based symmetric key in Azure AD itself. When the device needs to decrypt the user profile with the DPAPI key, Azure AD provides the DPAPI key encrypted by the session key, which CloudAP plugin requests TPM to decrypt. This functionality ensures consistency in securing refresh tokens and avoids applications implementing their own protection mechanisms.
-**Browser cookies**: In Windows 10, Azure AD supports browser SSO in Internet Explorer and Microsoft Edge natively, in Google Chrome via the Windows 10 accounts extension and in Mozilla Firefox v91+ via a browser setting. The security is built not only to protect the cookies but also the endpoints to which the cookies are sent. Browser cookies are protected the same way a PRT is, by utilizing the session key to sign and protect the cookies.
+**Browser cookies**: In Windows 10 or newer, Azure AD supports browser SSO in Internet Explorer and Microsoft Edge natively, in Google Chrome via the Windows 10 accounts extension and in Mozilla Firefox v91+ via a browser setting. The security is built not only to protect the cookies but also the endpoints to which the cookies are sent. Browser cookies are protected the same way a PRT is, by utilizing the session key to sign and protect the cookies.
When a user initiates a browser interaction, the browser (or extension) invokes a COM native client host. The native client host ensures that the page is from one of the allowed domains. The browser could send other parameters to the native client host, including a nonce, however the native client host guarantees validation of the hostname. The native client host requests a PRT-cookie from CloudAP plugin, which creates and signs it with the TPM-protected session key. As the PRT-cookie is signed by the session key, it is very difficult to tamper with. This PRT-cookie is included in the request header for Azure AD to validate the device it is originating from. If using the Chrome browser, only the extension explicitly defined in the native client hostΓÇÖs manifest can invoke it preventing arbitrary extensions from making these requests. Once Azure AD validates the PRT cookie, it issues a session cookie to the browser. This session cookie also contains the same session key issued with a PRT. During subsequent requests, the session key is validated effectively binding the cookie to the device and preventing replays from elsewhere.
A PRT can get a multi-factor authentication (MFA) claim in specific scenarios. W
* In this case, the MFA claim is not updated continuously, so the MFA duration is based on the lifetime set on the directory. * When a previous existing PRT and RT are used for access to an app, the PRT and RT will be regarded as the first proof of authentication. A new AT will be required with a second proof and an imprinted MFA claim. This will also issue a new PRT and RT.
-Windows 10 maintains a partitioned list of PRTs for each credential. So, thereΓÇÖs a PRT for each of Windows Hello for Business, password, or smartcard. This partitioning ensures that MFA claims are isolated based on the credential used, and not mixed up during token requests.
+Windows 10 or newer maintain a partitioned list of PRTs for each credential. So, thereΓÇÖs a PRT for each of Windows Hello for Business, password, or smartcard. This partitioning ensures that MFA claims are isolated based on the credential used, and not mixed up during token requests.
## How is a PRT invalidated?
The following diagrams illustrate the underlying details in issuing, renewing, a
## Next steps
-For more information on troubleshooting PRT-related issues, see the article [Troubleshooting hybrid Azure Active Directory joined Windows 10 and Windows Server 2016 devices](troubleshoot-hybrid-join-windows-current.md#troubleshoot-post-join-authentication-issues).
+For more information on troubleshooting PRT-related issues, see the article [Troubleshooting hybrid Azure Active Directory joined Windows 10 or newer and Windows Server 2016 devices](troubleshoot-hybrid-join-windows-current.md#troubleshoot-post-join-authentication-issues).
active-directory Device Management Azure Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/device-management-azure-portal.md
Previously updated : 01/25/2022 Last updated : 02/15/2022
From there, you can go to **All devices** to:
[![Screenshot that shows the All devices view in the Azure portal.](./media/device-management-azure-portal/all-devices-azure-portal.png)](./media/device-management-azure-portal/all-devices-azure-portal.png#lightbox) > [!TIP]
-> - Hybrid Azure AD joined Windows 10 devices don't have an owner. If you're looking for a device by owner and don't find it, search by the device ID.
+> - Hybrid Azure AD joined Windows 10 or newer devices don't have an owner. If you're looking for a device by owner and don't find it, search by the device ID.
> > - If you see a device that's **Hybrid Azure AD joined** with a state of **Pending** in the **Registered** column, the device has been synchronized from Azure AD connect and is waiting to complete registration from the client. See [How to plan your Hybrid Azure AD join implementation](hybrid-azuread-join-plan.md). For more information, see [Device management frequently asked questions](faq.yml). >
You must be assigned one of the following roles to view or manage device setting
- **Users may join devices to Azure AD**: This setting enables you to select the users who can register their devices as Azure AD joined devices. The default is **All**. > [!NOTE]
- > The **Users may join devices to Azure AD** setting is applicable only to Azure AD join on Windows 10. This setting doesn't apply to hybrid Azure AD joined devices, [Azure AD joined VMs in Azure](./howto-vm-sign-in-azure-ad-windows.md#enabling-azure-ad-login-in-for-windows-vm-in-azure), or Azure AD joined devices that use [Windows Autopilot self-deployment mode](/mem/autopilot/self-deploying) because these methods work in a userless context.
+ > The **Users may join devices to Azure AD** setting is applicable only to Azure AD join on Windows 10 or newer. This setting doesn't apply to hybrid Azure AD joined devices, [Azure AD joined VMs in Azure](./howto-vm-sign-in-azure-ad-windows.md#enabling-azure-ad-login-in-for-windows-vm-in-azure), or Azure AD joined devices that use [Windows Autopilot self-deployment mode](/mem/autopilot/self-deploying) because these methods work in a userless context.
- **Additional local administrators on Azure AD joined devices**: This setting allows you to select the users who are granted local administrator rights on a device. These users are added to the Device Administrators role in Azure AD. Global Administrators in Azure AD and device owners are granted local administrator rights by default. This option is a premium edition capability available through products like Azure AD Premium and Enterprise Mobility + Security.-- **Users may register their devices with Azure AD**: You need to configure this setting to allow users to register Windows 10 personal, iOS, Android, and macOS devices with Azure AD. If you select **None**, devices aren't allowed to register with Azure AD. Enrollment with Microsoft Intune or mobile device management for Microsoft 365 requires registration. If you've configured either of these services, **ALL** is selected and **NONE** is unavailable.
+- **Users may register their devices with Azure AD**: You need to configure this setting to allow users to register Windows 10 or newer personal, iOS, Android, and macOS devices with Azure AD. If you select **None**, devices aren't allowed to register with Azure AD. Enrollment with Microsoft Intune or mobile device management for Microsoft 365 requires registration. If you've configured either of these services, **ALL** is selected and **NONE** is unavailable.
- **Require Multi-Factor Authentication to register or join devices with Azure AD**: This setting allows you to specify whether users are required to provide another authentication factor to join or register their devices to Azure AD. The default is **No**. We recommend that you require multifactor authentication when a device is registered or joined. Before you enable multifactor authentication for this service, you must ensure that multifactor authentication is configured for users that register their devices. For more information on Azure AD Multi-Factor Authentication services, see [getting started with Azure AD Multi-Factor Authentication](../authentication/concept-mfa-howitworks.md). > [!NOTE]
active-directory Device Registration How It Works https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/device-registration-how-it-works.md
Previously updated : 08/16/2021 Last updated : 02/15/2022
Device Registration is a prerequisite to cloud-based authentication. Commonly, d
| Phase | Description | | :-: | -- |
-| A | The user signs in to a domain joined Windows 10 computer using domain credentials. This credential can be user name and password or smart card authentication. The user sign-in triggers the Automatic Device Join task. The Automatic Device Join tasks is triggered on domain join and retried every hour. It doesn't solely depend on the user sign-in. |
+| A | The user signs in to a domain joined Windows 10 or newer computer using domain credentials. This credential can be user name and password or smart card authentication. The user sign-in triggers the Automatic Device Join task. The Automatic Device Join tasks is triggered on domain join and retried every hour. It doesn't solely depend on the user sign-in. |
| B | The task queries Active Directory using the LDAP protocol for the keywords attribute on the service connection point stored in the configuration partition in Active Directory (`CN=62a0ff2e-97b9-4513-943f-0d221bd30080,CN=Device Registration Configuration,CN=Services,CN=Configuration,DC=corp,DC=contoso,DC=com`). The value returned in the keywords attribute determines if device registration is directed to Azure Device Registration Service (ADRS) or the enterprise device registration service hosted on-premises. | | C | For the managed environment, the task creates an initial authentication credential in the form of a self-signed certificate. The task writes the certificate to the userCertificate attribute on the computer object in Active Directory using LDAP. | | D | The computer can't authenticate to Azure DRS until a device object representing the computer that includes the certificate on the userCertificate attribute is created in Azure AD. Azure AD Connect detects an attribute change. On the next synchronization cycle, Azure AD Connect sends the userCertificate, object GUID, and computer SID to Azure DRS. Azure DRS uses the attribute information to create a device object in Azure AD. |
Device Registration is a prerequisite to cloud-based authentication. Commonly, d
| Phase | Description | | :-: | :-- |
-| A | The user signs in to a domain joined Windows 10 computer using domain credentials. This credential can be user name and password or smart card authentication. The user sign-in triggers the Automatic Device Join task. The Automatic Device Join tasks is triggered on domain join and retried every hour. It doesn't solely depend on the user sign-in. |
+| A | The user signs in to a domain joined Windows 10 or newer computer using domain credentials. This credential can be user name and password or smart card authentication. The user sign-in triggers the Automatic Device Join task. The Automatic Device Join tasks is triggered on domain join and retried every hour. It doesn't solely depend on the user sign-in. |
| B | The task queries Active Directory using the LDAP protocol for the keywords attribute on the service connection point stored in the configuration partition in Active Directory (`CN=62a0ff2e-97b9-4513-943f-0d221bd30080,CN=Device Registration Configuration,CN=Services,CN=Configuration,DC=corp,DC=contoso,DC=com`). The value returned in the keywords attribute determines if device registration is directed to Azure Device Registration Service (ADRS) or the enterprise device registration service hosted on-premises. | | C | For the federated environments, the computer authenticates the enterprise device registration endpoint using Windows Integrated Authentication. The enterprise device registration service creates and returns a token that includes claims for the object GUID, computer SID, and domain joined state. The task submits the token and claims to Azure AD where they're validated. Azure AD returns an ID token to the running task. | | D | The application creates TPM bound (preferred) RSA 2048 bit key-pair known as the device key (dkpub/dkpriv). The application creates a certificate request using dkpub and the public key and signs the certificate request with using dkpriv. Next, the application derives second key pair from the TPM's storage root key. This key is the transport key (tkpub/tkpriv). |
active-directory Enterprise State Roaming Enable https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/enterprise-state-roaming-enable.md
Previously updated : 02/12/2020 Last updated : 02/15/2022
When you enable Enterprise State Roaming, your organization is automatically gra
![image of device setting labeled Users may sync settings and app data across devices](./media/enterprise-state-roaming-enable/device-settings.png)
-For a Windows 10 device to use the Enterprise State Roaming service, the device must authenticate using an Azure AD identity. For devices that are joined to Azure AD, the userΓÇÖs primary sign-in identity is their Azure AD identity, so no additional configuration is required. For devices that use on-premises Active Directory, the IT admin must [Configure hybrid Azure Active Directory joined devices](./hybrid-azuread-join-plan.md).
+For a Windows 10 or newer device to use the Enterprise State Roaming service, the device must authenticate using an Azure AD identity. For devices that are joined to Azure AD, the userΓÇÖs primary sign-in identity is their Azure AD identity, so no additional configuration is required. For devices that use on-premises Active Directory, the IT admin must [Configure hybrid Azure Active Directory joined devices](./hybrid-azuread-join-plan.md).
## Data storage
active-directory Enterprise State Roaming Group Policy Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/enterprise-state-roaming-group-policy-settings.md
Previously updated : 02/12/2020 Last updated : 02/15/2022
The following tables describe the policy settings available.
## MDM settings
-The MDM policy settings apply to both Windows 10 and Windows 10 Mobile. Windows 10 Mobile support exists only for Microsoft account based roaming via userΓÇÖs OneDrive account. Refer to [Devices and endpoints](enterprise-state-roaming-windows-settings-reference.md) for details on what devices are supported for Azure AD-based syncing.
+The MDM policy settings apply to Windows 10 or newer. Refer to [Devices and endpoints](enterprise-state-roaming-windows-settings-reference.md) for details on what devices are supported for Azure AD-based syncing.
| Name | Description | | | |
The MDM policy settings apply to both Windows 10 and Windows 10 Mobile. Windows
## Group policy settings
-The group policy settings apply to Windows 10 devices that are joined to an Active Directory domain. The table also includes legacy settings that would appear to manage sync settings, but that do not work for Enterprise State Roaming for Windows 10, which are noted with ΓÇÿDo not useΓÇÖ in the description.
+The group policy settings apply to Windows 10 or newer devices that are joined to an Active Directory domain. The table also includes legacy settings that would appear to manage sync settings, but that do not work for Enterprise State Roaming for Windows 10 or newer, which are noted with ΓÇÿDo not useΓÇÖ in the description.
These settings are located at: `Computer Configuration > Administrative Templates > Windows Components > Sync your settings`
active-directory Enterprise State Roaming Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/enterprise-state-roaming-overview.md
Previously updated : 02/12/2020 Last updated : 02/15/2022
# What is enterprise state roaming?
-With Windows 10, [Azure Active Directory (Azure AD)](../fundamentals/active-directory-whatis.md) users gain the ability to securely synchronize their user settings and application settings data to the cloud. Enterprise State Roaming provides users with a unified experience across their Windows devices and reduces the time needed for configuring a new device. Enterprise State Roaming operates similar to the standard [consumer settings sync](https://go.microsoft.com/fwlink/?linkid=2015135) that was first introduced in Windows 8. Additionally, Enterprise State Roaming offers:
+With Windows 10 or newer, [Azure Active Directory (Azure AD)](../fundamentals/active-directory-whatis.md) users gain the ability to securely synchronize their user settings and application settings data to the cloud. Enterprise State Roaming provides users with a unified experience across their Windows devices and reduces the time needed for configuring a new device. Enterprise State Roaming operates similar to the standard [consumer settings sync](https://go.microsoft.com/fwlink/?linkid=2015135) that was first introduced in Windows 8. Additionally, Enterprise State Roaming offers:
* **Separation of corporate and consumer data** ΓÇô Organizations are in control of their data, and there is no mixing of corporate data in a consumer cloud account or consumer data in an enterprise cloud account.
-* **Enhanced security** ΓÇô Data is automatically encrypted before leaving the userΓÇÖs Windows 10 device by using Azure Rights Management (Azure RMS), and data stays encrypted at rest in the cloud. All content stays encrypted at rest in the cloud, except for the namespaces, like settings names and Windows app names.
+* **Enhanced security** ΓÇô Data is automatically encrypted before leaving the userΓÇÖs Windows 10 or newer device by using Azure Rights Management (Azure RMS), and data stays encrypted at rest in the cloud. All content stays encrypted at rest in the cloud, except for the namespaces, like settings names and Windows app names.
* **Better management and monitoring** ΓÇô Provides control and visibility over who syncs settings in your organization and on which devices through the Azure AD portal integration. Enterprise State Roaming is available in multiple Azure regions. You can find the updated list of available regions on the [Azure Services by Regions](https://azure.microsoft.com/regions/#services) page under Azure Active Directory.
Enterprise State Roaming is available in multiple Azure regions. You can find th
| | | | [Enable Enterprise State Roaming in Azure Active Directory](enterprise-state-roaming-enable.md) |Enterprise State Roaming is available to any organization with a Premium Azure Active Directory (Azure AD) subscription. For more information on how to get an Azure AD subscription, see the [Azure AD product](https://azure.microsoft.com/services/active-directory) page. | | [Settings and data roaming FAQ](enterprise-state-roaming-faqs.yml) |This article answers some questions IT administrators might have about settings and app data sync. |
-| [Group policy and MDM settings for settings sync](enterprise-state-roaming-group-policy-settings.md) |Windows 10 provides Group Policy and mobile device management (MDM) policy settings to limit settings sync. |
-| [Windows 10 roaming settings reference](enterprise-state-roaming-windows-settings-reference.md) |A list of settings that will be roamed and/or backed-up in Windows 10. |
+| [Group policy and MDM settings for settings sync](enterprise-state-roaming-group-policy-settings.md) |Windows 10 or newer provides Group Policy and mobile device management (MDM) policy settings to limit settings sync. |
+| [Windows 10 roaming settings reference](enterprise-state-roaming-windows-settings-reference.md) |A list of settings that will be roamed and/or backed-up in Windows 10 or newer. |
| [Troubleshooting](enterprise-state-roaming-troubleshooting.md) |This article goes through some basic steps for troubleshooting, and contains a list of known issues. | ## Next steps
active-directory Enterprise State Roaming Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/enterprise-state-roaming-troubleshooting.md
Previously updated : 02/12/2020 Last updated : 02/15/2022
This topic provides information on how to troubleshoot and diagnose issues with
Before you start troubleshooting, verify that the user and device have been configured properly, and that all the requirements of Enterprise State Roaming are met by the device and the user.
-1. Windows 10, with the latest updates, and a minimum Version 1511 (OS Build 10586 or later) is installed on the device.
+1. Windows 10 or newer, with the latest updates, and a minimum Version 1511 (OS Build 10586 or later) is installed on the device.
1. The device is Azure AD joined or hybrid Azure AD joined. For more information, see [how to get a device under the control of Azure AD](overview.md). 1. Ensure that **Enterprise State Roaming** is enabled for the tenant in Azure AD as described in [To enable Enterprise State Roaming](enterprise-state-roaming-enable.md). You can enable roaming for all users or for only a selected group of users. 1. The user is assigned an Azure Active Directory Premium license.
This section gives suggestions on how to troubleshoot and diagnose problems rela
## Verify sync, and the ΓÇ£Sync your settingsΓÇ¥ settings page
-1. After joining your Windows 10 PC to a domain that is configured to allow Enterprise State Roaming, sign on with your work account. Go to **Settings** > **Accounts** > **Sync Your Settings** and confirm that sync and the individual settings are on, and that the top of the settings page indicates that you are syncing with your work account. Confirm the same account is also used as your login account in **Settings** > **Accounts** > **Your Info**.
+1. After joining your Windows 10 or newer PC to a domain that is configured to allow Enterprise State Roaming, sign on with your work account. Go to **Settings** > **Accounts** > **Sync Your Settings** and confirm that sync and the individual settings are on, and that the top of the settings page indicates that you are syncing with your work account. Confirm the same account is also used as your login account in **Settings** > **Accounts** > **Your Info**.
1. Verify that sync works across multiple machines by making some changes on the original machine, such as moving the taskbar to the right or top side of the screen. Watch the change propagate to the second machine within five minutes. * Locking and unlocking the screen (Win + L) can help trigger a sync.
This section gives suggestions on how to troubleshoot and diagnose problems rela
### Verify the device registration status
-Enterprise State Roaming requires the device to be registered with Azure AD. Although not specific to Enterprise State Roaming, following the instructions below can help confirm that the Windows 10 Client is registered, and confirm thumbprint, Azure AD settings URL, NGC status, and other information.
+Enterprise State Roaming requires the device to be registered with Azure AD. Although not specific to Enterprise State Roaming, following the instructions below can help confirm that the Windows 10 or newer Client is registered, and confirm thumbprint, Azure AD settings URL, NGC status, and other information.
1. Open the command prompt unelevated. To do this in Windows, open the Run launcher (Win + R) and type ΓÇ£cmdΓÇ¥ to open. 1. Once the command prompt is open, type ΓÇ£*dsregcmd.exe /status*ΓÇ¥.
Enterprise State Roaming requires the device to be registered with Azure AD. Alt
Under certain conditions, Enterprise State Roaming can fail to sync data if Azure AD Multi-Factor Authentication is configured. For more information on these symptoms, see the support document [KB3193683](https://support.microsoft.com/kb/3193683).
-**Potential issue**: If your device is configured to require Multi-Factor Authentication on the Azure Active Directory portal, you may fail to sync settings while signing in to a Windows 10 device using a password. This type of Multi-Factor Authentication configuration is intended to protect an Azure administrator account. Admin users may still be able to sync by signing in to their Windows 10 devices with their Microsoft Passport for Work PIN or by completing Multi-Factor Authentication while accessing other Azure services like Microsoft 365.
+**Potential issue**: If your device is configured to require Multi-Factor Authentication on the Azure Active Directory portal, you may fail to sync settings while signing in to a Windows 10 or newer device using a password. This type of Multi-Factor Authentication configuration is intended to protect an Azure administrator account. Admin users may still be able to sync by signing in to their Windows 10 or newer devices with their Microsoft Passport for Work PIN or by completing Multi-Factor Authentication while accessing other Azure services like Microsoft 365.
**Potential issue**: Sync can fail if the admin configures the Active Directory Federation Services Multi-Factor Authentication Conditional Access policy and the access token on the device expires. Ensure that you sign in and sign out using the Microsoft Passport for Work PIN or complete Multi-Factor Authentication while accessing other Azure services like Microsoft 365.
active-directory Enterprise State Roaming Windows Settings Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/enterprise-state-roaming-windows-settings-reference.md
Previously updated : 02/12/2020 Last updated : 02/15/2022
# Windows 10 roaming settings reference
-The following is a list of the settings that will be roamed or backed up in Windows 10.
+The following is a list of the settings that will be roamed or backed up in Windows 10 or newer.
## Devices and endpoints
-See the following table for a summary of the devices and account types that are supported by the sync, backup, and restore framework in Windows 10.
+See the following table for a summary of the devices and account types that are supported by the sync, backup, and restore framework in Windows 10 or newer.
| Account type and operation | Desktop | Mobile | | | | |
Windows settings generally sync by default, but some settings are only backed up
## Windows Settings overview
-The following settings groups are available for end users to enable/disable settings sync on Windows 10 devices.
+The following settings groups are available for end users to enable/disable settings sync on Windows 10 or newer devices.
* Theme: desktop background, user tile, taskbar position, etc. * Internet Explorer Settings: browsing history, typed URLs, favorites, etc.
active-directory Howto Device Identity Virtual Desktop Infrastructure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/howto-device-identity-virtual-desktop-infrastructure.md
Previously updated : 01/12/2022 Last updated : 02/15/2022
Before configuring device identities in Azure AD for your VDI environment, famil
| | | | Non-Persistent | No | | Azure AD registered | Federated/Managed | Windows current/Windows down-level | Persistent/Non-Persistent | Not Applicable |
-<sup>1</sup> **Windows current** devices represent Windows 10, Windows Server 2016 v1803 or higher, and Windows Server 2019.
+<sup>1</sup> **Windows current** devices represent Windows 10 or newer, Windows Server 2016 v1803 or higher, and Windows Server 2019.
<sup>2</sup> **Windows down-level** devices represent Windows 7, Windows 8.1, Windows Server 2008 R2, Windows Server 2012, and Windows Server 2012 R2. For support information on Windows 7, see [Support for Windows 7 is ending](https://www.microsoft.com/microsoft-365/windows/end-of-windows-7-support). For support information on Windows Server 2008 R2, see [Prepare for Windows Server 2008 end of support](https://www.microsoft.com/cloud-platform/windows-server-2008).
Before configuring device identities in Azure AD for your VDI environment, famil
<sup>4</sup> A **Managed** identity infrastructure environment represents an environment with Azure AD as the identity provider deployed with either [password hash sync (PHS)](../hybrid/whatis-phs.md) or [pass-through authentication (PTA)](../hybrid/how-to-connect-pta.md) with [seamless single sign-on](../hybrid/how-to-connect-sso.md).
-<sup>5</sup> **Non-Persistence support for Windows current** requires other consideration as documented below in guidance section. This scenario requires Windows 10 1803, Windows Server 2019, or Windows Server (Semi-annual channel) starting version 1803
+<sup>5</sup> **Non-Persistence support for Windows current** requires other consideration as documented below in guidance section. This scenario requires Windows 10 1803 or newer, Windows Server 2019, or Windows Server (Semi-annual channel) starting version 1803
<sup>6</sup> **Non-Persistence support for Windows down-level** requires other consideration as documented below in guidance section.
active-directory Howto Hybrid Azure Ad Join https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/howto-hybrid-azure-ad-join.md
Previously updated : 01/20/2022 Last updated : 02/15/2022
Hybrid Azure AD join requires devices to have access to the following Microsoft
> [!WARNING] > If your organization uses proxy servers that intercept SSL traffic for scenarios like data loss prevention or Azure AD tenant restrictions, ensure that traffic to these URLs are excluded from TLS break-and-inspect. Failure to exclude these URLs may cause interference with client certificate authentication, cause issues with device registration, and device-based Conditional Access.
-If your organization requires access to the internet via an outbound proxy, you can use [Web Proxy Auto-Discovery (WPAD)](/previous-versions/tn-archive/cc995261(v=technet.10)) to enable Windows 10 computers for device registration with Azure AD. To address issues configuring and managing WPAD, see [Troubleshooting Automatic Detection](/previous-versions/tn-archive/cc302643(v=technet.10)).
+If your organization requires access to the internet via an outbound proxy, you can use [Web Proxy Auto-Discovery (WPAD)](/previous-versions/tn-archive/cc995261(v=technet.10)) to enable Windows 10 or newer computers for device registration with Azure AD. To address issues configuring and managing WPAD, see [Troubleshooting Automatic Detection](/previous-versions/tn-archive/cc302643(v=technet.10)).
If you don't use WPAD, you can configure WinHTTP proxy settings on your computer with a Group Policy Object (GPO) beginning with Windows 10 1709. For more information, see [WinHTTP Proxy Settings deployed by GPO](/archive/blogs/netgeeks/winhttp-proxy-settings-deployed-by-gpo). > [!NOTE] > If you configure proxy settings on your computer by using WinHTTP settings, any computers that can't connect to the configured proxy will fail to connect to the internet.
-If your organization requires access to the internet via an authenticated outbound proxy, make sure that your Windows 10 computers can successfully authenticate to the outbound proxy. Because Windows 10 computers run device registration by using machine context, configure outbound proxy authentication by using machine context. Follow up with your outbound proxy provider on the configuration requirements.
+If your organization requires access to the internet via an authenticated outbound proxy, make sure that your Windows 10 or newer computers can successfully authenticate to the outbound proxy. Because Windows 10 or newer computers run device registration by using machine context, configure outbound proxy authentication by using machine context. Follow up with your outbound proxy provider on the configuration requirements.
Verify devices can access the required Microsoft resources under the system account by using the [Test Device Registration Connectivity](/samples/azure-samples/testdeviceregconnectivity/testdeviceregconnectivity/) script.
Configure hybrid Azure AD join by using Azure AD Connect for a federated environ
1. On the **Device options** page, select **Configure Hybrid Azure AD join**, and then select **Next**. 1. On the **SCP** page, complete the following steps, and then select **Next**: 1. Select the forest.
- 1. Select the authentication service. You must select **AD FS server** unless your organization has exclusively Windows 10 clients and you have configured computer/device sync, or your organization uses seamless SSO.
+ 1. Select the authentication service. You must select **AD FS server** unless your organization has exclusively Windows 10 or newer clients and you have configured computer/device sync, or your organization uses seamless SSO.
1. Select **Add** to enter the enterprise administrator credentials. ![Azure AD Connect SCP configuration federated domain](./media/howto-hybrid-azure-ad-join/azure-ad-connect-scp-configuration-federated.png)
active-directory Howto Vm Sign In Azure Ad Windows https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/howto-vm-sign-in-azure-ad-windows.md
Previously updated : 08/19/2021 Last updated : 02/15/2022
The following Windows distributions are currently supported for this feature:
- Windows 10 1809 and later > [!IMPORTANT]
-> Remote connection to VMs joined to Azure AD is only allowed from Windows 10 PCs that are either Azure AD registered (starting Windows 10 20H1), Azure AD joined or hybrid Azure AD joined to the **same** directory as the VM.
+> Remote connection to VMs joined to Azure AD is only allowed from Windows 10 or newer PCs that are either Azure AD registered (starting Windows 10 20H1), Azure AD joined or hybrid Azure AD joined to the **same** directory as the VM.
This feature is now available in the following Azure clouds:
You can enforce Conditional Access policies such as multi-factor authentication
require multi-factor authentication as a grant access control. > [!NOTE]
-> If you use "Require multi-factor authentication" as a grant access control for requesting access to the "Azure Windows VM Sign-In" app, then you must supply multi-factor authentication claim as part of the client that initiates the RDP session to the target Windows VM in Azure. The only way to achieve this on a Windows 10 client is to use Windows Hello for Business PIN or biometric authentication with the RDP client. Support for biometric authentication was added to the RDP client in Windows 10 version 1809. Remote desktop using Windows Hello for Business authentication is only available for deployments that use cert trust model and currently not available for key trust model.
+> If you use "Require multi-factor authentication" as a grant access control for requesting access to the "Azure Windows VM Sign-In" app, then you must supply multi-factor authentication claim as part of the client that initiates the RDP session to the target Windows VM in Azure. The only way to achieve this on a Windows 10 or newer client is to use Windows Hello for Business PIN or biometric authentication with the RDP client. Support for biometric authentication was added to the RDP client in Windows 10 version 1809. Remote desktop using Windows Hello for Business authentication is only available for deployments that use cert trust model and currently not available for key trust model.
> [!WARNING] > Per-user Enabled/Enforced Azure AD Multi-Factor Authentication is not supported for VM Sign-In.
require multi-factor authentication as a grant access control.
## Log in using Azure AD credentials to a Windows VM > [!IMPORTANT]
-> Remote connection to VMs joined to Azure AD is only allowed from Windows 10 PCs that are either Azure AD registered (minimum required build is 20H1) or Azure AD joined or hybrid Azure AD joined to the **same** directory as the VM. Additionally, to RDP using Azure AD credentials, the user must belong to one of the two Azure roles, Virtual Machine Administrator Login or Virtual Machine User Login. If using an Azure AD registered Windows 10 PC, you must enter credentials in the `AzureAD\UPN` format (for example, `AzureAD\john@contoso.com`). At this time, Azure Bastion can be used to log in with Azure AD authentication [using Azure CLI and the native RDP client **mstsc**](../../bastion/connect-native-client-windows.md).
+> Remote connection to VMs joined to Azure AD is only allowed from Windows 10 or newer PCs that are either Azure AD registered (minimum required build is 20H1) or Azure AD joined or hybrid Azure AD joined to the **same** directory as the VM. Additionally, to RDP using Azure AD credentials, the user must belong to one of the two Azure roles, Virtual Machine Administrator Login or Virtual Machine User Login. If using an Azure AD registered Windows 10 or newer PC, you must enter credentials in the `AzureAD\UPN` format (for example, `AzureAD\john@contoso.com`). At this time, Azure Bastion can be used to log in with Azure AD authentication [using Azure CLI and the native RDP client **mstsc**](../../bastion/connect-native-client-windows.md).
To log in to your Windows Server 2019 virtual machine using Azure AD:
If you see the following error message when you initiate a remote desktop connec
![Your credentials did not work](./media/howto-vm-sign-in-azure-ad-windows/your-credentials-did-not-work.png)
-Verify that the Windows 10 PC you are using to initiate the remote desktop connection is one that is either Azure AD joined, or hybrid Azure AD joined to the same Azure AD directory where your VM is joined to. For more information about device identity, see the article [What is a device identity](./overview.md).
+Verify that the Windows 10 or newer PC you are using to initiate the remote desktop connection is one that is either Azure AD joined, or hybrid Azure AD joined to the same Azure AD directory where your VM is joined to. For more information about device identity, see the article [What is a device identity](./overview.md).
> [!NOTE] > Windows 10 Build 20H1 added support for an Azure AD registered PC to initiate RDP connection to your VM. When using an Azure AD registered (not Azure AD joined or hybrid Azure AD joined) PC as the RDP client to initiate connections to your VM, you must enter credentials in the format `AzureAD\UPN` (for example, `AzureAD\john@contoso.com`).
If you see the following error message when you initiate a remote desktop connec
![The sign-in method you're trying to use isn't allowed.](./media/howto-vm-sign-in-azure-ad-windows/mfa-sign-in-method-required.png)
-If you have configured a Conditional Access policy that requires multi-factor authentication (MFA) before you can access the resource, then you need to ensure that the Windows 10 PC initiating the remote desktop connection to your VM signs in using a strong authentication method such as Windows Hello. If you do not use a strong authentication method for your remote desktop connection, you will see the previous error.
+If you have configured a Conditional Access policy that requires multi-factor authentication (MFA) before you can access the resource, then you need to ensure that the Windows 10 or newer PC initiating the remote desktop connection to your VM signs in using a strong authentication method such as Windows Hello. If you do not use a strong authentication method for your remote desktop connection, you will see the previous error.
- Your credentials did not work.
active-directory Hybrid Azuread Join Manual https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/hybrid-azuread-join-manual.md
Previously updated : 01/20/2022 Last updated : 02/15/2022
Hybrid Azure AD join requires devices to have access to the following Microsoft
> [!WARNING] > If your organization uses proxy servers that intercept SSL traffic for scenarios like data loss prevention or Azure AD tenant restrictions, ensure that traffic to these URLs are excluded from TLS break-and-inspect. Failure to exclude these URLs may cause interference with client certificate authentication, cause issues with device registration, and device-based Conditional Access.
-If your organization requires access to the internet via an outbound proxy, you can use [Web Proxy Auto-Discovery (WPAD)](/previous-versions/tn-archive/cc995261(v=technet.10)) to enable Windows 10 computers for device registration with Azure AD. To address issues configuring and managing WPAD, see [Troubleshooting Automatic Detection](/previous-versions/tn-archive/cc302643(v=technet.10)).
+If your organization requires access to the internet via an outbound proxy, you can use [Web Proxy Auto-Discovery (WPAD)](/previous-versions/tn-archive/cc995261(v=technet.10)) to enable Windows 10 or newer computers for device registration with Azure AD. To address issues configuring and managing WPAD, see [Troubleshooting Automatic Detection](/previous-versions/tn-archive/cc302643(v=technet.10)).
If you don't use WPAD, you can configure WinHTTP proxy settings on your computer beginning with Windows 10 1709. For more information, see [WinHTTP Proxy Settings deployed by GPO](/archive/blogs/netgeeks/winhttp-proxy-settings-deployed-by-gpo). > [!NOTE] > If you configure proxy settings on your computer by using WinHTTP settings, any computers that can't connect to the configured proxy will fail to connect to the internet.
-If your organization requires access to the internet via an authenticated outbound proxy, make sure that your Windows 10 computers can successfully authenticate to the outbound proxy. Because Windows 10 computers run device registration by using machine context, configure outbound proxy authentication by using machine context. Follow up with your outbound proxy provider on the configuration requirements.
+If your organization requires access to the internet via an authenticated outbound proxy, make sure that your Windows 10 or newer computers can successfully authenticate to the outbound proxy. Because Windows 10 or newer computers run device registration by using machine context, configure outbound proxy authentication by using machine context. Follow up with your outbound proxy provider on the configuration requirements.
Verify devices can access the required Microsoft resources under the system account by using the [Test Device Registration Connectivity](/samples/azure-samples/testdeviceregconnectivity/testdeviceregconnectivity/) script.
active-directory Hybrid Azuread Join Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/hybrid-azuread-join-plan.md
Previously updated : 01/20/2022 Last updated : 02/15/2022
If you have an on-premises Active Directory Domain Services (AD DS) environment
This article assumes that you're familiar with the [Introduction to device identity management in Azure Active Directory](./overview.md). > [!NOTE]
-> The minimum required domain controller version for Windows 10 hybrid Azure AD join is Windows Server 2008 R2.
+> The minimum required domain controller version for Windows 10 or newer hybrid Azure AD join is Windows Server 2008 R2.
Hybrid Azure AD joined devices require network line of sight to your domain controllers periodically. Without this connection, devices become unusable.
Hybrid Azure AD join supports a broad range of Windows devices. Because the conf
### Windows current devices -- Windows 10 - Windows 11
+- Windows 10
- Windows Server 2016 - **Note**: Azure National cloud customers require version 1803 - Windows Server 2019
-For devices running the Windows desktop operating system, supported versions are listed in this article [Windows 10 release information](/windows/release-information/). As a best practice, Microsoft recommends you upgrade to the latest version of Windows 10.
+For devices running the Windows desktop operating system, supported versions are listed in this article [Windows 10 release information](/windows/release-information/). As a best practice, Microsoft recommends you upgrade to the latest version of Windows.
### Windows down-level devices
As a first planning step, you should review your environment and determine wheth
### Handling devices with Azure AD registered state
-If your Windows 10 domain joined devices are [Azure AD registered](concept-azure-ad-register.md) to your tenant, it could lead to a dual state of hybrid Azure AD joined and Azure AD registered device. We recommend upgrading to Windows 10 1803 (with KB4489894 applied) or newer to automatically address this scenario. In pre-1803 releases, you'll need to remove the Azure AD registered state manually before enabling hybrid Azure AD join. In 1803 and above releases, the following changes have been made to avoid this dual state:
+If your Windows 10 or newer domain joined devices are [Azure AD registered](concept-azure-ad-register.md) to your tenant, it could lead to a dual state of hybrid Azure AD joined and Azure AD registered device. We recommend upgrading to Windows 10 1803 (with KB4489894 applied) or newer to automatically address this scenario. In pre-1803 releases, you'll need to remove the Azure AD registered state manually before enabling hybrid Azure AD join. In 1803 and above releases, the following changes have been made to avoid this dual state:
- Any existing Azure AD registered state for a user would be automatically removed <i>after the device is hybrid Azure AD joined and the same user logs in</i>. For example, if User A had an Azure AD registered state on the device, the dual state for User A is cleaned up only when User A logs in to the device. If there are multiple users on the same device, the dual state is cleaned up individually when those users log in. After removing the Azure AD registered state, Windows 10 will unenroll the device from Intune or other MDM, if the enrollment happened as part of the Azure AD registration via auto-enrollment. - Azure AD registered state on any local accounts on the device isnΓÇÖt impacted by this change. Only applicable to domain accounts. Azure AD registered state on local accounts isn't removed automatically even after user logon, since the user isn't a domain user.
If your Windows 10 domain joined devices are [Azure AD registered](concept-azure
- In Windows 10 1803, if you have Windows Hello for Business configured, the user needs to reconfigure Windows Hello for Business after the dual state cleanup. This issue has been addressed with KB4512509. > [!NOTE]
-> Even though Windows 10 automatically removes the Azure AD registered state locally, the device object in Azure AD is not immediately deleted if it is managed by Intune. You can validate the removal of Azure AD registered state by running dsregcmd /status and consider the device not to be Azure AD registered based on that.
+> Even though Windows 10 and Windows 11 automatically remove the Azure AD registered state locally, the device object in Azure AD is not immediately deleted if it is managed by Intune. You can validate the removal of Azure AD registered state by running dsregcmd /status and consider the device not to be Azure AD registered based on that.
### Hybrid Azure AD join for single forest, multiple Azure AD tenants
Beginning with version 1.1.819.0, Azure AD Connect provides you with a wizard to
## Review on-premises AD users UPN support for hybrid Azure AD join
-Sometimes, on-premises AD users UPNs are different from your Azure AD UPNs. In these cases, Windows 10 hybrid Azure AD join provides limited support for on-premises AD UPNs based on the [authentication method](../hybrid/choose-ad-authn.md), domain type, and Windows 10 version. There are two types of on-premises AD UPNs that can exist in your environment:
+Sometimes, on-premises AD users UPNs are different from your Azure AD UPNs. In these cases, Windows 10 or newer hybrid Azure AD join provides limited support for on-premises AD UPNs based on the [authentication method](../hybrid/choose-ad-authn.md), domain type, and Windows version. There are two types of on-premises AD UPNs that can exist in your environment:
- Routable users UPN: A routable UPN has a valid verified domain, that is registered with a domain registrar. For example, if contoso.com is the primary domain in Azure AD, contoso.org is the primary domain in on-premises AD owned by Contoso and [verified in Azure AD](../fundamentals/add-custom-domain.md). - Non-routable users UPN: A non-routable UPN doesn't have a verified domain and is applicable only within your organization's private network. For example, if contoso.com is the primary domain in Azure AD and contoso.local is the primary domain in on-premises AD but isn't a verifiable domain in the internet and only used within Contoso's network.
active-directory Manage Stale Devices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/manage-stale-devices.md
Previously updated : 02/07/2022 Last updated : 02/15/2022
Because a stale device is defined as a registered device that hasn't been used t
The evaluation of the activity timestamp is triggered by an authentication attempt of a device. Azure AD evaluates the activity timestamp when: - A Conditional Access policies requiring [managed devices](../conditional-access/require-managed-devices.md) or [approved client apps](../conditional-access/app-based-conditional-access.md) has been triggered.-- Windows 10 devices that are either Azure AD joined or hybrid Azure AD joined are active on the network.
+- Windows 10 or newer devices that are either Azure AD joined or hybrid Azure AD joined are active on the network.
- Intune managed devices have checked in to the service. If the delta between the existing value of the activity timestamp and the current value is more than 14 days (+/-5 day variance), the existing value is replaced with the new value.
Your hybrid Azure AD joined devices should follow your policies for on-premises
To clean up Azure AD: -- **Windows 10 devices** - Disable or delete Windows 10 devices in your on-premises AD, and let Azure AD Connect synchronize the changed device status to Azure AD.
+- **Windows 10 or newer devices** - Disable or delete Windows 10 or newer devices in your on-premises AD, and let Azure AD Connect synchronize the changed device status to Azure AD.
- **Windows 7/8** - Disable or delete Windows 7/8 devices in your on-premises AD first. You can't use Azure AD Connect to disable or delete Windows 7/8 devices in Azure AD. Instead, when you make the change in your on-premises, you must disable/delete in Azure AD. > [!NOTE] > - Deleting devices in your on-premises AD or Azure AD does not remove registration on the client. It will only prevent access to resources using device as an identity (e.g. Conditional Access). Read additional information on how to [remove registration on the client](faq.yml).
-> - Deleting a Windows 10 device only in Azure AD will re-synchronize the device from your on-premises using Azure AD connect but as a new object in "Pending" state. A re-registration is required on the device.
-> - Removing the device from sync scope for Windows 10/Server 2016 devices will delete the Azure AD device. Adding it back to sync scope will place a new object in "Pending" state. A re-registration of the device is required.
-> - If you are not using Azure AD Connect for Windows 10 devices to synchronize (e.g. ONLY using AD FS for registration), you must manage lifecycle similar to Windows 7/8 devices.
+> - Deleting a Windows 10 or newer device only in Azure AD will re-synchronize the device from your on-premises using Azure AD connect but as a new object in "Pending" state. A re-registration is required on the device.
+> - Removing the device from sync scope for Windows 10 or newer /Server 2016 devices will delete the Azure AD device. Adding it back to sync scope will place a new object in "Pending" state. A re-registration of the device is required.
+> - If you are not using Azure AD Connect for Windows 10 or newer devices to synchronize (e.g. ONLY using AD FS for registration), you must manage lifecycle similar to Windows 7/8 devices.
### Azure AD joined devices
The timestamp is updated to support device lifecycle scenarios. This attribute i
### Why should I worry about my BitLocker keys?
-When configured, BitLocker keys for Windows 10 devices are stored on the device object in Azure AD. If you delete a stale device, you also delete the BitLocker keys that are stored on the device. Confirm that your cleanup policy aligns with the actual lifecycle of your device before deleting a stale device.
+When configured, BitLocker keys for Windows 10 or newer devices are stored on the device object in Azure AD. If you delete a stale device, you also delete the BitLocker keys that are stored on the device. Confirm that your cleanup policy aligns with the actual lifecycle of your device before deleting a stale device.
### Why should I worry about Windows Autopilot devices?
active-directory Plan Device Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/plan-device-deployment.md
Previously updated : 01/20/2022 Last updated : 02/15/2022
iOS and Android devices may only be Azure AD registered. The following table pre
| Consideration | Azure AD registered | Azure AD joined | Hybrid Azure AD joined | | | :: | :: | :: | | **Client operating systems** | | | |
-| Windows 10 devices | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) |
+| Windows 11 or Windows 10 devices | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) |
| Windows down-level devices (Windows 8.1 or Windows 7) | | | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | |**Sign in options** | | | | | End-user local credentials | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | | |
BYOD and corporate owned mobile device are registered by users installing the Co
* [iOS](/mem/intune/user-help/install-and-sign-in-to-the-intune-company-portal-app-ios) * [Android](/mem/intune/user-help/enroll-device-android-company-portal)
-* [Windows 10](/mem/intune/user-help/enroll-windows-10-device)
+* [Windows 10 or newer](/mem/intune/user-help/enroll-windows-10-device)
* [macOS](/mem/intune/user-help/enroll-your-device-in-intune-macos-cp) If registering your devices is the best option for your organization, see the following resources:
If registering your devices is the best option for your organization, see the fo
## Azure AD join
-Azure AD join enables you to transition towards a cloud-first model with Windows. It provides a great foundation if you're planning to modernize your device management and reduce device-related IT costs. Azure AD join works with Windows 10 devices only. Consider it as the first choice for new devices.
+Azure AD join enables you to transition towards a cloud-first model with Windows. It provides a great foundation if you're planning to modernize your device management and reduce device-related IT costs. Azure AD join works with Windows 10 or newer devices only. Consider it as the first choice for new devices.
[Azure AD joined devices can SSO to on-premises resources](azuread-join-sso.md) when they are on the organization's network, can authenticate to on-premises servers like file, print, and other applications.
If hybrid Azure AD join is the best option for your organization, see the follow
If installing the required version of Azure AD Connect isn't an option for you, see [how to manually configure hybrid Azure AD join](hybrid-azuread-join-manual.md). > [!NOTE]
-> The on-premises domain-joined Windows 10 device attempts to auto-join to Azure AD to become hybrid Azure AD joined by default. This will only succeed if you have set up the right environment.
+> The on-premises domain-joined Windows 10 or newer device attempts to auto-join to Azure AD to become hybrid Azure AD joined by default. This will only succeed if you have set up the right environment.
You may determine that hybrid Azure AD join is the best solution for a device in a different state. The following table shows how to change the state of a device.
Review supported and unsupported platforms for integrated devices:
| Device management tools | Azure AD registered | Azure AD joined | Hybrid Azure AD joined | | | :: | :: | :: | | [Mobile Device Management (MDM) ](/windows/client-management/mdm/azure-active-directory-integration-with-mdm) <br>Example: Microsoft Intune | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) |
-| [Co-management with Microsoft Intune and Microsoft Endpoint Configuration Manager](/mem/configmgr/comanage/overview) <br>(Windows 10 and later) | | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) |
+| [Co-management with Microsoft Intune and Microsoft Endpoint Configuration Manager](/mem/configmgr/comanage/overview) <br>(Windows 10 or newer) | | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | ![Checkmark for these values.](./media/plan-device-deployment/check.png) |
| [Group policy](/previous-versions/windows/it-pro/windows-server-2012-R2-and-2012/hh831791(v=ws.11))<br>(Windows only) | | | ![Checkmark for these values.](./media/plan-device-deployment/check.png) | We recommend that you consider [Microsoft Intune Mobile Application management (MAM)](/mem/intune/apps/app-management) with or without device management for registered iOS or Android devices.
active-directory Troubleshoot Hybrid Join Windows Current https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/troubleshoot-hybrid-join-windows-current.md
Previously updated : 01/20/2022 Last updated : 02/15/2022
# Troubleshoot hybrid Azure AD-joined devices
-This article provides troubleshooting guidance to help you resolve potential issues with devices that are running Windows 10 or Windows Server 2016 or newer.
+This article provides troubleshooting guidance to help you resolve potential issues with devices that are running Windows 10 or newer and Windows Server 2016 or newer.
Hybrid Azure Active Directory (Azure AD) join supports the Windows 10 November 2015 update and later.
This content applies only to federated domain accounts.
Reasons for failure: - Unable to get an access token silently for the DRS resource.
- - Windows 10 devices acquire the authentication token from the Federation Service by using integrated Windows authentication to an active WS-Trust endpoint. For more information, see [Federation Service configuration](hybrid-azuread-join-manual.md#set-up-issuance-of-claims).
+ - Windows 10 and Windows 11 devices acquire the authentication token from the Federation Service by using integrated Windows authentication to an active WS-Trust endpoint. For more information, see [Federation Service configuration](hybrid-azuread-join-manual.md#set-up-issuance-of-claims).
**Common error codes**:
active-directory Troubleshoot Hybrid Join Windows Legacy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/troubleshoot-hybrid-join-windows-legacy.md
Previously updated : 11/21/2019 Last updated : 02/15/2022
This article is applicable only to the following devices:
- Windows Server 2012 - Windows Server 2012 R2
-For Windows 10 or Windows Server 2016, see [Troubleshooting hybrid Azure Active Directory joined Windows 10 and Windows Server 2016 devices](troubleshoot-hybrid-join-windows-current.md).
+For Windows 10 or newer and Windows Server 2016, see [Troubleshooting hybrid Azure Active Directory joined Windows 10 and Windows Server 2016 devices](troubleshoot-hybrid-join-windows-current.md).
This article assumes that you have [configured hybrid Azure Active Directory joined devices](hybrid-azuread-join-plan.md) to support the following scenarios:
This article provides you with troubleshooting guidance on how to resolve potent
**What you should know:** -- Hybrid Azure AD join for downlevel Windows devices works slightly differently than it does in Windows 10. Many customers do not realize that they need AD FS (for federated domains) or Seamless SSO configured (for managed domains).
+- Hybrid Azure AD join for downlevel Windows devices works slightly differently than it does in Windows 10 or newer. Many customers do not realize that they need AD FS (for federated domains) or Seamless SSO configured (for managed domains).
- Seamless SSO doesn't work in private browsing mode on Firefox and Microsoft Edge browsers. It also doesn't work on Internet Explorer if the browser is running in Enhanced Protected mode or if Enhanced Security Configuration is enabled. - For customers with federated domains, if the Service Connection Point (SCP) was configured such that it points to the managed domain name (for example, contoso.onmicrosoft.com, instead of contoso.com), then Hybrid Azure AD Join for downlevel Windows devices will not work. - The same physical device appears multiple times in Azure AD when multiple domain users sign-in the downlevel hybrid Azure AD joined devices. For example, if *jdoe* and *jharnett* sign-in to a device, a separate registration (DeviceID) is created for each of them in the **USER** info tab.
active-directory B2b Fundamentals https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/b2b-fundamentals.md
This article contains recommendations and best practices for business-to-busines
| Recommendation | Comments | | | |
-| Carefully consider how you want to collaborate with external users and organizations | Azure AD gives you a flexible set of controls for managing collaboration with external users and organizations. You can allow or block all collaboration, or configure collaboration only for specific organizations, users, and apps. Before configuring settings for cross-tenant access and external collaboration, take a careful inventory of the organizations you work and partner with. Then determine if you want to enable [B2B collaboration](what-is-b2b.md) with other Azure AD tenants, and how you want to manage [B2B collaboration invitations](external-collaboration-settings-configure.md). |
+| Consult Azure AD guidance for securing your collaboration with external partners | Learn how to take a holistic governance approach to your organization's collaboration with external partners by following the recommendations in [Securing external collaboration in Azure Active Directory and Microsoft 365](../fundamentals/secure-external-access-resources.md). |
+| Carefully plan your cross-tenant access and external collaboration settings | Azure AD gives you a flexible set of controls for managing collaboration with external users and organizations. You can allow or block all collaboration, or configure collaboration only for specific organizations, users, and apps. Before configuring settings for cross-tenant access and external collaboration, take a careful inventory of the organizations you work and partner with. Then determine if you want to enable [B2B collaboration](what-is-b2b.md) with other Azure AD tenants, and how you want to manage [B2B collaboration invitations](external-collaboration-settings-configure.md). |
| For an optimal sign-in experience, federate with identity providers | Whenever possible, federate directly with identity providers to allow invited users to sign in to your shared apps and resources without having to create Microsoft Accounts (MSAs) or Azure AD accounts. You can use the [Google federation feature](google-federation.md) to allow B2B guest users to sign in with their Google accounts. Or, you can use the [SAML/WS-Fed identity provider (preview) feature](direct-federation.md) to set up federation with any organization whose identity provider (IdP) supports the SAML 2.0 or WS-Fed protocol. | | Use the Email one-time passcode feature for B2B guests who canΓÇÖt authenticate by other means | The [Email one-time passcode](one-time-passcode.md) feature authenticates B2B guest users when they can't be authenticated through other means like Azure AD, a Microsoft account (MSA), or Google federation. When the guest user redeems an invitation or accesses a shared resource, they can request a temporary code, which is sent to their email address. Then they enter this code to continue signing in. | | Add company branding to your sign-in page | You can customize your sign-in page so it's more intuitive for your B2B guest users. See how to [add company branding to sign in and Access Panel pages](../fundamentals/customize-branding.md). |
active-directory Cross Tenant Access Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/cross-tenant-access-overview.md
The output is a summary of all available sign-in events for inbound and outbound
### Sign-in logs PowerShell script
-To determine your users' access to external Azure AD organizations, you can use the [Get-MgAuditLogSignIn](https://aka.ms/cross-tenant-log-ps) cmdlet in the Microsoft Graph PowerShell SDK to view data from your sign-in logs for the last 30 days. For example, run the following command:
+To determine your users' access to external Azure AD organizations, you can use the [Get-MgAuditLogSignIn](/powershell/module/microsoft.graph.reports/get-mgauditlogsignin) cmdlet in the Microsoft Graph PowerShell SDK to view data from your sign-in logs for the last 30 days. For example, run the following command:
```powershell Get-MgAuditLogSignIn `
If your organization exports sign-in logs to a Security Information and Event Ma
## Next steps
-[Configure cross-tenant access settings for B2B collaboration](cross-tenant-access-settings-b2b-collaboration.md)
+[Configure cross-tenant access settings for B2B collaboration](cross-tenant-access-settings-b2b-collaboration.md)
active-directory Identity Governance Automation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/governance/identity-governance-automation.md
# Automate Azure AD Identity Governance tasks via Azure Automation and Microsoft Graph
-[Azure Automation](/azure/automation/overview) is an Azure cloud service that allows you to automate common or repetitive systems management and processes. Microsoft Graph is the Microsoft unified API endpoint for Azure AD features that manage users, groups, access packages, access reviews, and other resources in the directory. You can manage Azure AD at scale from the PowerShell command line, using the [Microsoft Graph PowerShell SDK](/graph/powershell/get-started). You can also include the Microsoft Graph PowerShell cmdlets from a [PowerShell-based runbook in Azure Automation](/azure/automation/automation-intro), so that you can automate Azure AD tasks from a simple script.
+[Azure Automation](../../automation/overview.md) is an Azure cloud service that allows you to automate common or repetitive systems management and processes. Microsoft Graph is the Microsoft unified API endpoint for Azure AD features that manage users, groups, access packages, access reviews, and other resources in the directory. You can manage Azure AD at scale from the PowerShell command line, using the [Microsoft Graph PowerShell SDK](/graph/powershell/get-started). You can also include the Microsoft Graph PowerShell cmdlets from a [PowerShell-based runbook in Azure Automation](/azure/automation/automation-intro), so that you can automate Azure AD tasks from a simple script.
Azure Automation and the PowerShell Graph SDK supports certificate-based authentication and application permissions, so you can have Azure Automation runbooks authenticate to Azure AD without needing a user context.
This article will show you how to get started using Azure Automation for Azure A
## Create an Azure Automation account
-Azure Automation provides a cloud-hosted environment for [runbook execution](/azure/automation/automation-runbook-execution). Those runbooks can start automatically based on a schedule, or be triggered by webhooks or by Logic Apps.
+Azure Automation provides a cloud-hosted environment for [runbook execution](../../automation/automation-runbook-execution.md). Those runbooks can start automatically based on a schedule, or be triggered by webhooks or by Logic Apps.
Using Azure Automation requires you to have an Azure subscription.
$ap | Select-Object -Property Id,DisplayName | ConvertTo-Json
Once your runbook is published, your can create a schedule in Azure Automation, and link your runbook to that schedule to run automatically. Scheduling runbooks from Azure Automation is suitable for runbooks that do not need to interact with other Azure or Office 365 services.
-If you wish to send the output of your runbook to another service, then you may wish to consider using [Azure Logic Apps](/azure/logic-apps/logic-apps-overview) to start your Azure Automation runbook, as Logic Apps can also parse the results.
+If you wish to send the output of your runbook to another service, then you may wish to consider using [Azure Logic Apps](../../logic-apps/logic-apps-overview.md) to start your Azure Automation runbook, as Logic Apps can also parse the results.
1. In Azure Logic Apps, create a Logic App in the Logic Apps Designer starting with **Recurrence**.
If you wish to send the output of your runbook to another service, then you may
1. Select **New step** and add the operation **Get job output**. Select the same Subscription, Resource Group, Automation Account as the previous step, and select the Dynamic value of the **Job ID** from the previous step.
-1. You can then add more operations to the Logic App, such as the [**Parse JSON** action](/azure/logic-apps/logic-apps-perform-data-operations#parse-json-action), that use the **Content** returned when the runbook completes.
+1. You can then add more operations to the Logic App, such as the [**Parse JSON** action](../../logic-apps/logic-apps-perform-data-operations.md#parse-json-action), that use the **Content** returned when the runbook completes.
Note that in Azure Automation, a PowerShell runbook can fail to complete if it tries to write a large amount of data to the output stream at once. You can typically work around this issue by having the runbook output just the information needed by the Logic App, such as by using the `Select-Object -Property` cmdlet to exclude unneeded properties.
There are two places where you can see the expiration date in the Azure portal.
## Next steps -- [Create an Automation account using the Azure portal](/azure/automation/quickstarts/create-account-portal)-- [Manage access to resources in Active Directory entitlement management using Microsoft Graph PowerShell](/powershell/microsoftgraph/tutorial-entitlement-management?view=graph-powershell-beta)
+- [Create an Automation account using the Azure portal](../../automation/quickstarts/create-account-portal.md)
+- [Manage access to resources in Active Directory entitlement management using Microsoft Graph PowerShell](/powershell/microsoftgraph/tutorial-entitlement-management?view=graph-powershell-beta)
active-directory How To Connect Install Custom https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/how-to-connect-install-custom.md
After installing the required components, select your users' single sign-on meth
On the **Connect to Azure AD** page, enter a global admin account and password. If you selected **Federation with AD FS** on the previous page, don't sign in with an account that's in a domain you plan to enable for federation. You might want to use an account in the default *onmicrosoft.com* domain, which comes with your Azure AD tenant. This account is used only to create a service account in Azure AD. It's not used after the installation finishes.
-
+
+>[!NOTE]
+>A best practice is to avoid using on-premises synced accounts for Azure AD role assignments. If the on premises account is compromised, this can be used to compromise your Azure AD resources as well. For a complete list of best practices refer to [Best practices for Azure AD roles](https://docs.microsoft.com/azure/active-directory/roles/best-practices)
+
![Screenshot showing the "Connect to Azure AD" page.](./media/how-to-connect-install-custom/connectaad.png) If your global admin account has multifactor authentication enabled, you provide the password again in the sign-in window, and you must complete the multifactor authentication challenge. The challenge could be a verification code or a phone call.
active-directory How To Connect Sync Feature Directory Extensions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/how-to-connect-sync-feature-directory-extensions.md
An object in Azure AD can have up to 100 attributes for directory extensions. Th
> [!NOTE] > It is not supported to sync constructed attributes, such as msDS-UserPasswordExpiryTimeComputed. If you upgrade from an old version of AADConnect you may still see these attributes show up in the installation wizard, you should not enable them though. Their value will not sync to Azure AD if you do.
-> You can read more about constructed attributes in [this artice](https://docs.microsoft.com/openspecs/windows_protocols/ms-adts/a3aff238-5f0e-4eec-8598-0a59c30ecd56).
-> You should also not attempt to sync [Non-replicated attributes](https://docs.microsoft.com/windows/win32/ad/attributes), such as badPwdCount, Last-Logon, and Last-Logoff, as their values will not be synced to Azure AD.
+> You can read more about constructed attributes in [this artice](/openspecs/windows_protocols/ms-adts/a3aff238-5f0e-4eec-8598-0a59c30ecd56).
+> You should also not attempt to sync [Non-replicated attributes](/windows/win32/ad/attributes), such as badPwdCount, Last-Logon, and Last-Logoff, as their values will not be synced to Azure AD.
## Configuration changes in Azure AD made by the wizard
One of the more useful scenarios is to use these attributes in dynamic security
## Next steps Learn more about the [Azure AD Connect sync](how-to-connect-sync-whatis.md) configuration.
-Learn more about [Integrating your on-premises identities with Azure Active Directory](whatis-hybrid-identity.md).
+Learn more about [Integrating your on-premises identities with Azure Active Directory](whatis-hybrid-identity.md).
active-directory Tshoot Connect Sync Errors https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/tshoot-connect-sync-errors.md
Errors can occur when identity data is synced from Windows Server Active Directo
This article assumes you're familiar with the underlying [design concepts of Azure AD and Azure AD Connect](plan-connect-design-concepts.md). >[!IMPORTANT]
->This article attempts to address the most common synchronization errors. Unfortunately, covering every scenario in one document is not possible. For more information including in-depth troubleshooting steps, see [End-to-end troubleshooting of Azure AD Connect objects and attributes](https://docs.microsoft.com/troubleshoot/azure/active-directory/troubleshoot-aad-connect-objects-attributes) and the [User Provisioning and Synchronization](https://docs.microsoft.com/troubleshoot/azure/active-directory/welcome-azure-ad) section under the Azure AD troubleshooting documentation.
+>This article attempts to address the most common synchronization errors. Unfortunately, covering every scenario in one document is not possible. For more information including in-depth troubleshooting steps, see [End-to-end troubleshooting of Azure AD Connect objects and attributes](/troubleshoot/azure/active-directory/troubleshoot-aad-connect-objects-attributes) and the [User Provisioning and Synchronization](/troubleshoot/azure/active-directory/welcome-azure-ad) section under the Azure AD troubleshooting documentation.
With the latest version of Azure AD Connect \(August 2016 or higher\), a Synchronization Errors Report is available in the [Azure portal](https://aka.ms/aadconnecthealth) as part of Azure AD Connect Health for sync.
To resolve this issue:
* [Locate Active Directory objects in Active Directory Administrative Center](/previous-versions/windows/it-pro/windows-server-2008-R2-and-2008/dd560661(v=ws.10)) * [Query Azure AD for an object by using Azure AD PowerShell](/previous-versions/azure/jj151815(v=azure.100))
-* [End-to-end troubleshooting of Azure AD Connect objects and attributes](https://docs.microsoft.com/troubleshoot/azure/active-directory/troubleshoot-aad-connect-objects-attributes)
-* [Azure AD Troubleshooting](https://docs.microsoft.com/troubleshoot/azure/active-directory/welcome-azure-ad)
+* [End-to-end troubleshooting of Azure AD Connect objects and attributes](/troubleshoot/azure/active-directory/troubleshoot-aad-connect-objects-attributes)
+* [Azure AD Troubleshooting](/troubleshoot/azure/active-directory/welcome-azure-ad)
active-directory Concept Workload Identity Risk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/identity-protection/concept-workload-identity-risk.md
Title: Securing workload identities with Azure AD Identity Protection Preview
+ Title: Securing workload identities with Azure AD Identity Protection (Preview)
description: Workload identity risk in Azure Active Directory Identity Protection
Azure AD Identity Protection has historically protected users in detecting, investigating, and remediating identity-based risks. We're now extending these capabilities to workload identities to protect applications, service principals, and Managed Identities.
-A workload identity is an identity that allows an application or service principal access to resources, sometimes in the context of a user. These workload identities differ from traditional user accounts as they:
+A [workload identity](../develop/workload-identities-overview.md) is an identity that allows an application or service principal access to resources, sometimes in the context of a user. These workload identities differ from traditional user accounts as they:
- CanΓÇÖt perform multi-factor authentication. - Often have no formal lifecycle process.
We detect risk on workload identities across sign-in behavior and offline indica
| | | | | Azure AD threat intelligence | Offline | This risk detection indicates some activity that is consistent with known attack patterns based on Microsoft's internal and external threat intelligence sources. | | Suspicious Sign-ins | Offline | This risk detection indicates sign-in properties or patterns that are unusual for this service principal. <br><br> The detection learns the baselines sign-in behavior for workload identities in your tenant in between 2 and 60 days, and fires if one or more of the following unfamiliar properties appear during a later sign-in: IP address / ASN, target resource, user agent, hosting/non-hosting IP change, IP country, credential type. <br><br> Because of the programmatic nature of workload identity sign-ins, we provide a timestamp for the suspicious activity instead of flagging a specific sign-in event. <br><br> Sign-ins that are initiated after an authorized configuration change may trigger this detection. |
+| Unusual addition of credentials to an OAuth app | Offline | This detection is discovered by [Microsoft Defender for Cloud Apps](/defender-cloud-apps/investigate-anomaly-alerts#unusual-addition-of-credentials-to-an-oauth-app). This detection identifies the suspicious addition of privileged credentials to an OAuth app. This can indicate that an attacker has compromised the app, and is using it for malicious activity. |
| Admin confirmed account compromised | Offline | This detection indicates an admin has selected 'Confirm compromised' in the Risky Workload Identities UI or using riskyServicePrincipals API. To see which admin has confirmed this account compromised, check the accountΓÇÖs risk history (via UI or API). | ## Identify risky workload identities
active-directory Configure Authentication For Federated Users Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/configure-authentication-for-federated-users-portal.md
Use the previous example to get the **ObjectID** of the policy, and that of the
## Configuring policy through Graph Explorer
-Set the HRD policy using Microsoft Graph. See [homeRealmDiscoveryPolicy](https://docs.microsoft.com/graph/api/resources/homeRealmDiscoveryPolicy?view=graph-rest-1.0) resource type for information on how to create the policy.
+Set the HRD policy using Microsoft Graph. See [homeRealmDiscoveryPolicy](/graph/api/resources/homeRealmDiscoveryPolicy?view=graph-rest-1.0) resource type for information on how to create the policy.
From the Microsoft Graph explorer window:
From the Microsoft Graph explorer window:
## Next steps
-[Prevent sign-in auto-acceleration](prevent-domain-hints-with-home-realm-discovery.md).
+[Prevent sign-in auto-acceleration](prevent-domain-hints-with-home-realm-discovery.md).
active-directory F5 Big Ip Headers Easy Button https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/f5-big-ip-headers-easy-button.md
This scenario looks at the classic legacy application using HTTP authorization h
Being legacy, the application lacks any form of modern protocols to support a direct integration with Azure AD. Modernizing the app is also costly, requires careful planning, and introduces risk of potential downtime.
-One option would be to consider [Azure AD Application Proxy](/azure/active-directory/app-proxy/application-proxy), to gate remote access to the application.
+One option would be to consider [Azure AD Application Proxy](../app-proxy/application-proxy.md), to gate remote access to the application.
Another approach is to use an F5 BIG-IP Application Delivery Controller (ADC), as it too provides the protocol transitioning required to bridge legacy applications to the modern ID control plane.
If you donΓÇÖt see a BIG-IP error page, then the issue is probably more related
2. The **View Variables** link in this location may also help root cause SSO issues, particularly if the BIG-IP APM fails to obtain the right attributes
-For more information, visit this F5 knowledge article [Configuring LDAP remote authentication for Active Directory](https://support.f5.com/csp/article/K11072). ThereΓÇÖs also a great BIG-IP reference table to help diagnose LDAP-related issues in this F5 knowledge article on [LDAP Query](https://techdocs.f5.com/kb/en-us/products/big-ip_apm/manuals/product/apm-authentication-single-sign-on-11-5-0/5.html).
+For more information, visit this F5 knowledge article [Configuring LDAP remote authentication for Active Directory](https://support.f5.com/csp/article/K11072). ThereΓÇÖs also a great BIG-IP reference table to help diagnose LDAP-related issues in this F5 knowledge article on [LDAP Query](https://techdocs.f5.com/kb/en-us/products/big-ip_apm/manuals/product/apm-authentication-single-sign-on-11-5-0/5.html).
active-directory F5 Big Ip Kerberos Easy Button https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/f5-big-ip-kerberos-easy-button.md
For this scenario, we have an application using **Kerberos authentication**, als
Being legacy, the application lacks modern protocols to support a direct integration with Azure AD. Modernizing the app would be ideal, but is costly, requires careful planning, and introduces risk of potential downtime.
-One option would be to consider using [Azure AD Application Proxy](/azure/active-directory/app-proxy/application-proxy), as it provides the protocol transitioning required to bridge the legacy application to the modern identity control plane. Or for our scenario, we'll achieve this using F5's BIG-IP Application Delivery Controller (ADC).
+One option would be to consider using [Azure AD Application Proxy](../app-proxy/application-proxy.md), as it provides the protocol transitioning required to bridge the legacy application to the modern identity control plane. Or for our scenario, we'll achieve this using F5's BIG-IP Application Delivery Controller (ADC).
Having a BIG-IP in front of the application enables us to overlay the service with Azure AD pre-authentication and header-based SSO, significantly improving the overall security posture of the application for remote and local access.
If you donΓÇÖt see a BIG-IP error page, then the issue is probably more related
2. Select the link for your active session. The **View Variables** link in this location may also help determine root cause KCD issues, particularly if the BIG-IP APM fails to obtain the right user and domain identifiers.
-See [BIG-IP APM variable assign examples]( https://devcentral.f5.com/s/articles/apm-variable-assign-examples-1107) and [F5 BIG-IP session variables reference]( https://techdocs.f5.com/en-us/bigip-15-0-0/big-ip-access-policy-manager-visual-policy-editor/session-variables.html) for more info.
+See [BIG-IP APM variable assign examples]( https://devcentral.f5.com/s/articles/apm-variable-assign-examples-1107) and [F5 BIG-IP session variables reference]( https://techdocs.f5.com/en-us/bigip-15-0-0/big-ip-access-policy-manager-visual-policy-editor/session-variables.html) for more info.
active-directory F5 Big Ip Ldap Header Easybutton https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/f5-big-ip-ldap-header-easybutton.md
This scenario looks at the classic legacy application using HTTP authorization h
Being legacy, the application lacks any form of modern protocols to support a direct integration with Azure AD. Modernizing the app is also costly, requires careful planning, and introduces risk of potential downtime.
-One option would be to consider [Azure AD Application Proxy](/azure/active-directory/app-proxy/application-proxy), to gate remote access to the application.
+One option would be to consider [Azure AD Application Proxy](../app-proxy/application-proxy.md), to gate remote access to the application.
Another approach is to use an F5 BIG-IP Application Delivery Controller (ADC), as it too provides the protocol transitioning required to bridge legacy applications to the modern ID control plane.
If you donΓÇÖt see a BIG-IP error page, then the issue is probably more related
```ldapsearch -xLLL -H 'ldap://192.168.0.58' -b "CN=partners,dc=contoso,dc=lds" -s sub -D "CN=f5-apm,CN=partners,DC=contoso,DC=lds" -w 'P@55w0rd!' "(cn=testuser)" ```
-For more information, visit this F5 knowledge article [Configuring LDAP remote authentication for Active Directory](https://support.f5.com/csp/article/K11072). ThereΓÇÖs also a great BIG-IP reference table to help diagnose LDAP-related issues in this F5 knowledge article on [LDAP Query](https://techdocs.f5.com/kb/en-us/products/big-ip_apm/manuals/product/apm-authentication-single-sign-on-11-5-0/5.html).
+For more information, visit this F5 knowledge article [Configuring LDAP remote authentication for Active Directory](https://support.f5.com/csp/article/K11072). ThereΓÇÖs also a great BIG-IP reference table to help diagnose LDAP-related issues in this F5 knowledge article on [LDAP Query](https://techdocs.f5.com/kb/en-us/products/big-ip_apm/manuals/product/apm-authentication-single-sign-on-11-5-0/5.html).
active-directory Workbook Cross Tenant Access Activity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/reports-monitoring/workbook-cross-tenant-access-activity.md
Previously updated : 02/04/2022 Last updated : 02/14/2022
This workbook has four sections:
- Individual users for inbound and outbound collaboration by tenant ID
-![Screenshot showing list of external tenants with sign-in data](./media/workbook-cross-tenant-access-activity/external-tenants-list.png)
+The total number of external tenants that have had cross-tenant access activity with your tenant is shown at the top of the workbook.
+
+Under **Step 1**, the external tenant list shows all the tenants that have had inbound or outbound activity with your tenant. When you select an external tenant in the table, the remaining sections update with information about outbound and inbound activity for that tenant.
+
+[ ![Screenshot showing list of external tenants with sign-in data.](./media/workbook-cross-tenant-access-activity/cross-tenant-workbook-step-1.png) ](./media/workbook-cross-tenant-access-activity/cross-tenant-workbook-step-1.png#lightbox)
+
+The table under **Step 2** summarizes all outbound and inbound sign-in activity for the selected tenant, including the number of successful sign-ins and the resons for failed sign-ins. You can select **Outbound activity** or **Inbound activity** to update the remaining sections of the workbook with the type of activity you want to view.
+
+![Screenshot showing activity for the selected tenant.](./media/workbook-cross-tenant-access-activity/cross-tenant-workbook-step-2.png)
+
+Under **Step 3**, the table lists the applications that are being accessed across tenants. If you selected **Outbound activity** in the previous section, the table shows the applications in external tenants that are being accessed by your users. If you selected **Inbound activity**, you'll see the list of your applications that are being accessed by external users. You can select a row to find out which users are accessing that application.
+
+![Screenshot showing application activity for the selected tenant.](./media/workbook-cross-tenant-access-activity/cross-tenant-workbook-step-3.png)
+
+The table in **Step 4** displays the list of users who are accessing the application you selected.
+
+![Screenshot showing users accessing an app.](./media/workbook-cross-tenant-access-activity/cross-tenant-workbook-step-4.png)
## Filters
aks Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/faq.md
For Windows Server nodes, Windows Update does not automatically run and apply th
### Are there additional security threats relevant to AKS that customers should be aware of?
-Microsoft provides guidance on additional actions you can take to secure your workloads through services like [Microsoft Defender for Containers](https://docs.microsoft.com/azure/defender-for-cloud/defender-for-containers-introduction?tabs=defender-for-container-arch-aks). The following is a list of additional security threats related to AKS and Kubernetes that customers should be aware of:
+Microsoft provides guidance on additional actions you can take to secure your workloads through services like [Microsoft Defender for Containers](../defender-for-cloud/defender-for-containers-introduction.md?tabs=defender-for-container-arch-aks). The following is a list of additional security threats related to AKS and Kubernetes that customers should be aware of:
* [New large-scale campaign targets Kubeflow](https://techcommunity.microsoft.com/t5/azure-security-center/new-large-scale-campaign-targets-kubeflow/ba-p/2425750) - June 8, 2021
AKS doesn't apply Network Security Groups (NSGs) to its subnet and will not modi
[admission-controllers]: https://kubernetes.io/docs/reference/access-authn-authz/admission-controllers/ [private-clusters-github-issue]: https://github.com/Azure/AKS/issues/948 [csi-driver]: https://github.com/Azure/secrets-store-csi-driver-provider-azure
-[vm-sla]: https://azure.microsoft.com/support/legal/sla/virtual-machines/
+[vm-sla]: https://azure.microsoft.com/support/legal/sla/virtual-machines/
aks Operator Best Practices Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/operator-best-practices-identity.md
Azure Active Directory Pod Identity supports 2 modes of operation:
* [Managed Identity Controller(MIC)](https://azure.github.io/aad-pod-identity/docs/concepts/mic/): A Kubernetes controller that watches for changes to pods, [AzureIdentity](https://azure.github.io/aad-pod-identity/docs/concepts/azureidentity/) and [AzureIdentityBinding](https://azure.github.io/aad-pod-identity/docs/concepts/azureidentitybinding/) through the Kubernetes API Server. When it detects a relevant change, the MIC adds or deletes [AzureAssignedIdentity](https://azure.github.io/aad-pod-identity/docs/concepts/azureassignedidentity/) as needed. Specifically, when a pod is scheduled, the MIC assigns the managed identity on Azure to the underlying VMSS used by the node pool during the creation phase. When all pods using the identity are deleted, it removes the identity from the VMSS of the node pool, unless the same managed identity is used by other pods. The MIC takes similar actions when AzureIdentity or AzureIdentityBinding are created or deleted. * [Node Managed Identity (NMI)](https://azure.github.io/aad-pod-identity/docs/concepts/nmi/): is a pod that runs as a DaemonSet on each node in the AKS cluster. NMI intercepts security token requests to the [Azure Instance Metadata Service](../virtual-machines/linux/instance-metadata-service.md?tabs=linux) on each node, redirect them to itself and validates if the pod has access to the identity it's requesting a token for and fetch the token from the Azure Active Directory tenant on behalf of the application.
-2. Managed Mode: In this mode, there is only NMI. The identity needs to be manually assigned and managed by the user. For more information, see [Pod Identity in Managed Mode](https://azure.github.io/aad-pod-identity/docs/configure/pod_identity_in_managed_mode/). In this mode, when you use the [az aks pod-identity add](/cli/azure/aks/pod-identity#az_aks_pod_identity_add) command to add a pod identity to an Azure Kubernetes Service (AKS) cluster, it creates the [AzureIdentity](https://azure.github.io/aad-pod-identity/docs/concepts/azureidentity/) and [AzureIdentityBinding](https://azure.github.io/aad-pod-identity/docs/concepts/azureidentitybinding/) in the namespace specified by the `--namespace` parameter, while the AKS resource provider assigns the managed identity specified by the `--identity-resource-id` parameter to virtual machine scale set (VMSS) of each node pool in the AKS cluster.
+2. Managed Mode: In this mode, there is only NMI. The identity needs to be manually assigned and managed by the user. For more information, see [Pod Identity in Managed Mode](https://azure.github.io/aad-pod-identity/docs/configure/pod_identity_in_managed_mode/). In this mode, when you use the [az aks pod-identity add](/cli/azure/aks/pod-identity#az-aks-pod-identity-add) command to add a pod identity to an Azure Kubernetes Service (AKS) cluster, it creates the [AzureIdentity](https://azure.github.io/aad-pod-identity/docs/concepts/azureidentity/) and [AzureIdentityBinding](https://azure.github.io/aad-pod-identity/docs/concepts/azureidentitybinding/) in the namespace specified by the `--namespace` parameter, while the AKS resource provider assigns the managed identity specified by the `--identity-resource-id` parameter to virtual machine scale set (VMSS) of each node pool in the AKS cluster.
> [!NOTE] > If you instead decide to install the Azure Active Directory Pod Identity using the [AKS cluster add-on](./use-azure-ad-pod-identity.md), the setup will use the `managed` mode. The `managed` mode provides the following advantages over the `standard`:
-1. Identity assignment on the VMSS of a node pool can take up 40-60s. In case of cronjobs or applications that require access to the identity and can't tolerate the assignment delay, it's best to use `managed` mode as the identity is pre-assigned to the VMSS of the node pool, manually or via the [az aks pod-identity add](/cli/azure/aks/pod-identity#az_aks_pod_identity_add) command.
+1. Identity assignment on the VMSS of a node pool can take up 40-60s. In case of cronjobs or applications that require access to the identity and can't tolerate the assignment delay, it's best to use `managed` mode as the identity is pre-assigned to the VMSS of the node pool, manually or via the [az aks pod-identity add](/cli/azure/aks/pod-identity#az-aks-pod-identity-add) command.
2. In `standard` mode, MIC requires write permissions on the VMSS used by the AKS cluster and `Managed Identity Operator` permission on the user-assigned managed identities. While running in `managed mode`, since there is no MIC, the role assignments are not required. Instead of manually defining credentials for pods, pod-managed identities request an access token in real time, using it to access only their assigned services. In AKS, there are two components that handle the operations to allow pods to use managed identities:
aks Use Multiple Node Pools https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/use-multiple-node-pools.md
A workload may require splitting a cluster's nodes into separate pools for logic
* All subnets assigned to nodepools must belong to the same virtual network. * System pods must have access to all nodes/pods in the cluster to provide critical functionality such as DNS resolution and tunneling kubectl logs/exec/port-forward proxy.
-* If you expand your VNET after creating the cluster you must update your cluster (perform any managed cluster operation but node pool operations don't count) before adding a subnet outside the original cidr. AKS will error out on the agent pool add now though we originally allowed it. If you don't know how to reconcile your cluster file a support ticket.
-* Calico Network Policy is not supported.
+* If you expand your VNET after creating the cluster you must update your cluster (perform any managed cluster operation but node pool operations don't count) before adding a subnet outside the original cidr. AKS will error out on the agent pool add now though we originally allowed it. If you don't know how to reconcile your cluster file a support ticket.
* Azure Network Policy is not supported.
-* Kube-proxy expects a single contiguous cidr and uses it this for three optmizations. See this [K.E.P.](https://github.com/kubernetes/enhancements/tree/master/keps/sig-network/2450-Remove-knowledge-of-pod-cluster-CIDR-from-iptables-rules) and --cluster-cidr [here](https://kubernetes.io/docs/reference/command-line-tools-reference/kube-proxy/) for details. In Azure cni your first node pool's subnet will be given to kube-proxy.
+* Kube-proxy is designed for a single contiguous CIDR and optimizes rules based on that value. When using multiple non-contiguous ranges, these optimizations cannot occur. See this [K.E.P.](https://github.com/kubernetes/enhancements/tree/master/keps/sig-network/2450-Remove-knowledge-of-pod-cluster-CIDR-from-iptables-rules) and the documentation for the [`--cluster-cidr` `kube-proxy` argument](https://kubernetes.io/docs/reference/command-line-tools-reference/kube-proxy/) for more details. In clusters configured with Azure CNI, `kube-proxy` will be configured with the subnet of the first node pool at cluster creation.
To create a node pool with a dedicated subnet, pass the subnet resource ID as an additional parameter when creating a node pool.
analysis-services Analysis Services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/analysis-services/analysis-services-overview.md
description: Learn about Azure Analysis Services, a fully managed platform as a
Previously updated : 10/12/2021 Last updated : 02/15/2022 recommendations: false
Azure Analysis Services is a fully managed platform as a service (PaaS) that pro
In Azure portal, you can [create a server](analysis-services-create-server.md) within minutes. And with Azure Resource Manager [templates](../azure-resource-manager/templates/quickstart-create-templates-use-the-portal.md) and PowerShell, you can create servers using a declarative template. With a single template, you can deploy server resources along with other Azure components such as storage accounts and Azure Functions.
-**Video:** Check out Automating deployment to learn more about how you can use Azure Automation to speed server creation.
- Azure Analysis Services integrates with many Azure services enabling you to build sophisticated analytics solutions. Integration with [Azure Active Directory](../active-directory/fundamentals/active-directory-whatis.md) provides secure, role-based access to your critical data. Integrate with [Azure Data Factory](../data-factory/introduction.md) pipelines by including an activity that loads data into the model. [Azure Automation](../automation/automation-intro.md) and [Azure Functions](../azure-functions/functions-overview.md) can be used for lightweight orchestration of models using custom code. ## The right tier when you need it
Azure Analysis Services documentation also uses [GitHub Issues](/teamblog/a-new-
Things are changing rapidly. Get the latest information on the [Power BI blog](https://powerbi.microsoft.com/blog/category/analysis-services/) and [Azure blog](https://azure.microsoft.com/blog/).
-## Community
+## Q&A
-Analysis Services has a vibrant community of users. Join the conversation on [Azure Analysis Services forum](https://aka.ms/azureanalysisservicesforum).
+Microsoft [Q&A](/answers/products/) is a technical community platform part of Microsoft Docs that provides a rich online experience in answering your technical questions. Join the conversation on [Q&A - Azure Analysis Services forum](/answers/topics/azure-analysis-services.html).
## Next steps
app-service App Service Web Nodejs Best Practices And Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/app-service-web-nodejs-best-practices-and-troubleshoot-guide.md
This setting controls the directory where iisnode logs stdout/stderr. The defaul
### debuggerExtensionDll
-This setting controls what version of node-inspector iisnode uses when debugging your node application. Currently, iisnode-inspector-0.7.3.dll and iisnode-inspector.dll are the only two valid values for this setting. The default value is iisnode-inspector-0.7.3.dll. The iisnode-inspector-0.7.3.dll version uses node-inspector-0.7.3 and uses web sockets. Enable web sockets on your Azure webapp to use this version. See <https://ranjithblogs.azurewebsites.net/?p=98> for more details on how to configure iisnode to use the new node-inspector.
+This setting controls what version of node-inspector iisnode uses when debugging your node application. Currently, iisnode-inspector-0.7.3.dll and iisnode-inspector.dll are the only two valid values for this setting. The default value is iisnode-inspector-0.7.3.dll. The iisnode-inspector-0.7.3.dll version uses node-inspector-0.7.3 and uses web sockets. Enable web sockets on your Azure webapp to use this version.
### flushResponse
app-service Quickstart Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/quickstart-python.md
Title: 'Quickstart: Deploy a Python web app to Azure App Service'
+ Title: 'Quickstart: Deploy a Python (Django or Flask) web app to Azure'
description: Get started with Azure App Service by deploying your first Python app to Azure App Service. Last updated 01/28/2022
# Quickstart: Deploy a Python (Django or Flask) web app to Azure App Service
-In this quickstart, you will deploy a Python web app (Django or Flask) to [Azure App Service](/azure/app-service/overview#app-service-on-linux). Azure App Service is a fully managed web hosting service that supports Python 3.7 and higher apps hosted in a Linux server environment.
+In this quickstart, you will deploy a Python web app (Django or Flask) to [Azure App Service](./overview.md#app-service-on-linux). Azure App Service is a fully managed web hosting service that supports Python 3.7 and higher apps hosted in a Linux server environment.
To complete this quickstart, you need: 1. An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
Having issues? [Let us know](https://aka.ms/PythonAppServiceQuickstartFeedback).
## 2 - Create a web app in Azure
-To host your application in Azure, you need to create Azure App Service web app in Azure. You can create a web app using the [Azure portal](https://portal.azure.com/), VS Code using the [Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack), or the Azure CLI.
+To host your application in Azure, you need to create Azure App Service web app in Azure. You can create a web app using the [Azure portal](https://portal.azure.com/), [VS Code](https://code.visualstudio.com/) using the [Azure Tools extension pack](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-node-azure-pack), or the Azure CLI.
### [Azure portal](#tab/azure-portal)
To deploy a web app from VS Code, you must have the [Azure Tools extension pack]
-Having issues? Refer first to the [Troubleshooting guide](/azure/app-service/configure-language-python.md#troubleshooting), otherwise, [let us know](https://aka.ms/PythonAppServiceQuickstartFeedback).
+Having issues? Refer first to the [Troubleshooting guide](./configure-language-python.md#troubleshooting), otherwise, [let us know](https://aka.ms/PythonAppServiceQuickstartFeedback).
## 4 - Browse to the app
The Python sample code is running a Linux container in App Service using a built
**Congratulations!** You have deployed your Python app to App Service.
-Having issues? Refer first to the [Troubleshooting guide](/azure/app-service/configure-language-python.md#troubleshooting), otherwise, [let us know](https://aka.ms/PythonAppServiceQuickstartFeedback).
+Having issues? Refer first to the [Troubleshooting guide](./configure-language-python.md#troubleshooting), otherwise, [let us know](https://aka.ms/PythonAppServiceQuickstartFeedback).
## 5 - Stream logs
Starting Live Log Stream
-Having issues? Refer first to the [Troubleshooting guide](/azure/app-service/configure-language-python.md#troubleshooting), otherwise, [let us know](https://aka.ms/PythonAppServiceQuickstartFeedback).
+Having issues? Refer first to the [Troubleshooting guide](./configure-language-python.md#troubleshooting), otherwise, [let us know](https://aka.ms/PythonAppServiceQuickstartFeedback).
## Clean up resources
Having issues? [Let us know](https://aka.ms/PythonAppServiceQuickstartFeedback).
## Next steps > [!div class="nextstepaction"]
-> [Tutorial: Python (Django) web app with PostgreSQL](/azure/app-service/tutorial-python-postgresql-app)
+> [Tutorial: Python (Django) web app with PostgreSQL](./tutorial-python-postgresql-app.md)
> [!div class="nextstepaction"]
-> [Configure Python app](/azure/app-service/configure-language-python)
+> [Configure Python app](./configure-language-python.md)
> [!div class="nextstepaction"]
-> [Add user sign-in to a Python web app](/azure/active-directory/develop/quickstart-v2-python-webapp)
+> [Add user sign-in to a Python web app](../active-directory/develop/quickstart-v2-python-webapp.md)
> [!div class="nextstepaction"]
-> [Tutorial: Run Python app in custom container](/azure/app-service/tutorial-custom-container)
+> [Tutorial: Run Python app in custom container](./tutorial-custom-container.md)
app-service Reference App Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/reference-app-settings.md
Title: Environment variables and app settings reference description: Describes the commonly used environment variables, and which ones can be modified with app settings. Previously updated : 06/14/2021 Last updated : 02/15/2022 # Environment variables and app settings in Azure App Service
For more information on custom containers, see [Run a custom container in Azure]
| `DOCKER_REGISTRY_SERVER_URL` | URL of the registry server, when running a custom container in App Service. For security, this variable is not passed on to the container. | `https://<server-name>.azurecr.io` | | `DOCKER_REGISTRY_SERVER_USERNAME` | Username to authenticate with the registry server at `DOCKER_REGISTRY_SERVER_URL`. For security, this variable is not passed on to the container. || | `DOCKER_REGISTRY_SERVER_PASSWORD` | Password to authenticate with the registry server at `DOCKER_REGISTRY_SERVER_URL`. For security, this variable is not passed on to the container. ||
+| `WEBSITE_PULL_IMAGE_OVER_VNET` | Connect and pull from a registry inside a Virtual Network or on-premise. Your app will need to be connected to a Virtual Network using VNet integration feature. This setting is also needed for Azure Container Registry with Private Endpoint. ||
| `WEBSITES_WEB_CONTAINER_NAME` | In a Docker Compose app, only one of the containers can be internet accessible. Set to the name of the container defined in the configuration file to override the default container selection. By default, the internet accessible container is the first container to define port 80 or 8080, or, when no such container is found, the first container defined in the configuration file. | | | `WEBSITES_PORT` | For a custom container, the custom port number on the container for App Service to route requests to. By default, App Service attempts automatic port detection of ports 80 and 8080. This setting is *not* injected into the container as an environment variable. || | `WEBSITE_CPU_CORES_LIMIT` | By default, a Windows container runs with all available cores for your chosen pricing tier. To reduce the number of cores, set to the number of desired cores limit. For more information, see [Customize the number of compute cores](configure-custom-container.md?pivots=container-windows#customize-the-number-of-compute-cores).||
HTTPSCALE_FORWARD_REQUEST
IS_VALID_STAMP_TOKEN NEEDS_SITE_RESTRICTED_TOKEN HTTP_X_MS_PRIVATELINK_ID
- -->
+ -->
app-service Tutorial Connect Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/tutorial-connect-overview.md
Your app service may need to connect to other Azure services such as a database,
|Connection method|When to use| |--|--|
-|[Direct connection from App Service managed identity](#connect-to-azure-services-with-managed-identity)|Dependent service [supports managed identity](/azure/active-directory/managed-identities-azure-resources/managed-identities-status)<br><br>* Best for enterprise-level security<br>* Connection to dependent service is secured with managed identity<br>* Large team or automated connection string and secret management<br>* Don't manage credentials manually.<br>* Credentials arenΓÇÖt accessible to you.|
+|[Direct connection from App Service managed identity](#connect-to-azure-services-with-managed-identity)|Dependent service [supports managed identity](../active-directory/managed-identities-azure-resources/managed-identities-status.md)<br><br>* Best for enterprise-level security<br>* Connection to dependent service is secured with managed identity<br>* Large team or automated connection string and secret management<br>* Don't manage credentials manually.<br>* Credentials arenΓÇÖt accessible to you.|
|[Connect using Key Vault secrets from App Service managed identity](#connect-to-key-vault-with-managed-identity)|Dependent service doesn't support managed identity<br><br>* Best for enterprise-level security<br>* Connection includes non-Azure services such as GitHub, Twitter, Facebook, Google<br>* Large team or automated connection string and secret management<br>* Don't manage credentials manually.<br>* Credentials arenΓÇÖt accessible to you.<br>* Manage connection information with environment variables.| |[Connect with app settings](#connect-with-app-settings)|* Best for small team or individual owner of Azure resources.<br>* Stage 1 of multi-stage migration to Azure<br>* Temporary or proof-of-concept applications<br>* Manually manage connection information with environment variables| ## Connect to Azure services with managed identity
-Use [managed identity](/azure/active-directory/managed-identities-azure-resources/overview) to authenticate from one Azure resource, such as Azure app service, to another Azure resource whenever possible. This level of authentication lets Azure manage the authentication process, after the required setup is complete. Once the connection is set up, you won't need to manage the connection.
+Use [managed identity](../active-directory/managed-identities-azure-resources/overview.md) to authenticate from one Azure resource, such as Azure app service, to another Azure resource whenever possible. This level of authentication lets Azure manage the authentication process, after the required setup is complete. Once the connection is set up, you won't need to manage the connection.
Benefits of managed identity:
Benefits of managed identity:
:::image type="content" source="media/tutorial-connect-overview/when-use-managed-identities.png" alt-text="Image showing source and target resources for managed identity.":::
-Learn which [services](/azure/active-directory/managed-identities-azure-resources/managed-identities-status) are supported with managed identity and what [operations you can perform](/azure/active-directory/managed-identities-azure-resources/overview).
+Learn which [services](../active-directory/managed-identities-azure-resources/managed-identities-status.md) are supported with managed identity and what [operations you can perform](../active-directory/managed-identities-azure-resources/overview.md).
### Example managed identity scenario
app-service Tutorial Nodejs Mongodb App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/tutorial-nodejs-mongodb-app.md
# Deploy a Node.js + MongoDB web app to Azure
-In this tutorial, you'll deploy a sample **Express.js** app using a **MongoDB** database to Azure. The Express.js app will be hosted in Azure App Service which supports hosting Node.js apps in both Linux (Node versions 12, 14, and 16) and Windows (versions 12 and 14) server environments. The MongoDB database will be hosted in Azure Cosmos DB, a cloud native database offering a [100% MongoDB compatible API](/azure/cosmos-db/mongodb/mongodb-introduction).
+In this tutorial, you'll deploy a sample **Express.js** app using a **MongoDB** database to Azure. The Express.js app will be hosted in Azure App Service which supports hosting Node.js apps in both Linux (Node versions 12, 14, and 16) and Windows (versions 12 and 14) server environments. The MongoDB database will be hosted in Azure Cosmos DB, a cloud native database offering a [100% MongoDB compatible API](../cosmos-db/mongodb/mongodb-introduction.md).
:::image type="content" source="./media/tutorial-nodejs-mongodb-app/app-diagram.png" alt-text="A diagram showing how the Express.js app will be deployed to Azure App Service and the MongoDB data will be hosted inside of Azure Cosmos DB." lightbox="./media/tutorial-nodejs-mongodb-app/app-diagram-large.png":::
The contents of the App Service diagnostic logs can be reviewed in the Azure por
## 7 - Inspect deployed files using Kudu
-Azure App Service provides a web-based diagnostics console named [Kudu](/azure/app-service/resources-kudu) that allows you to examine the server hosting environment for your web app. Using Kudu, you can view the files deployed to Azure, review the deployment history of the application, and even open an SSH session into the hosting environment.
+Azure App Service provides a web-based diagnostics console named [Kudu](./resources-kudu.md) that allows you to examine the server hosting environment for your web app. Using Kudu, you can view the files deployed to Azure, review the deployment history of the application, and even open an SSH session into the hosting environment.
To access Kudu, navigate to one of the following URLs. You will need to sign into the Kudu site with your Azure credentials.
Follow these steps while signed-in to the Azure portal to delete a resource grou
> [JavaScript on Azure developer center](/azure/developer/javascript) > [!div class="nextstepaction"]
-> [Configure Node.js app in App Service](/azure/app-service/configure-language-nodejs)
+> [Configure Node.js app in App Service](./configure-language-nodejs.md)
applied-ai-services Build Training Data Set https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/build-training-data-set.md
Follow these additional tips to further optimize your data set for training.
## Upload your training data
-When you've put together the set of form documents that you'll use for training, you need to upload it to an Azure blob storage container. If you don't know how to create an Azure storage account with a container, following the [Azure Storage quickstart for Azure portal](../../storage/blobs/storage-quickstart-blobs-portal.md). Use the standard performance tier.
+When you've put together the set of form documents that you'll use for training, you need to upload it to an Azure blob storage container. If you don't know how to create an Azure storage account with a container, follow the [Azure Storage quickstart for Azure portal](../../storage/blobs/storage-quickstart-blobs-portal.md). Use the standard performance tier.
If you want to use manually labeled data, you'll also have to upload the *.labels.json* and *.ocr.json* files that correspond to your training documents. You can use the [Sample Labeling tool](label-tool.md) (or your own UI) to generate these files.
applied-ai-services Compose Custom Models Preview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/compose-custom-models-preview.md
+
+ Title: "How to guide: create and compose custom models with Form Recognizer v2.0"
+
+description: Learn how to create, use, and manage Form Recognizer v2.0 custom and composed models
+++++ Last updated : 02/15/2022+
+recommendations: false
++
+# Compose custom models v3.0 | Preview
+
+> [!NOTE]
+> This how-to guide references Form Recognizer v3.0 (preview). To use Form Recognizer v2.1 (GA), see [Compose custom models v2.1.](compose-custom-models.md).
+
+A composed model is created by taking a collection of custom models and assigning them to a single model comprised of your form types. You can assign up to 100 trained custom models to a single composed model. When you call Analyze with the composed model ID, Form Recognizer will first classify the form you submitted, choose the best matching assigned model, and then return results for that model.
+
+To learn more, see [Composed custom models](concept-composed-models.md)
+
+In this article you will learn how to create and use composed custom models to analyze your forms and documents.
+
+## Prerequisites
+
+To get started, you'll need the following:
+
+* **An Azure subscription**. You can [create a free Azure subscription](https://azure.microsoft.com/free/cognitive-services/)
+
+* **A Form Recognizer resource**. Once you have your Azure subscription, [create a Form Recognizer resource](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) in the Azure portal to get your key and endpoint. If you have an existing Form Recognizer resource, navigate directly to your resource page. You can use the free pricing tier (F0) to try the service, and upgrade later to a paid tier for production.
+
+ 1. After the resource deploys, select **Go to resource**.
+
+ 1. Copy the **Keys and Endpoint** values from the resource you created and paste them in a convenient location, such as *Microsoft Notepad*. You'll need the key and endpoint values to connect your application to the Form Recognizer API.
+
+ :::image border="true" type="content" source="media/containers/keys-and-endpoint.png" alt-text="Still photo showing how to access resource key and endpoint URL.":::
+
+ > [!TIP]
+ > For further guidance, *see* [**create a Form Recognizer resource**](create-a-form-recognizer-resource.md).
+
+* **An Azure storage account.** If you don't know how to create an Azure storage account, follow the [Azure Storage quickstart for Azure portal](../../storage/blobs/storage-quickstart-blobs-portal.md). You can use the free pricing tier (F0) to try the service, and upgrade later to a paid tier for production.
+
+## Create your custom models
+
+First, you'll need to a set of custom models to compose. Using the Form Recognizer Studio, REST API, or client-library SDKs, the steps are as follows:
+
+* [**Assemble your training dataset**](#assemble-your-training-dataset)
+* [**Upload your training set to Azure blob storage**](#upload-your-training-dataset)
+* [**Train your custom models**](#train-your-custom-model)
+
+## Assemble your training dataset
+
+Building a custom model begins with establishing your training dataset. You'll need a minimum of five completed forms of the same type for your sample dataset. They can be of different file types (jpg, png, pdf, tiff) and contain both text and handwriting. Your forms must follow the [input requirements](build-training-data-set.md#custom-model-input-requirements) for Form Recognizer.
+
+>[!TIP]
+> Follow these tips to optimize your data set for training:
+>
+> * If possible, use text-based PDF documents instead of image-based documents. Scanned PDFs are handled as images.
+> * For filled-in forms, use examples that have all of their fields filled in.
+> * Use forms with different values in each field.
+> * If your form images are of lower quality, use a larger data set (10-15 images, for example).
+
+See [Build a training data set](./build-training-data-set.md) for tips on how to collect your training documents.
+
+## Upload your training dataset
+
+When you've gathered the set of form documents that you'll use for training, you'll need to [upload your training data](build-training-data-set.md#upload-your-training-data)
+to an Azure blob storage container.
+
+If you want to use manually labeled data, you'll also have to upload the *.labels.json* and *.ocr.json* files that correspond to your training documents.
++
+## Train your custom model
+
+You [train your model](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects). Labeled datasets rely on the prebuilt-layout API, but supplementary human input is included such as your specific labels and field locations. To use both labeled data, start with at least five completed forms of the same type for the labeled training data and then add unlabeled data to the required data set.
+
+When you train with labeled data, the model uses supervised learning to extract values of interest, using the labeled forms you provide. Labeled data results in better-performing models and can produce models that work with complex forms or forms containing values without keys.
+
+Form Recognizer uses the [prebuilt-layout model](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument) API to learn the expected sizes and positions of printed and handwritten text elements and extract tables. Then it uses user-specified labels to learn the key/value associations and tables in the documents. We recommend that you use five manually labeled forms of the same type (same structure) to get started when training a new model and add more labeled data as needed to improve the model accuracy. Form Recognizer enables training a model to extract key-value pairs and tables using supervised learning capabilities.
+
+### [Form Recognizer Studio](#tab/studio)
+
+To create custom models, you start with configuring your project:
+
+1. From the Studio home, select the [Custom form project](https://formrecognizer.appliedai.azure.com/studio/customform/projects) to open the Custom form home page.
+
+1. Use the Γ₧ò **Create a project** command to start the new project configuration wizard.
+
+1. Enter project details, select the Azure subscription and resource, and the Azure Blob storage container that contains your data.
+
+1. Review and submit your settings to create the project.
++
+While creating your custom models, you may need to extract data collections from your documents. These may appear in a couple of formats. Using tables as the visual pattern:
+
+* Dynamic or variable count of values (rows) for a given set of fields (columns)
+
+* Specific collection of values for a given set of fields (columns and/or rows)
+
+See [Form Recognizer Studio: labeling as tables](quickstarts/try-v3-form-recognizer-studio.md#labeling-as-tables)
+
+### [REST API](#tab/rest)
+
+Training with labels leads to better performance in some scenarios. To train with labels, you need to have special label information files (*\<filename\>.pdf.labels.json*) in your blob storage container alongside the training documents.
+
+Label files contain key-value associations that a user has entered manually. They are needed for labeled data training, but not every source file needs to have a corresponding label file. Source files without labels will be treated as ordinary training documents. We recommend five or more labeled files for reliable training. You can use a UI tool like [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects) to generate these files.
+
+Once you have your label files, you can include them with by calling the training method with the *useLabelFile* parameter set to `true`.
++
+### [Client-libraries](#tab/sdks)
+
+Training with labels leads to better performance in some scenarios. To train with labels, you need to have special label information files (*\<filename\>.pdf.labels.json*) in your blob storage container alongside the training documents. Once you have them, you can call the training method with the *useTrainingLabels* parameter set to `true`.
+
+|Language |Method|
+|--|--|
+|**C#**|[**StartBuildModel**](/dotnet/api/azure.ai.formrecognizer.documentanalysis.documentmodeladministrationclient.startbuildmodel?view=azure-dotnet-preview#azure-ai-formrecognizer-documentanalysis-documentmodeladministrationclient-startbuildmodel&preserve-view=true)|
+|**Java**| [**beginBuildModel**](/java/api/com.azure.ai.formrecognizer.administration.documentmodeladministrationclient.beginbuildmodel?view=azure-java-preview&preserve-view=true)|
+|**JavaScript** | [**beginBuildModel**](/javascript/api/@azure/ai-form-recognizer/documentmodeladministrationclient?view=azure-node-preview#@azure-ai-form-recognizer-documentmodeladministrationclient-beginbuildmodel&preserve-view=true)|
+| **Python** | [**begin_build_model**](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.aio.documentmodeladministrationclient?view=azure-python-preview#azure-ai-formrecognizer-aio-documentmodeladministrationclient-begin-build-model&preserve-view=true)
+++
+## Create a composed model
+
+> [!NOTE]
+> **the `create compose model` operation is only available for custom models trained _with_ labels.** Attempting to compose unlabeled models will produce an error.
+
+With the [**create compose model**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/ComposeDocumentModel) operation, you can assign up to 100 trained custom models to a single model ID. When you call Analyze with the composed model ID, Form Recognizer will first classify the form you submitted, choose the best matching assigned model, and then return results for that model. This operation is useful when incoming forms may belong to one of several templates.
+
+### [Form Recognizer Studio](#tab/studio)
+
+Once the training process has successfully completed, you can begin to build your composed model. Here are the steps for creating and using composed models:
+
+* [**Gather your custom model IDs**](#gather-your-model-ids)
+* [**Compose your custom models**](#compose-your-custom-models)
+* [**Analyze documents**](#analyze-documents)
+* [**Manage your composed models**](#manage-your-composed-models)
+
+#### Gather your model IDs
+
+When you train models using the [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/), the model ID is located in the models menu under a project:
++
+#### Compose your custom models
+
+1. Select a custom models project.
+
+1. In the project, select the ```Models``` menu item.
+
+1. From the resulting list of models, select the models you wish to compose.
+
+1. Choose the **Compose button** from the upper-left corner.
+
+1. In the pop-up window, name your newly composed model and select **Compose**.
+
+1. When the operation completes, your newly composed model will appear in the list.
+
+1. Once the model is ready, use the **Test** command to validate it with your test documents and observe the results.
+++
+#### Analyze documents
+
+The custom model **Analyze** operation requires you to provide the `modelID` in the call to Form Recognizer. You should provide the composed model ID for the `modelID` parameter in your applications.
++
+#### Manage your composed models
+
+You can manage your custom models throughout life cycles:
+
+* Test and validate new documents.
+* Download your model to use in your applications.
+* Delete your model when it's lifecycle is complete.
++
+### [REST API](#tab/rest)
+
+Once the training process has successfully completed, you can begin to build your composed model. Here are the steps for creating and using composed models:
+
+* [**Gather your custom model IDs**](#gather-your-model-ids)
+* [**Compose your custom models**](#compose-your-custom-models)
+* [**Analyze documents**](#analyze-documents)
+* [**Manage your composed models**](#manage-your-composed-models)
+
+#### Gather your model IDs
+
+The [**REST API**](./quickstarts/try-v3-rest-api.md#manage-custom-models), will return a `201 (Success)` response with a **Location** header. The value of the last parameter in this header is the model ID for the newly trained model.
+
+#### Compose your custom models
+
+The [compose model API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/ComposeDocumentModel) accepts a list of models to be composed.
++
+#### Analyze documents
+
+You can make an [**Analyze document**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument) request using a unique model name in the request parameters.
++
+#### Manage your composed models
+
+You can manage custom models throughout your development needs including [**copying**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/CopyDocumentModelTo), [**listing**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/GetModels), and [**deleting**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/DeleteModel) your models.
+
+### [Client-libraries](#tab/sdks)
+
+Once the training process has successfully completed, you can begin to build your composed model. Here are the steps for creating and using composed models:
+
+* [**Create a composed model**](#create-a-composed-model)
+* [**Analyze documents**](#analyze-documents)
+* [**Manage your composed models**](#manage-your-composed-models)
+
+#### Create a composed model
+
+You can use the programming language of your choice to create a composed model:
+
+| Programming language| Code sample |
+|--|--|
+|**C#** | [Model compose](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_ModelCompose.md#create-a-composed-model)
+|**Java** | [Model compose](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_ModelCompose.md#create-a-composed-model)
+|**JavaScript** | [Compose model](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/formrecognizer/ai-form-recognizer/samples/v4-beta/javascript/composeModel.js)
+|**Python** | [Create composed model](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/samples/v3.2-beta/sample_create_composed_model.py)
+
+#### Analyze documents
+
+Once you have built your composed model, it can be used to analyze forms and documents You can use your composed `model ID` and let the service decide which of your aggregated custom models fits best according to the document provided.
+
+|Programming language| Code sample |
+|--|--|
+|**C#** | [Analyze a document with a custom/composed model](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_AnalyzeWithCustomModel.md)
+|**Java** | [Analyze forms with your custom/composed model ](https://github.com/Azure/azure-sdk-for-javocumentFromUrl.java)
+|**JavaScript** | [Analyze documents by model ID](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/formrecognizer/ai-form-recognizer/samples/v4-beta/javascript/analyzeReceiptByModelId.js)
+|**Python** | [Analyze custom documents](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/samples/v3.2-beta/sample_analyze_custom_documents.py)
+
+## Manage your composed models
+
+You can manage your custom models throughout their lifecycle by viewing a list of all custom models under your subscription, retrieving information about a specific custom model, and deleting custom models from your account.
+
+|Programming language| Code sample |
+|--|--|
+|**C#** | [Analyze a document with a custom/composed model](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/Sample_AnalyzeWithCustomModel.md)|
+|**Java** | [Custom model management operations](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/formrecognizer/azure-ai-formrecognizer/src/samples/java/com/azure/ai/formrecognizer/administration/ManageCustomModels.java)|
+|**JavaScript** | [Get model types and schema](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/formrecognizer/ai-form-recognizer/samples/v4-beta/javascript/getModel.js)|
+|**Python** | [Manage models](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/samples/v3.2-beta/sample_manage_models.py)|
+++
+## Next steps
+
+Try one of our quickstarts to get started using Form Recognizer preview
+
+> [!div class="nextstepaction"]
+> [Form Recognizer Studio](quickstarts/try-v3-form-recognizer-studio.md)
+
+> [!div class="nextstepaction"]
+> [REST API](quickstarts/try-v3-rest-api.md)
+
+> [!div class="nextstepaction"]
+> [C#](quickstarts/try-v3-csharp-sdk.md)
+
+> [!div class="nextstepaction"]
+> [Java](quickstarts/try-v3-java-sdk.md)
+
+> [!div class="nextstepaction"]
+> [JavaScript](quickstarts/try-v3-javascript-sdk.md)
+
+> [!div class="nextstepaction"]
+> [Python](quickstarts/try-v3-python-sdk.md)
applied-ai-services Compose Custom Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/compose-custom-models.md
Title: "How to guide: use custom and composed models"
+ Title: "How to guide: create and compose custom models with Form Recognizer v2.1"
-description: Learn how to create, use, and manage Form Recognizer custom and composed models
+description: Learn how to create, compose use, and manage custom models with Form Recognizer v2.1
Previously updated : 11/02/2021 Last updated : 02/15/2022 recommendations: false-
-# Use custom and composed models
+# Compose custom models v2.1
+
+> [!NOTE]
+> This how-to guide references Form Recognizer v2.1 (GA). To try Form Recognizer v3.0 (preview), see [Compose custom models v3.0 (preview)](compose-custom-models-preview.md).
Form Recognizer uses advanced machine-learning technology to detect and extract information from document images and return the extracted data in a structured JSON output. With Form Recognizer, you can train standalone custom models or combine custom models to create composed models.
Form Recognizer uses advanced machine-learning technology to detect and extract
* **Composed models**. A composed model is created by taking a collection of custom models and assigning them to a single model that encompasses your form types. When a document is submitted to a composed model, the service performs a classification step to decide which custom model accurately represents the form presented for analysis.
-***Model configuration window in Form Recognizer Studio***
-- In this article, you'll learn how to create Form Recognizer custom and composed models using our [Form Recognizer Sample Labeling tool](label-tool.md), [REST APIs](quickstarts/client-library.md?branch=main&pivots=programming-language-rest-api#train-a-custom-model), or [client-library SDKs](quickstarts/client-library.md?branch=main&pivots=programming-language-csharp#train-a-custom-model). ## Sample Labeling tool
-You can see how data is extracted from custom forms by trying our Sample Labeling tool. You'll need the following:
+You can see how data is extracted from custom forms by trying our Sample Labeling tool. You'll need the following resources:
* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
to an Azure blob storage container. If you don't know how to create an Azure sto
## Train your custom model
-You can [train your model](./quickstarts/try-sdk-rest-api.md#train-a-custom-model) with or without labeled data sets. Unlabeled datasets rely solely on the [Layout API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeLayoutAsync) to detect and identify key information without added human input. Labeled datasets also rely on the Layout API, but supplementary human input is included such as your specific labels and field locations. To use both labeled and unlabeled data, start with at least five completed forms of the same type for the labeled training data and then add unlabeled data to the required data set.
-
-### Train without labels
-
-Form Recognizer uses unsupervised learning to understand the layout and relationships between fields and entries in your forms. When you submit your input forms, the algorithm clusters the forms by type, discovers what keys and tables are present, and associates values to keys and entries to tables. Training without labels doesn't require manual data labeling or intensive coding and maintenance, and we recommend you try this method first.
-
-See [Build a training data set](./build-training-data-set.md) for tips on how to collect your training documents.
-
-### Train with labels
+You [train your model](./quickstarts/try-sdk-rest-api.md#train-a-custom-model) with labeled data sets. Labeled datasets rely on the prebuilt-layout API, but supplementary human input is included such as your specific labels and field locations. Start with at least five completed forms of the same type for your labeled training data.
When you train with labeled data, the model uses supervised learning to extract values of interest, using the labeled forms you provide. Labeled data results in better-performing models and can produce models that work with complex forms or forms containing values without keys.
Learn more about the Form Recognizer client library by exploring our API referen
> [!div class="nextstepaction"] > [Form Recognizer API reference](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm)
->
+>
applied-ai-services Concept Accuracy Confidence https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-accuracy-confidence.md
+
+ Title: Interpret and improve model accuracy and analysis confidence scores
+
+description: Best practices to interpret the accuracy score from the train model operation and the confidence score from analysis operations.
+++++ Last updated : 02/15/2022+++
+# Interpret and improve accuracy and confidence for custom models
+
+> [!NOTE]
+>
+> * **Custom models do not provide accuracy scores during training**.
+> * Confidence scores for structured fields such as tables are currently unavailable.
+
+Custom models generate an estimated accuracy score when trained. Documents analyzed with a custom model produce a confidence score for extracted fields. In this article, you'll learn to interpret accuracy and confidence scores and best practices for using those scores to improve accuracy and confidence results.
+
+## Accuracy scores
+
+The output of a `build` (v3.0) or `train` (v2.1) custom model operation includes the estimated accuracy score. This score represents the model's ability to accurately predict the labeled value on a visually similar document.
+The accuracy value range is a percentage between 0% (low) and 100% (high). The estimated accuracy is calculated by running a few different combinations of the training data to predict the labeled values.
+
+**Form Recognizer Studio** </br>
+**Trained custom model (invoice)**
++
+## Confidence scores
+
+Form Recognizer analysis results return an estimated confidence for predicted words, key-value pairs, selection marks, regions, and signatures. Currently, not all document fields return a confidence score.
+
+Confidence indicates an estimated probability between 0 and 1 that the prediction is correct. For example, a confidence value of 0.95 (95%) indicates that the prediction is likely correct 19 out of 20 times. For scenarios where accuracy is critical, confidence may be used to determine whether to automatically accept the prediction or flag it for human review.
+
+**Form Recognizer Studio** </br>
+**Analyzed invoice prebuilt-invoice model**
++
+## Interpret accuracy and confidence scores
+
+The following table demonstrates how to interpret both the accuracy and confidence scores to measure your custom model's performance.
+
+| Accuracy | Confidence | Result |
+|--|--|--|
+| High| High | <ul><li>The model is performing well with the labeled keys and document formats. </li><li>You have a balanced training dataset</li></ul> |
+| High | Low | <ul><li>The analyzed document appears different from the training dataset.</li><li>The model would benefit from retraining with at least five more labeled documents. </li><li>These results could also indicate a format variation between the training dataset and the analyzed document. </br>Consider adding a new model.</li></ul> |
+| Low | High | <ul><li>This result is most unlikely.</li><li>For low accuracy scores, add more labeled data or split visually distinct documents into multiple models.</li></ul> |
+| Low | Low| <ul><li>Add more labeled data.</li><li>Split visually distinct documents into multiple models.</li></ul>|
+
+## Ensure high model accuracy
+
+The accuracy of your model is affected by variances in the visual structure of your documents. Reported accuracy scores can be inconsistent when the analyzed documents differ from documents used in training. Keep in mind that a document set can look similar when viewed by humans but appear dissimilar to an AI model. Below, is a list of the best practices for training models with the highest accuracy. Following these guidelines should produce a model with higher accuracy and confidence scores during analysis and reduce the number of documents flagged for human review.
+
+* Ensure that all variations of a document are included in the training dataset. Variations include different formats, for example, digital versus scanned PDFs.
+
+* If you expect the model to analyze both types of PDF documents, add at least five samples of each type to the training dataset.
+
+* Separate visually distinct document types to train different models.
+ * As a general rule, if you remove all user entered values and the documents look similar, you need to add more training data to the existing model.
+ * If the documents are dissimilar, split your training data into different folders and train a model for each variation. You can then [compose](compose-custom-models.md#create-a-composed-model) the different variations into a single model.
+
+* Make sure that you don't have any extraneous labels.
+
+* For signature and region labeling, don't include the surrounding text.
+
+## Next step
+
+> [!div class="nextstepaction"]
+> [Learn to create custom models ](quickstarts/try-v3-form-recognizer-studio.md#custom-models)
applied-ai-services Concept Business Card https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-business-card.md
The business card model combines powerful Optical Character Recognition (OCR) ca
## Development options
-The following resources are supported by Form Recognizer v2.1:
+The following tools are supported by Form Recognizer v2.1:
| Feature | Resources | |-|-| |**Business card model**| <ul><li>[**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/prebuilts-analyze)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-business-cards)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=business-card#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-The following resources are supported by Form Recognizer v3.0:
+The following tools are supported by Form Recognizer v3.0:
| Feature | Resources | Model ID | |-|-|--|
The following resources are supported by Form Recognizer v3.0:
### Try Form Recognizer
-See how data, including name, job title, address, email, and company name, is extracted from business cards using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following:
+See how data, including name, job title, address, email, and company name, is extracted from business cards using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following resources:
* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
You will need a business card document. You can use our [sample business card do
* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
## Next steps
applied-ai-services Concept Composed Models https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-composed-models.md
+
+ Title: Form Recognizer composed models
+
+description: Learn about composed custom models
+++++ Last updated : 02/15/2022+
+recommendations: false
++
+# Composed custom models
+
+**Composed models**. A composed model is created by taking a collection of custom models and assigning them to a single model comprised of your form types. When a document is submitted for analysis to a composed model, the service performs a classification to decide which custom model accurately represents the form presented for analysis.
+
+With composed models, you can assign multiple custom models to a composed model called with a single model ID. It's useful when you've trained several models and want to group them to analyze similar form types. For example, your composed model might include custom models trained to analyze your supply, equipment, and furniture purchase orders. Instead of manually trying to select the appropriate model, you can use a composed model to determine the appropriate custom model for each analysis and extraction.
+
+* ```Custom form```and ```Custom document``` models can be composed together into a single composed model when they're trained with the same API version or an API version later than ```2021-01-30-preview```. For more information on composing custom template and custom neural models, see [compose model limits](#compose-model-limits).
+* With the model compose operation, you can assign up to 100 trained custom models to a single composed model. When you call Analyze with the composed model ID, Form Recognizer will first classify the form you submitted, choose the best matching assigned model, and then return results for that model.
+* For **_custom template models_**, the composed model can be created using variations of a custom template or different form types. This operation is useful when incoming forms may belong to one of several templates.
+* The response will include a ```docType``` property to indicate which of the composed models was used to analyze the document.
+
+## Compose model limits
+
+> [!NOTE]
+> With the addition of **_custom neural model_** , there are a few limits to the compatibility of models that can be composed together.
+
+### Composed model compatibility
+
+ |Custom model type | API Version |Custom form 2021-01-30-preview (v3.0)| Custom document 2021-01-30-preview(v3.0) | Custom form GA version (v2.1) or earlier|
+|--|--|--|--|--|
+|**Custom template** (updated custom form)| 2021-01-30-preview | &#10033;| Γ£ô | X |
+|**Custom neural**| trained with current API version (2021-01-30-preview) |Γ£ô |Γ£ô | X |
+|**Custom form**| Custom form GA version (v2.1) or earlier | X | X| Γ£ô|
+
+**Table symbols**: Γ£ö ΓÇö supported; **X** ΓÇö not supported; &#10033; ΓÇö unsupported for this API version, but will be supported in a future API version.
+
+* To compose a model trained with a prior version of the API (2.1 or earlier), train a model with the 3.0 API using the same labeled dataset to ensure that it can be composed with other models.
+
+* Models composed with v2.1 of the API will continue to be supported, requiring no updates.
+
+* The limit for maximum number of custom models that can be composed is 100.
+
+## Development options
+
+The following resources are supported by Form Recognizer **v3.0** (preview):
+
+| Feature | Resources |
+|-|-|
+|_**Custom model**_| <ul><li>[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/BuildDocumentModel)</li><li>[C# SDK](quickstarts/try-v3-csharp-sdk.md)</li><li>[Java SDK](quickstarts/try-v3-java-sdk.md)</li><li>[JavaScript SDK](quickstarts/try-v3-javascript-sdk.md)</li><li>[Python SDK](quickstarts/try-v3-python-sdk.md)</li></ul>|
+| _**Composed model**_| <ul><li>[Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/ComposeDocumentModel)</li><li>[C# SDK](/dotnet/api/azure.ai.formrecognizer.documentanalysis.documentmodeladministrationclient.startcreatecomposedmodel?view=azure-dotnet-preview&preserve-view=true)</li><li>[Java SDK](/java/api/com.azure.ai.formrecognizer.administration.documentmodeladministrationclient.begincreatecomposedmodel?view=azure-java-preview&preserve-view=true)</li><li>[JavaScript SDK](/javascript/api/@azure/ai-form-recognizer/documentmodeladministrationclient?view=azure-node-preview#@azure-ai-form-recognizer-documentmodeladministrationclient-begincomposemodel&preserve-view=true)</li><li>[Python SDK](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.formtrainingclient?view=azure-python-preview#azure-ai-formrecognizer-formtrainingclient-begin-create-composed-model&preserve-view=true)</li></ul>|
+
+The following resources are supported by Form Recognizer v2.1:
+
+| Feature | Resources |
+|-|-|
+|_**Custom model**_| <ul><li>[Form Recognizer labeling tool](https://fott-2-1.azurewebsites.net)</li><li>[REST API](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-forms-with-a-custom-model)</li><li>[Client library SDK](quickstarts/try-sdk-rest-api.md)</li><li>[Form Recognizer Docker container](containers/form-recognizer-container-install-run.md?tabs=custom#run-the-container-with-the-docker-compose-up-command)</li></ul>|
+| _**Composed model**_ |<ul><li>[Form Recognizer labeling tool](https://fott-2-1.azurewebsites.net/)</li><li>[REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/Compose)</li><li>[C# SDK](/dotnet/api/azure.ai.formrecognizer.training.createcomposedmodeloperation?view=azure-dotnet&preserve-view=true)</li><li>[Java SDK](/java/api/com.azure.ai.formrecognizer.models.createcomposedmodeloptions?view=azure-java-stable&preserve-view=true)</li><li>[JavaScript SDK](/javascript/api/@azure/ai-form-recognizer/begincreatecomposedmodeloptions?view=azure-node-latest&preserve-view=true)</li><li>[Python SDK](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.formtrainingclient?view=azure-python#azure-ai-formrecognizer-formtrainingclient-begin-create-composed-model&preserve-view=true)</li></ul>|
++
+## Next steps
+
+Learn to create and compose custom models:
+
+> [!div class="nextstepaction"]
+> [**Form Recognizer v2.1 (GA)**](compose-custom-models.md)
applied-ai-services Concept Custom Neural https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-custom-neural.md
+
+ Title: Form Recognizer custom neural model
+
+description: Learn about custom neural (neural) model type, its features and how you train a model with high accuracy to extract data from structured and unstructured documents
+++++ Last updated : 02/15/2022++
+recommendations: false
++
+# Form Recognizer custom neural model
+
+Custom neural models or neural models are a deep learned model that combines layout and language features to accurately extract labeled fields from documents. The base custom neural model is trained on various document types that makes it suitable to be trained for extracting fields from structured, semi-structured and unstructured documents. The table below lists common document types for each category:
+
+|Documents | Examples |
+||--|
+|structured| surveys, questionnaires|
+|semi-structured | invoices, purchase orders |
+|unstructured | contracts, letters|
+
+Custom neural models share the same labeling format and strategy as custom template models. Currently custom neural models only support a subset of the field types supported by custom template models.
+
+## Model capabilities
+
+Custom neural models currently only support key-value pairs and selection marks, future releases will include support for structured fields (tables) and signature.
+
+| Form fields | Selection marks | Tables | Signature | Region |
+|--|--|--|--|--|
+| Supported| Supported | Unsupported | Unsupported | Unsupported |
+
+## Supported regions
+
+In public preview custom neural models can only be trained in select Azure regions.
+
+* AustraliaEast
+* BrazilSouth
+* CanadaCentral
+* CentralIndia
+* CentralUS
+* EastUS
+* EastUS2
+* FranceCentral
+* JapanEast
+* JioIndiaWest
+* KoreaCentral
+* NorthEurope
+* SouthCentralUS
+* SoutheastAsia
+* UKSouth
+* WestEurope
+* WestUS
+* WestUS2
+* WestUS3
+
+You can copy a model trained in one of the regions listed above to any other region for use.
+
+## Best practices
+
+Custom neural models differ from custom template models in a few different ways.
+
+### Dealing with variations
+
+Custom neural models can generalize across different formats of a single document type. As a best practice, create a single model for all variations of a document type. Add at least five labeled samples for each of the different variations to the training dataset.
+
+### Field naming
+
+When you label the data, labeling the field relevant to the value will improve the accuracy of the key-value pairs extracted. For example, for a field value containing the supplier ID, consider naming the field "supplier_id". Field names should be in the language of the document.
+
+### Labeling contiguous values
+
+Value tokens/words of one field must be either
+
+* Consecutive sequence in natural reading order without interleaving with other fields
+* In a region that don't cover any other fields
+
+### Representative data
+
+Values in training cases should be diverse and representative. For example, if a field is named "date", values for this field should be a date. synthetic value like a random string can affect model performance.
++
+## Current Limitations
+
+* The model doesn't recognize values split across page boundaries.
+* Custom neural models are only trained in English and model performance will be lower for documents in other languages.
+* If a dataset labeled for custom template models is used to train a custom neural model, the unsupported field types are ignored.
+* Custom neural models are limited to 10 build operations per month. Open a support request if you need the limit increased.
+
+## Training a model
+
+Custom neural models are only available in the [v3 API](v3-migration-guide.md).
+
+| Document Type | REST API | SDK | Label and Test Models|
+|--|--|--|--|
+| Custom document | [Form Recognizer 3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument)| [Form Recognizer Preview SDK](quickstarts/try-v3-python-sdk.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)
+
+The build operation to train model supports a new ```buildMode``` property, to train a custom neural model, set the ```buildMode``` to ```neural```.
+
+```REST
+https://{endpoint}/formrecognizer/documentModels:build?api-version=2022-01-30-preview
+
+{
+ "modelId": "string",
+ "description": "string",
+ "buildMode": "neural",
+ "azureBlobSource":
+ {
+ "containerUrl": "string",
+ "prefix": "string"
+ }
+}
+```
+## Next steps
+
+* Train a custom model:
+
+ > [!div class="nextstepaction"]
+ > [Form Recognizer quickstart](quickstarts/try-v3-form-recognizer-studio.md#custom-models)
+
+* View the REST API:
+
+ > [!div class="nextstepaction"]
+ > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument)
applied-ai-services Concept Custom Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-custom-template.md
+
+ Title: Form Recognizer custom template model
+
+description: Learn about the custom template model type, its features and how you train a model with high accuracy to extract data from structured or templated forms
+++++ Last updated : 02/15/2022+
+recommendations: false
++
+# Form Recognizer custom template model
+
+Custom templateΓÇöformerly custom form-are easy-to-train models that accurately extract labeled key-value pairs, selection marks, tables, regions, and signatures from documents. Template models use layout cues to extract values from documents and are suitable to extract fields from highly structured documents with defined visual templates.
+
+Custom template models share the same labeling format and strategy as custom neural models, with support for more field types and languages.
+
+## Model capabilities
+
+Custom template models support key-value pairs, selection marks, tables, signature fields, and selected regions.
+
+| Form fields | Selection marks | Structured fields (Tables) | Signature | Selected regions |
+|--|--|--|--|--|
+| Supported| Supported | Supported | Preview | Supported |
+
+## Dealing with variations
+
+Template models rely on a defined visual template, changes to the template will result in lower accuracy. In those instances, split your training dataset to include at least five samples of each template and train a model for each of the variations. You can then [compose](concept-composed-models.md) the models into a single endpoint. When dealing with subtle variations, like digital PDF documents and images, it's best to include at least five examples of each type in the same training dataset.
+
+## Training a model
+
+Template models are available generally [v2.1 API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm) and in preview [v3 API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/BuildDocumentModel). If you're starting with a new project or have an existing labeled dataset, work with the v3 API and Form Recognizer Studio to train a custom template model.
+
+| Model | REST API | SDK | Label and Test Models|
+|--|--|--|--|
+| Custom template (preview) | [Form Recognizer 3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument)| [Form Recognizer Preview SDK](quickstarts/try-v3-python-sdk.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)|
+| Custom template | [Form Recognizer 2.1 (GA)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm)| [Form Recognizer SDK](quickstarts/get-started-sdk-rest-api.md?pivots=programming-language-python)| [Form Recognizer Sample labeling tool](https://fott-2-1.azurewebsites.net/)|
+
+On the v3 API, the build operation to train model supports a new ```buildMode``` property, to train a custom template model, set the ```buildMode``` to ```template```.
+
+```REST
+https://{endpoint}/formrecognizer/documentModels:build?api-version=2022-01-30-preview
+
+{
+ "modelId": "string",
+ "description": "string",
+ "buildMode": "template",
+ "azureBlobSource":
+ {
+ "containerUrl": "string",
+ "prefix": "string"
+ }
+}
+```
++
+## Next steps
+
+* Train a custom template model:
+
+ > [!div class="nextstepaction"]
+ > [Form Recognizer quickstart](quickstarts/try-sdk-rest-api.md)
+
+* Learn more about custom neural models:
+
+ > [!div class="nextstepaction"]
+ > [Custom neural models](concept-custom-neural.md )
+
+* View the REST API:
+
+ > [!div class="nextstepaction"]
+ > [Form Recognizer API v2.1](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm)
applied-ai-services Concept Custom https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-custom.md
Title: Form Recognizer custom and composed models
-description: Learn how to create, use, and manage Azure Form Recognizer custom and composed models.
+description: Learn to create, use, and manage Form Recognizer custom and composed models.
Previously updated : 11/02/2021- Last updated : 02/15/2022+ recommendations: false-
+# Form Recognizer custom models
-# Form Recognizer custom and composed models
+Form Recognizer uses advanced machine learning technology to detect and extract information from forms and documents and returns the extracted data in a structured JSON output. With Form Recognizer, you can use pre-built or pre-trained models or you can train standalone custom models. Standalone custom models can be combined to create composed models.
+To create a custom model, you label a dataset of documents with the values you want extracted and train the model on the labeled dataset. You only need five examples of the same form or document type to get started.
-Form Recognizer uses advanced machine-learning technology to detect and extract information from document images and return the extracted data in a structured JSON output. With Form Recognizer, you can train standalone custom models or combine custom models to create composed models.
+## Custom model types
+Custom models can be one of two types, [**custom template**](concept-custom-template.md ) or [**custom neural**](concept-custom-neural.md) models. The labeling and training process for both models is identical, but the models differ as follows:
-* **Custom models**: By using custom models, you can analyze and extract data from forms and documents specific to your business. Custom models are trained for your distinct data and use cases.
-* **Composed models**: A composed model is created by taking a collection of custom models and assigning them to a single model that encompasses your form types. When a document is submitted to a composed model, the service performs a classification step to decide which custom model accurately represents the form presented for analysis.
+### Custom template model
- :::image type="content" source="media/studio/analyze-custom.png" alt-text="Screenshot that shows the Form Recognizer tool analyze-a-custom-form window.":::
+ The custom template model relies on a consistent visual template to extract the labeled data. The accuracy of your model is affected by variances in the visual structure of your documents. Questionnaires or application forms are examples of consistent visual templates.Your training set will consist of structured documents where the formatting and layout are static and constant from one document instance to the next. Custom template models support key-value pairs, selection marks, tables, signature fields and regions and can be trained on documents in any of the [supported languages](language-support.md). For more information, *see* [custom template models](concept-custom-template.md ).
-## What is a custom model?
+> [!TIP]
+>
+>To confirm that your training documents present a consistent visual template, remove all the user-entered data from each form in the set. If the blank forms are identical in appearance, they represent a consistent visual template.
+>
+> For more information, *see* [Interpret and improve accuracy and confidence for custom models](concept-accuracy-confidence.md).
+### Custom neural model
-A custom model is a machine-learning program trained to recognize form fields within your distinct content and extract key-value pairs and table data. You only need five examples of the same form type to get started and your custom model can be trained with or without labeled datasets.
+The custom neural model is a deep learning model type relies on a base model trained on a large collection of labeled documents using key-value pairs. This model is then fine-tuned or adapted to your data when you train the model with a labeled dataset. Custom neural models support structured, semi-structured, and unstructured documents to extract fields. Custom neural models currently support English-language documents. When choosing between the two model types, start with a neural model if it meets your functional needs. See [neural models](concept-custom-neural.md) to learn more about custom document models.
-## What is a composed model?
+## Model features
-With composed models, you can assign multiple custom models to a composed model called with a single model ID. It's useful when you've trained several models and want to group them to analyze similar form types. For example, your composed model might include custom models trained to analyze your supply, equipment, and furniture purchase orders. Instead of manually trying to select the appropriate model, you can use a composed model to determine the appropriate custom model for each analysis and extraction.
+The table below compares custom template and custom neural features:
-## Development options
+## Custom model tools
-The following resources are supported by Form Recognizer v2.1:
+The following tools are supported by Form Recognizer v2.1:
| Feature | Resources | |-|-| |Custom model| <ul><li>[Form Recognizer labeling tool](https://fott-2-1.azurewebsites.net)</li><li>[REST API](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-forms-with-a-custom-model)</li><li>[Client library SDK](quickstarts/try-sdk-rest-api.md)</li><li>[Form Recognizer Docker container](containers/form-recognizer-container-install-run.md?tabs=custom#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-The following resources are supported by Form Recognizer v3.0:
+The following tools are supported by Form Recognizer v3.0:
| Feature | Resources | |-|-|
See how data is extracted from your specific or unique documents by using custom
#### Sample Labeling tool
-You need a set of at least six forms of the same type. You use this data to train the model and test a form. You can use the [sample dataset](https://go.microsoft.com/fwlink/?linkid=2090451). Download and extract the *sample_data.zip* file. Then upload the contents to your Azure Blob Storage container.
-In the Form Recognizer UI:
+|Feature |Custom Template | Custom Neural |
+|--|--|--|
+|Document structure |Template, fixed form, and structured documents.| Structured, semi-structured, and unstructured documents.|
+|Training time | 1 - 5 minutes | 20 - 60 minutes |
+|Data extraction| Key-value pairs, tables, selection marks, signatures, and regions| Key-value pairs and selections marks.|
+|Models per Document type | Requires one model per each document-type variation| Supports a single model for all document-type variations.|
+|Language support| See [custom template model language support](language-support.md)| The custom neural model currently supports English-language documents only.|
-1. On the **Sample Labeling tool** home page, select **Use Custom to train a model with labels and get key value pairs**.
+## Model capabilities
- :::image type="content" source="media/label-tool/fott-use-custom.png" alt-text="Screenshot that shows selecting the custom option.":::
+This table compares the supported data extraction areas:
-1. In the next window, select **New Project**.
+|Model| Form fields | Selection marks | Structured fields (Tables) | Signature | Region labeling |
+|--|:--:|:--:|:--:|:--:|:--:|
+|Custom template| Γ£ö | Γ£ö | Γ£ö |&#10033; | Γ£ö |
+|Custom neural| Γ£ö| Γ£ö |**n/a**| **n/a** | **n/a** |
- :::image type="content" source="media/label-tool/fott-new-project.png" alt-text="Screenshot that shows selecting New Project.":::
+**Table symbols**: Γ£ö ΓÇö supported; &#10033; ΓÇö preview; **n/a** ΓÇö currently unavailable
-For more detailed instructions, see the [Sample Labeling tool](quickstarts/try-sample-label-tool.md) quickstart.
+> [!TIP]
+> When choosing between the two model types, start with a custom neural model if it meets your functional needs. See [custom neural](concept-custom-neural.md ) to learn more about custom neural models.
-> [!div class="nextstepaction"]
-> [Try Sample Labeling tool](https://fott-2-1.azurewebsites.net/projects/create)
+## Custom model development options
-## Input requirements
+The following table describes the features available with the associated tools and SDKs. As a best practice, ensure that you use the compatible tools listed here.
-Meet the following requirements:
+| Document type | REST API | SDK | Label and Test Models|
+|--|--|--|--|
+| Custom form 2.1 | [Form Recognizer 2.1 GA API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm) | [Form Recognizer SDK](quickstarts/get-started-sdk-rest-api.md?pivots=programming-language-python)| [Sample labeling tool](https://fott-2-1.azurewebsites.net/)|
+| Custom template 3.0 | [Form Recognizer 3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument)| [Form Recognizer Preview SDK](quickstarts/try-v3-python-sdk.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)|
+| Custom neural | [Form Recognizer 3.0 (preview)](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument)| [Form Recognizer Preview SDK](quickstarts/try-v3-python-sdk.md)| [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)
++
+> [!NOTE]
+> Custom template models trained with the 3.0 API will have a few improvements over the 2.1 API stemming from improvements to the OCR engine. Datasets used to train a custom template model using the 2.1 API can still be used to train a new model using the 3.0 API.
* For best results, provide one clear photo or high-quality scan per document. * Supported file formats are JPEG, PNG, BMP, TIFF, and PDF (text-embedded or scanned). Text-embedded PDFs are best to eliminate the possibility of error in character extraction and location.
The [Sample Labeling tool](https://fott-2-1.azurewebsites.net/) doesn't support
## Supported languages and locales
- The Form Recognizer preview version introduces more language support for custom models. For a list of supported handwritten and printed text, see [Language support](language-support.md#layout-and-custom-model).
+ The Form Recognizer preview version introduces more language support for custom models. For a list of supported handwritten and printed text, see [Language support](language-support.md).
## Form Recognizer v3.0 (preview)
The [Sample Labeling tool](https://fott-2-1.azurewebsites.net/) doesn't support
After your training set is labeled, you can train your custom model and use it to analyze documents. The signature fields specify whether a signature was detected or not.
-## Next steps
-* Complete a Form Recognizer quickstart:
-
- > [!div class="nextstepaction"]
- > [Form Recognizer quickstart](quickstarts/try-sdk-rest-api.md)
+## Next steps
-* Explore the REST API:
+Explore Form Recognizer quickstarts and REST APIs:
- > [!div class="nextstepaction"]
- > [Form Recognizer API v2.1](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm)
+| Quickstart | REST API|
+|--|--|
+|[v3.0 Studio quickstart](quickstarts/try-v3-form-recognizer-studio.md) |[Form Recognizer v3.0 API 2022-01-30-preview](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument)|
+| [v2.1 quickstart](quickstarts/get-started-sdk-rest-api.md) | [Form Recognizer API v2.1](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/BuildDocumentModel) |
applied-ai-services Concept Form Recognizer Studio https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-form-recognizer-studio.md
>[!NOTE] > Form Recognizer Studio is currently in public preview. Some features may not be supported or have limited capabilities.
-[Form Recognizer Studio preview](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service into your applications. Use the [Form Recognizer Studio quickstart](quickstarts/try-v3-form-recognizer-studio.md) to get started analyzing documents with pre-trained models. Build custom form models and reference the models in your applications using the [Python SDK preview](quickstarts/try-v3-python-sdk.md) and other quickstarts.
+[Form Recognizer Studio preview](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service into your applications. Use the [Form Recognizer Studio quickstart](quickstarts/try-v3-form-recognizer-studio.md) to get started analyzing documents with pre-trained models. Build custom template models and reference the models in your applications using the [Python SDK preview](quickstarts/try-v3-python-sdk.md) and other quickstarts.
The following image shows the Invoice prebuilt model feature at work.
The following image shows the Invoice prebuilt model feature at work.
The following Form Recognizer service features are available in the Studio.
-* **Layout**: Try out Form Recognizer's Layout feature to extract text, tables, selection marks, and structure information from documentsΓÇöPDF, TIFFΓÇöand imagesΓÇöJPG, PNG, BMP. Start with the [Studio Layout quickstart](quickstarts/try-v3-form-recognizer-studio.md#layout). Explore with sample documents and your documents. Use the interactive visualization and JSON output to understand how the feature works. See the [Layout overview](concept-layout.md) to learn more and get started with the [Python SDK quickstart for Layout](quickstarts/try-v3-python-sdk.md#layout-model).
+* **Read**: Try out Form Recognizer's Read feature to extract text lines, words, detected languages, and handwritten style if detected. Start with the [Studio Read feature](https://formrecognizer.appliedai.azure.com/studio/read). Explore with sample documents and your documents. Use the interactive visualization and JSON output to understand how the feature works. See the [Read overview](concept-read.md) to learn more and get started with the [Python SDK quickstart for Layout](quickstarts/try-v3-python-sdk.md).
-* **Prebuilt models**: Form Recognizer's pre-built models enable you to add intelligent form processing to your apps and flows without having to train and build your own models. Start with the [Studio Prebuilts quickstart](quickstarts/try-v3-form-recognizer-studio.md#prebuilt-models). Explore with sample documents and your documents. Use the interactive visualization, extracted fields list, and JSON output to understand how the feature works. See the [Models overview](concept-model-overview.md) to learn more and get started with the [Python SDK quickstart for Prebuilt Invoice](quickstarts/try-v3-python-sdk.md#prebuilt-model).
+* **Layout**: Try out Form Recognizer's Layout feature to extract text, tables, selection marks, and structure information. Start with the [Studio Layout feature](https://formrecognizer.appliedai.azure.com/studio/layout). Explore with sample documents and your documents. Use the interactive visualization and JSON output to understand how the feature works. See the [Layout overview](concept-layout.md) to learn more and get started with the [Python SDK quickstart for Layout](quickstarts/try-v3-python-sdk.md#layout-model).
-* **Custom models**: Form Recognizer's custom models enable you to extract fields and values from models trained with your data, tailored to your forms and documents. Create standalone custom models or combine two or more custom models to create a composed model to extract data from multiple form types. Start with the [Studio Custom models quickstart](quickstarts/try-v3-form-recognizer-studio.md#custom-projects). Use the online wizard, labeling interface, training step, and visualizations to understand how the feature works. Test the custom model with your sample documents and iterate to improve the model. See the [Custom models overview](concept-custom.md) to learn more and use the [Form Recognizer v3.0 preview migration guide](v3-migration-guide.md) to start integrating the new models with your applications.
+* **General Documents**: Try out Form Recognizer's General Documents feature to extract key-value pairs and entities. Start with the [Studio General Documents feature](https://formrecognizer.appliedai.azure.com/studio/document). Explore with sample documents and your documents. Use the interactive visualization and JSON output to understand how the feature works. See the [General Documents overview](concept-general-document.md) to learn more and get started with the [Python SDK quickstart for Layout](quickstarts/try-v3-python-sdk.md#general-document-model).
-* **Custom models: Labeling features**: Form Recognizer Custom model creation requires identifying the fields to be extracted and labeling those fields before training the custom models. Labeling text, selection marks, tabular data, and other content types are typically assisted with a user interface to ease the training workflow. For example, use the [Label as tables](quickstarts/try-v3-form-recognizer-studio.md#labeling-as-tables) and [Labeling for signature detection](quickstarts/try-v3-form-recognizer-studio.md#labeling-for-signature-detection) quickstarts to understand the labeling experience in Form Recognizer Studio.
+* **Prebuilt models**: Form Recognizer's pre-built models enable you to add intelligent document processing to your apps and flows without having to train and build your own models. As an example, start with the [Studio Invoice feature](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice). Explore with sample documents and your documents. Use the interactive visualization, extracted fields list, and JSON output to understand how the feature works. See the [Models overview](concept-model-overview.md) to learn more and get started with the [Python SDK quickstart for Prebuilt Invoice](quickstarts/try-v3-python-sdk.md#prebuilt-model).
+
+* **Custom models**: Form Recognizer's custom models enable you to extract fields and values from models trained with your data, tailored to your forms and documents. Create standalone custom models or combine two or more custom models to create a composed model to extract data from multiple form types. Start with the [Studio Custom models feature](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects). Use the online wizard, labeling interface, training step, and visualizations to understand how the feature works. Test the custom model with your sample documents and iterate to improve the model. See the [Custom models overview](concept-custom.md) to learn more and use the [Form Recognizer v3.0 preview migration guide](v3-migration-guide.md) to start integrating the new models with your applications.
## Next steps
applied-ai-services Concept General Document https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-general-document.md
Title: Form Recognizer general document model | Preview description: Concepts encompassing data extraction and analysis using prebuilt general document preview model-+ Previously updated : 10/07/2021- Last updated : 02/15/2022+ recommendations: false-
-<!-- markdownlint-disable MD033 -->
+<!-- markdownlint-disable MD033 -->
# Form Recognizer general document model (preview)
-The General document preview model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to extract key-value pairs and entities from documents. General document is only available with the preview (v3.0) API. For more information on using the preview (v3.0) API, see our [migration guide](v3-migration-guide.md).
+The General document preview model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to extract key-value pairs, selection marks, and entities from documents. General document is only available with the preview (v3.0) API. For more information on using the preview (v3.0) API, see our [migration guide](v3-migration-guide.md).
-The general document API supports most form types and will analyze your documents and associate values to keys and entries to tables that it discovers. It is ideal for extracting common key-value pairs from documents. You can use the general document model as an alternative to [training a custom model without labels](compose-custom-models.md#train-without-labels).
+
+The general document API supports most form types and will analyze your documents and extract keys and associated values. It is ideal for extracting common key-value pairs from documents. You can use the general document model as an alternative to training a custom model without labels.
+
+> [!NOTE]
+> The ```2022-01-30-preview``` update to the general document model adds support for selection marks.
## General document features
-* There is no need to train a custom model to extract key-value pairs.
+* The general document model is a pre-trained model, does not require labels or training.
+
+* A single API extracts key-value pairs, selection marks entities, text, tables, and structure from documents.
+
+* The general document model supports structured, semi-structured, and unstructured documents.
-* A single API is used to extract key value pairs, entities, text, tables, and structure from documents.
+* Key names are spans of text within the document that are associated with a value.
-* It is a pre-trained model that will be periodically trained on new data to improve coverage and accuracy.
-* The general document model supports structured, semi-structured, and unstructured data.
+* Selection marks are identified as fields with a value of ```:selected:``` or ```:unselected:```
***Sample document processed in the Form Recognizer Studio***
The general document API supports most form types and will analyze your document
## Development options
-The following resources are supported by Form Recognizer v3.0:
+The following tools are supported by Form Recognizer v3.0:
| Feature | Resources | |-|-|
The following resources are supported by Form Recognizer v3.0:
### Try Form Recognizer
-See how data, including tables, values, and entities, is extracted from forms and documents using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following:
+See how data is extracted from forms and documents using the Form Recognizer Studio or our Sample Labeling tool.
+
+You'll need the following resources:
* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
-* A [Form Recognizer instance](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) in the Azure portal. You can use the free pricing tier (`F0`) to try the service. After your resource deploys, select **Go to resource** to get your API key and endpoint.
+* A [Form Recognizer instance](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) in the Azure portal. You can use the free pricing tier (`F0`) to try the service. After your resource deploys, select **Go to resource** to get your API key and endpoint.
:::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
See how data, including tables, values, and entities, is extracted from forms an
## Key-value pairs
-Key value pairs are specific spans within the document that identify a label or key and its associated response or value. In a structured form, this could be the label and the value the user entered for that field or in an unstructured document it could be the date a contract was executed on based on the text in a paragraph. The AI model is trained to extract identifiable keys and values based on a wide variety of document types, formats, and structures.
-Keys can also exist in isolation when the model detects that a key exists, with no associated value or when processing optional fields. For example, a middle name field may be left blank on a form in some instances. Key value pairs are always spans of text contained in the document and if you have documents where same value is described in different ways, for example a customer or a user, the associated key will be either customer or user based on what the document contained.
+key-value pairs are specific spans within the document that identify a label or key and its associated response or value. In a structured form, this could be the label and the value the user entered for that field or in an unstructured document it could be the date a contract was executed on based on the text in a paragraph. The AI model is trained to extract identifiable keys and values based on a wide variety of document types, formats, and structures.
+
+Keys can also exist in isolation when the model detects that a key exists, with no associated value or when processing optional fields. For example, a middle name field may be left blank on a form in some instances. key-value pairs are always spans of text contained in the document and if you have documents where same value is described in different ways, for example a customer or a user, the associated key will be either customer or user based on what the document contained.
+ ## Entities Natural language processing models can identify parts of speech and classify each token or word. The named entity recognition model is able to identify entities like people, locations, and dates to provide for a richer experience. Identifying entities enables you to distinguish between customer types, for example, an individual or an organization.
-The key value pair extraction model and entity identification model are run in parallel on the entire document and not just on the values of the extracted key value pairs. This ensures that complex structures where a key cannot be identified is still enriched by identifying the entities referenced. You can still match keys or values to entities based on the offsets of the identified spans.
-* The general document is a pre-trained model and can be directly invoked via the REST API.
+The key value pair extraction model and entity identification model are run in parallel on the entire document and not just on the values of the extracted key-value pairs. This ensures that complex structures where a key cannot be identified is still enriched by identifying the entities referenced. You can still match keys or values to entities based on the offsets of the identified spans.
+
+* The general document is a pre-trained model and can be directly invoked via the REST API.
* The general document model supports named entity recognition (NER) for several entity categories. NER is the ability to identify different entities in text and categorize them into pre-defined classes or types such as: person, location, event, product, and organization. Extracting entities can be useful in scenarios where you want to validate extracted values. The entities are extracted from the entire content and not just the extracted values.
The key value pair extraction model and entity identification model are run in p
* Supported file formats: JPEG, PNG, BMP, TIFF, and PDF (text-embedded or scanned). Text-embedded PDFs are best to eliminate the possibility of error in character extraction and location. * For PDF and TIFF, up to 2000 pages can be processed (with a free tier subscription, only the first two pages are processed). * The file size must be less than 50 MB.
-* Image dimensions must be between 50 x 50 pixels and 10000 x 10000 pixels.
+* Image dimensions must be between 50 x 50 pixels and 10,000 x 10,000 pixels.
* PDF dimensions are up to 17 x 17 inches, corresponding to Legal or A3 paper size, or smaller. * The total size of the training data is 500 pages or less. * If your PDFs are password-locked, you must remove the lock before submission.
The key value pair extraction model and entity identification model are run in p
* Keys are spans of text extracted from the document, for semi structured documents, keys may need to be mapped to an existing dictionary of keys.
-* Expect to see key value pairs with a key, but no value. For example if a user chose to not provide an email address on the form.
+* Expect to see key-value pairs with a key, but no value. For example if a user chose to not provide an email address on the form.
## Next steps * Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
> [!div class="nextstepaction"]
-> [Try the Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)
+> [Try the Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio)
applied-ai-services Concept Id Document https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-id-document.md
# Form Recognizer ID document model
-The ID document model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to analyze and extracts key information from U.S. Driver's Licenses (all 50 states and District of Columbia) and international passport biographical pages (excluding visa and other travel documents). The API analyzes identity documents; extracts key information such as first name, last name, address, and date of birth; and returns a structured JSON data representation.
+The ID document model combines Optical Character Recognition (OCR) with deep learning models to analyze and extracts key information from US Drivers Licenses (all 50 states and District of Columbia) and international passport biographical pages (excludes visa and other travel documents). The API analyzes identity documents, extracts key information, and returns a structured JSON data representation.
***Sample U.S. Driver's License processed with Form Recognizer Studio*** ## Development options
-The following resources are supported by Form Recognizer v2.1:
+The following tools are supported by Form Recognizer v2.1:
| Feature | Resources | |-|-| |**ID document model**| <ul><li>[**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/prebuilts-analyze)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-identity-id-documents)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=id-document#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-The following resources are supported by Form Recognizer v3.0:
+The following tools are supported by Form Recognizer v3.0:
| Feature | Resources | Model ID | |-|-|--|
The following resources are supported by Form Recognizer v3.0:
### Try Form Recognizer
-See how data, including name, birth date, machine-readable zone, and expiration date, is extracted from ID documents using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following:
+See how to extract data, including name, birth date, machine-readable zone, and expiration date, from ID documents using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following resources:
* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
See how data, including name, birth date, machine-readable zone, and expiration
#### Sample Labeling tool
-You will need an ID document. You can use our [sample ID document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/DriverLicense.png).
+You'll need an ID document. You can use our [sample ID document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/DriverLicense.png).
1. On the Sample Labeling tool home page, select **Use prebuilt model to get data**.
You will need an ID document. You can use our [sample ID document](https://raw.g
* Supported file formats: JPEG, PNG, BMP, TIFF, and PDF (text-embedded or scanned). Text-embedded PDFs are best to eliminate the possibility of error in character extraction and location. * For PDF and TIFF, up to 2000 pages can be processed (with a free tier subscription, only the first two pages are processed). * The file size must be less than 50 MB.
-* Image dimensions must be between 50 x 50 pixels and 10000 x 10000 pixels.
+* Image dimensions must be between 50 x 50 pixels and 10,000 x 10,000 pixels.
* PDF dimensions are up to 17 x 17 inches, corresponding to Legal or A3 paper size, or smaller. * The total size of the training data is 500 pages or less. * If your PDFs are password-locked, you must remove the lock before submission.
You will need an ID document. You can use our [sample ID document](https://raw.g
* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
## Next steps
applied-ai-services Concept Invoice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-invoice.md
Previously updated : 11/02/2021 Last updated : 02/15/2022 recommendations: false- <!-- markdownlint-disable MD033 --> # Form Recognizer invoice model
- The invoice model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to analyze and extract key fields and line items from sales invoices. Invoices can be of various formats and quality including phone-captured images, scanned documents, and digital PDFs. The API analyzes invoice text; extracts key information such as customer name, billing address, due date, and amount due; and returns a structured JSON data representation.
+ The invoice model combines powerful Optical Character Recognition (OCR) capabilities with deep learning models to analyze and extract key fields and line items from sales invoices. Invoices can be of various formats and quality including phone-captured images, scanned documents, and digital PDFs. The API analyzes invoice text; extracts key information such as customer name, billing address, due date, and amount due; and returns a structured JSON data representation. The model currently supports both English and Spanish invoices.
**Sample invoice processed with [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice)**:
## Development options
-The following resources are supported by Form Recognizer v2.1:
+The following tools are supported by Form Recognizer v2.1:
| Feature | Resources | |-|-| |**Invoice model**| <ul><li>[**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/prebuilts-analyze)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-invoices)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=invoice#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-The following resources are supported by Form Recognizer v3.0:
+The following tools are supported by Form Recognizer v3.0:
| Feature | Resources | Model ID | |-|-|--|
The following resources are supported by Form Recognizer v3.0:
### Try Form Recognizer
-See how data, including customer information, vendor details, and line items, is extracted from invoices using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following:
+See how data, including customer information, vendor details, and line items, is extracted from invoices using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following resources:
* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
See how data, including customer information, vendor details, and line items, is
> [!div class="nextstepaction"] > [Try Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice)
-#### Sample Labeling tool
+#### Sample Labeling tool (API v2.1)
+> [!NOTE]
+> Unless you must use API v2.1, it is strongly suggested that you use the [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com) for testing purposes instead of the sample labeling tool.
-You will need an invoice document. You can use our [sample invoice document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/sample-invoice.pdf).
+You'll need an invoice document. You can use our [sample invoice document](https://raw.githubusercontent.com/Azure-Samples/cognitive-services-REST-api-samples/master/curl/form-recognizer/sample-invoice.pdf).
1. On the Sample Labeling tool home page, select **Use prebuilt model to get data**.
You will need an invoice document. You can use our [sample invoice document](htt
* Supported file formats: JPEG, PNG, BMP, TIFF, and PDF (text-embedded or scanned). Text-embedded PDFs are best to eliminate the possibility of error in character extraction and location. * For PDF and TIFF, up to 2000 pages can be processed (with a free tier subscription, only the first two pages are processed). * The file size must be less than 50 MB.
-* Image dimensions must be between 50 x 50 pixels and 10000 x 10000 pixels.
+* Image dimensions must be between 50 x 50 pixels and 10,000 x 10,000 pixels.
* PDF dimensions are up to 17 x 17 inches, corresponding to Legal or A3 paper size, or smaller. * The total size of the training data is 500 pages or less. * If your PDFs are password-locked, you must remove the lock before submission.
You will need an invoice document. You can use our [sample invoice document](htt
| Model | LanguageΓÇöLocale code | Default | |--|:-|:| |Invoice| <ul><li>English (United States)ΓÇöen-US</li></ul>| English (United States)ΓÇöen-US|
+|Invoice| <ul><li>SpanishΓÇöes</li></ul>| Spanish (United States)ΓÇöes|
## Field extraction
You will need an invoice document. You can use our [sample invoice document](htt
| InvoiceDate | Date | Date the invoice was issued | yyyy-mm-dd| | DueDate | Date | Date payment for this invoice is due | yyyy-mm-dd| | VendorName | String | Vendor name | |
+| VendorTaxId | String | The taxpayer number associated with the vendor | |
| VendorAddress | String | Vendor mailing address| | | VendorAddressRecipient | String | Name associated with the VendorAddress | | | CustomerAddress | String | Mailing address for the Customer | |
+| CustomerTaxId | String | The taxpayer number associated with the customer | |
| CustomerAddressRecipient | String | Name associated with the CustomerAddress | | | BillingAddress | String | Explicit billing address for the customer | | | BillingAddressRecipient | String | Name associated with the BillingAddress | | | ShippingAddress | String | Explicit shipping address for the customer | | | ShippingAddressRecipient | String | Name associated with the ShippingAddress | |
+| PaymentTerm | String | The terms of payment for the invoice | |
| SubTotal | Number | Subtotal field identified on this invoice | Integer | | TotalTax | Number | Total tax field identified on this invoice | Integer |
+| TotalVAT | Number | Total VAT field identified on this invoice | Integer |
| InvoiceTotal | Number (USD) | Total new charges associated with this invoice | Integer | | AmountDue | Number (USD) | Total Amount Due to the vendor | Integer | | ServiceAddress | String | Explicit service address or property address for the customer | |
Following are the line items extracted from an invoice in the JSON output respon
| UnitPrice | Number | The net or gross price (depending on the gross invoice setting of the invoice) of one unit of this item | $30.00 | 30 | | ProductCode | String| Product code, product number, or SKU associated with the specific line item | A123 | | | Unit | String| The unit of the line item, e.g, kg, lb etc. | Hours | |
-| Date | Date| Date corresponding to each line item. Often it is a date the line item was shipped | 3/4/2021| 2021-03-04 |
+| Date | Date| Date corresponding to each line item. Often it's a date the line item was shipped | 3/4/2021| 2021-03-04 |
| Tax | Number | Tax associated with each line item. Possible values include tax amount, tax %, and tax Y/N | 10% | |
+| VAT | Number | Stands for Value added tax. This is a flat tax levied on an item. Common in european countries | &euro;20.00 | |
The invoice key-value pairs and line items extracted are in the `documentResults` section of the JSON output. ## Form Recognizer preview v3.0
- The Form Recognizer preview introduces several new features and capabilities.
+ The Form Recognizer preview introduces several new features, capabilities, and AI quality improvements to underlying technologies.
* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API (preview)**](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
## Next steps
The invoice key-value pairs and line items extracted are in the `documentResults
> [Form Recognizer quickstart](quickstarts/try-sdk-rest-api.md) * Explore our REST API:-
+ > [!div class="nextstepaction"]
+ > [Form Recognizer API v3.0 (Preview)](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument)
+
> [!div class="nextstepaction"] > [Form Recognizer API v2.1](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/5ed8c9843c2794cbb1a96291)
applied-ai-services Concept Layout https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-layout.md
# Form Recognizer layout model
-Azure the Form Recognizer Layout API extracts text, tables, selection marks, and structure information from documents (PDF, TIFF) and images (JPG, PNG, BMP). The layout model combines an enhanced version of our powerful [Optical Character Recognition (OCR)](../../cognitive-services/computer-vision/overview-ocr.md) capabilities with deep learning models to extract text, tables, selection marks, and document structure.
+The Form Recognizer Layout API extracts text, tables, selection marks, and structure information from documents (PDF, TIFF) and images (JPG, PNG, BMP).
***Sample form processed with [Form Recognizer Sample Labeling tool](https://fott-2-1.azurewebsites.net/) layout feature***
Azure the Form Recognizer Layout API extracts text, tables, selection marks, and
## Development options
-The following resources are supported by Form Recognizer v2.1:
+The following tools are supported by Form Recognizer v2.1:
| Feature | Resources | |-|-| |**Layout API**| <ul><li>[**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/layout-analyze)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-layout)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?branch=main&tabs=layout#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-The following resources are supported by Form Recognizer v3.0:
+The following tools are supported by Form Recognizer v3.0:
| Feature | Resources | Model ID | |-|||
The following resources are supported by Form Recognizer v3.0:
### Try Form Recognizer
-See how data, including tables, check boxes, and text, is extracted from forms and documents using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following:
+See how data, including tables, check boxes, and text, is extracted from forms and documents using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following resources:
* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
See how data, including tables, check boxes, and text, is extracted from forms a
***Sample form processed with [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/layout)*** 1. On the Form Recognizer Studio home page, select **Layout**
You'll need a form document. You can use our [sample form document](https://raw.
* For PDF and TIFF, up to 2000 pages can be processed (with a free tier subscription, only the first two pages are processed). * The file size must be less than 50 MB. * Image dimensions must be between 50 x 50 pixels and 10000 x 10000 pixels.
-* PDF dimensions are up to 17 x 17 inches, corresponding to Legal or A3 paper size, or smaller.
-* The total size of the training data is 500 pages or less.
-* If your PDFs are password-locked, you must remove the lock before submission.
-* For unsupervised learning (without labeled data):
- * Data must contain keys and values.
- * Keys must appear above or to the left of the values; they can't appear below or to the right.
> [!NOTE] > The [Sample Labeling tool](https://fott-2-1.azurewebsites.net/) does not support the BMP file format. This is a limitation of the tool not the Form Recognizer Service. ## Supported languages and locales
- Form Recognizer preview version introduces additional language support for the layout model. *See* our [Language Support](language-support.md#layout-and-custom-model) for a complete list of supported handwritten and printed text.
+ Form Recognizer preview version introduces additional language support for the layout model. *See* our [Language Support](language-support.md) for a complete list of supported handwritten and printed languages.
## Features
Layout API extracts text from documents and images with multiple text angles and
### Natural reading order for text lines (Latin only)
-You can specify the order in which the text lines are output with the `readingOrder` query parameter. Use `natural` for a more human-friendly reading order output as shown in the following example. This feature is only supported for Latin languages.
+In Form Recognizer v2.1, you can specify the order in which the text lines are output with the `readingOrder` query parameter. Use `natural` for a more human-friendly reading order output as shown in the following example. This feature is only supported for Latin languages.
+In Form Recognizer v3.0, the natural reading order output is used by the service in all cases. Therefore, there is no `readingOrder` parameter provided in this version.
### Handwritten classification for text lines (Latin only)
-The response includes classifying whether each text line is of handwriting style or not, along with a confidence score. This feature is only supported for Latin languages. The following example shows the handwritten classification for the text in the image.
-
+The response includes classifying whether each text line is of handwriting style or not, along with a confidence score. This feature is only supported for Latin languages.
### Select page numbers or ranges for text extraction
-For large multi-page documents, use the `pages` query parameter to indicate specific page numbers or page ranges for text extraction. The following example shows a document with 10 pages, with text extracted for both cases - all pages (1-10) and selected pages (3-6).
-
+For large multi-page documents, use the `pages` query parameter to indicate specific page numbers or page ranges for text extraction.
## Form Recognizer preview v3.0
For large multi-page documents, use the `pages` query parameter to indicate spec
* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
## Next steps
applied-ai-services Concept Model Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-model-overview.md
Title: Form Recognizer models
-description: Concepts encompassing data extraction and analysis using prebuilt models
+description: Concepts encompassing data extraction and analysis using prebuilt models.
Previously updated : 10/07/2021 Last updated : 02/15/2022 recommendations: false
# Form Recognizer models
- Azure Form Recognizer prebuilt models enable you to add intelligent form processing to your apps and flows without having to train and build your own models. Prebuilt models use optical character recognition (OCR) combined with deep learning models to identify and extract predefined text and data fields common to specific form and document types. Form Recognizer extracts analyzes form and document data then returns an organized, structured JSON response. Form Recognizer v2.1 supports invoice, receipt, ID document, and business card models.
+Azure Form Recognizer prebuilt models enable you to add intelligent document processing to your apps and flows without having to train and build your own models. Prebuilt models use optical character recognition (OCR) combined with deep learning models to identify and extract predefined text and data fields common to specific form and document types. Form Recognizer extracts analyzes form and document data then returns an organized, structured JSON response. Form Recognizer v2.1 supports invoice, receipt, ID document, and business card models.
## Model overview | **Model** | **Description** | | | |
+| 🆕[Read (preview)](#read-preview) | Extract text lines, words, their locations, detected languages, and handwritten style if detected. |
| 🆕[General document (preview)](#general-document-preview) | Extract text, tables, structure, key-value pairs, and named entities. | | [Layout](#layout) | Extracts text and layout information from documents. |
-| [Invoice](#invoice) | Extract key information from English invoices. |
+| [Invoice](#invoice) | Extract key information from English and Spanish invoices. |
| [Receipt](#receipt) | Extract key information from English receipts. | | [ID document](#id-document) | Extract key information from US driver licenses and international passports. | | [Business card](#business-card) | Extract key information from English business cards. | | [Custom](#custom) | Extract data from forms and documents specific to your business. Custom models are trained for your distinct data and use cases. |
+### Read (preview)
++
+The Read API analyzes and extracts ext lines, words, their locations, detected languages, and handwritten style if detected.
+
+***Sample document processed using the [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/read)***:
++
+> [!div class="nextstepaction"]
+> [Learn more: read model](concept-read.md)
+ ### General document (preview) :::image type="content" source="media/studio/general-document.png" alt-text="Screenshot: Studio general document icon.":::
-* The general document API supports most form types and will analyze your documents and associate values to keys and entries to tables that it discovers. It is ideal for extracting common key-value pairs from documents. You can use the general document model as an alternative to [training a custom model without labels](compose-custom-models.md#train-without-labels).
+* The general document API supports most form types and will analyze your documents and associate values to keys and entries to tables that it discovers. It's ideal for extracting common key-value pairs from documents. You can use the general document model as an alternative to training a custom model without labels.
* The general document is a pre-trained model and can be directly invoked via the REST API. * The general document model supports named entity recognition (NER) for several entity categories. NER is the ability to identify different entities in text and categorize them into pre-defined classes or types such as: person, location, event, product, and organization. Extracting entities can be useful in scenarios where you want to validate extracted values. The entities are extracted from the entire content.
-***Sample document processed in the [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=document)***:
+***Sample document processed using the [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/document)***:
:::image type="content" source="media/studio/general-document-analyze.png" alt-text="Screenshot: general document analysis in the Form Recognizer Studio.":::
The Layout API analyzes and extracts text, tables and headers, selection marks, and structure information from forms and documents.
-***Sample form processed with [Form Recognizer Sample Labeling tool](https://fott-2-1.azurewebsites.net/) layout feature***:
+***Sample form processed using the [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/layout)***:
> [!div class="nextstepaction"] > [Learn more: layout model](concept-layout.md)
The Layout API analyzes and extracts text, tables and headers, selection marks,
:::image type="content" source="media/studio/invoice.png" alt-text="Screenshot: Studio invoice icon.":::
-The invoice model analyzes and extracts key information from sales invoices. The API analyzes invoices in various formats and extracts key information such as customer name, billing address, due date, and amount due.
+The invoice model analyzes and extracts key information from sales invoices. The API analyzes invoices in various formats and extracts key information such as customer name, billing address, due date, and amount due. Currently, the model supports both English and Spanish invoices.
-***Sample invoice processed with [Form Recognizer Sample Labeling tool](https://fott-2-1.azurewebsites.net/)***:
+***Sample invoice processed using [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice)***:
> [!div class="nextstepaction"] > [Learn more: invoice model](concept-invoice.md)
The invoice model analyzes and extracts key information from sales invoices. The
The receipt model analyzes and extracts key information from sales receipts. The API analyzes printed and handwritten receipts and extracts key information such as merchant name, merchant phone number, transaction date, tax, and transaction total.
-***Sample receipt processed with [Form Recognizer Sample Labeling tool](https://fott-2-1.azurewebsites.net/)***:
+***Sample receipt processed using [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt)***:
> [!div class="nextstepaction"] > [Learn more: receipt model](concept-receipt.md)
The receipt model analyzes and extracts key information from sales receipts. The
:::image type="content" source="media/studio/id-document.png" alt-text="Screenshot: Studio identity document icon.":::
-The ID document model analyzes and extracts key information from U.S. Driver's Licenses (all 50 states and District of Columbia) and international passport biographical pages (excluding visa and other travel documents). The API analyzes identity documents and extracts key information such as first name, last name, address, and date of birth.
+The ID document model analyzes and extracts key information from U.S. Driver's Licenses (all 50 states and District of Columbia) and biographical pages from international passports (excluding visa and other travel documents). The API analyzes identity documents and extracts key information such as first name, last name, address, and date of birth.
-***Sample U.S. Driver's License processed with [Form Recognizer Sample Labeling tool](https://fott-2-1.azurewebsites.net/)***:
+***Sample U.S. Driver's License processed using [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument)***:
> [!div class="nextstepaction"] > [Learn more: identity document model](concept-id-document.md)
The ID document model analyzes and extracts key information from U.S. Driver's L
:::image type="content" source="media/studio/business-card.png" alt-text="Screenshot: Studio business card icon.":::
-The business card model analyzes and extracts key information from business card images. The API analyzes printed business card images and extracts key information such as first name, last name, company name, email address, and phone number.
+The business card model analyzes and extracts key information from business card images. The API analyzes printed business card images and extracts key information such as first name, last name, company name, email address, and phone number.
-***Sample business card processed with [Form Recognizer Sample Labeling tool](https://fott-2-1.azurewebsites.net/)***:
+***Sample business card processed using [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard)***:
> [!div class="nextstepaction"] > [Learn more: business card model](concept-business-card.md)
The business card model analyzes and extracts key information from business card
The custom model analyzes and extracts data from forms and documents specific to your business. The API is a machine-learning program trained to recognize form fields within your distinct content and extract key-value pairs and table data. You only need five examples of the same form type to get started and your custom model can be trained with or without labeled datasets.
-***Sample custom form processed with [Form Recognizer Sample Labeling tool](https://fott-2-1.azurewebsites.net/)***:
+***Sample custom template processed using [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/customform/projects)***:
> [!div class="nextstepaction"] > [Learn more: custom model](concept-custom.md)
The custom model analyzes and extracts data from forms and documents specific to
| **Model** | **Text extraction** |**Key-Value pairs** |**Fields**|**Selection Marks** | **Tables** |**Entities** | | | :: |::| :: | :: |:: |:: |
- |🆕General document | ✓ | ✓ || ✓ | ✓ | ✓ |
+ |🆕Read (preview) | ✓ | || | | |
+ |🆕General document (preview) | ✓ | ✓ || ✓ | ✓ | ✓ |
| Layout | Γ£ô | || Γ£ô | Γ£ô | | | Invoice | Γ£ô | Γ£ô |Γ£ô| Γ£ô | Γ£ô || |Receipt | Γ£ô | Γ£ô |Γ£ô| | ||
The custom model analyzes and extracts data from forms and documents specific to
| Business card | Γ£ô | Γ£ô | Γ£ô| | || | Custom |Γ£ô | Γ£ô || Γ£ô | Γ£ô | Γ£ô | + ## Input requirements * For best results, provide one clear photo or high-quality scan per document. * Supported file formats: JPEG, PNG, BMP, TIFF, and PDF (text-embedded or scanned). Text-embedded PDFs are best to eliminate the possibility of error in character extraction and location. * For PDF and TIFF, up to 2000 pages can be processed (with a free tier subscription, only the first two pages are processed). * The file size must be less than 50 MB.
-* Image dimensions must be between 50 x 50 pixels and 10000 x 10000 pixels.
+* Image dimensions must be between 50 x 50 pixels and 10,000 x 10,000 pixels.
* PDF dimensions are up to 17 x 17 inches, corresponding to Legal or A3 paper size, or smaller. * The total size of the training data is 500 pages or less. * If your PDFs are password-locked, you must remove the lock before submission.
The custom model analyzes and extracts data from forms and documents specific to
Form Recognizer v3.0 (preview) introduces several new features and capabilities:
+* [**Read (preview)**](concept-read.md) model is a new API that extracts text lines, words, their locations, detected languages, and handwrting style if detected.
* [**General document (preview)**](concept-general-document.md) model is a new API that uses a pre-trained model to extract text, tables, structure, key-value pairs, and named entities from forms and documents. * [**Receipt (preview)**](concept-receipt.md) model supports single-page hotel receipt processing. * [**ID document (preview)**](concept-id-document.md) model supports endorsements, restrictions, and vehicle classification extraction from US driver's licenses.
Learn how to use Form Recognizer v3.0 in your applications by following our [**F
* [Learn how to process your own forms and documents](quickstarts/try-sample-label-tool.md) with our [Form Recognizer sample tool](https://fott-2-1.azurewebsites.net/)
-* Complete a [Form Recognizer quickstart](quickstarts/try-sdk-rest-api.md) and get started creating a form processing app in the development language of your choice.
+* Complete a [Form Recognizer quickstart](quickstarts/try-sdk-rest-api.md) and get started creating a document processing app in the development language of your choice.
applied-ai-services Concept Read https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-read.md
+
+ Title: Read - Form Recognizer
+
+description: Learn concepts related to Read API analysis with Form Recognizer APIΓÇöusage and limits.
+++++ Last updated : 02/15/2022+
+recommendations: false
+++
+# Form Recognizer read model
+
+The Form Recognizer v3.0 preview includes the new Read API. Read extracts text lines, words, their locations, detected languages, and handwritten style if detected from documents (PDF, TIFF) and images (JPG, PNG, BMP).
+
+**Data extraction features**
+
+| **Read model** | **Text Extraction** | **Language detection** |
+| | | |
+| Read | Γ£ô |Γ£ô |
+
+## Development options
+
+The following resources are supported by Form Recognizer v3.0:
+
+| Feature | Resources | Model ID |
+|-|||
+|**Read model**| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md)</li><li>[**JavaScript SDK**](quickstarts/try-v3-javascript-sdk.md)</li></ul>|**prebuilt-read**|
+
+### Try Form Recognizer
+
+See how text is extracted from forms and documents using the Form Recognizer Studio. You'll need the following:
+
+* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
+
+* A [Form Recognizer instance](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) in the Azure portal. You can use the free pricing tier (`F0`) to try the service. After your resource deploys, select **Go to resource** to get your API key and endpoint.
+
+ :::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot: keys and endpoint location in the Azure portal.":::
+
+#### Form Recognizer Studio (preview)
+
+> [!NOTE]
+> Form Recognizer studio is available with the preview (v3.0) API.
+
+***Sample form processed with [Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/read)***
++
+1. On the Form Recognizer Studio home page, select **Read**
+
+1. You can analyze the sample document or select the **+ Add** button to upload your own sample.
+
+1. Select the **Analyze** button:
+
+ :::image type="content" source="media/studio/form-recognizer-studio-read-analyze-v3p2-updated.png" alt-text="Screenshot: analyze read menu.":::
+
+ > [!div class="nextstepaction"]
+ > [Try Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/layout)
+
+## Input requirements
+
+* For best results, provide one clear photo or high-quality scan per document.
+* Supported file formats: JPEG, PNG, BMP, TIFF, and PDF (text-embedded or scanned). Text-embedded PDFs are best to eliminate the possibility of error in character extraction and location.
+* For PDF and TIFF, up to 2000 pages can be processed (with a free tier subscription, only the first two pages are processed).
+* The file size must be less than 50 MB.
+* Image dimensions must be between 50 x 50 pixels and 10000 x 10000 pixels.
+
+## Supported languages and locales
+
+Form Recognizer preview version supports several languages for the read model. *See* our [Language Support](language-support.md) for a complete list of supported handwritten and printed languages.
+
+## Features
+
+### Text lines and words
+
+Read API extracts text from documents and images with multiple text angles and colors. It accepts photos of documents, faxes, printed and/or handwritten (English only) text, and mixed modes. Text is extracted with information provided on lines, words, bounding boxes, confidence scores, and style (handwritten or other).
+
+### Language detection (v3.0 preview)
+
+Read API in v3.0 preview 2 adds language detection as a new feature for text lines. Read will try to detect the languages at the text line level and output the language code with the highest confidence score for one or more text lines.
+
+### Handwritten classification for text lines (Latin only)
+
+The response includes classifying whether each text line is of handwriting style or not, along with a confidence score. This feature is only supported for Latin languages.
+
+### Select page (s) for text extraction
+
+For large multi-page documents, use the `pages` query parameter to indicate specific page numbers or page ranges for text extraction.
+
+## Next steps
+
+* Complete a Form Recognizer quickstart:
+
+ > [!div class="nextstepaction"]
+ > [Form Recognizer quickstart](quickstarts/try-sdk-rest-api.md)
+
+* Explore our REST API:
+
+ > [!div class="nextstepaction"]
+ > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument)
applied-ai-services Concept Receipt https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-receipt.md
The receipt model combines powerful Optical Character Recognition (OCR) capabili
## Development options
-The following resources are supported by Form Recognizer v2.1:
+The following tools are supported by Form Recognizer v2.1:
| Feature | Resources | |-|-| |**Receipt model**| <ul><li>[**Form Recognizer labeling tool**](https://fott-2-1.azurewebsites.net/prebuilts-analyze)</li><li>[**REST API**](quickstarts/try-sdk-rest-api.md?pivots=programming-language-rest-api#analyze-receipts)</li><li>[**Client-library SDK**](quickstarts/try-sdk-rest-api.md)</li><li>[**Form Recognizer Docker container**](containers/form-recognizer-container-install-run.md?tabs=receipt#run-the-container-with-the-docker-compose-up-command)</li></ul>|
-The following resources are supported by Form Recognizer v3.0:
+The following tools are supported by Form Recognizer v3.0:
| Feature | Resources | Model ID | |-|-|--|
The following resources are supported by Form Recognizer v3.0:
### Try Form Recognizer
-See how data, including time and date of transactions, merchant information, and amount totals, is extracted from receipts using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following:
+See how data, including time and date of transactions, merchant information, and amount totals, is extracted from receipts using the Form Recognizer Studio or our Sample Labeling tool. You'll need the following resources:
* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
You will need a receipt document. You can use our [sample receipt document](http
* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
-* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
## Next steps
applied-ai-services Concept W2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-w2.md
+
+ Title: Form Recognizer Form W-2 prebuilt-tax model
+
+description: Data extraction and analysis extraction using the prebuilt-tax Form W-2 model
+++++ Last updated : 02/15/2022+
+recommendations: false
++
+# Form Recognizer Form W-2 prebuilt-tax model | Preview
+
+The Form W-2, Wage and Tax Statement, is a US Internal Revenue Service (IRS) tax form completed by employers to report employees' salary, wages, compensation, and taxes withheld. Employers send a W-2 form to each employee on or before January 31 each year and employees use the form to prepare their tax returns.
+
+A W-2 is a multipart form divided into state and federal sections:
+
+* Copy A is sent to the Social Security Administration.
+* Copy 1 is for the city, state, or locality tax assessment.
+* Copy B is for filing with the employee's federal tax return.
+* Copy C is for the employee's records.
+* Copy 2 is another copy for a city, state, or locality tax assessment.
+* Copy D is for the employer's records.
+
+Each W-2 Form consists of more than 14 boxes, both numbered and lettered, that detail the employee's income from the previous year. The Form Recognizer **prebuilt-tax**, Form W-2 model, combines Optical Character Recognition (OCR) with deep learning models to analyze and extract information reported in each box on a W-2 form. The model supports standard and customized forms from 2018 to the present, including both single form and multiple forms (copy A, B, C, D, 1, 2) on one page.
+
+***Sample W-2 form processed using Form Recognizer Studio***
++
+## Development options
+
+The prebuilt-tax, Form W-2, model is supported by Form Recognizer v3.0 with the following tools:
+
+| Feature | Resources | Model ID |
+|-|-|--|
+|**W-2 model**|<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md)</li><li>[**JavaScript SDK**](quickstarts/try-v3-javascript-sdk.md)</li></ul>|**prebuilt-tax.us.w2**|
+
+### Try Form Recognizer
+
+See how data, including employee, employer, wage, and tax information is extracted from W-2 forms using the Form Recognizer Studio. You'll need the following resources:
+
+* An Azure subscriptionΓÇöyou can [create one for free](https://azure.microsoft.com/free/cognitive-services/)
+
+* A [Form Recognizer instance](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) in the Azure portal. You can use the free pricing tier (`F0`) to try the service. After your resource deploys, select **Go to resource** to get your API key and endpoint.
+
+ :::image type="content" source="media/containers/keys-and-endpoint.png" alt-text="Screenshot of keys and endpoint location in the Azure portal.":::
+
+#### Form Recognizer Studio
+
+> [!NOTE]
+> Form Recognizer studio is available with v3.0 preview API.
+
+1. On the [Form Recognizer Studio home page](https://formrecognizer.appliedai.azure.com/studio), select **W-2 form**.
+
+1. You can analyze the sample invoice or select the **Γ₧ò Add** button to upload your own sample.
+
+1. Select the **Analyze** button:
+
+ :::image type="content" source="media/studio/w2-analyze.png" alt-text="Screenshot: analyze W-2 window in the Form Recognizer Studio.":::
+
+ > [!div class="nextstepaction"]
+ > [Try Form Recognizer Studio](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2)
+
+## Input requirements
+
+* For best results, provide one clear photo or high-quality scan per document.
+* Supported file formats: JPEG, PNG, BMP, TIFF, and PDF (text-embedded or scanned). Text-embedded PDFs are best to eliminate the possibility of error in character extraction and location.
+* For PDF and TIFF, up to 2000 pages can be processed (with a free tier subscription, only the first two pages are processed).
+* The file size must be less than 50 MB.
+* Image dimensions must be between 50 x 50 pixels and 10,000 x 10,000 pixels.
+* PDF dimensions are up to 17 x 17 inches, corresponding to Legal or A3 paper size, or smaller.
+* The total size of the training data is 500 pages or less.
+* If your PDFs are password-locked, you must remove the lock before submission.
+* For unsupervised learning (without labeled data):
+ * Data must contain keys and values.
+ * Keys must appear above or to the left of the values; they can't appear below or to the right.
+
+## Supported languages and locales
+
+| Model | LanguageΓÇöLocale code | Default |
+|--|:-|:|
+|prebuilt-tax.us.w2| <ul>English (United States)</ul></br>|English (United States)ΓÇöen-US|
+
+## Field extraction
+
+|Name| Box | Type | Description | Standardized output|
+|:--|:-|:-|:-|:-|
+| Employee.SocialSecurityNumber | a | String | Employee's Social Security N number (SSN). | 123-45-6789 |
+| Employer.IdNumber | b | String | Employer's ID number (EIN), the business equivalent of a social security number.| 12-1234567 |
+| Employer.Name | c | String | Employer's name. | Contoso |
+| Employer.Address | c | String | Employer's address (with city). | 123 Example Street Sample City, CA |
+| Employer.ZipCode | c | String | Employer's zip code. | 12345 |
+| ControlNumber | d | String | A code identifying the unique W-2 in the records of employer. | R3D1 |
+| Employee.Name | e | String | Full name of the employee. | Henry Ross|
+| Employee.Address | f | String | Employee's address (with city). | 123 Example Street Sample City, CA |
+| Employee.ZipCode | f | String | Employee's zip code. | 12345 |
+| WagesTipsAndOtherCompensation | 1 | Number | A summary of your pay, including wages, tips and other compensation. | 50000 |
+| FederalIncomeTaxWithheld | 2 | Number | Federal income tax withheld. | 1111 |
+| SocialSecurityWages | 3 | Number | Social security wages. | 35000 |
+| SocialSecurityTaxWithheld | 4 | Number | Social security tax with held. | 1111 |
+| MedicareWagesAndTips | 5 | Number | Medicare wages and tips. | 45000 |
+| MedicareTaxWithheld | 6 | Number | Medicare tax with held. | 1111 |
+| SocialSecurityTips | 7 | Number | Social security tips. | 1111 |
+| AllocatedTips | 8 | Number | Allocated tips. | 1111 |
+| VerificationCode | 9 | String | Verification Code on Form W-2 | A123-B456-C789-DXYZ |
+| DependentCareBenefits | 10 | Number | Dependent care benefits. | 1111 |
+| NonqualifiedPlans | 11 | Number | The non-qualified plan, a type of retirement savings plan that is employer-sponsored and tax-deferred. | 1111 |
+| AdditionalInfo | | Array of objects | An array of LetterCode and Amount. | |
+| LetterCode | 12a, 12b, 12c, 12d | String | Letter code. Refer to [IRS/W-2](https://www.irs.gov/pub/irs-prior/fw2--2014.pdf) for the semantics of the code values. | D |
+| Amount | 12a, 12b, 12c, 12d | Number | Amount | 1234 |
+| IsStatutoryEmployee | 13 | String | Whether the RetirementPlan box is checked or not. | true |
+| IsRetirementPlan | 13 | String | Whether the RetirementPlan box is checked or not. | true |
+| IsThirdPartySickPay | 13 | String | Whether the ThirdPartySickPay box is checked or not. | false |
+| Other | 14 | String | Other info employers may use this field to report. | |
+| StateTaxInfos | | Array of objects | An array of state tax info including State, EmployerStateIdNumber, StateIncomeTax, StageWagesTipsEtc. | |
+| State | 15 | String | State | CA |
+| EmployerStateIdNumber | 15 | String | Employer state number. | 123-123-1234 |
+| StateWagesTipsEtc | 16 | Number | State wages, tips, etc. | 50000 |
+| StateIncomeTax | 17 | Number | State income tax. | 1535 |
+| LocalTaxInfos | | Array of objects | An array of local income tax info including LocalWagesTipsEtc, LocalIncomeTax, LocalityName. | |
+| LocalWagesTipsEtc | 18 | Number | Local wages, tips, etc. | 50000 |
+| LocalIncomeTax | 19 | Number | Local income tax. | 750 |
+| LocalityName | 20 | Number | Locality name. | CLEVELAND |
+ | W2Copy | | String | Copy of W-2 forms A, B, C, D, 1, or 2. | Copy A For Social Security Administration |
+| TaxYear | | Number | Tax year. | 2020 |
+| W2FormVariant | | String | The variants of W-2 forms, including "W-2", "W-2AS", "W-2CM", "W-2GU", "W-2VI". | W-2 |
++
+### Migration guide and REST API v3.0
+
+* Follow our [**Form Recognizer v3.0 migration guide**](v3-migration-guide.md) to learn how to use the preview version in your applications and workflows.
+
+* Explore our [**REST API (preview)**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument) to learn more about the preview version and new capabilities.
+
+## Next steps
+
+* Complete a Form Recognizer quickstart:
+
+ > [!div class="nextstepaction"]
+ > [Form Recognizer quickstart](quickstarts/try-sdk-rest-api.md)
+
+* Explore our REST API:
+
+ > [!div class="nextstepaction"]
+ > [Form Recognizer API v3.0](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument)
applied-ai-services Form Recognizer Container Install Run https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/containers/form-recognizer-container-install-run.md
http {
* On the left pane of the tool, select the connections tab. * Select to create a new project and give it a name and description. * For the provider, choose the local file system option. For the local folder, make sure you enter the path to the folder where you stored the sample data files.
-* Navigate back to the home tab and select the "Use custom to train a model with labels and key value pairs option".
+* Navigate back to the home tab and select the "Use custom to train a model with labels and key-value pairs option".
* Select the train button on the left pane to train the labeled model. * Save this connection and use it to label your requests. * You can choose to analyze the file of your choice against the trained model.
$docker-compose up
* On the left pane of the tool, select the **connections** tab. * Select **create a new project** and give it a name and description. * For the provider, choose the **local file system** option. For the local folder, make sure you enter the path to the folder where you stored the **sample data** files.
-* Navigate back to the home tab and select **Use custom to train a model with labels and key value pairs**.
+* Navigate back to the home tab and select **Use custom to train a model with labels and key-value pairs**.
* Select the **train button** on the left pane to train the labeled model. * **Save** this connection and use it to label your requests. * You can choose to analyze the file of your choice against the trained model.
applied-ai-services Create A Form Recognizer Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/create-a-form-recognizer-resource.md
recommendations: false
# Create a Form Recognizer resource
-Azure Form Recognizer is a cloud-based [Azure Applied AI Service](../../applied-ai-services/index.yml) that uses machine-learning models to extract and analyze form fields, text, and tables from your documents. Here, you'll learn how to create a Form Recognizer resource in the Azure portal.
+Azure Form Recognizer is a cloud-based [Azure Applied AI Service](../../applied-ai-services/index.yml) that uses machine-learning models to extract key-value pairs, text, and tables from your documents. Here, you'll learn how to create a Form Recognizer resource in the Azure portal.
## Visit the Azure portal
That's it! You're now ready to start automating data extraction using Azure Form
* Try the [Form Recognizer Studio](concept-form-recognizer-studio.md), an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service into your applications.
-* Complete a Form Recognizer [C#](quickstarts/try-v3-csharp-sdk.md),[Python](quickstarts/try-v3-python-sdk.md), [Java](quickstarts/try-v3-java-sdk.md), or [JavaScript](quickstarts/try-v3-javascript-sdk.md) quickstart and get started creating a form processing app in the development language of your choice.
+* Complete a Form Recognizer [C#](quickstarts/try-v3-csharp-sdk.md),[Python](quickstarts/try-v3-python-sdk.md), [Java](quickstarts/try-v3-java-sdk.md), or [JavaScript](quickstarts/try-v3-javascript-sdk.md) quickstart and get started creating a document processing app in the development language of your choice.
applied-ai-services Deploy Label Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/deploy-label-tool.md
Previously updated : 07/02/2021 Last updated : 02/15/2022 - # Deploy the Sample Labeling tool
+> [!NOTE]
+> The [cloud hosted](https://fott-2-1.azurewebsites.net/) labeling tool is available at [https://fott-2-1.azurewebsites.net/](https://fott-2-1.azurewebsites.net/). Follow the steps in this document only if you want to deploy the sample labeling tool for yourself.
+ The Form Recognizer Sample Labeling tool is an application that provides a simple user interface (UI), which you can use to manually label forms (documents) for supervised learning. In this article, we'll provide links and instructions that teach you how to: * [Run the Sample Labeling tool locally](#run-the-sample-labeling-tool-locally)
applied-ai-services Try Sdk Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/how-to-guides/try-sdk-rest-api.md
Title: "Use Form Recognizer client library SDKs or REST API"
-description: Use a Form Recognizer client library SDK or REST API to create a forms processing app that extracts key/value pairs and table data from your custom documents.
+description: How to use a Form Recognizer client libraries or REST API to create apps that extracts key-value pairs and table data from your custom documents.
Previously updated : 11/02/2021 Last updated : 02/01/2022 zone_pivot_groups: programming-languages-set-formre recommendations: false
# Use Form Recognizer SDKs or REST API
- In this how-to guide, you'll learn how to add Form Recognizer to your applications and workflows using an SDK, in a programming language of your choice, or the REST API. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract and analyze form fields, text, and tables from your documents. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
+ In this how-to guide, you'll learn how to add Form Recognizer to your applications and workflows using an SDK, in a programming language of your choice, or the REST API. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
You'll use the following APIs to extract structured data from forms and documents:
applied-ai-services Label Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/label-tool.md
Title: "How-to: Analyze documents, Label forms, train a model, and analyze forms with Form Recognizer"
-description: In this how-to, you'll use the Form Recognizer sample tool to analyze documents, invoices, receipts etc. Label and create a custom model to extract text, tables, selection marks, structure and key value pairs from documents.
+description: In this how-to, you'll use the Form Recognizer sample tool to analyze documents, invoices, receipts etc. Label and create a custom model to extract text, tables, selection marks, structure and key-value pairs from documents.
keywords: document processing
<!-- markdownlint-disable MD034 --> # Train a custom model using the Sample Labeling tool
-In this article, you'll use the Form Recognizer REST API with the Sample Labeling tool to train a custom document processing model with manually labeled data.
+In this article, you'll use the Form Recognizer REST API with the Sample Labeling tool to train a custom model with manually labeled data.
> [!VIDEO https://docs.microsoft.com/Shows/Docs-Azure/Azure-Form-Recognizer/player]
applied-ai-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/language-support.md
<!-- markdownlint-disable MD001 --> <!-- markdownlint-disable MD024 -->
-## Layout and custom model
+## Read, Layout, and Custom form (template) model
-The following lists cover the currently GA languages in the the 2.1 version and new previews in the 3.0 preview version of Form Recognizer. These languages are supported by Layout and Custom models. The preview release may include enhancements to the currently GA languages.
+
+The following lists include the currently GA languages in the the 2.1 version and new ones in the most recent 3.0 preview. These languages are supported by Read, Layout, and Custom form (template) model features.
> [!NOTE] > **Language code optional** > > Form Recognizer's deep learning based universal models extract all multi-lingual text in your documents, including text lines with mixed languages, and do not require specifying a language code. Do not provide the language code as the parameter unless you are sure about the language and want to force the service to apply only the relevant model. Otherwise, the service may return incomplete and incorrect text.
-To use the preview languages in Layout and custom models, refer to the [v3.0 REST API migration guide](/rest/api/medi).
+To use the preview languages, refer to the [v3.0 REST API migration guide](/rest/api/medi).
### Handwritten languages
-The following table lists the handwritten languages supported by Form Recognizer's Layout and Custom model features.
-
-|Language| Language code (optional) | Preview? |
-|:--|:-:|:-:|
-|English|`en`||
-|Chinese Simplified |`zh-Hans`| preview
-|French |`fr`| preview
-|German |`de`| preview
-|Italian|`it`| preview
-|Portuguese |`pt`| preview
-|Spanish |`es`| preview
-
-### Print languages
-The following table lists the print languages supported by Form Recognizer's Layout and Custom model features.
-
-|Language| Language code (optional) | Preview? |
-|:--|:-:|:-:|
-|Afrikaans|`af`||
-|Albanian |`sq`||
-|Asturian |`ast`| |
-|Azerbaijani (Latin) | `az` | preview |
-|Basque |`eu`| |
-|Belarusian (Cyrillic) | `be` | preview |
-|Belarusian (Latin) | `be` | preview |
-|Bislama |`bi`| |
-|Bosnian (Latin) |`bs`| preview |
-|Breton |`br`| |
-|Bulgarian |`bg`| preview |
-|Buryat (Cyrillic)|`bua`| preview |
-|Catalan |`ca`| |
-|Cebuano |`ceb`| |
-|Chamorro |`ch`| |
-|Chinese Simplified | `zh-Hans`| |
-|Chinese Traditional | `zh-Hant`| |
-|Cornish |`kw`| |
-|Corsican |`co`| |
-|Crimean Tatar (Latin)|`crh`| |
-|Croatian |`hr`| preview |
-|Czech | `cs` | |
-|Danish | `da` | |
-|Dutch | `nl` | |
-|English | `en` | |
-|Erzya (Cyrillic) |`myv`| preview |
-|Estonian |`et`| |
-|Faroese |`fo`| preview |
-|Fijian |`fj`| |
-|Filipino |`fil`| |
-|Finnish | `fi` | |
-|French | `fr` | |
-|Friulian | `fur` | |
-|Gagauz (Latin) |`gag`| preview |
-|Galician | `gl` | |
-|German | `de` | |
-|Gilbertese | `gil` | |
-|Greenlandic | `kl` | |
-|Haitian Creole | `ht` | |
-|Hani | `hni` | |
-|Hawaiian |`haw`| preview |
-|Hmong Daw (Latin)| `mww` | |
-|Hungarian | `hu` | |
-|Icelandic |`is`| preview |
-|Inari Sami |`smn`| preview |
-|Indonesian | `id` | |
-|Interlingua | `ia` | |
-|Inuktitut (Latin) | `iu` | |
-|Irish | `ga` | |
-|Italian | `it` | |
-|Japanese | `ja` | |
-|Javanese | `jv` | |
-|K'iche' | `quc` | |
-|Kabuverdianu | `kea` | |
-|Kachin (Latin) | `kac` | |
-|Karachay-Balkar |`krc`| preview |
-|Kara-Kalpak (Latin) | `kaa` | |
-|Kara-Kalpak (Cyrillic) | `kaa-cyrl` | preview |
-|Kashubian | `csb` | |
-|Kazakh (Cyrillic) |`kk-cyrl`| preview |
-|Kazakh (Latin) |`kk-latn`| preview |
-|Khasi | `kha` | |
-|Korean | `ko` | |
-|Koryak |`kpy`| preview |
-|Kosraean |`kos`| preview |
-|Kumyk (Cyrillic) |`kum`| preview |
-|Kurdish (Latin)| `ku` | |
-|Kyrgyz (Cyrillic) |`ky`| preview |
-|Lakota |`lkt`| preview |
-|Latin|`la`| preview |
-|Lithuanian|`lt`| preview |
-|Lower Sorbian|`dsb`| preview |
-|Lule Sami|`smj`| preview |
-|Luxembourgish | `lb` | |
-|Malay (Latin) | `ms` | |
-|Maltese|`mt`| preview |
-|Manx | `gv` | |
-|Maori|`mi`| preview |
-|Mongolian (Cyrillic)|`mn`| preview |
-|Montenegrin (Cyrillic)|`cnr-cyrl`| preview |
-|Montenegrin (Latin)|`cnr-latn`| preview |
-|Neapolitan | `nap` | |
-|Niuean|`niu`| preview |
-|Nogay|`nog`| preview |
-|Northern Sami (Latin)|`sme`| preview |
-|Norwegian | `no` | |
-|Occitan | `oc` | |
-|Ossetic|`os`| preview |
-|Polish | `pl` | |
-|Portuguese | `pt` | |
-|Ripuarian|`ksh`| preview |
-|Romanian | `ro` | preview |
-|Romansh | `rm` | |
-|Russian | `ru` | preview |
-|Samoan (Latin)|`sm`| preview |
-|Scots | `sco` | |
-|Scottish Gaelic | `gd` | |
-|Serbian (Latin) | `sr-latn` | preview |
-|Skolt Sami|`sms`| preview |
-|Slovak | `sk` | preview |
-|Slovenian | `sl` | |
-|Southern Sami|`sma`| preview |
-|Spanish | `es` | |
-|Swahili (Latin) | `sw` | |
-|Swedish | `sv` | |
-|Tajik (Cyrillic)|`tg`| preview |
-|Tatar (Latin) | `tt` | |
-|Tetum | `tet` | |
-|Tongan|`to`|(preview) |
-|Turkish | `tr` | |
-|Turkmen (Latin)|`tk`| preview |
-|Tuvan|`tyv`| preview |
-|Upper Sorbian | `hsb` | |
-|Uzbek (Cyrillic) | `uz-cyrl` | |
-|Uzbek (Latin) | `uz` | |
-|Volap├╝k | `vo` | |
-|Walser | `wae` | |
-|Welsh | `cy` | preview |
-|Western Frisian | `fy` | |
-|Yucatec Maya | `yua` | |
-|Zhuang | `za` | |
-|Zulu | `zu` | |
+
+The following table lists the handwritten languages.
+
+|Language| Language code (optional) | Language| Language code (optional) |
+|:--|:-:|:--|:-:|
+|English|`en`|Japanese (preview) |`ja`|
+|Chinese Simplified (preview) |`zh-Hans`|Korean (preview)|`ko`|
+|French (preview) |`fr`|Portuguese (preview)|`pt`|
+|German (preview) |`de`|Spanish (preview) |`es`|
+|Italian (preview) |`it`|
+
+### Print languages (preview)
+
+This section lists the supported languages in the latest preview.
+
+|Language| Code (optional) |Language| Code (optional) |
+|:--|:-:|:--|:-:|
+|Angika (Devanagiri) | `anp`|Lakota | `lkt`
+|Arabic | `ar`|Latin | `la`
+|Awadhi-Hindi (Devanagiri) | `awa`|Lithuanian | `lt`
+|Azerbaijani (Latin) | `az`|Lower Sorbian | `dsb`
+|Bagheli | `bfy`|Lule Sami | `smj`
+|Belarusian (Cyrillic) | `be`, `be-cyrl`|Mahasu Pahari (Devanagiri) | `bfz`
+|Belarusian (Latin) | `be`, `be-latn`|Maltese | `mt`
+|Bhojpuri-Hindi (Devanagiri) | `bho`|Malto (Devanagiri) | `kmj`
+|Bodo (Devanagiri) | `brx`|Maori | `mi`
+|Bosnian (Latin) | `bs`|Marathi | `mr`
+|Brajbha | `bra`|Mongolian (Cyrillic) | `mn`
+|Bulgarian | `bg`|Montenegrin (Cyrillic) | `cnr-cyrl`
+|Bundeli | `bns`|Montenegrin (Latin) | `cnr-latn`
+|Buryat (Cyrillic) | `bua`|Nepali | `ne`
+|Chamling | `rab`|Niuean | `niu`
+|Chhattisgarhi (Devanagiri)| `hne`|Nogay | `nog`
+|Croatian | `hr`|Northern Sami (Latin) | `sme`
+|Dari | `prs`|Ossetic | `os`
+|Dhimal (Devanagiri) | `dhi`|Pashto | `ps`
+|Dogri (Devanagiri) | `doi`|Persian | `fa`
+|Erzya (Cyrillic) | `myv`|Punjabi (Arabic) | `pa`
+|Faroese | `fo`|Ripuarian | `ksh`
+|Gagauz (Latin) | `gag`|Romanian | `ro`
+|Gondi (Devanagiri) | `gon`|Russian | `ru`
+|Gurung (Devanagiri) | `gvr`|Sadri (Devanagiri) | `sck`
+|Halbi (Devanagiri) | `hlb`|Samoan (Latin) | `sm`
+|Haryanvi | `bgc`|Sanskrit (Devanagari) | `sa`
+|Hawaiian | `haw`|Santali(Devanagiri) | `sat`
+|Hindi | `hi`|Serbian (Latin) | `sr`, `sr-latn`
+|Ho(Devanagiri) | `hoc`|Sherpa (Devanagiri) | `xsr`
+|Icelandic | `is`|Sirmauri (Devanagiri) | `srx`
+|Inari Sami | `smn`|Skolt Sami | `sms`
+|Jaunsari (Devanagiri) | `Jns`|Slovak | `sk`
+|Kangri (Devanagiri) | `xnr`|Somali (Arabic) | `so`
+|Karachay-Balkar | `krc`|Southern Sami | `sma`
+|Kara-Kalpak (Cyrillic) | `kaa-cyrl`|Tajik (Cyrillic) | `tg`
+|Kazakh (Cyrillic) | `kk-cyrl`|Thangmi | `thf`
+|Kazakh (Latin) | `kk-latn`|Tongan | `to`
+|Khaling | `klr`|Turkmen (Latin) | `tk`
+|Korku | `kfq`|Tuvan | `tyv`
+|Koryak | `kpy`|Urdu | `ur`
+|Kosraean | `kos`|Uyghur (Arabic) | `ug`
+|Kumyk (Cyrillic) | `kum`|Uzbek (Arabic) | `uz-arab`
+|Kurdish (Arabic) | `ku-arab`|Uzbek (Cyrillic) | `uz-cyrl`
+|Kurukh (Devanagiri) | `kru`|Welsh | `cy`
+|Kyrgyz (Cyrillic) | `ky`
+
+### Print languages (GA)
+
+This section lists the supported languages in the latest GA version.
+
+|Language| Code (optional) |Language| Code (optional) |
+|:--|:-:|:--|:-:|
+|Afrikaans|`af`|Japanese | `ja` |
+|Albanian |`sq`|Javanese | `jv` |
+|Asturian |`ast`|K'iche' | `quc` |
+|Basque |`eu`|Kabuverdianu | `kea` |
+|Bislama |`bi`|Kachin (Latin) | `kac` |
+|Breton |`br`|Kara-Kalpak (Latin) | `kaa` |
+|Catalan |`ca`|Kashubian | `csb` |
+|Cebuano |`ceb`|Khasi | `kha` |
+|Chamorro |`ch`|Korean | `ko` |
+|Chinese Simplified | `zh-Hans`|Kurdish (Latin) | `ku-latn`
+|Chinese Traditional | `zh-Hant`|Luxembourgish | `lb` |
+|Cornish |`kw`|Malay (Latin) | `ms` |
+|Corsican |`co`|Manx | `gv` |
+|Crimean Tatar (Latin)|`crh`|Neapolitan | `nap` |
+|Czech | `cs` |Norwegian | `no` |
+|Danish | `da` |Occitan | `oc` |
+|Dutch | `nl` |Polish | `pl` |
+|English | `en` |Portuguese | `pt` |
+|Estonian |`et`|Romansh | `rm` |
+|Fijian |`fj`|Scots | `sco` |
+|Filipino |`fil`|Scottish Gaelic | `gd` |
+|Finnish | `fi` |Slovenian | `sl` |
+|French | `fr` |Spanish | `es` |
+|Friulian | `fur` |Swahili (Latin) | `sw` |
+|Galician | `gl` |Swedish | `sv` |
+|German | `de` |Tatar (Latin) | `tt` |
+|Gilbertese | `gil` |Tetum | `tet` |
+|Greenlandic | `kl` |Turkish | `tr` |
+|Haitian Creole | `ht` |Upper Sorbian | `hsb` |
+|Hani | `hni` |Uzbek (Latin) | `uz` |
+|Hmong Daw (Latin)| `mww` |Volap├╝k | `vo` |
+|Hungarian | `hu` |Walser | `wae` |
+|Indonesian | `id` |Western Frisian | `fy` |
+|Interlingua | `ia` |Yucatec Maya | `yua` |
+|Inuktitut (Latin) | `iu` |Zhuang | `za` |
+|Irish | `ga` |Zulu | `zu` |
+|Italian | `it` |
+
+## Custom neural model
+
+Language| Locale code |
+|:--|:-:|
+|English (United States)|en-us|
## Receipt and business card models
Pre-Built Receipt and Business Cards support all English receipts and business c
Language| Locale code | |:--|:-:| |English (United States)|en-us|
+|Spanish (preview) | es |
## ID documents
applied-ai-services Managed Identities https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/managed-identities.md
+
+ Title: Create and use managed identities with Form Recognizer
+
+description: Understand how to create and use managed identity with Form Recognizer
+++++ Last updated : 01/26/2022++++
+# Create and use managed identities with Form Recognizer
+
+> [!IMPORTANT]
+> Azure RBAC (Azure role-based access control) assignment is currently in preview and not recommended for production workloads. Certain features may not be supported or have constrained capabilities. Azure RBAC assignments are used to grant permissions for managed identity.
+
+## What is managed identity?
+
+Azure managed identity is a service principal. It creates an Azure Active Directory (Azure AD) identity and specific permissions for Azure managed resources. You can use a managed identity to grant access to any resource that supports Azure AD authentication. To grant access, assign a role to a managed identity using [Azure RBAC](../../role-based-access-control/overview.md) (Azure role-based access control). There's no added cost to use managed identity in Azure.
+
+Managed identity supports both privately and publicly accessible Azure blob storage accounts. For storage accounts with public access, you can opt to use a shared access signature (SAS) to grant limited access. In this article, you'll learn to enable a system-assigned managed identity for your Form Recognizer instance.
+
+## Private storage account access
+> [!NOTE]
+>
+> Form Recognizer only supports system-assigned managed identities today. User-assigned managed identities is on the roadmap and will be enabled in the near future.
++
+ Private Azure storage account access and authentication are supported by [managed identities for Azure resources](../../active-directory/managed-identities-azure-resources/overview.md). If you have an Azure storage account, protected by a Virtual Network (VNet) or firewall, Form Recognizer can't directly access your storage account data. However, once a managed identity is enabled, Form Recognizer can access your storage account using an assigned managed identity credential.
+
+> [!NOTE]
+>
+> * If you intend to analyze your storage data with the [**Form Recognizer Sample Labeling tool (FOTT)**](https://fott-2-1.azurewebsites.net/), you must deploy the tool behind your VNet or firewall.
+>
+> * The Analyze [**Receipt**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeReceiptAsync), [**Business Card**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeBusinessCardAsync), [**Invoice**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/5ed8c9843c2794cbb1a96291), [**ID document**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/5f74a7738978e467c5fb8707), and [**Custom Form**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1/operations/AnalyzeWithCustomForm) APIs can extract data from a single document by posting requests as raw binary content. In these scenarios, there is no requirement for a managed identity credential.
+
+## Prerequisites
+
+To get started, you'll need:
+
+* An active [**Azure account**](https://azure.microsoft.com/free/cognitive-services/)ΓÇöif you don't have one, you can [**create a free account**](https://azure.microsoft.com/free/).
+
+* A [**Form Recognizer**](https://portal.azure.com/#create/Microsoft.CognitiveServicesTextTranslation) or [**Cognitive Services**](https://portal.azure.com/#create/Microsoft.CognitiveServicesAllInOne) resource in the Azure portal. For detailed steps, _see_ [Create a Cognitive Services resource using the Azure portal](../../cognitive-services/cognitive-services-apis-create-account.md?tabs=multiservice%2cwindows).
+
+* An [**Azure blob storage account**](https://portal.azure.com/#create/Microsoft.StorageAccount-ARM) in the same region as your Form Recognizer resource. You'll create containers to store and organize your blob data within your storage account.
+
+ * If your storage account is behind a firewall, **you must enable the following configuration**: </br></br>
+
+ * On your storage account page, select **Security + networking** → **Networking** from the left menu.
+ :::image type="content" source="media/managed-identities/security-and-networking-node.png" alt-text="Screenshot: security + networking tab.":::
+
+ * In the main window, select **Allow access from selected networks**.
+ :::image type="content" source="media/managed-identities/firewalls-and-virtual-networks.png" alt-text="Screenshot: Selected networks radio button selected.":::
+
+ * On the selected networks page, navigate to the **Exceptions** category and make certain that the [**Allow Azure services on the trusted services list to access this storage account**](../../storage/common/storage-network-security.md?tabs=azure-portal#manage-exceptions) checkbox is enabled.
+
+ :::image type="content" source="media/managed-identities/allow-trusted-services-checkbox-portal-view.png" alt-text="Screenshot: allow trusted services checkbox, portal view":::
+* A brief understanding of [**Azure role-based access control (Azure RBAC)**](../../role-based-access-control/role-assignments-portal.md) using the Azure portal.
+
+## Managed identity assignments
+
+There are two types of managed identity: **system-assigned** and **user-assigned**. Currently, Form Recognizer is supported by system-assigned managed identity. A system-assigned managed identity is **enabled** directly on a service instance. It isn't enabled by default; you have to go to your resource and update the identity setting. The system-assigned managed identity is tied to your resource throughout its lifecycle. If you delete your resource, the managed identity will be deleted as well.
+
+In the following steps, we'll enable a system-assigned managed identity and grant Form Recognizer limited access to your Azure blob storage account.
+
+## Enable a system-assigned managed identity
+
+>[!IMPORTANT]
+>
+> To enable a system-assigned managed identity, you need **Microsoft.Authorization/roleAssignments/write** permissions, such as [**Owner**](../../role-based-access-control/built-in-roles.md#owner) or [**User Access Administrator**](../../role-based-access-control/built-in-roles.md#user-access-administrator). You can specify a scope at four levels: management group, subscription, resource group, or resource.
+
+1. Sign in to the [Azure portal](https://portal.azure.com) using an account associated with your Azure subscription.
+
+1. Navigate to your **Form Recognizer** resource page in the Azure portal.
+
+1. In the left rail, Select **Identity** from the **Resource Management** list:
+
+ :::image type="content" source="media/managed-identities/resource-management-identity-tab.png" alt-text="Screenshot: resource management identity tab in the Azure portal.":::
+
+1. In the main window, toggle the **System assigned Status** tab to **On**.
+
+1. Under **Permissions** select **Azure role assignments**:
+
+ :::image type="content" source="media/managed-identities/enable-system-assigned-managed-identity-portal.png" alt-text="Screenshot: enable system-assigned managed identity in Azure portal.":::
+
+1. An Azure role assignments page will open. Choose your subscription from the drop-down menu then select **&plus; Add role assignment**.
+
+ :::image type="content" source="media/managed-identities/azure-role-assignments-page-portal.png" alt-text="Screenshot: Azure role assignments page in the Azure portal.":::
+
+ > [!NOTE]
+ >
+ > If you're unable to assign a role in the Azure portal because the Add > Add role assignment option is disabled or you get the permissions error, "you do not have permissions to add role assignment at this scope", check that you're currently signed in as a user with an assigned a role that has Microsoft.Authorization/roleAssignments/write permissions such as Owner or User Access Administrator at the Storage scope for the storage resource.
+
+ 7. Next, you're going to assign a **Storage Blob Data Reader** role to your Form Recognizer service resource. In the **Add role assignment** pop-up window complete the fields as follows and select **Save**:
+
+ | Field | Value|
+ ||--|
+ |**Scope**| ***Storage***|
+ |**Subscription**| ***The subscription associated with your storage resource***.|
+ |**Resource**| ***The name of your storage resource***|
+ |**Role** | ***Storage Blob Data Reader***ΓÇöallows for read access to Azure Storage blob containers and data.|
+
+ :::image type="content" source="media/managed-identities/add-role-assignment-window.png" alt-text="Screenshot: add role assignments page in the Azure portal.":::
+
+1. After you've received the _Added Role assignment_ confirmation message, refresh the page to see the added role assignment.
+
+ :::image type="content" source="media/managed-identities/add-role-assignment-confirmation.png" alt-text="Screenshot: Added role assignment confirmation pop-up message.":::
+
+1. If you don't see the change right away, wait and try refreshing the page once more. When you assign or remove role assignments, it can take up to 30 minutes for changes to take effect.
+
+ :::image type="content" source="media/managed-identities/assigned-roles-window.png" alt-text="Screenshot: Azure role assignments window.":::
+
+ That's it! You've completed the steps to enable a system-assigned managed identity. With this identity credential, you can grant Form Recognizer-specific access rights to documents and files stored in your BYOS account.
+
+## Learn more about managed identity
+
+> [!div class="nextstepaction"]
+> [Managed identities for Azure resources: frequently asked questions - Azure AD](../../active-directory/managed-identities-azure-resources/managed-identities-faq.md)
applied-ai-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/overview.md
Previously updated : 12/10/2021 Last updated : 02/15/2022 recommendations: false keywords: automated data processing, document processing, automated data entry, forms processing
:::image type="content" source="media/form-recognizer-icon.png" alt-text="Form Recognizer icon from the Azure portal.":::
-Azure Form Recognizer is a cloud-based [Azure Applied AI Service](../../applied-ai-services/index.yml) that uses machine-learning models to extract and analyze form fields, text, and tables from your documents. Form Recognizer analyzes your forms and documents, extracts text and data, maps field relationships as key-value pairs, and returns a structured JSON output. You quickly get accurate results that are tailored to your specific content without excessive manual intervention or extensive data science expertise. Use Form Recognizer to automate your data processing in applications and workflows, enhance data-driven strategies, and enrich document search capabilities.
+Azure Form Recognizer is a cloud-based [Azure Applied AI Service](../../applied-ai-services/index.yml) that uses machine-learning models to extract key-value pairs, text, and tables from your documents. Form Recognizer analyzes your forms and documents, extracts text and data, maps field relationships as key-value pairs, and returns a structured JSON output. You quickly get accurate results that are tailored to your specific content without excessive manual intervention or extensive data science expertise. Use Form Recognizer to automate your data processing in applications and workflows, enhance data-driven strategies, and enrich document search capabilities.
Form Recognizer easily identifies, extracts, and analyzes the following document data:
The following features and development options are supported by the Form Recogn
| Feature | Description | Development options | |-|--|-|
-|[🆕 **General document model**](concept-general-document.md)|Extract text, tables, structure, key-value pairs and, named entities.|<ul ><li>[**Form Recognizer Studio**](quickstarts/try-v3-form-recognizer-studio.md#prebuilt-models)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md#try-it-general-document-model)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#general-document-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#general-document-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#general-document-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#general-document-model)</li></ul> |
-|[**Layout model**](concept-layout.md) | Extract text, selection marks, and tables structures, along with their bounding box coordinates, from forms and documents.</br></br> Layout API has been updated to a prebuilt model. | <ul><li>[**Form Recognizer Studio**](quickstarts/try-v3-form-recognizer-studio.md#layout)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md#try-it-layout-model)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#layout-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#layout-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#layout-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#layout-model)</li></ul>|
-|[**Custom model (updated)**](concept-custom.md) | Extraction and analysis of data from forms and documents specific to distinct business data and use cases.</br></br>Custom model API v3.0 supports **signature detection for custom forms**.</li></ul>| <ul><li>[**Form Recognizer Studio**](quickstarts/try-v3-form-recognizer-studio.md#custom-projects)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md)</li></ul>|
-|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. | <ul><li>[**Form Recognizer Studio**](quickstarts/try-v3-form-recognizer-studio.md#prebuilt-models)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li></ul>|
-|[**Receipt model (updated)**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.</br></br>Receipt model v3.0 supports processing of **single-page hotel receipts**.| <ul><li>[**Form Recognizer Studio**](quickstarts/try-v3-form-recognizer-studio.md#prebuilt-models)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
-|[**ID document model (updated)**](concept-id-document.md) |Automated data processing and extraction of key information from US driver's licenses and international passports.</br></br>Prebuilt ID document API supports the **extraction of endorsements, restrictions, and vehicle classifications from US driver's licenses**. |<ul><li> [**Form Recognizer Studio**](quickstarts/try-v3-form-recognizer-studio.md#prebuilt-models)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
-|[**Business card model**](concept-business-card.md) |Automated data processing and extraction of key information from business cards.| <ul><li>[**Form Recognizer Studio**](quickstarts/try-v3-form-recognizer-studio.md#prebuilt-models)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
+|[🆕 **Read**](concept-read.md)|Extract text lines, words, detected languages, and handwritten style if detected.|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/read)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md#try-it-general-document-model)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#general-document-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#general-document-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#general-document-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#general-document-model)</li></ul> |
+|[🆕 **General document model**](concept-general-document.md)|Extract text, tables, structure, key-value pairs and, named entities.|<ul ><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/document)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md#try-it-general-document-model)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#general-document-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#general-document-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#general-document-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#general-document-model)</li></ul> |
+|[**Layout model**](concept-layout.md) | Extract text, selection marks, and tables structures, along with their bounding box coordinates, from forms and documents.</br></br> Layout API has been updated to a prebuilt model. | <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/layout)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md#try-it-layout-model)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#layout-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#layout-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#layout-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#layout-model)</li></ul>|
+|[**Custom model (updated)**](concept-custom.md) | Extraction and analysis of data from forms and documents specific to distinct business data and use cases.</br></br>Custom model API v3.0 supports **signature detection for custom forms**.</li></ul>| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/custommodel/projects)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md)</li></ul>|
+|[**Invoice model**](concept-invoice.md) | Automated data processing and extraction of key information from sales invoices. | <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice)</li><li>[**REST API**](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li></ul>|
+|[**Receipt model (updated)**](concept-receipt.md) | Automated data processing and extraction of key information from sales receipts.</br></br>Receipt model v3.0 supports processing of **single-page hotel receipts**.| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
+|[**ID document model (updated)**](concept-id-document.md) |Automated data processing and extraction of key information from US driver's licenses and international passports.</br></br>Prebuilt ID document API supports the **extraction of endorsements, restrictions, and vehicle classifications from US driver's licenses**. |<ul><li> [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
+|[**Business card model**](concept-business-card.md) |Automated data processing and extraction of key information from business cards.| <ul><li>[**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard)</li><li>[**REST API**](quickstarts/try-v3-rest-api.md)</li><li>[**C# SDK**](quickstarts/try-v3-csharp-sdk.md#prebuilt-model)</li><li>[**Python SDK**](quickstarts/try-v3-python-sdk.md#prebuilt-model)</li><li>[**Java SDK**](quickstarts/try-v3-java-sdk.md#prebuilt-model)</li><li>[**JavaScript**](quickstarts/try-v3-javascript-sdk.md#prebuilt-model)</li></ul>|
applied-ai-services Get Started Sdk Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/get-started-sdk-rest-api.md
# Get started with Form Recognizer client library SDKs or REST API
-Get started with Azure Form Recognizer using the programming language of your choice. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract and analyze form fields, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
+Get started with Azure Form Recognizer using the programming language of your choice. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
To learn more about Form Recognizer features and development options, visit our [Overview](../overview.md#form-recognizer-features-and-development-options) page.
applied-ai-services Try Sample Label Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-sample-label-tool.md
keywords: document processing
<!-- markdownlint-disable MD029 --> # Get started with the Form Recognizer Sample Labeling tool
-Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine-learning models to extract and analyze form fields, text, and tables from your documents. You can use Form Recognizer to automate your data processing in applications and workflows, enhance data-driven strategies, and enrich document search capabilities.
+Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine-learning models to extract key-value pairs, text, and tables from your documents. You can use Form Recognizer to automate your data processing in applications and workflows, enhance data-driven strategies, and enrich document search capabilities.
The Form Recognizer Sample Labeling tool is an open source tool that enables you to test the latest features of Azure Form Recognizer and Optical Character Recognition (OCR)
Form Recognizer offers several prebuilt models to choose from. Each model has it
1. Select **Run analysis**. The Form Recognizer Sample Labeling tool will call the Analyze Prebuilt API and analyze the document.
-1. View the results - see the key value pairs extracted, line items, highlighted text extracted and tables detected.
+1. View the results - see the key-value pairs extracted, line items, highlighted text extracted and tables detected.
:::image type="content" source="../media/label-tool/prebuilt-2.jpg" alt-text="Analyze Results of Form Recognizer invoice model":::
Train a custom model to analyze and extract data from forms and documents specif
1. Start by creating a new CORS entry in the Blob service.
- 1. Set the **Allowed origins** to **https://formrecognizer.appliedai.azure.com**.
+ 1. Set the **Allowed origins** to **https://fott-2-1.azurewebsites.net**.
1. Select all the available 8 options for **Allowed methods**.
Train a custom model to analyze and extract data from forms and documents specif
1. Navigate to the [Form Recognizer Sample Tool](https://fott-2-1.azurewebsites.net/).
-1. On the sample tool home page select **Use custom form to train a model with labels and get key value pairs**.
+1. On the sample tool home page select **Use custom form to train a model with labels and get key-value pairs**.
:::image type="content" source="../media/label-tool/custom-1.jpg" alt-text="Train a custom model.":::
Choose the Train icon on the left pane to open the Training page. Then select th
* **Model ID** - The ID of the model that was created and trained. Each training call creates a new model with its own ID. Copy this string to a secure location; you'll need it if you want to do prediction calls through the [REST API](./try-sdk-rest-api.md?pivots=programming-language-rest-api) or [client library](./try-sdk-rest-api.md). * **Average Accuracy** - The model's average accuracy. You can improve model accuracy by labeling additional forms and retraining to create a new model. We recommend starting by labeling five forms analyzing and testing the results and then if needed adding more forms as needed.
-* The list of tags, and the estimated accuracy per tag.
+* The list of tags, and the estimated accuracy per tag. For more information, _see_ [Interpret and improve accuracy and confidence](../concept-accuracy-confidence.md).
:::image type="content" source="../media/label-tool/custom-3.jpg" alt-text="Training view tool.":::
applied-ai-services Try V3 Csharp Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-csharp-sdk.md
Previously updated : 01/28/2022 Last updated : 02/15/2022 recommendations: false
>[!NOTE] > Form Recognizer v3.0 is currently in public preview. Some features may not be supported or have limited capabilities.
-[Reference documentation](/dotnet/api/overview/azure/ai.formrecognizer-readme?view=azure-dotnet&preserve-view=true ) | [Library source code](https://github.com/Azure/azure-sdk-for-net/tree/Azure.AI.FormRecognizer_4.0.0-beta.1/sdk/formrecognizer/Azure.AI.FormRecognizer/src) | [Package (NuGet)](https://www.nuget.org/packages/Azure.AI.FormRecognizer) | [Samples](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.FormRecognizer_4.0.0-beta.1/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/README.md)
+[Reference documentation](/dotnet/api/azure.ai.formrecognizer.documentanalysis?view=azure-dotnet-preview&preserve-view=true) |[Library Source Code](https://github.com/Azure/azure-sdk-for-net/tree/Azure.AI.FormRecognizer_4.0.0-beta.3/sdk/formrecognizer/Azure.AI.FormRecognizer/) |[Package (NuGet)](https://www.nuget.org/packages/Azure.AI.FormRecognizer/4.0.0-beta.3) | [Samples](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/README.md)
-Get started with Azure Form Recognizer using the C# programming language. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract and analyze form fields, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
+Get started with Azure Form Recognizer using the C# programming language. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
To learn more about Form Recognizer features and development options, visit our [Overview](../overview.md#form-recognizer-features-and-development-options) page.
applied-ai-services Try V3 Form Recognizer Studio https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-form-recognizer-studio.md
Previously updated : 09/14/2021 Last updated : 02/15/2022
>[!NOTE] > Form Recognizer Studio is currently in public preview. Some features may not be supported or have limited capabilities.
-[Form Recognizer Studio preview](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service in your applications. Get started with exploring the pre-trained models with sample documents or your own. Create projects to build custom form models and reference the models in your applications using the [Python SDK preview](try-v3-python-sdk.md) and other quickstarts.
+[Form Recognizer Studio preview](https://formrecognizer.appliedai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Form Recognizer service in your applications. Get started with exploring the pre-trained models with sample documents or your own. Create projects to build custom template models and reference the models in your applications using the [Python SDK preview](try-v3-python-sdk.md) and other quickstarts.
-
-## Migrating from the sample labeling tool
-
-If you are a previous user of the [sample labeling tool](try-sample-label-tool.md), skip the prerequisites to [**sign into the Studio preview**](try-v3-form-recognizer-studio.md#sign-into-the-form-recognizer-studio-preview) to use your existing Azure account and Form Recognizer or Cognitive Services resources with the Studio.
-
-To migrate your existing custom projects to the Studio, jump ahead to the [**Custom model getting started**](try-v3-form-recognizer-studio.md#custom-projects) section to create a new project and point it to the same Azure Blob storage location assuming you have access to it in Azure. Once you configure a new project, the Studio will load all documents and interim files for labeling and training.
## Prerequisites for new users * An active [**Azure account**](https://azure.microsoft.com/free/cognitive-services/). If you don't have one, you can [**create a free account**](https://azure.microsoft.com/free/). * A [**Form Recognizer**](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) or [**Cognitive Services multi-service**](https://portal.azure.com/#create/Microsoft.CognitiveServicesAllInOne) resource.
-## Sign into the Form Recognizer Studio preview
+## Pretrained models
-> [!NOTE]
-> **Virtual networks (VNETs)**
->
-> If you are using the Studio with service endpoints and blob storage configured within a virtual network (VNET), ensure that your computer is in the same VNET as the endpoint and the storage container.
-
-After you have completed the prerequisites, navigate to the [Form Recognizer Studio preview](https://formrecognizer.appliedai.azure.com).
+After you have completed the prerequisites, navigate to the [Form Recognizer Studio General Documents preview](https://formrecognizer.appliedai.azure.com). In the following example, we use the General Documents feature. The steps to use other pre-trained features like [Read](https://formrecognizer.appliedai.azure.com/studio/read), [Layout](https://formrecognizer.appliedai.azure.com/studio/layout), [Invoice](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=invoice), [Receipt](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt), [Business card](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard), [ID documents](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument), and [W2 tax form](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2) models are similar.
1. Select a Form Recognizer service feature from the Studio home page.
-1. Select your Azure subscription, resource group, and resource. (You can change the resources anytime in "Settings" in the top menu.)
-
-1. Review and confirm your selections.
+1. This is a one-time step unless you have already selected the service resource from prior use. Select your Azure subscription, resource group, and resource. (You can change the resources anytime in "Settings" in the top menu.) Review and confirm your selections.
+1. Select the Analyze command to run analysis on the sample document or try your document by using the Add command.
-## Layout
-
-In the Layout view:
-
-1. Select the Analyze command to run Layout analysis on the sample document or try your document by using the Add command.
-
-1. Observe the highlighted extracted text, the table icons showing the extracted table locations, and highlighted selection marks.
+1. Observe the highlighted extracted content in the document view. Hover your move over the keys and values to see details.
1. Use the controls at the bottom of the screen to zoom in and out and rotate the document view.
There are several prebuilt models to choose from, each of which has its own set
* [**Receipt**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=receipt): extracts text and key information from receipts. * [**ID document**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=idDocument): extracts text and key information from driver licenses and international passports. * [**Business card**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=businessCard): extracts text and key information from business cards.
+* [**W-2**](https://formrecognizer.appliedai.azure.com/studio/prebuilt?formType=tax.us.w2): extracts text and key information from W-2 tax forms.
-In the Prebuilt view:
+1. In the output section's Content tab, browse the list of extracted key-value pairs and entities. For other Form Recognizer features, the Content tab will show the corresponding insights extracted.
-1. From the Studio home, select one of the prebuilt models. In this example, we are using the Invoice model.
+1. From the results tab, check out the formatted JSON response from the service. Search and browse the JSON response to understand the service results.
-1. Select the Analyze command to run analysis on the sample document or try your invoice by using the Add command.
+1. From the Code tab, copy the code sample to get started on integrating the feature with your application.
-1. In the visualization section, observe the highlighted fields and values and invoice line items. All extracted text and tables are also shown.
-
-1. In the output section's Fields tab, note the listed fields and values, and select the line items to view in a table-like format.
-
-1. In the output section's Result tab, browse the JSON output to understand the service response format. Copy and download to jumpstart integration.
- ## Additional prerequisites for custom projects
In addition to the Azure account and a Form Recognizer or Cognitive Services res
### Azure Blob Storage container
-A **standard performance** [**Azure Blob Storage account**](https://portal.azure.com/#create/Microsoft.StorageAccount-ARM). You'll create containers to store and organize your blob data within your storage account. If you don't know how to create an Azure storage account with a container, following these quickstarts:
+A **standard performance** [**Azure Blob Storage account**](https://portal.azure.com/#create/Microsoft.StorageAccount-ARM). You'll create containers to store and organize your training documents within your storage account. If you don't know how to create an Azure storage account with a container, following these quickstarts:
- * [**Create a storage account**](../../../storage/common/storage-account-create.md). When creating your storage account, make sure to select **Standard** performance in the **Instance details → Performance** field.
- * [**Create a container**](../../../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container). When creating your container, set the **Public access level** field to **Container** (anonymous read access for containers and blobs) in the **New Container** window.
+* [**Create a storage account**](../../../storage/common/storage-account-create.md). When creating your storage account, make sure to select **Standard** performance in the **Instance details → Performance** field.
+* [**Create a container**](../../../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container). When creating your container, set the **Public access level** field to **Container** (anonymous read access for containers and blobs) in the **New Container** window.
### Configure CORS
CORS should now be configured to use the storage account from Form Recognizer St
:::image border="true" type="content" source="../media/sas-tokens/upload-blob-window.png" alt-text="Screenshot: upload blob window in the Azure portal."::: > [!NOTE]
-> By default, the Studio will use form documents that are located at the root of your container. However, you can use data organized in folders if specified in the Custom form project creation steps. *See* [**Organize your data in subfolders**](../build-training-data-set.md#organize-your-data-in-subfolders-optional)
-
-## Custom projects
+> By default, the Studio will use form documents that are located at the root of your container. However, you can use data organized in folders by specifying the folder path in the Custom form project creation steps. *See* [**Organize your data in subfolders**](../build-training-data-set.md#organize-your-data-in-subfolders-optional)
-### Getting started
+## Custom models
To create custom models, you start with configuring your project:
-1. From the Studio home, select the [Custom form project](https://formrecognizer.appliedai.azure.com/studio/customform/projects) to open the Custom form home page.
+1. From the Studio home, select the Custom model card to open the Custom models page.
1. Use the "Create a project" command to start the new project configuration wizard.
To create custom models, you start with configuring your project:
1. Review and submit your settings to create the project. -
-### Basic flow
-
-After the project creation step, in the custom model phase:
- 1. From the labeling view, define the labels and their types that you are interested in extracting. 1. Select the text in the document and select the label from the drop-down list or the labels pane. 1. Label four more documents to get at least five documents labeled.
-1. Select the Train command and enter model name and description to start training your custom model.
+1. Select the Train command and enter model name, select whether you want the custom template (form) or custom neural (document) model to start training your custom model.
1. Once the model is ready, use the Test command to validate it with your test documents and observe the results. -
-### Other features
-In addition, view all your models using the Models tab on the left. From the list view, select model(s) to perform the following actions:
+### Labeling as tables
-1. Test the model from the list view.
+> [!NOTE]
+> Tables are currently only supported for custom template models. When training a custom neural model, labeled tables are ignored.
1. Use the Delete command to delete models that are not required.
In addition, view all your models using the Models tab on the left. From the lis
1. Select multiple models and compose them into a new model to be used in your applications.
-## Labeling as tables
+Using tables as the visual pattern:
-While creating your custom models, you may need to extract data collections from your documents. These may appear in a couple of formats. Using tables as the visual pattern:
+For custom form models, while creating your custom models, you may need to extract data collections from your documents. These may appear in a couple of formats. Using tables as the visual pattern:
* Dynamic or variable count of values (rows) for a given set of fields (columns) * Specific collection of values for a given set of fields (columns and/or rows)
-### Label as dynamic table
+**Label as dynamic table**
Use dynamic tables to extract variable count of values (rows) for a given set of fields (columns):
Use dynamic tables to extract variable count of values (rows) for a given set of
:::image border="true" type="content" source="../media/quickstarts/custom-tables-dynamic.gif" alt-text="Form Recognizer labeling as dynamic table example":::
-### Label as fixed table
+**Label as fixed table**
Use fixed tables to extract specific collection of values for a given set of fields (columns and/or rows):
Use fixed tables to extract specific collection of values for a given set of fie
:::image border="true" type="content" source="../media/quickstarts/custom-tables-fixed.gif" alt-text="Form Recognizer Labeling as fixed table example":::
-## Labeling for signature detection
+### Signature detection
+
+>[!NOTE]
+> Signature fields are currently only supported for custom template models. When training a custom neural model, labeled signature fields are ignored.
-To label for signature detection:
+To label for signature detection: (Custom form only)
1. In the labeling view, create a new "Signature" type label and name it.
applied-ai-services Try V3 Java Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-java-sdk.md
Previously updated : 01/28/2022 Last updated : 02/15/2022 recommendations: false
>[!NOTE] > Form Recognizer v3.0 is currently in public preview. Some features may not be supported or have limited capabilities.
-[Reference documentation](https://azuresdkdocs.blob.core.windows.net/$web/jav)
+[Reference documentation](https://azuresdkdocs.blob.core.windows.net/$web/jav)
-Get started with Azure Form Recognizer using the Java programming language. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract and analyze form fields, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
+Get started with Azure Form Recognizer using the Java programming language. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
To learn more about Form Recognizer features and development options, visit our [Overview](../overview.md#form-recognizer-features-and-development-options) page.
applied-ai-services Try V3 Javascript Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-javascript-sdk.md
Previously updated : 01/28/2022 Last updated : 02/15/2022 recommendations: false
>[!NOTE] > Form Recognizer v3.0 is currently in public preview. Some features may not be supported or have limited capabilities.
-[Reference documentation](https://azuresdkdocs.blob.core.windows.net/$web/javascript/azure-ai-form-recognizer/4.0.0-beta.1/https://docsupdatetracker.net/index.html) | [Library source code](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/formrecognizer/ai-form-recognizer/src) | [Package (NuGet)](https://www.nuget.org/packages/Azure.AI.FormRecognizer) | [Samples](https://github.com/Azure/azure-sdk-for-js/blob/%40azure/ai-form-recognizer_4.0.0-beta.1/sdk/formrecognizer/ai-form-recognizer/README.md)
+[Reference documentation](https://azuresdkdocs.blob.core.windows.net/$web/javascript/azure-ai-form-recognizer/4.0.0-beta.3/https://docsupdatetracker.net/index.html) | [Library source code](https://github.com/Azure/azure-sdk-for-js/tree/@azure/ai-form-recognizer_4.0.0-beta.3/sdk/formrecognizer/ai-form-recognizer/) | [Package (npm)](https://www.npmjs.com/package/@azure/ai-form-recognizer/v/4.0.0-beta.3) | [Samples](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/formrecognizer/ai-form-recognizer/samples/v4-bet)
-Get started with Azure Form Recognizer using the JavaScript programming language. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract and analyze form fields, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
+Get started with Azure Form Recognizer using the JavaScript programming language. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
To learn more about Form Recognizer features and development options, visit our [Overview](../overview.md#form-recognizer-features-and-development-options) page.
In this quickstart you'll use following features to analyze and extract data and
1. Specify your project's attributes using the prompts presented in the terminal. * The most important attributes are name, version number, and entry point.
- * We recommend keeping `index.js` for the entry point name. Description, test command, github repository, keywords, author, and license information are optional attributes that you can choose to skip for this project.
+ * We recommend keeping `index.js` for the entry point name. Description, test command, GitHub repository, keywords, author, and license information are optional attributes that you can choose to skip for this project.
* Accept the suggestions in parentheses by selecting **Return** or **Enter**. * After completing the prompts, a `package.json` file will be created in your form-recognizer-app directory.
applied-ai-services Try V3 Python Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-python-sdk.md
Previously updated : 01/28/2022 Last updated : 02/15/2022 recommendations: false
>[!NOTE] > Form Recognizer v3.0 is currently in public preview. Some features may not be supported or have limited capabilities.
-[Reference documentation](https://azuresdkdocs.blob.core.windows.net/$web/python/azure-ai-formrecognizer/latest/azure.ai.formrecognizer.html) | [Library source code](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/formrecognizer/azure-ai-formrecognizer/azure/ai/formrecognizer) | [Package (PyPi)](https://pypi.org/project/azure-ai-formrecognizer/) | [Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/formrecognizer/azure-ai-formrecognizer/samples)
+[Reference documentation](https://azuresdkdocs.blob.core.windows.net/$web/python/azure-ai-formrecognizer/3.2.0b3/https://docsupdatetracker.net/index.html) | [Library source code](https://github.com/Azure/azure-sdk-for-python/tree/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/) | [Package (PyPi)](https://pypi.org/project/azure-ai-formrecognizer/3.2.0b3/) | [Samples](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/samples/README.md)
- Get started with Azure Form Recognizer using the Python programming language. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract and analyze form fields, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
+ Get started with Azure Form Recognizer using the Python programming language. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
To learn more about Form Recognizer features and development options, visit our [Overview](../overview.md#form-recognizer-features-and-development-options) page.
applied-ai-services Try V3 Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-rest-api.md
Previously updated : 01/28/2022 Last updated : 02/15/2022 - # Get started: Form Recognizer REST API v3.0 | Preview >[!NOTE]
-> Form Recognizer v3.0 is currently in public preview. Some features may not be supported or have limited capabilities.
+> Form Recognizer v3.0 is currently in public preview. Some features may not be supported or have limited capabilities.
+The current API version is ```2022-01-30-preview```
-| [Form Recognizer REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument) | [Azure REST API reference](/rest/api/azure/) |
+| [Form Recognizer REST API](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument) | [Azure REST API reference](/rest/api/azure/) |
-Get started with Azure Form Recognizer using the C# programming language. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract and analyze form fields, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
+Get started with Azure Form Recognizer using the REST API. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. You can easily call Form Recognizer models using the REST API or by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
To learn more about Form Recognizer features and development options, visit our [Overview](../overview.md#form-recognizer-features-and-development-options) page. ## Form Recognizer models
To learn more about Form Recognizer features and development options, visit our
The REST API supports the following models and capabilities: * 🆕General document—Analyze and extract text, tables, structure, key-value pairs, and named entities.|
+* 🆕 W-2 Tax Forms—Analyze and extract fields from W-2 tax documents, using a pre-trained W-2 model.
* LayoutΓÇöAnalyze and extract tables, lines, words, and selection marks like radio buttons and check boxes in forms documents, without the need to train a model. * CustomΓÇöAnalyze and extract form fields and other content from your custom forms, using models you trained with your own form types. * InvoicesΓÇöAnalyze and extract common fields from invoices, using a pre-trained invoice model.
To learn more about Form Recognizer features and development options, visit our
## Analyze document
-Form Recognizer v3.0 consolidates the analyze document and get analyze result (GET) operations for layout, prebuilt models, and custom models into a single pair of operations by assigningΓÇ»`modelIds` to the POST and GET operations:
+Form Recognizer v3.0 consolidates the analyze document (POST) and get analyze results (GET) operations for layout, prebuilt models, and custom models into a single pair of operations by assigningΓÇ»`modelIds` to the POST and GET operations:
```http POST /documentModels/{modelId}:analyze
The following table illustrates the updates to the REST API calls.
|Receipt | `/prebuilt/receipt/analyze` | `/documentModels/prebuilt-receipt:analyze` | |ID document| `/prebuilt/idDocument/analyze` | `/documentModels/prebuilt-idDocument:analyze`| |Business card| `/prebuilt/businessCard/analyze` | `/documentModels/prebuilt-businessCard:analyze` |
+|W-2 tax document| | `/documentModels/prebuilt-tax.us.w2:analyze`
|Custom| `/custom/{modelId}/analyze` |`/documentModels/{modelId}:analyze`| In this quickstart you'll use following features to analyze and extract data and values from forms and documents:
In this quickstart you'll use following features to analyze and extract data and
#### Request ```bash
-curl -v -i POST "https://{endpoint}/formrecognizer/documentModels/prebuilt-document:analyze?api-version=2021-09-30-preview" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{'urlSource': '{your-document-url}'}"
+curl -v -i POST "https://{endpoint}/formrecognizer/documentModels/prebuilt-document:analyze?api-version=2022-01-30-preview" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{'urlSource': '{your-document-url}'}"
``` #### Operation-Location You'll receive a `202 (Success)` response that includes an **Operation-Location** header. The value of this header contains a result ID that you can use to query the status of the asynchronous operation and get the results:
-https://{host}/formrecognizer/documentModels/{modelId}/analyzeResults/**{resultId}**?api-version=2021-09-30-preview
+https://{host}/formrecognizer/documentModels/{modelId}/analyzeResults/**{resultId}**?api-version=2022-01-30-preview
### Get general document results
After you've called the **[Analyze document](https://westus.dev.cognitive.micros
#### Request ```bash
-curl -v -X GET "https://{endpoint}/formrecognizer/documentModels/prebuilt-document/analyzeResults/{resultId}?api-version=2021-09-30-preview" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+curl -v -X GET "https://{endpoint}/formrecognizer/documentModels/prebuilt-document/analyzeResults/{resultId}?api-version=2022-01-30-preview" -H "Ocp-Apim-Subscription-Key: {subscription key}"
``` ### Examine the response
The `"analyzeResults"` node contains all of the recognized text. Text is organiz
"createdDateTime": "2021-09-28T16:52:51Z", "lastUpdatedDateTime": "2021-09-28T16:53:08Z", "analyzeResult": {
- "apiVersion": "2021-09-30-preview",
+ "apiVersion": "2022-01-30-preview",
"modelId": "prebuilt-document", "stringIndexType": "textElements", "content": "content extracted",
The `"analyzeResults"` node contains all of the recognized text. Text is organiz
#### Request ```bash
-curl -v -i POST "https://{endpoint}/formrecognizer/documentModels/prebuilt-layout:analyze?api-version=2021-09-30-preview" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{'urlSource': '{your-document-url}'}"
+curl -v -i POST "https://{endpoint}/formrecognizer/documentModels/prebuilt-layout:analyze?api-version=2022-01-30-preview" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{'urlSource': '{your-document-url}'}"
```
curl -v -i POST "https://{endpoint}/formrecognizer/documentModels/prebuilt-layou
You'll receive a `202 (Success)` response that includes an **Operation-Location** header. The value of this header contains a result ID that you can use to query the status of the asynchronous operation and get the results:
-`https://{host}/formrecognizer/documentModels/{modelId}/analyzeResults/**{resultId}**?api-version=2021-09-30-preview`
+`https://{host}/formrecognizer/documentModels/{modelId}/analyzeResults/**{resultId}**?api-version=2022-01-30-preview`
### Get layout results
-After you've called the **[Analyze document](https://westus.api.cognitive.microsoft.com/formrecognizer/documentModels/prebuilt-layout:analyze?api-version=2021-09-30-preview&stringIndexType=textElements)** API, call the **[Get analyze result](https://westus.api.cognitive.microsoft.com/formrecognizer/documentModels/prebuilt-layout/analyzeResults/{resultId}?api-version=2021-09-30-preview)** API to get the status of the operation and the extracted data. Before you run the command, make these changes:
+After you've called the **[Analyze document](https://westus.api.cognitive.microsoft.com/formrecognizer/documentModels/prebuilt-layout:analyze?api-version=2022-01-30-preview&stringIndexType=textElements)** API, call the **[Get analyze result](https://westus.api.cognitive.microsoft.com/formrecognizer/documentModels/prebuilt-layout/analyzeResults/{resultId}?api-version=2022-01-30-preview)** API to get the status of the operation and the extracted data. Before you run the command, make these changes:
1. Replace `{endpoint}` with the endpoint that you obtained with your Form Recognizer subscription. 1. Replace `{subscription key}` with the subscription key you copied from the previous step.
After you've called the **[Analyze document](https://westus.api.cognitive.micros
#### Request ```bash
-curl -v -X GET "https://{endpoint}/formrecognizer/documentModels/prebuilt-layout/analyzeResults/{resultId}?api-version=2021-09-30-preview" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+curl -v -X GET "https://{endpoint}/formrecognizer/documentModels/prebuilt-layout/analyzeResults/{resultId}?api-version=2022-01-30-preview" -H "Ocp-Apim-Subscription-Key: {subscription key}"
``` ### Examine the response
-You'll receive a `200 (Success)` response with JSON output. The first field, `"status"`, indicates the status of the operation. If the operation is not complete, the value of `"status"` will be `"running"` or `"notStarted"`, and you should call the API again, either manually or through a script. We recommend an interval of one second or more between calls.
+You'll receive a `200 (Success)` response with JSON output. The first field, `"status"`, indicates the status of the operation. If the operation isn't complete, the value of `"status"` will be `"running"` or `"notStarted"`, and you should call the API again, either manually or through a script. We recommend an interval of one second or more between calls.
## **Try it**: Prebuilt model
This sample demonstrates how to analyze data from certain common document types
##### Choose the invoice prebuilt model ID
-You are not limited to invoicesΓÇöthere are several prebuilt models to choose from, each of which has its own set of supported fields. The model to use for the analyze operation depends on the type of document to be analyzed. Here are the model IDs for the prebuilt models currently supported by the Form Recognizer service:
+You aren't limited to invoicesΓÇöthere are several prebuilt models to choose from, each of which has its own set of supported fields. The model to use for the analyze operation depends on the type of document to be analyzed. Here are the model IDs for the prebuilt models currently supported by the Form Recognizer service:
* **prebuilt-invoice**: extracts text, selection marks, tables, key-value pairs, and key information from invoices. * **prebuilt-businessCard**: extracts text and key information from business cards.
Before you run the command, make these changes:
#### Request ```bash
-curl -v -i POST "https://{endpoint}/formrecognizer/documentModels/prebuilt-invoice:analyze?api-version=2021-09-30-preview" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{'urlSource': '{your-document-url}'}"
+curl -v -i POST "https://{endpoint}/formrecognizer/documentModels/prebuilt-invoice:analyze?api-version=2022-01-30-preview" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{'urlSource': '{your-document-url}'}"
``` #### Operation-Location You'll receive a `202 (Success)` response that includes an **Operation-Location** header. The value of this header contains a result ID that you can use to query the status of the asynchronous operation and get the results:
-https://{host}/formrecognizer/documentModels/{modelId}/analyzeResults/**{resultId}**?api-version=2021-09-30-preview
+https://{host}/formrecognizer/documentModels/{modelId}/analyzeResults/**{resultId}**?api-version=2022-01-30-preview
### Get invoice results
-After you've called the **[Analyze document](https://westus.api.cognitive.microsoft.com/formrecognizer/documentModels/prebuilt-invoice:analyze?api-version=2021-09-30-preview&stringIndexType=textElements)** API, call the **[Get analyze result](https://westus.api.cognitive.microsoft.com/formrecognizer/documentModels/prebuilt-invoice/analyzeResults/{resultId}?api-version=2021-09-30-preview)** API to get the status of the operation and the extracted data. Before you run the command, make these changes:
+After you've called the **[Analyze document](https://westus.api.cognitive.microsoft.com/formrecognizer/documentModels/prebuilt-invoice:analyze?api-version=2022-01-30-preview&stringIndexType=textElements)** API, call the **[Get analyze result](https://westus.api.cognitive.microsoft.com/formrecognizer/documentModels/prebuilt-invoice/analyzeResults/{resultId}?api-version=2022-01-30-preview)** API to get the status of the operation and the extracted data. Before you run the command, make these changes:
1. Replace `{endpoint}` with the endpoint that you obtained with your Form Recognizer subscription. 1. Replace `{subscription key}` with the subscription key you copied from the previous step.
After you've called the **[Analyze document](https://westus.api.cognitive.micros
#### Request ```bash
-curl -v -X GET "https://{endpoint}/formrecognizer/documentModels/prebuilt-invoice/analyzeResults/{resultId}?api-version=2021-09-30-preview" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+curl -v -X GET "https://{endpoint}/formrecognizer/documentModels/prebuilt-invoice/analyzeResults/{resultId}?api-version=2022-01-30-preview" -H "Ocp-Apim-Subscription-Key: {subscription key}"
``` ### Examine the response
-You'll receive a `200 (Success)` response with JSON output. The first field, `"status"`, indicates the status of the operation. If the operation is not complete, the value of `"status"` will be `"running"` or `"notStarted"`, and you should call the API again, either manually or through a script. We recommend an interval of one second or more between calls.
+You'll receive a `200 (Success)` response with JSON output. The first field, `"status"`, indicates the status of the operation. If the operation isn't complete, the value of `"status"` will be `"running"` or `"notStarted"`, and you should call the API again, either manually or through a script. We recommend an interval of one second or more between calls.
### Improve results
You'll receive a `200 (Success)` response with JSON output. The first field, `"s
The preview v3.0  [List models](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/GetModels) request returns a paged list of prebuilt models in addition to custom models. Only models with status of succeeded are included. In-progress or failed models can be enumerated via the [List Operations](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/GetOperations) request. Use the nextLink property to access the next page of models, if any. To get more information about each returned model, including the list of supported documents and their fields, pass the modelId to the [Get Model](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/GetOperations)request. ```bash
-curl -v -X GET "https://{endpoint}/formrecognizer/documentModels?api-version=2021-09-30-preview"
+curl -v -X GET "https://{endpoint}/formrecognizer/documentModels?api-version=2022-01-30-preview"
``` ### Get a specific model
curl -v -X GET "https://{endpoint}/formrecognizer/documentModels?api-version=202
The preview v3.0 [Get model](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/GetModel) retrieves information about a specific model with a status of succeeded. For failed and in-progress models, use the [Get Operation](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/GetOperation) to track the status of model creation operations and any resulting errors. ```bash
-curl -v -X GET "https://{endpoint}/formrecognizer/documentModels/{modelId}?api-version=2021-09-30-preview" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+curl -v -X GET "https://{endpoint}/formrecognizer/documentModels/{modelId}?api-version=2022-01-30-preview" -H "Ocp-Apim-Subscription-Key: {subscription key}"
``` ### Delete a Model
curl -v -X GET "https://{endpoint}/formrecognizer/documentModels/{modelId}?api-v
The preview v3.0 [Delete model](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/DeleteModel) request removes the custom model and the modelId can no longer be accessed by future operations. New models can be created using the same modelId without conflict. ```bash
-curl -v -X DELETE "https://{endpoint}/formrecognizer/documentModels/{modelId}?api-version=2021-09-30-preview" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+curl -v -X DELETE "https://{endpoint}/formrecognizer/documentModels/{modelId}?api-version=2022-01-30-preview" -H "Ocp-Apim-Subscription-Key: {subscription key}"
``` ## Next steps
curl -v -X DELETE "https://{endpoint}/formrecognizer/documentModels/{modelId}?ap
In this quickstart, you used the Form Recognizer REST API preview (v3.0) to analyze forms in different ways. Next, explore the reference documentation to learn about Form Recognizer API in more depth. > [!div class="nextstepaction"]
-> [REST API preview (v3.0) reference documentation](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument)
+> [REST API preview (v3.0) reference documentation](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument)
applied-ai-services Service Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/service-limits.md
Previously updated : 09/30/2021 Last updated : 02/15/2022
For the usage with [Form Recognizer SDK](quickstarts/try-v3-csharp-sdk.md), [For
| **Concurrent Request limit** | 1 | 15 (default value) | | Adjustable | No<sup>2</sup> | Yes<sup>2</sup> | | **Compose Model limit** | 5 | 100 (default value) |-
+| **Custom neural model train** | 10 per month | 10 per month |
+| Adjustable | No<sup>2</sup> | Yes<sup>2</sup> |
<sup>1</sup> For **Free (F0)** pricing tier see also monthly allowances at the [pricing page](https://azure.microsoft.com/pricing/details/form-recognizer/). <sup>2</sup> See [best practices](#example-of-a-workload-pattern-best-practice), and [adjustment instructions](#create-and-submit-support-request). ## Detailed description, Quota adjustment, and best practices
-Before requesting a quota increase (where applicable), ensure that it is necessary. Form Recognizer service uses autoscaling to bring the required computational resources in "on-demand" and at the same time to keep the customer costs low, deprovision unused resources by not maintaining an excessive amount of hardware capacity. Every time your application receives a Response Code 429 ("Too many requests") while your workload is within the defined limits (see [Quotas and Limits quick reference](#form-recognizer-service-quotas-and-limits)) the most likely explanation is that the Service is scaling up to your demand and did not reach the required scale yet, thus it does not immediately have enough resources to serve the request. This state is usually transient and should not last long.
+Before requesting a quota increase (where applicable), ensure that it is necessary. Form Recognizer service uses autoscaling to bring the required computational resources in "on-demand" and at the same time to keep the customer costs low, deprovision unused resources by not maintaining an excessive amount of hardware capacity. Every time your application receives a Response Code 429 ("Too many requests") while your workload is within the defined limits (see [Quotas and Limits quick reference](#form-recognizer-service-quotas-and-limits)) the most likely explanation is that the Service is scaling up to your demand and didn't reach the required scale yet, thus it doesn't immediately have enough resources to serve the request. This state is transient and shouldn't last long.
### General best practices to mitigate throttling during autoscaling To minimize issues related to throttling (Response Code 429), we recommend using the following techniques:
The next sections describe specific cases of adjusting quotas.
Jump to [Form Recognizer: increasing concurrent request limit](#create-and-submit-support-request) ### Increasing transactions per second request limit
-By default the number of concurrent requests is limited to 15 transactions per second for a Form Recognizer resource. For the Standard pricing tier, this amount can be increased. Before submitting the request, ensure you are familiar with the material in [this section](#detailed-description-quota-adjustment-and-best-practices) and aware of these [best practices](#example-of-a-workload-pattern-best-practice).
+By default the number of concurrent requests is limited to 15 transactions per second for a Form Recognizer resource. For the Standard pricing tier, this amount can be increased. Before submitting the request, ensure you're familiar with the material in [this section](#detailed-description-quota-adjustment-and-best-practices) and aware of these [best practices](#example-of-a-workload-pattern-best-practice).
Increasing the Concurrent Request limit does **not** directly affect your costs. Form Recognizer service uses "Pay only for what you use" model. The limit defines how high the Service may scale before it starts throttle your requests. Existing value of Concurrent Request limit parameter is **not** visible via Azure portal, Command-Line tools, or API requests. To verify the existing value, create an Azure Support Request. #### Have the required information ready:+ - Form Recognizer Resource ID - Region
Initiate the increase of transactions per second(TPS) limit for your resource by
- A new window will appear with auto-populated information about your Azure Subscription and Azure Resource - Enter *Summary* (like "Increase Form Recognizer TPS limit") - In *Problem type* select "Quota or usage validation"-- Click *Next: Solutions*
+- Select *Next: Solutions*
- Proceed further with the request creation-- When in *Details* tab enter in the *Description* field:
- - a note, that the request is about **Form Recognizer** quota
- - Provide a TPS expectation you would like to scale to
- - Azure resource information you [collected before](#have-the-required-information-ready)
- - Complete entering the required information and click *Create* button in *Review + create* tab
- - Note the support request number in Azure portal notifications. You will be contacted shortly for further processing
+- Under the *Details* tab enters the following in the *Description* field:
+ - a note, that the request is about **Form Recognizer** quota.
+ - Provide a TPS expectation you would like to scale to meet.
+ - Azure resource information you [collected](#have-the-required-information-ready).
+ - Complete entering the required information and select *Create* button in *Review + create* tab
+ - Note the support request number in Azure portal notifications. You'll be contacted shortly for further processing
## Example of a workload pattern best practice
-This example presents the approach we recommend following to mitigate possible request throttling due to [Autoscaling being in progress](#detailed-description-quota-adjustment-and-best-practices). It is not an "exact recipe", but merely a template we invite to follow and adjust as necessary.
+This example presents the approach we recommend following to mitigate possible request throttling due to [Autoscaling being in progress](#detailed-description-quota-adjustment-and-best-practices). It isn't an "exact recipe", but merely a template we invite to follow and adjust as necessary.
-Let us suppose that a Form Recognizer resource has the default limit set. Start the workload to submit your analyze requests. If you find that you are seeing frequent throttling with response code 429, start by backing off on the GET analyze response request and retry using the 2-3-5-8 pattern. In general it is recommended that you not call the get analyze response more than once every 2 seconds for a corresponding POST request.
+Let us suppose that a Form Recognizer resource has the default limit set. Start the workload to submit your analyze requests. If you find that you're seeing frequent throttling with response code 429, start by backing off on the GET analyze response request and retry using the 2-3-5-8 pattern. In general it's recommended that you not call the get analyze response more than once every 2 seconds for a corresponding POST request.
-If you find that you are being throttled on the number of POST requests for documents being submitted, consider adding a delay between the requests. If your workload requires a higher degree of concurrent processing, you will then need to create a support request to increase your service limits on transactions per second.
+If you find that you're being throttled on the number of POST requests for documents being submitted, consider adding a delay between the requests. If your workload requires a higher degree of concurrent processing, you'll then need to create a support request to increase your service limits on transactions per second.
-Generally, it is highly recommended to test the workload and the workload patterns before going to production.
+Generally, it's highly recommended to test the workload and the workload patterns before going to production.
applied-ai-services Supervised Table Tags https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/supervised-table-tags.md
Title: "How to use table tags to train your custom form model - Form Recognizer"
+ Title: "How to use table tags to train your custom template model - Form Recognizer"
description: Learn how to effectively use supervised table tag labeling.
-# Use table tags to train your custom form model
+# Use table tags to train your custom template model
-In this article, you'll learn how to train your custom form model with table tags (labels). Some scenarios require more complex labeling than simply aligning key-value pairs. Such scenarios include extracting information from forms with complex hierarchical structures or encountering items that not automatically detected and extracted by the service. In these cases, you can use table tags to train your custom form model.
+In this article, you'll learn how to train your custom template model with table tags (labels). Some scenarios require more complex labeling than simply aligning key-value pairs. Such scenarios include extracting information from forms with complex hierarchical structures or encountering items that not automatically detected and extracted by the service. In these cases, you can use table tags to train your custom template model.
## When should I use table tags?
applied-ai-services Tutorial Ai Builder https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/tutorial-ai-builder.md
Title: "Tutorial: Create a form processing app with AI Builder - Form Recognizer"
+ Title: "Tutorial: Create a document processing app with AI Builder - Form Recognizer"
-description: In this tutorial, you'll use AI Builder to create and train a form processing application.
+description: In this tutorial, you'll use AI Builder to create and train a document processing application.
# Tutorial: Create a form-processing app with AI Builder
-[AI Builder](/ai-builder/overview) is a Power Platform capability that allows you to automate processes and predict outcomes to improve business performance. You can use AI Builder form processing to create AI models that identify and extract key-value pairs and table data from form documents.
+[AI Builder](/ai-builder/overview) is a Power Platform capability that allows you to automate processes and predict outcomes to improve business performance. You can use AI Builder document processing to create AI models that identify and extract key-value pairs and table data from form documents.
> [!NOTE] > This project is also available as a [Microsoft Learn module](/learn/modules/get-started-with-form-processing/).
In this tutorial, you learn how to: > [!div class="checklist"]
-> * Create a form processing AI model
+> * Create a document processing AI model
> * Train your model > * Publish your model to use in Azure Power Apps or Power Automate
In this tutorial, you learn how to:
* An AI Builder [add-on or trial](https://go.microsoft.com/fwlink/?LinkId=2113956&clcid=0x409).
-## Create a form processing project
+## Create a document processing project
1. Go to [Power Apps](https://make.powerapps.com/) or [Power Automate](https://flow.microsoft.com/signin), and sign in with your organization account. 1. In the left pane, select **AI Builder** > **Build**.
-1. Select the **Form Processing** card.
+1. Select the **document processing** card.
1. Type a name for your model. 1. Select **Create**.
In this tutorial, you learn how to:
On the **Add documents** page, you need to provide sample documents to train your model for the type of form you want to extract information from. After you upload your documents, AI Builder analyzes them to check that they're sufficient to train a model. > [!NOTE]
-> AI Builder does not currently support the following types of form processing input data:
+> AI Builder does not currently support the following types of document processing input data:
> > - Complex tables (nested tables, merged headers or cells, and so on) > - Check boxes or radio buttons
If you're happy with your model, select **Publish** to publish it. When publish
> [!div class="mx-imgBorder"] > ![publish model page](./media/tutorial-ai-builder/model-page.png)
-After you've published your form processing model, you can use it in a [Power Apps canvas app](/ai-builder/form-processor-component-in-powerapps) or in [Power Automate](/ai-builder/form-processing-model-in-flow).
+After you've published your document processing model, you can use it in a [Power Apps canvas app](/ai-builder/form-processor-component-in-powerapps) or in [Power Automate](/ai-builder/form-processing-model-in-flow).
## Next steps
applied-ai-services Tutorial Azure Function https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/tutorial-azure-function.md
Open Azure Storage Explorer and upload a sample PDF document to the **Test** con
Stop the script before continuing.
-## Add form processing code
+## Add document processing code
Next, you'll add your own code to the Python script to call the Form Recognizer service and parse the uploaded documents using the Form Recognizer [Layout API](concept-layout.md).
applied-ai-services Tutorial Logic Apps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/tutorial-logic-apps.md
recommendations: false
# Tutorial: Use Azure Logic Apps with Form Recognizer
+> [!IMPORTANT]
+>
+> This tutorial and the Logic App Form Recognizer connector targets Form Recognizer REST API v2.1.
++ Azure Logic Apps is a cloud-based platform that can be used to automate workflows without writing a single line of code. The platform enables you to easily integrate Microsoft and third-party applications with your apps, data, services, and systems. A Logic App is the Azure resource you create when you want to develop a workflow. Here are a few examples of what you can do with a Logic App: * Create business processes and workflows visually.
For more information, *see* [Logic Apps Overview](../../logic-apps/logic-apps-ov
## Prerequisites
-To complete this tutorial, you'll need the following:
+To complete this tutorial, You'll need the following resources:
* **An Azure subscription**. You can [create a free Azure subscription](https://azure.microsoft.com/free/cognitive-services/)
Congratulations! You've officially completed this tutorial.
## Next steps > [!div class="nextstepaction"]
-> [Use the invoice processing prebuilt model in Power Automate](/ai-builder/flow-invoice-processing?toc=/azure/applied-ai-services/form-recognizer/toc.json&bc=/azure/applied-ai-services/form-recognizer/breadcrumb/toc.json)
+> [Use the invoice processing prebuilt model in Power Automate](/ai-builder/flow-invoice-processing?toc=/azure/applied-ai-services/form-recognizer/toc.json&bc=/azure/applied-ai-services/form-recognizer/breadcrumb/toc.json)
applied-ai-services V3 Migration Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/v3-migration-guide.md
Previously updated : 12/13/2021 Last updated : 02/15/2022 recommendations: false- # Form Recognizer v3.0 migration | Preview
Form Recognizer v3.0 (preview) introduces several new features and capabilities:
* [Form Recognizer REST API](quickstarts/try-v3-rest-api.md) has been redesigned for better usability. * [**General document (v3.0)**](concept-general-document.md) model is a new API that extracts text, tables, structure, key-value pairs, and named entities from forms and documents.
+* [**Custom document model (v3.0)**](concept-custom-neural.md) is a new custom model type to extract fields from structured and unstructured documents.
* [**Receipt (v3.0)**](concept-receipt.md) model supports single-page hotel receipt processing. * [**ID document (v3.0)**](concept-id-document.md) model supports endorsements, restrictions, and vehicle classification extraction from US driver's licenses. * [**Custom model API (v3.0)**](concept-custom.md) supports signature detection for custom forms.
In this article, you'll learn the differences between Form Recognizer v2.1 and v
### POST request ```http
-https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}?api-version=2021-07-30-preview
+https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}?api-version=2022-01-30-preview
``` ### GET request ```http
-https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}/AnalyzeResult/{resultId}?api-version=2021-07-30-preview
+https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}/AnalyzeResult/{resultId}?api-version=2022-01-30-preview
``` ### Analyze operation
https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}/
| **Receipt** | /prebuilt/receipt/analyze | /documentModels/prebuilt-receipt:analyze | | **ID document** | /prebuilt/idDocument/analyze | /documentModels/prebuilt-idDocument:analyze | |**Business card**| /prebuilt/businessCard/analyze| /documentModels/prebuilt-businessCard:analyze|
+|**W-2**| /prebuilt/w-2/analyze| /documentModels/prebuilt-w-2:analyze|
### Analyze request body
Analyze response has been refactored to the following top-level results to suppo
{ // Basic analyze result metadata
-"apiVersion": "2021-07-30-preview", // REST API version used
+"apiVersion": "2022-01-30-preview", // REST API version used
"modelId": "prebuilt-invoice", // ModelId used "stringIndexType": "textElements", // Character unit used for string offsets and lengths: // textElements, unicodeCodePoint, utf16CodeUnit // Concatenated content in global reading order across pages.
Analyze response has been refactored to the following top-level results to suppo
## Build or train model
-The model object has two updates in the new API
+The model object has three updates in the new API
* ```modelId``` is now a property that can be set on a model for a human readable name. * ```modelName``` has been renamed to ```description```
+* ```buildMode``` is a new proerty with values of ```template``` for custom form models or ```neural``` for custom document models.
The ```build``` operation is invoked to train a model. The request payload and call pattern remain unchanged. The build operation specifies the model and training dataset, it returns the result via the Operation-Location header in the response. Poll this model operation URL, via a GET request to check the status of the build operation (minimum recommended interval between requests is 1 second). Unlike v2.1, this URL is not the resource location of the model. Instead, the model URL can be constructed from the given modelId, also retrieved from the resourceLocation property in the response. Upon success, status is set to ```succeeded``` and result contains the custom model info. If errors are encountered, status is set to ```failed``` and the error is returned. The following code is a sample build request using a SAS token. Note the trailing slash when setting the prefix or folder path. ```json
-POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:build?api-version=2021-09-30-preview
+POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:build?api-version=2022-01-30-preview
{ "modelId": {modelId}, "description": "Sample model",
+ "buildMode": "template",
"azureBlobSource": { "containerUrl": "https://{storageAccount}.blob.core.windows.net/{containerName}?{sasToken}", "prefix": "{folderName/}"
POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:build
Model compose is now limited to single level of nesting. Composed models are now consistent with custom models with the addition of ```modelId``` and ```description``` properties. ```json
-POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:compose?api-version=2021-09-30-preview
+POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:compose?api-version=2022-01-30-preview
{ "modelId": "{composedModelId}", "description": "{composedModelDescription}",
POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:compo
{ "modelId": "{modelId2}" }, ] }-
+
``` ## Changes to copy model
The only changes to the copy model function are:
***Authorize the copy*** ```json
-POST https://{targetHost}/formrecognizer/documentModels:authorizeCopy?api-version=2021-09-30-preview
+POST https://{targetHost}/formrecognizer/documentModels:authorizeCopy?api-version=2022-01-30-preview
{ "modelId": "{targetModelId}", "description": "{targetModelDescription}",
POST https://{targetHost}/formrecognizer/documentModels:authorizeCopy?api-versio
Use the response body from the authorize action to construct the request for the copy. ```json
-POST https://{sourceHost}/formrecognizer/documentModels/{sourceModelId}:copy-to?api-version=2021-09-30-preview
+POST https://{sourceHost}/formrecognizer/documentModels/{sourceModelId}:copy-to?api-version=2022-01-30-preview
{ "targetResourceId": "{targetResourceId}", "targetResourceRegion": "{targetResourceRegion}",
List models have been extended to now return prebuilt and custom models. All pre
***Sample list models request*** ```json
-GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels?api-version=2021-09-30-preview
+GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels?api-version=2022-01-30-preview
``` ## Change to get model
GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels?api-ve
As get model now includes prebuilt models, the get operation returns a ```docTypes``` dictionary. Each document type is described by its name, optional description, field schema, and optional field confidence. The field schema describes the list of fields potentially returned with the document type. ```json
-GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}?api-version=2021-09-30-preview
+GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{modelId}?api-version=2022-01-30-preview
``` ## New get info operation
GET https://{your-form-recognizer-endpoint}/formrecognizer/documentModels/{model
The ```info``` operation on the service returns the custom model count and custom model limit. ```json
-GET https://{your-form-recognizer-endpoint}/formrecognizer/info? api-version=2021-09-30-preview
+GET https://{your-form-recognizer-endpoint}/formrecognizer/info? api-version=2022-01-30-preview
``` ***Sample response***
In this migration guide, you've learned how to upgrade your existing Form Recogn
* [Review the new REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-1/operations/AnalyzeDocument) * [What is Form Recognizer?](overview.md)
-* [Form Recognizer quickstart](./quickstarts/try-sdk-rest-api.md)
+* [Form Recognizer quickstart](./quickstarts/try-sdk-rest-api.md)
applied-ai-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/whats-new.md
Previously updated : 12/02/2021 Last updated : 02/15/2022
Form Recognizer service is updated on an ongoing basis. Bookmark this page to stay up to date with release notes, feature enhancements, and documentation updates.
+## February 2022
+
+### Form Recognizer v3.0 preview release
+
+ Form Recognizer v3.0 preview release introduces several new features and capabilities and enhances existing one:
+
+* [**Form Recognizer Studio**](https://formrecognizer.appliedai.azure.com) adds new demos for Read, W2, Hotel receipt samples, and support for training the new custom neural models.
+* [🆕 **W-2 prebuilt model**](concept-w2.md) is a new prebuilt model to extract fields from W-2 tax documents.
+* [🆕 **Read**](concept-read.md) API extracts text lines, words, their locations, detected languages, and handwritten style if detected.
+* [🆕 **Custom neural model**](concept-custom-neural.md) is a new custom model to extract text and selection marks from structured forms and **unstructured documents**.
+* [**Language Expansion**](language-support.md) Form Recognizer Read, Layout, and Custom Form add support for 42 new languages including Arabic, Hindi, and other languages using Arabic and Devanagari scripts to expand the coverage to 164 languages. Handwritten support for the same features expands to Japanese and Korean in addition to English, Chinese Simplified, French, German, Italian, Portuguese, and Spanish languages.
+* [**Invoice API**](language-support.md#invoice-model) Invoice API expands support to Spanish invoices.
+* [**General document**](concept-general-document.md) pre-trained model now updated to support selection marks in addition to API text, tables, structure, key-value pairs, and named entities from forms and documents.
+
+Get stared with the new [REST API](https://westus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v3-0-preview-2/operations/AnalyzeDocument), [Python](quickstarts/try-v3-python-sdk.md), or [.NET](quickstarts/try-v3-csharp-sdk.md) SDK for the v3.0 preview API.
+
+#### Form Recognizer model data extraction
+
+ | **Model** | **Text extraction** |**Key-Value pairs** |**Selection Marks** | **Tables** |**Entities** |
+ | | :: |::| :: | :: |:: |
+ |🆕Read | ✓ | | | | |
+ |🆕General document | ✓ | ✓ | ✓ | ✓ | ✓ |
+ | Layout | Γ£ô | | Γ£ô | Γ£ô | |
+ | Invoice | Γ£ô | Γ£ô | Γ£ô | Γ£ô ||
+ |Receipt | Γ£ô | Γ£ô | | ||
+ | ID document | Γ£ô | Γ£ô | | ||
+ | Business card | Γ£ô | Γ£ô | | ||
+ | Custom |Γ£ô | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
+ ## November 2021 ### Form Recognizer v3.0 preview SDK release update (beta.2)
pip package version 3.1.0b4
* **REST API reference is available** - View the [v2.1-preview.1 reference](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-1/operations/AnalyzeBusinessCardAsync) * **New languages supported In addition to English**, the following [languages](language-support.md) are now supported: for `Layout` and `Train Custom Model`: English (`en`), Chinese (Simplified) (`zh-Hans`), Dutch (`nl`), French (`fr`), German (`de`), Italian (`it`), Portuguese (`pt`) and Spanish (`es`).
-* **Checkbox / Selection Mark detection** ΓÇô Form Recognizer supports detection and extraction of selection marks such as check boxes and radio buttons. Selection Marks are extracted in `Layout` and you can now also label and train in `Train Custom Model` - _Train with Labels_ to extract key value pairs for selection marks.
+* **Checkbox / Selection Mark detection** ΓÇô Form Recognizer supports detection and extraction of selection marks such as check boxes and radio buttons. Selection Marks are extracted in `Layout` and you can now also label and train in `Train Custom Model` - _Train with Labels_ to extract key-value pairs for selection marks.
* **Model Compose** - allows multiple models to be composed and called with a single model ID. When a you submit a document to be analyzed with a composed model ID, a classification step is first performed to route it to the correct custom model. Model Compose is available for `Train Custom Model` - _Train with labels_. * **Model name** - add a friendly name to your custom models for easier management and tracking. * **[New pre-built model for Business Cards](./concept-business-card.md)** for extracting common fields in English, language business cards.
pip package version 3.1.0b4
* [Python SDK](/python/api/overview/azure/ai-formrecognizer-readme) * [JavaScript SDK](/javascript/api/overview/azure/ai-form-recognizer-readme)
- The new SDK supports all the features of the v2.0 REST API for Form Recognizer. For example, you can train a model with or without labels and extract text, key value pairs and tables from your forms, extract data from receipts with the pre-built receipts service and extract text and tables with the layout service from your documents. You can share your feedback on the SDKs through the [SDK Feedback form](https://aka.ms/FR_SDK_v1_feedback).
+ The new SDK supports all the features of the v2.0 REST API for Form Recognizer. For example, you can train a model with or without labels and extract text, key-value pairs and tables from your forms, extract data from receipts with the pre-built receipts service and extract text and tables with the layout service from your documents. You can share your feedback on the SDKs through the [SDK Feedback form](https://aka.ms/FR_SDK_v1_feedback).
* **Copy Custom Model** You can now copy models between regions and subscriptions using the new Copy Custom Model feature. Before invoking the Copy Custom Model API, you must first obtain authorization to copy into the target resource by calling the Copy Authorization operation against the target resource endpoint.
automanage Automanage Virtual Machines https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automanage/automanage-virtual-machines.md
In the Machine selection pane in the portal, you will notice the **Eligibility**
- Machine is not located in a supported [region](#supported-regions) - Machine's log analytics workspace is not located in a supported [region](#supported-regions) - User does not have permissions to the log analytics workspace's subscription. Check out the [required permissions](#required-rbac-permissions)-- The Automanage resource provider is not registered on the subscription. Check out [how to register a Resource Provider](/azure/azure-resource-manager/management/resource-providers-and-types#register-resource-provider-1) with the Automanage resource provider: *Microsoft.Automanage*-- Machine does not have necessary VM agents installed which the Automanage service requires. Check out the [Windows agent installation](/azure/virtual-machines/extensions/agent-windows) and the [Linux agent installation](/azure/virtual-machines/extensions/agent-linux)-- Arc machine is not connected. Learn more about the [Arc agent status](/azure/azure-arc/servers/overview#agent-status) and [how to connect](/azure/azure-arc/servers/agent-overview#connected-machine-agent-technical-overview)
+- The Automanage resource provider is not registered on the subscription. Check out [how to register a Resource Provider](../azure-resource-manager/management/resource-providers-and-types.md#register-resource-provider-1) with the Automanage resource provider: *Microsoft.Automanage*
+- Machine does not have necessary VM agents installed which the Automanage service requires. Check out the [Windows agent installation](../virtual-machines/extensions/agent-windows.md) and the [Linux agent installation](../virtual-machines/extensions/agent-linux.md)
+- Arc machine is not connected. Learn more about the [Arc agent status](../azure-arc/servers/overview.md#agent-status) and [how to connect](../azure-arc/servers/agent-overview.md#connected-machine-agent-technical-overview)
Once you have selected your eligible machines, Click **Enable**, and you're done.
In this article, you learned that Automanage for machines provides a means for w
Try enabling Automanage for Azure virtual machines or Arc-enabled servers in the Azure portal. > [!div class="nextstepaction"]
-> [Enable Automanage for virtual machines in the Azure portal](quick-create-virtual-machines-portal.md)
+> [Enable Automanage for virtual machines in the Azure portal](quick-create-virtual-machines-portal.md)
automation Add User Assigned Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/add-user-assigned-identity.md
If you don't have an Azure subscription, create a [free account](https://azure.m
- Windows Hybrid Runbook Worker: version 7.3.1125.0 - Linux Hybrid Runbook Worker: version 1.7.4.0 -- To assign an Azure role, you must have ```Microsoft.Authorization/roleAssignments/write``` permissions, such as [User Access Administrator](/azure/role-based-access-control/built-in-roles#user-access-administrator) or [Owner](/azure/role-based-access-control/built-in-roles#owner).
+- To assign an Azure role, you must have ```Microsoft.Authorization/roleAssignments/write``` permissions, such as [User Access Administrator](../role-based-access-control/built-in-roles.md#user-access-administrator) or [Owner](../role-based-access-control/built-in-roles.md#owner).
## Add user-assigned managed identity for Azure Automation account
print(response.text)
- If you need to disable a managed identity, see [Disable your Azure Automation account managed identity](disable-managed-identity-for-automation.md). -- For an overview of Azure Automation account security, see [Automation account authentication overview](automation-security-overview.md).
+- For an overview of Azure Automation account security, see [Automation account authentication overview](automation-security-overview.md).
automation Automation Create Standalone Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-create-standalone-account.md
Review your new Automation account.
:::image type="content" source="./media/automation-create-standalone-account/automation-account-overview.png" alt-text="Automation account overview page":::
-When the Automation account is successfully created, several resources are automatically created for you. After creation, these runbooks can be safely deleted if you do not wish to keep them. The managed identities can be used to authenticate to your account in a runbook, and should be left unless you create another one or do not require them. The following table summarizes resources for the account.
+When the Automation account is successfully created, several resources are automatically created for you. After creation, these runbooks can be safely deleted if you do not wish to keep them. The managed identities can be used to authenticate to your account in a runbook, and should be left unless you create another one or do not require them. The Automation access keys are also created during Automation account creation. The following table summarizes resources for the account.
|Resource |Description | ||||
When the Automation account is successfully created, several resources are autom
> [!NOTE] > The tutorial runbooks have not been updated to authenticate using a managed identity. Review the [Using system-assigned identity](enable-managed-identity-for-automation.md#assign-role-to-a-system-assigned-managed-identity) or [Using user-assigned identity](add-user-assigned-identity.md#assign-a-role-to-a-user-assigned-managed-identity) to learn how to grant the managed identity access to resources and configure your runbooks to authenticate using either type of managed identity.
+## Manage Automation account keys
+
+When you create an Automation account, Azure generates two 512-bit automation account access keys for that account. These keys are shared access keys that are used as registration keys for registering [DSC nodes](/azure/automation/automation-dsc-onboarding#use-dsc-metaconfiguration-to-register-hybrid-machines) as well as [Windows](/azure/automation/automation-windows-hrw-install#manual-deployment) and [Linux](/azure/automation/automation-linux-hrw-install#manually-run-powershell-commands) Hybrid runbook workers. These keys are only used while registering DSC nodes and Hybrid workers. Existing machines configured as DSC nodes or hybrid workers wonΓÇÖt be affected after rotation of these keys.
+
+### View Automation account keys
+
+To view and copy your Automation account access keys, follow these steps:
+1. In the [Azure portal](https://portal.azure.com/), go to your Automation account.
+1. Under **Account Settings**, select **Keys** to view your Automation account's primary and secondary access keys.
+You can use any of the two keys to access your Automation account. However, we recommend that you use the first key and reserve the use of second key.
+
+ :::image type="content" source="./media/automation-create-standalone-account/automation-demo-keys-inline.png" alt-text="Automation Keys page" lightbox="./media/automation-create-standalone-account/automation-demo-keys-expanded.png" :::
+
+### Manually rotate access keys
+
+We recommend that you rotate your access keys periodically to keep the Automation account secure. As you have two access keys, you can rotate them using Azure portal or Azure PowerShell cmdlet.
+
+Choose a client
+
+# [Azure portal](#tab/azureportal)
+
+Follow these steps:
+1. Go to your Automation account in [Azure portal](https://portal.azure.com/).
+1. Under **Account Settings**, select **Keys**.
+1. Select **Regenerate primary** to regenerate the primary access key for your Automation account.
+1. Select the **Regenerate secondary** to regenerate the secondary access key.
+ :::image type="content" source="./media/automation-create-standalone-account/regenerate-keys.png" alt-text="Regenerate keys":::
+
+# [Azure PowerShell](#tab/azurepowershell)
+
+Run the [New-AzAutomationKey](/powershell/module/az.automation/new-azautomationkey) command to regenerate the primary access key, as shown in the following example:
+
+ ```azurepowershell
+ New-AzAutomationKey -KeyType Primary -ResourceGroup <ResourceGroup> -AutomationAccountName <AutomationAccount>
+ ```
+
+
+### View registration URL
+The DSC node registers with the State Configuration service using the registration URL and authenticates using a registration access key along with the Automation Account access keys.
++ ## Next steps * To get started with PowerShell runbooks, see [Tutorial: Create a PowerShell runbook](./learn/powershell-runbook-managed-identity.md).
automation Automation Linux Hrw Install https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-linux-hrw-install.md
To remove a Hybrid Runbook Worker group of Linux machines, you use the same step
## Manage Role permissions for Hybrid Worker Groups and Hybrid Workers
-You can create custom Azure Automation roles and grant following permissions to Hybrid Worker Groups and Hybrid Workers. To learn more about how to create Azure Automation custom roles, see [Azure custom roles](/azure/role-based-access-control/custom-roles)
+You can create custom Azure Automation roles and grant following permissions to Hybrid Worker Groups and Hybrid Workers. To learn more about how to create Azure Automation custom roles, see [Azure custom roles](../role-based-access-control/custom-roles.md)
**Actions** | **Description** |
automation Automation Windows Hrw Install https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-windows-hrw-install.md
This process can take several seconds to finish. You can track its progress unde
## Manage Role permissions for Hybrid Worker Groups and Hybrid Workers
-You can create custom Azure Automation roles and grant following permissions to Hybrid Worker Groups and Hybrid Workers. To learn more about how to create Azure Automation custom roles, see [Azure custom roles](/azure/role-based-access-control/custom-roles)
+You can create custom Azure Automation roles and grant following permissions to Hybrid Worker Groups and Hybrid Workers. To learn more about how to create Azure Automation custom roles, see [Azure custom roles](../role-based-access-control/custom-roles.md)
**Actions** | **Description** |
automation Enable Managed Identity For Automation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/enable-managed-identity-for-automation.md
If you don't have an Azure subscription, create a [free account](https://azure.m
- Windows Hybrid Runbook Worker: version 7.3.1125.0 - Linux Hybrid Runbook Worker: version 1.7.4.0 -- To assign an Azure role, you must have ```Microsoft.Authorization/roleAssignments/write``` permissions, such as [User Access Administrator](/azure/role-based-access-control/built-in-roles#user-access-administrator) or [Owner](/azure/role-based-access-control/built-in-roles#owner).
+- To assign an Azure role, you must have ```Microsoft.Authorization/roleAssignments/write``` permissions, such as [User Access Administrator](../role-based-access-control/built-in-roles.md#user-access-administrator) or [Owner](../role-based-access-control/built-in-roles.md#owner).
## Enable a system-assigned managed identity for an Azure Automation account
Azure Automation provided authentication for managing Azure Resource Manager res
- If you need to disable a managed identity, see [Disable your Azure Automation account managed identity](disable-managed-identity-for-automation.md). -- For an overview of Azure Automation account security, see [Automation account authentication overview](automation-security-overview.md).
+- For an overview of Azure Automation account security, see [Automation account authentication overview](automation-security-overview.md).
automation Extension Based Hybrid Runbook Worker Install https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/extension-based-hybrid-runbook-worker-install.md
Review the parameters used in this template.
### Prerequisites
-You would require an Azure VM or Arc-enabled server. You can follow the steps [here](/azure/azure-arc/servers/onboard-portal) to create an Arc connected machine.
+You would require an Azure VM or Arc-enabled server. You can follow the steps [here](../azure-arc/servers/onboard-portal.md) to create an Arc connected machine.
### Install and use Hybrid Worker extension using REST API
To install and use Hybrid Worker extension using REST API, follow these steps. T
## Manage Role permissions for Hybrid Worker Groups
-You can create custom Azure Automation roles and grant following permissions to Hybrid Worker Groups. To learn more about how to create Azure Automation custom roles, see [Azure custom roles](/azure/role-based-access-control/custom-roles).
+You can create custom Azure Automation roles and grant following permissions to Hybrid Worker Groups. To learn more about how to create Azure Automation custom roles, see [Azure custom roles](../role-based-access-control/custom-roles.md).
**Actions** | **Description** |
Microsoft.Automation/automationAccounts/hybridRunbookWorkerGroups/delete | Delet
* To learn how to configure your runbooks to automate processes in your on-premises datacenter or other cloud environment, see [Run runbooks on a Hybrid Runbook Worker](automation-hrw-run-runbooks.md).
-* To learn how to troubleshoot your Hybrid Runbook Workers, see [Troubleshoot Hybrid Runbook Worker issues](troubleshoot/extension-based-hybrid-runbook-worker.md).
+* To learn how to troubleshoot your Hybrid Runbook Workers, see [Troubleshoot Hybrid Runbook Worker issues](troubleshoot/extension-based-hybrid-runbook-worker.md).
automation Hybrid Runbook Worker https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/troubleshoot/hybrid-runbook-worker.md
You might also need to update the date or time zone of your computer. If you sel
##### Log Analytics gateway not configured
-Follow the steps mentioned [here](/azure/azure-monitor/agents/gateway#configure-for-automation-hybrid-runbook-workers) to add Hybrid Runbook Worker endpoints to the Log Analytics Gateway.
+Follow the steps mentioned [here](../../azure-monitor/agents/gateway.md#configure-for-automation-hybrid-runbook-workers) to add Hybrid Runbook Worker endpoints to the Log Analytics Gateway.
### <a name="set-azstorageblobcontent-execution-fails"></a>Scenario: Set-AzStorageBlobContent fails on a Hybrid Runbook Worker
automation Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/whats-new.md
This page is updated monthly, so revisit it regularly. If you're looking for ite
**Type:** New feature
-New scripts are added to the Azure Automation [GitHub repository](https://github.com/azureautomation) to address one of Azure Automation's key scenarios of VM management based on Azure Monitor alert. For more information, see [Trigger runbook from Azure alert](/azure/automation/automation-create-alert-triggered-runbook).
+New scripts are added to the Azure Automation [GitHub repository](https://github.com/azureautomation) to address one of Azure Automation's key scenarios of VM management based on Azure Monitor alert. For more information, see [Trigger runbook from Azure alert](./automation-create-alert-triggered-runbook.md).
- Stop-Azure-VM-On-Alert - Restart-Azure-VM-On-Alert
New scripts are added to the Azure Automation [GitHub repository](https://github
**Type:** New feature
-Azure Automation now supports Managed Identities in Azure public, Azure Gov, and Azure China cloud. [System Assigned Managed Identities](/azure/automation/enable-managed-identity-for-automation) is supported for cloud as well as hybrid jobs, while [User Assigned Managed Identities](/azure/automation/automation-security-overview#managed-identities-preview) is supported only for cloud jobs. Read the [announcement](https://azure.microsoft.com/updates/azure-automation-managed-identities-ga/) for more information.
+Azure Automation now supports Managed Identities in Azure public, Azure Gov, and Azure China cloud. [System Assigned Managed Identities](./enable-managed-identity-for-automation.md) is supported for cloud as well as hybrid jobs, while [User Assigned Managed Identities](./automation-security-overview.md) is supported only for cloud jobs. Read the [announcement](https://azure.microsoft.com/updates/azure-automation-managed-identities-ga/) for more information.
### Preview support for PowerShell 7.1
availability-zones Az Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/availability-zones/az-overview.md
Datacenter locations are selected by using rigorous vulnerability risk assessmen
With availability zones, you can design and operate applications and databases that automatically transition between zones without interruption. Azure availability zones are highly available, fault tolerant, and more scalable than traditional single or multiple datacenter infrastructures.
-Each data center is assigned to a physical zone. Physical zones are mapped to logical zones in your Azure subscription. Azure subscriptions are automatically assigned this mapping at the time a subscription is created. You can use the dedicated ARM API called: [checkZonePeers](https://docs.microsoft.com/rest/api/resources/subscriptions/check-zone-peers) to compare zone mapping for resilient solutions that span across multiple subscriptions.
+Each data center is assigned to a physical zone. Physical zones are mapped to logical zones in your Azure subscription. Azure subscriptions are automatically assigned this mapping at the time a subscription is created. You can use the dedicated ARM API called: [checkZonePeers](/rest/api/resources/subscriptions/check-zone-peers) to compare zone mapping for resilient solutions that span across multiple subscriptions.
You can design resilient solutions by using Azure services that use availability zones. Co-locate your compute, storage, networking, and data resources across an availability zone, and replicate this arrangement in other availability zones.
Azure provides the most extensive global footprint of any cloud provider and is
- [Microsoft commitment to expand Azure availability zones to more regions](https://azure.microsoft.com/blog/our-commitment-to-expand-azure-availability-zones-to-more-regions/) - [Azure services that support availability zones](az-region.md)-- [Azure services](region-types-service-categories-azure.md)-
+- [Azure services](region-types-service-categories-azure.md)
azure-arc Monitor Grafana Kibana https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/monitor-grafana-kibana.md
# View logs and metrics using Kibana and Grafana
-Kibana and Grafana web dashboards are provided to bring insight and clarity to the Kubernetes namespaces being used by Azure Arc-enabled data services. To access Kibana and Grafana web dashboards view service endpoints check [Azure Data Studio dashboards](/azure/azure-arc/data/azure-data-studio-dashboards) documentation.
+Kibana and Grafana web dashboards are provided to bring insight and clarity to the Kubernetes namespaces being used by Azure Arc-enabled data services. To access Kibana and Grafana web dashboards view service endpoints check [Azure Data Studio dashboards](./azure-data-studio-dashboards.md) documentation.
az network nsg rule create -n ports_30777 --nsg-name azurearcvmNSG --priority 60
- [Introduction](https://www.elastic.co/webinars/getting-started-kibana?baymax=default&elektra=docs&storm=top-video) - [Kibana guide](https://www.elastic.co/guide/en/kibana/current/https://docsupdatetracker.net/index.html) - [Introduction to dashboard drilldowns with data visualizations in Kibana](https://www.elastic.co/webinars/dashboard-drilldowns-with-data-visualizations-in-kibana/)
- - [How to build Kibana dashboards](https://www.elastic.co/webinars/how-to-build-kibana-dashboards/)
+ - [How to build Kibana dashboards](https://www.elastic.co/webinars/how-to-build-kibana-dashboards/)
azure-arc Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/kubernetes/faq.md
Title: "Azure Arc-enabled Kubernetes frequently asked questions"
+ Title: "Azure Arc-enabled Kubernetes and GitOps frequently asked questions"
Previously updated : 02/19/2021 Last updated : 02/15/2022 --
-description: "This article contains a list of frequently asked questions related to Azure Arc-enabled Kubernetes"
-keywords: "Kubernetes, Arc, Azure, containers, configuration, GitOps, faq"
++
+description: "This article contains a list of frequently asked questions related to Azure Arc-enabled Kubernetes and Azure GitOps"
+keywords: "Kubernetes, Arc, Azure, containers, configuration, GitOps, Flux, faq"
# Frequently Asked Questions - Azure Arc-enabled Kubernetes
If the value of `managedIdentityCertificateExpirationTime` indicates a timestamp
> [!NOTE] > `az connectedk8s delete` will also delete configurations and cluster extensions on top of the cluster. After running `az connectedk8s connect`, recreate the configurations and cluster extensions on the cluster, either manually or using Azure Policy.
-## If I am already using CI/CD pipelines, can I still use Azure Arc-enabled Kubernetes and configurations?
+## If I am already using CI/CD pipelines, can I still use Azure Arc-enabled Kubernetes or AKS and GitOps configurations?
-Yes, you can still use configurations on a cluster receiving deployments via a CI/CD pipeline. Compared to traditional CI/CD pipelines, configurations feature two extra benefits:
+Yes, you can still use configurations on a cluster receiving deployments via a CI/CD pipeline. Compared to traditional CI/CD pipelines, GitOps configurations feature some extra benefits:
**Drift reconciliation**
The CI/CD pipeline applies changes only once during pipeline run. However, the G
CI/CD pipelines are useful for event-driven deployments to your Kubernetes cluster (for example, a push to a Git repository). However, if you want to deploy the same configuration to all of your Kubernetes clusters, you would need to manually configure each Kubernetes cluster's credentials to the CI/CD pipeline.
-For Azure Arc-enabled Kubernetes, since Azure Resource Manager manages your configurations, you can automate creating the same configuration across all Azure Arc-enabled Kubernetes resources using Azure Policy, within scope of a subscription or a resource group. This capability is even applicable to Azure Arc-enabled Kubernetes resources created after the policy assignment.
+For Azure Arc-enabled Kubernetes, since Azure Resource Manager manages your GitOps configurations, you can automate creating the same configuration across all Azure Arc-enabled Kubernetes and AKS resources using Azure Policy, within scope of a subscription or a resource group. This capability is even applicable to Azure Arc-enabled Kubernetes and AKS resources created after the policy assignment.
This feature applies baseline configurations (like network policies, role bindings, and pod security policies) across the entire Kubernetes cluster inventory to meet compliance and governance requirements.
+**Cluster compliance**
+
+The compliance state of each GitOps configuration is reported back to Azure. This lets you keep track of any failed deployments.
+
+## Error installing the microsoft.flux extension (Flux v2)
+
+The `microsoft.flux` extension installs the Flux controllers and Azure GitOps agents into your Azure Arc-enabled Kubernetes or AKS clusters. If you experience an error during installation below are some troubleshooting actions.
+
+* Error message
+
+ ```console
+ {'code':'DeploymentFailed','message':'At least one resource deployment operation failed. Please list deployment operations for details. Please see https://aka.ms/DeployOperations for usage details.','details':[{'code':'ExtensionCreationFailed','message':' Request failed to https://management.azure.com/subscriptions/<SUBSCRIPTION_ID>/resourceGroups/<RESOURCE_GROUP>/providers/Microsoft.ContainerService/managedclusters/<CLUSTER_NAME>/extensionaddons/flux?api-version=2021-03-01. Error code: BadRequest. Reason: Bad Request'}]}
+ ```
+
+* For AKS cluster, assure that the subscription has the following feature flag enabled: `Microsoft.ContainerService/AKS-ExtensionManager`.
+
+ ```console
+ az feature register --namespace Microsoft.ContainerService --name AKS-ExtensionManager
+ ```
+
+* Force delete the extension.
+
+ ```console
+ az k8s-extension delete --force -g <RESOURCE_GROUP> -c <CLUSTER_NAME> -n flux -t <managedClusters OR connectedClusters>
+ ```
+
+* Assure that the cluster does not have any policies that restrict creation of the `flux-system` namespace or resources in that namespace.
+
+After you have verified the above, you can re-install the extension.
+ ## Does Azure Arc-enabled Kubernetes store any customer data outside of the cluster's region? The feature to enable storing customer data in a single region is currently only available in the Southeast Asia Region (Singapore) of the Asia Pacific Geo and Brazil South (Sao Paulo State) Region of Brazil Geo. For all other regions, customer data is stored in Geo. For more information, see [Trust Center](https://azure.microsoft.com/global-infrastructure/data-residency/).
The feature to enable storing customer data in a single region is currently only
## Next steps * Walk through our quickstart to [connect a Kubernetes cluster to Azure Arc](./quickstart-connect-cluster.md).
-* Already have a Kubernetes cluster connected Azure Arc? [Create configurations on your Azure Arc-enabled Kubernetes cluster](./tutorial-use-gitops-connected-cluster.md).
+* Already have an AKS cluster or an Azure Arc-enabled Kubernetes cluster? [Create GitOps configurations on your Azure Arc-enabled Kubernetes cluster](./tutorial-use-gitops-flux2.md).
+* Learn how to [setup a CI/CD pipeline with GitOps](./tutorial-gitops-flux2-ci-cd.md).
* Learn how to [use Azure Policy to apply configurations at scale](./use-azure-policy.md).
azure-arc Onboard Configuration Manager Custom Task https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/onboard-configuration-manager-custom-task.md
To verify that the machines have been successfully connected to Azure Arc, verif
- Review the [Planning and deployment guide](plan-at-scale-deployment.md) to plan for deploying Azure Arc-enabled servers at any scale and implement centralized management and monitoring. - Review connection troubleshooting information in the [Troubleshoot Connected Machine agent guide](troubleshoot-agent-onboard.md).-- Learn how to manage your machine using [Azure Policy](/azure/governance/policy/overview) for such things as VM [guest configuration](/azure/governance/policy/concepts/guest-configuration), verifying that the machine is reporting to the expected Log Analytics workspace, enabling monitoring with [VM insights](/azure/azure-monitor/vm/vminsights-enable-policy), and much more.
+- Learn how to manage your machine using [Azure Policy](../../governance/policy/overview.md) for such things as VM [guest configuration](../../governance/policy/concepts/guest-configuration.md), verifying that the machine is reporting to the expected Log Analytics workspace, enabling monitoring with [VM insights](../../azure-monitor/vm/vminsights-enable-policy.md), and much more.
azure-arc Onboard Configuration Manager Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/onboard-configuration-manager-powershell.md
The script status monitoring will indicate whether the script has successfully i
- Review the [Planning and deployment guide](plan-at-scale-deployment.md) to plan for deploying Azure Arc-enabled servers at any scale and implement centralized management and monitoring. - Review connection troubleshooting information in the [Troubleshoot Connected Machine agent guide](troubleshoot-agent-onboard.md).-- Learn how to manage your machine using [Azure Policy](/azure/governance/policy/overview) for such things as VM [guest configuration](/azure/governance/policy/concepts/guest-configuration), verifying that the machine is reporting to the expected Log Analytics workspace, enabling monitoring with [VM insights](/azure/azure-monitor/vm/vminsights-enable-policy), and much more.
+- Learn how to manage your machine using [Azure Policy](../../governance/policy/overview.md) for such things as VM [guest configuration](../../governance/policy/concepts/guest-configuration.md), verifying that the machine is reporting to the expected Log Analytics workspace, enabling monitoring with [VM insights](../../azure-monitor/vm/vminsights-enable-policy.md), and much more.
azure-arc Onboard Service Principal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/onboard-service-principal.md
Title: Connect hybrid machines to Azure at scale description: In this article, you learn how to connect machines to Azure using Azure Arc-enabled servers using a service principal. Previously updated : 08/17/2021 Last updated : 02/10/2022
You can enable Azure Arc-enabled servers for multiple Windows or Linux machines in your environment with several flexible options depending on your requirements. Using the template script we provide, you can automate every step of the installation, including establishing the connection to Azure Arc. However, you are required to interactively execute this script with an account that has elevated permissions on the target machine and in Azure.
-To connect the machines to Azure Arc-enabled servers, you can use an Azure Active Directory [service principal](../../active-directory/develop/app-objects-and-service-principals.md) instead of using your privileged identity to [interactively connect the machine](onboard-portal.md). A service principal is a special limited management identity that is granted only the minimum permission necessary to connect machines to Azure using the `azcmagent` command. This is safer than using a higher privileged account like a Tenant Administrator, and follows our access control security best practices. The service principal is used only during onboarding, it is not used for any other purpose.
+To connect the machines to Azure Arc-enabled servers, you can use an Azure Active Directory [service principal](../../active-directory/develop/app-objects-and-service-principals.md) instead of using your privileged identity to [interactively connect the machine](onboard-portal.md). This service principal is a special limited management identity that is granted only the minimum permission necessary to connect machines to Azure using the `azcmagent` command. This is safer than using a higher privileged account like a Tenant Administrator, and follows our access control security best practices. The service principal is used only during onboarding; it is not used for any other purpose.
-The installation methods to install and configure the Connected Machine agent requires that the automated method you use has administrator permissions on the machines. On Linux, by using the root account and on Windows, as a member of the Local Administrators group.
+The installation methods to install and configure the Connected Machine agent requires that the automated method you use has administrator permissions on the machines: on Linux by using the root account, and on Windows as a member of the Local Administrators group.
Before you get started, be sure to review the [prerequisites](agent-overview.md#prerequisites) and verify that your subscription and resources meet the requirements. For information about supported regions and other related considerations, see [supported Azure regions](overview.md#supported-regions). Also review our [at-scale planning guide](plan-at-scale-deployment.md) to understand the design and deployment criteria, as well as our management and monitoring recommendations. If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
-## Create a Service Principal for onboarding at scale
+## Create a service principal for onboarding at scale
-You can use [Azure PowerShell](/powershell/azure/install-az-ps) to create a service principal with the [New-AzADServicePrincipal](/powershell/module/Az.Resources/New-AzADServicePrincipal) cmdlet. Or you can follow the steps listed under [Create a Service Principal using the Azure portal](../../active-directory/develop/howto-create-service-principal-portal.md) to complete this task.
+You can create a service principal in the Azure portal or by using Azure PowerShell.
> [!NOTE]
-> Before you create a service principal, your account must be a member of the **Owner** or **User Access Administrator** role in the subscription that you want to use for onboarding. If you don't have sufficient permissions to configure role assignments, the service principal might be created, but it won't be able to onboard machines.
->
+> To create a service principal and assign roles, your account must be a member of the **Owner** or **User Access Administrator** role in the subscription that you want to use for onboarding. If you don't have sufficient permissions to configure role assignments, the service principal might still be created, but it won't be able to onboard machines.
-To create the service principal using PowerShell, perform the following steps.
+### Azure portal
+
+The Azure Arc service in the Azure portal provides a streamlined way to create a service principal that can be used to connect your hybrid machines to Azure.
+
+1. In the Azure portal, navigate to Azure Arc, then select **Service principals** in the left menu.
+1. Select **Add**.
+1. Enter a name for your service principal.
+1. Choose whether the service principal will have access to an entire subscription, or only to a specific resource group.
+1. Select the subscription (and resource group, if applicable) to which the service principal will have access.
+1. In the **Client secret** section, select the duration for which your generated client secret will be in use. You can optionally enter a friendly name of your choice in the **Description** field.
+1. In the **Role assignment** section, select **Azure Connected Machine Onboarding**.
+1. Select **Create**.
++
+### Azure PowerShell
+
+You can use [Azure PowerShell](/powershell/azure/install-az-ps) to create a service principal with the [New-AzADServicePrincipal](/powershell/module/Az.Resources/New-AzADServicePrincipal) cmdlet.
1. Run the following command. You must store the output of the [`New-AzADServicePrincipal`](/powershell/module/az.resources/new-azadserviceprincipal) cmdlet in a variable, or you will not be able to retrieve the password needed in a later step.
To create the service principal using PowerShell, perform the following steps.
$credential.GetNetworkCredential().password ```
-3. In the output, find the password value under the field **password** and copy it. Also find the value under the field **ApplicationId** and copy it also. Save them for later in a secure place. If you forget or lose your service principal password, you can reset it using the [`New-AzADSpCredential`](/powershell/module/az.resources/new-azadspcredential) cmdlet.
+3. In the output, find the values for the fields **password** and **ApplicationId**. You'll need these values later, so save them in a secure place. If you forget or lose your service principal password, you can reset it using the [`New-AzADSpCredential`](/powershell/module/az.resources/new-azadspcredential) cmdlet.
The values from the following properties are used with parameters passed to the `azcmagent`:
-* The value from the **ApplicationId** property is used for the `--service-principal-id` parameter value
-* The value from the **password** property is used for the `--service-principal-secret` parameter used to connect the agent.
+- The value from the **ApplicationId** property is used for the `--service-principal-id` parameter value
+- The value from the **password** property is used for the `--service-principal-secret` parameter used to connect the agent.
-> [!NOTE]
+> [!TIP]
> Make sure to use the service principal **ApplicationId** property, not the **Id** property.
->
-The **Azure Connected Machine Onboarding** role contains only the permissions required to onboard a machine. You can assign the service principal permission to allow its scope to include a resource group or a subscription. To add role assignment, see [Assign Azure roles using the Azure portal](../../role-based-access-control/role-assignments-portal.md) or [Assign Azure roles using Azure CLI](../../role-based-access-control/role-assignments-cli.md).
+The **Azure Connected Machine Onboarding** role contains only the permissions required to onboard a machine. You can assign the service principal permission to allow its scope to include a resource group or a subscription. To add role assignments, see [Assign Azure roles using the Azure portal](../../role-based-access-control/role-assignments-portal.md) or [Assign Azure roles using Azure CLI](../../role-based-access-control/role-assignments-cli.md).
## Generate the installation script from the Azure portal
Taking the script template created earlier, you can install and configure the Co
The following are the settings that you configure the `azcmagent` command to use for the service principal.
-* `service-principal-id` : The unique identifier (GUID) that represents the application ID of the service principal.
-* `service-principal-secret` | The service principal password.
-* `tenant-id` : The unique identifier (GUID) that represents your dedicated instance of Azure AD.
-* `subscription-id` : The subscription ID (GUID) of your Azure subscription that you want the machines in.
-* `resource-group` : The resource group name where you want your connected machines to belong to.
-* `location` : See [supported Azure regions](overview.md#supported-regions). This location can be the same or different, as the resource group's location.
-* `resource-name` : (*Optional*) Used for the Azure resource representation of your on-premises machine. If you do not specify this value, the machine hostname is used.
+- `service-principal-id` : The unique identifier (GUID) that represents the application ID of the service principal.
+- `service-principal-secret` | The service principal password.
+- `tenant-id` : The unique identifier (GUID) that represents your dedicated instance of Azure AD.
+- `subscription-id` : The subscription ID (GUID) of your Azure subscription that you want the machines in.
+- `resource-group` : The resource group name where you want your connected machines to belong to.
+- `location` : See [supported Azure regions](overview.md#supported-regions). This location can be the same or different, as the resource group's location.
+- `resource-name` : (*Optional*) Used for the Azure resource representation of your on-premises machine. If you do not specify this value, the machine hostname is used.
You can learn more about the `azcmagent` command-line tool by reviewing the [Azcmagent Reference](./manage-agent.md). >[!NOTE] >The Windows PowerShell script only supports running from a 64-bit version of Windows PowerShell.
->
After you install the agent and configure it to connect to Azure Arc-enabled servers, go to the Azure portal to verify that the server has successfully connected. View your machines in the [Azure portal](https://aka.ms/hybridmachineportal).
-![A successful server connection](./media/onboard-portal/arc-for-servers-successful-onboard.png)
+![Screenshot showing a successful server connection in the Azure portal.](./media/onboard-portal/arc-for-servers-successful-onboard.png)
## Next steps -- Troubleshooting information can be found in the [Troubleshoot Connected Machine agent guide](troubleshoot-agent-onboard.md).- - Review the [Planning and deployment guide](plan-at-scale-deployment.md) to plan for deploying Azure Arc-enabled servers at any scale and implement centralized management and monitoring.--- Learn how to manage your machine using [Azure Policy](../../governance/policy/overview.md), for such things as VM [guest configuration](../../governance/policy/concepts/guest-configuration.md), verify the machine is reporting to the expected Log Analytics workspace, enable monitoring with [VM insights](../../azure-monitor/vm/vminsights-enable-policy.md), and much more.
+- Learn how to [troubleshoot agent connection issues](troubleshoot-agent-onboard.md).
+- Learn how to manage your machines using [Azure Policy](../../governance/policy/overview.md) for such things as VM [guest configuration](../../governance/policy/concepts/guest-configuration.md), verifying that machines are reporting to the expected Log Analytics workspace, monitoring with [VM insights](../../azure-monitor/vm/vminsights-enable-policy.md), and more.
azure-arc Quick Start Connect Vcenter To Arc Using Script https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/vmware-vsphere/quick-start-connect-vcenter-to-arc-using-script.md
First, the script deploys a virtual appliance, called [Azure Arc resource bridge
- An external virtual network/switch and internet access, directly or through a proxy.
+> [!NOTE]
+> Azure Arc-enabled VMware vSphere (preview) supports vCenters with a maximum of 2500 VMs. If your vCenter has more than 2500 VMs, it is not recommended to use Arc-enabled VMware vSphere with it at this point.
+ ### vSphere accounts A vSphere account that can:
azure-cache-for-redis Cache How To Redis Cli Tool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-cache-for-redis/cache-how-to-redis-cli-tool.md
If you want to run the command-line tool on another platform, download open-sour
You can gather the information needed to access the cache using three methods:
-1. Azure CLI using [az redis list-keys](/cli/azure/redis#az_redis_list_keys)
+1. Azure CLI using [az redis list-keys](/cli/azure/redis#az-redis-list-keys)
2. Azure PowerShell using [Get-AzRedisCacheKey](/powershell/module/az.rediscache/Get-AzRedisCacheKey) 3. Using the Azure portal
azure-cache-for-redis Cache Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-cache-for-redis/cache-managed-identity.md
# Managed identity with Azure Cache for Redis (Preview)
-[Managed identities](/azure/active-directory/managed-identities-azure-resources/overview) are a common tool used in Azure to help developers minimize the burden of managing secrets and login information. Managed identities are useful when Azure services connect to each other. Instead of managing authorization between each service, [Azure Active Directory](/azure/active-directory/fundamentals/active-directory-whatis) (Azure AD) can be used to provide a managed identity that makes the authentication process more streamlined and secure.
+[Managed identities](../active-directory/managed-identities-azure-resources/overview.md) are a common tool used in Azure to help developers minimize the burden of managing secrets and login information. Managed identities are useful when Azure services connect to each other. Instead of managing authorization between each service, [Azure Active Directory](../active-directory/fundamentals/active-directory-whatis.md) (Azure AD) can be used to provide a managed identity that makes the authentication process more streamlined and secure.
## Managed identity with storage accounts
Managed identity lets you simplify the process of securely connecting to your ch
> This functionality does not yet support authentication for connecting to a cache instance. >
-Azure Cache for Redis supports [both types of managed identity](/azure/active-directory/managed-identities-azure-resources/overview):
+Azure Cache for Redis supports [both types of managed identity](../active-directory/managed-identities-azure-resources/overview.md):
- **System-assigned identity** is specific to the resource. In this case, the cache is the resource. When the cache is deleted, the identity is deleted.
To use managed identity, you must have a premium-tier cache.
:::image type="content" source="media/cache-managed-identity/identity-add.png" alt-text="User assigned identity status is on":::
-1. A sidebar pops up to allow you to select any available user-assigned identity to your subscription. Choose an identity and select **Add**. For more information on user assigned managed identities, see [manage user-assigned identity](/azure/active-directory/managed-identities-azure-resources/how-manage-user-assigned-managed-identities).
+1. A sidebar pops up to allow you to select any available user-assigned identity to your subscription. Choose an identity and select **Add**. For more information on user assigned managed identities, see [manage user-assigned identity](../active-directory/managed-identities-azure-resources/how-manage-user-assigned-managed-identities.md).
>[!Note]
- >You need to [create a user assigned identity](/azure/active-directory/managed-identities-azure-resources/how-manage-user-assigned-managed-identities?pivots=identity-mi-methods-azp) in advance of this step.
+ >You need to [create a user assigned identity](../active-directory/managed-identities-azure-resources/how-manage-user-assigned-managed-identities.md?pivots=identity-mi-methods-azp) in advance of this step.
> :::image type="content" source="media/cache-managed-identity/choose-identity.png" alt-text="new Object principal ID shown for user assigned identity":::
Set-AzRedisCache -ResourceGroupName \"MyGroup\" -Name \"MyCache\" -IdentityType
:::image type="content" source="media/cache-managed-identity/blob-data.png" alt-text="storag blob data contributor list"::: > [!NOTE]
-> Adding an Azure Cache for Redis instance as a storage blog data contributor through system-assigned identity will conveniently add the cache instance to the [trusted services list](/azure/storage/common/storage-network-security?tabs=azure-portal), making firewall exceptions easier to implement.
+> Adding an Azure Cache for Redis instance as a storage blog data contributor through system-assigned identity will conveniently add the cache instance to the [trusted services list](../storage/common/storage-network-security.md?tabs=azure-portal), making firewall exceptions easier to implement.
## Use managed identity to access a storage account
Set-AzRedisCache -ResourceGroupName \"MyGroup\" -Name \"MyCache\" -IdentityType
## Next steps - [Learn more](cache-overview.md#service-tiers) about Azure Cache for Redis features-- [What are managed identifies](/azure/active-directory/managed-identities-azure-resources/overview)
+- [What are managed identifies](../active-directory/managed-identities-azure-resources/overview.md)
azure-functions Consumption Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/consumption-plan.md
To learn more about how to estimate costs when running in a Consumption plan, se
## Create a Consumption plan function app
-When you create a function app in the Azure portal, the Consumption plan is the default. When using APIs to create you function app, you don't have to first create an App Service plan as you do with Premium and Dedicated plans.
+When you create a function app in the Azure portal, the Consumption plan is the default. When using APIs to create your function app, you don't have to first create an App Service plan as you do with Premium and Dedicated plans.
Use the following links to learn how to create a serverless function app in a Consumption plan, either programmatically or in the Azure portal:
azure-functions Disable Function https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/disable-function.md
Even when you publish to your function app from a local project, you can still u
# [Azure CLI](#tab/azurecli)
-In the Azure CLI, you use the [`az functionapp config appsettings set`](/cli/azure/functionapp/config/appsettings#az_functionapp_config_appsettings_set) command to create and modify the app setting. The following command disables a function named `QueueTrigger` by creating an app setting named `AzureWebJobs.QueueTrigger.Disabled` and setting it to `true`.
+In the Azure CLI, you use the [`az functionapp config appsettings set`](/cli/azure/functionapp/config/appsettings#az-functionapp-config-appsettings-set) command to create and modify the app setting. The following command disables a function named `QueueTrigger` by creating an app setting named `AzureWebJobs.QueueTrigger.Disabled` and setting it to `true`.
```azurecli-interactive az functionapp config appsettings set --name <FUNCTION_APP_NAME> \
azure-functions Functions Bindings Storage Table Input https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-bindings-storage-table-input.md
Title: Azure Table storage input bindings for Azure Functions
-description: Understand how to use Azure Table storage input bindings in Azure Functions.
+ Title: Azure Tables input bindings for Azure Functions
+description: Understand how to use Azure Tables input bindings in Azure Functions.
- Previously updated : 09/03/2018 Last updated : 01/23/2022 ms.devlang: csharp, java, javascript, powershell, python
+zone_pivot_groups: programming-languages-set-functions-lang-workers
-# Azure Table storage input bindings for Azure Functions
+# Azure Tables input bindings for Azure Functions
+
+Use the Azure Tables input binding to read a table in an Azure Storage or Cosmos DB account.
-Use the Azure Table storage input binding to read a table in an Azure Storage account.
+For information on setup and configuration details, see the [overview](./functions-bindings-storage-table.md).
## Example
-# [C#](#tab/csharp)
+
+The usage of the binding depends on the extension package version, and the C# modality used in your function app, which can be one of the following:
+
+# [In-process](#tab/in-process)
-### One entity
+An in-process class library is a compiled C# function runs in the same process as the Functions runtime.
+
+# [Isolated process](#tab/isolated-process)
-The following example shows a [C# function](functions-dotnet-class-library.md) that reads a single table row. For every message sent to the queue, the function will be triggered.
+An isolated process class library compiled C# function runs in a process isolated from the runtime. Isolated process is required to support C# functions running on .NET 5.0.
+
+# [C# script](#tab/csharp-script)
+
+C# script is used primarily when creating C# functions in the Azure portal.
++
-The row key value "{queueTrigger}" indicates that the row key comes from the queue message string.
+Choose a version to see examples for the mode and version.
+
+# [Combined Azure Storage extension](#tab/storage-extension/in-process)
+
+The following example shows a [C# function](./functions-dotnet-class-library.md) that reads a single table row. For every message sent to the queue, the function will be triggered.
+
+The row key value `{queueTrigger}` binds the row key to the message metadata, which is the message string.
```csharp public class TableStorage
public class TableStorage
} ```
-### CloudTable
-
-`CloudTable` is only supported in the [Functions v2 and and higher runtimes](functions-versions.md).
- Use a `CloudTable` method parameter to read the table by using the Azure Storage SDK. Here's an example of a function that queries an Azure Functions log table: ```csharp
For more information about how to use CloudTable, see [Get started with Azure Ta
If you try to bind to `CloudTable` and get an error message, make sure that you have a reference to [the correct Storage SDK version](./functions-bindings-storage-table.md#azure-storage-sdk-version-in-functions-1x).
-### IQueryable
+# [Table API extension (preview)](#tab/table-api/in-process)
+
+The following example shows a [C# function](./functions-dotnet-class-library.md) that reads a single table row. For every message sent to the queue, the function will be triggered.
+
+The row key value `{queueTrigger}` binds the row key to the message metadata, which is the message string.
+
+```csharp
+public class TableStorage
+{
+ public class MyPoco : ITableEntity
+ {
+ public string Text { get; set; }
+
+ public string PartitionKey { get; set; }
+ public string RowKey { get; set; }
+ public DateTimeOffset? Timestamp { get; set; }
+ public ETag ETag { get; set; }
+ }
++
+ [FunctionName("TableInput")]
+ public static void TableInput(
+ [QueueTrigger("table-items")] string input,
+ [Table("MyTable", "MyPartition", "{queueTrigger}")] MyPoco poco,
+ ILogger log)
+ {
+ log.LogInformation($"PK={poco.PartitionKey}, RK={poco.RowKey}, Text={poco.Text}");
+ }
+}
+```
+
+Use a `TableClient` method parameter to read the table by using the Azure SDK. Here's an example of a function that queries an Azure Functions log table:
+
+```csharp
+using Microsoft.Azure.WebJobs;
+using Microsoft.Extensions.Logging;
+using Azure.Data.Tables;
+using System;
+using System.Threading.Tasks;
+using Azure;
+namespace FunctionAppCloudTable2
+{
+ public class LogEntity : ITableEntity
+ {
+ public string OriginalName { get; set; }
+
+ public string PartitionKey { get; set; }
+ public string RowKey { get; set; }
+ public DateTimeOffset? Timestamp { get; set; }
+ public ETag ETag { get; set; }
+ }
+ public static class CloudTableDemo
+ {
+ [FunctionName("CloudTableDemo")]
+ public static async Task Run(
+ [TimerTrigger("0 */1 * * * *")] TimerInfo myTimer,
+ [Table("AzureWebJobsHostLogscommon")] TableClient tableClient,
+ ILogger log)
+ {
+ log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
+ AsyncPageable<LogEntity> queryResults = tableClient.QueryAsync<LogEntity>(filter: $"PartitionKey eq 'FD2' and RowKey gt 't'");
+ await foreach (LogEntity entity in queryResults)
+ {
+ log.LogInformation($"{entity.PartitionKey}\t{entity.RowKey}\t{entity.Timestamp}\t{entity.OriginalName}");
+ }
+ }
+ }
+}
+```
+For more information about how to use `TableClient`, see the [Azure.Data.Tables API Reference](/dotnet/api/azure.data.tables.tableclient).
+
+# [Functions 1.x](#tab/functionsv1/in-process)
-`IQueryable` is only supported in the [Functions v1 runtime](functions-versions.md).
+The following example shows a [C# function](./functions-dotnet-class-library.md) that reads a single table row. For every message sent to the queue, the function will be triggered.
+
+The row key value `{queueTrigger}` binds the row key to the message metadata, which is the message string.
+
+```csharp
+public class TableStorage
+{
+ public class MyPoco
+ {
+ public string PartitionKey { get; set; }
+ public string RowKey { get; set; }
+ public string Text { get; set; }
+ }
+
+ [FunctionName("TableInput")]
+ public static void TableInput(
+ [QueueTrigger("table-items")] string input,
+ [Table("MyTable", "MyPartition", "{queueTrigger}")] MyPoco poco,
+ ILogger log)
+ {
+ log.LogInformation($"PK={poco.PartitionKey}, RK={poco.RowKey}, Text={poco.Text}");
+ }
+}
+```
-The following example shows a [C# function](functions-dotnet-class-library.md) that reads multiple table rows where the `MyPoco` class derives from `TableEntity`.
+The following example shows a [C# function](./functions-dotnet-class-library.md) that reads multiple table rows where the `MyPoco` class derives from `TableEntity`.
```csharp public class TableStorage
public class TableStorage
} ```
-# [C# Script](#tab/csharp-script)
+# [Combined Azure Storage extension](#tab/storage-extension/isolated-process)
-### One entity
+The following `MyTableData` class represents a row of data in the table:
-The following example shows a table input binding in a *function.json* file and [C# script](functions-reference-csharp.md) code that uses the binding. The function uses a queue trigger to read a single table row.
-The *function.json* file specifies a `partitionKey` and a `rowKey`. The `rowKey` value "{queueTrigger}" indicates that the row key comes from the queue message string.
+The following function, which is started by a Queue Storage trigger, reads a row key from the queue, which is used to get the row from the input table. The expression `{queueTrigger}` binds the row key to the message metadata, which is the message string.
++
+The following Queue-triggered function returns the first 5 entities as an `IEnumerable<T>`, with the partition key value set as the queue message.
+
+```csharp
+[Function("TestFunction")]
+public static void Run([QueueTrigger("myqueue", Connection = "AzureWebJobsStorage")] string partition,
+ [TableInput("inTable", "{queueTrigger}", Take = 5, Filter = "Text eq 'test'",
+ Connection = "AzureWebJobsStorage")] IEnumerable<MyTableData> tableInputs,
+ FunctionContext context)
+{
+ var logger = context.GetLogger("TestFunction");
+ logger.LogInformation(partition);
+ foreach (MyTableData tableInput in tableInputs)
+ {
+ logger.LogInformation($"PK={tableInput.PartitionKey}, RK={tableInput.RowKey}, Text={tableInput.Text}");
+ }
+}
+```
+The `Filter` and `Take` properties are used to limit the number of entities returned.
+
+# [Table API extension (preview)](#tab/table-api/isolated-process)
+
+The Table API extension does not currently support isolated process. You will instead need to use the combined Azure Storage extension.
+
+# [Functions 1.x](#tab/functionsv1/isolated-process)
+
+Functions version 1.x doesn't support isolated process.
+
+# [Combined Azure Storage extension](#tab/storage-extension/csharp-script)
+
+The following example shows a table input binding in a *function.json* file and [C# script](./functions-reference-csharp.md) code that uses the binding. The function uses a queue trigger to read a single table row.
+
+The *function.json* file specifies a `partitionKey` and a `rowKey`. The `rowKey` value `{queueTrigger}` indicates that the row key comes from the queue message string.
```json {
public class Person
} ```
-### CloudTable
-
-`IQueryable` isn't supported in the Functions runtime for [versions 2.x and higher)](functions-versions.md). An alternative is to use a `CloudTable` method parameter to read the table by using the Azure Storage SDK. Here's an example of a function that queries an Azure Functions log table:
+To read more than one row, use a `CloudTable` method parameter to read the table by using the Azure Storage SDK. Here's an example of a function that queries an Azure Functions log table:
```json {
For more information about how to use CloudTable, see [Get started with Azure Ta
If you try to bind to `CloudTable` and get an error message, make sure that you have a reference to [the correct Storage SDK version](./functions-bindings-storage-table.md#azure-storage-sdk-version-in-functions-1x).
-### IQueryable
+# [Table API extension (preview)](#tab/table-api/csharp-script)
+
+Version 3.x of the extension bundle doesn't currently include the Table API bindings. For now, you need to instead use version 2.x of the extension bundle, which uses the combined Azure Storage extension.
+
+# [Functions 1.x](#tab/functionsv1/csharp-script)
+
+The following example shows a table input binding in a *function.json* file and [C# script](./functions-reference-csharp.md) code that uses the binding. The function uses a queue trigger to read a single table row.
+
+The *function.json* file specifies a `partitionKey` and a `rowKey`. The `rowKey` value `{queueTrigger}` indicates that the row key comes from the queue message string.
+
+```json
+{
+ "bindings": [
+ {
+ "queueName": "myqueue-items",
+ "connection": "MyStorageConnectionAppSetting",
+ "name": "myQueueItem",
+ "type": "queueTrigger",
+ "direction": "in"
+ },
+ {
+ "name": "personEntity",
+ "type": "table",
+ "tableName": "Person",
+ "partitionKey": "Test",
+ "rowKey": "{queueTrigger}",
+ "connection": "MyStorageConnectionAppSetting",
+ "direction": "in"
+ }
+ ],
+ "disabled": false
+}
+```
+
+The [configuration](#configuration) section explains these properties.
+
+Here's the C# script code:
+
+```csharp
+public static void Run(string myQueueItem, Person personEntity, ILogger log)
+{
+ log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
+ log.LogInformation($"Name in Person entity: {personEntity.Name}");
+}
+
+public class Person
+{
+ public string PartitionKey { get; set; }
+ public string RowKey { get; set; }
+ public string Name { get; set; }
+}
+```
-The following example shows a table input binding in a *function.json* file and [C# script](functions-reference-csharp.md) code that uses the binding. The function reads entities for a partition key that is specified in a queue message.
+The following example shows a table input binding in a *function.json* file and [C# script](./functions-reference-csharp.md) code that uses the binding. The function uses `IQueryable<T>` to read entities for a partition key that is specified in a queue message. `IQueryable<T>` is only supported by version 1.x of the Functions runtime.
Here's the *function.json* file:
public class Person : TableEntity
} ```
-# [Java](#tab/java)
++ The following example shows an HTTP triggered function which returns a list of person objects who are in a specified partition in Table storage. In the example, the partition key is extracted from the http route, and the tableName and connection are from the function settings.
public HttpResponseMessage get(
} ```
-The following example uses the Filter to query for persons with a specific name in an Azure Table, and limits the number of possible matches to 10 results.
+The following example uses a filter to query for persons with a specific name in an Azure Table, and limits the number of possible matches to 10 results.
```java @FunctionName("getPersonsByName")
public Person[] get(
} ```
-# [JavaScript](#tab/javascript)
The following example shows a table input binding in a *function.json* file and [JavaScript code](functions-reference-node.md) that uses the binding. The function uses a queue trigger to read a single table row.
module.exports = async function (context, myQueueItem) {
}; ```
-# [PowerShell](#tab/powershell)
The following function uses a queue trigger to read a single table row as input to a function.
Binding configuration in _function.json_:
```json {
-  "bindings": [
-    {
-      "queueName": "myqueue-items",
-      "connection": "MyStorageConnectionAppSetting",
-      "name": "MyQueueItem",
-      "type": "queueTrigger",
-      "direction": "in"
-    },
-    {
-      "name": "PersonEntity",
-      "type": "table",
-      "tableName": "Person",
-      "partitionKey": "Test",
-      "rowKey": "{queueTrigger}",
-      "connection": "MyStorageConnectionAppSetting",
-      "direction": "in"
-    }
-  ],
-  "disabled": false
+ "bindings": [
+ {
+ "queueName": "myqueue-items",
+ "connection": "MyStorageConnectionAppSetting",
+ "name": "MyQueueItem",
+ "type": "queueTrigger",
+ "direction": "in"
+ },
+ {
+ "name": "PersonEntity",
+ "type": "table",
+ "tableName": "Person",
+ "partitionKey": "Test",
+ "rowKey": "{queueTrigger}",
+ "connection": "MyStorageConnectionAppSetting",
+ "direction": "in"
+ }
+ ],
+ "disabled": false
} ``` PowerShell code in _run.ps1_: ```powershell
-param($MyQueueItem,ΓÇ»$PersonEntity,ΓÇ»$TriggerMetadata)
-Write-Host "PowerShell queue trigger function processed work item: $MyQueueItem"
-Write-Host "Person entity name: $($PersonEntity.Name)"
+param($MyQueueItem, $PersonEntity, $TriggerMetadata)
+Write-Host "PowerShell queue trigger function processed work item: $MyQueueItem"
+Write-Host "Person entity name: $($PersonEntity.Name)"
```
-# [Python](#tab/python)
The following function uses a queue trigger to read a single table row as input to a function.
def main(req: func.HttpRequest, messageJSON) -> func.HttpResponse:
return func.HttpResponse(f"Table row: {messageJSON}") ```
-With this simple binding, you can't programatically handle a case in which no row that has a row key ID is found. For more fine-grained data selection, use the [storage SDK](/azure/developer/python/azure-sdk-example-storage-use?tabs=cmd).
+With this simple binding, you can't programmatically handle a case in which no row that has a row key ID is found. For more fine-grained data selection, use the [storage SDK](/azure/developer/python/azure-sdk-example-storage-use?tabs=cmd).
-## Attributes and annotations
-
-# [C#](#tab/csharp)
-
- In [C# class libraries](functions-dotnet-class-library.md), use the following attributes to configure a table input binding:
-* [TableAttribute](https://github.com/Azure/azure-webjobs-sdk/blob/master/src/Microsoft.Azure.WebJobs.Extensions.Storage/Tables/TableAttribute.cs)
+## Attributes
- The attribute's constructor takes the table name, partition key, and row key. The attribute can be used on an `out` parameter or on the return value of the function, as shown in the following example:
+Both [in-process](functions-dotnet-class-library.md) and [isolated process](dotnet-isolated-process-guide.md) C# libraries use attributes to define the function. C# script instead uses a function.json configuration file.
- ```csharp
- [FunctionName("TableInput")]
- public static void Run(
- [QueueTrigger("table-items")] string input,
- [Table("MyTable", "Http", "{queueTrigger}")] MyPoco poco,
- ILogger log)
- {
- ...
- }
- ```
+# [In-process](#tab/in-process)
- You can set the `Connection` property to specify the storage account to use, as shown in the following example:
+In [C# class libraries](functions-dotnet-class-library.md), the `TableAttribute` supports the following properties:
- ```csharp
- [FunctionName("TableInput")]
- public static void Run(
- [QueueTrigger("table-items")] string input,
- [Table("MyTable", "Http", "{queueTrigger}", Connection = "StorageConnectionAppSetting")] MyPoco poco,
- ILogger log)
- {
- ...
- }
- ```
+| Attribute property |Description|
+|||
+| **TableName** | The name of the table.|
+| **PartitionKey** |Optional. The partition key of the table entity to read. See the [usage](#usage) section for guidance on how to use this property.|
+|**RowKey** | Optional. The row key of a single table entity to read. Can't be used with `Take` or `Filter`. |
+|**Take** | Optional. The maximum number of entities to return. Can't be used with `RowKey`. |
+|**Filter** | Optional. An OData filter expression for the entities to return from the table. Can't be used with `RowKey`.|
+|**Connection** | The name of an app setting or setting collection that specifies how to connect to the table service. See [Connections](#connections). |
- For a complete example, see the C# example.
+The attribute's constructor takes the table name, partition key, and row key, as shown in the following example:
-* [StorageAccountAttribute](https://github.com/Azure/azure-webjobs-sdk/blob/master/src/Microsoft.Azure.WebJobs/StorageAccountAttribute.cs)
-
- Provides another way to specify the storage account to use. The constructor takes the name of an app setting that contains a storage connection string. The attribute can be applied at the parameter, method, or class level. The following example shows class level and method level:
+```csharp
+[FunctionName("TableInput")]
+public static void Run(
+ [QueueTrigger("table-items")] string input,
+ [Table("MyTable", "Http", "{queueTrigger}")] MyPoco poco,
+ ILogger log)
+{
+ ...
+}
+```
- ```csharp
- [StorageAccount("ClassLevelStorageAppSetting")]
- public static class AzureFunctions
- {
- [FunctionName("TableInput")]
- [StorageAccount("FunctionLevelStorageAppSetting")]
- public static void Run( //...
- {
- ...
- }
- ```
+You can set the `Connection` property to specify the connection to the table service, as shown in the following example:
-The storage account to use is determined in the following order:
+```csharp
+[FunctionName("TableInput")]
+public static void Run(
+ [QueueTrigger("table-items")] string input,
+ [Table("MyTable", "Http", "{queueTrigger}", Connection = "StorageConnectionAppSetting")] MyPoco poco,
+ ILogger log)
+{
+ ...
+}
+```
-* The `Table` attribute's `Connection` property.
-* The `StorageAccount` attribute applied to the same parameter as the `Table` attribute.
-* The `StorageAccount` attribute applied to the function.
-* The `StorageAccount` attribute applied to the class.
-* The default storage account for the function app ("AzureWebJobsStorage" app setting).
-# [C# Script](#tab/csharp-script)
+# [Isolated process](#tab/isolated-process)
-Attributes are not supported by C# Script.
+In [C# class libraries](dotnet-isolated-process-guide.md), the `TableInputAttribute` supports the following properties:
-# [Java](#tab/java)
+| Attribute property |Description|
+|||
+| **TableName** | The name of the table.|
+| **PartitionKey** |Optional. The partition key of the table entity to read. |
+|**RowKey** | Optional. The row key of the table entity to read. |
+| **Take** | Optional. The maximum number of entities to read into an [`IEnumerable<T>`]. Can't be used with `RowKey`.|
+|**Filter** | Optional. An OData filter expression for entities to read into an [`IEnumerable<T>`]. Can't be used with `RowKey`. |
+|**Connection** | The name of an app setting or setting collection that specifies how to connect to the table service. See [Connections](#connections). |
-In the [Java functions runtime library](/java/api/overview/azure/functions/runtime), use the `@TableInput` annotation on parameters whose value would come from Table storage. This annotation can be used with native Java types, POJOs, or nullable values using `Optional<T>`.
+# [C# script](#tab/csharp-script)
-# [JavaScript](#tab/javascript)
+C# script uses a function.json file for configuration instead of attributes.
-Attributes are not supported by JavaScript.
+The following table explains the binding configuration properties for C# script that you set in the *function.json* file.
-# [PowerShell](#tab/powershell)
+|function.json property | Description|
+||-|
+|**type** | Must be set to `table`. This property is set automatically when you create the binding in the Azure portal.|
+|**direction** | Must be set to `in`. This property is set automatically when you create the binding in the Azure portal. |
+|**name** | The name of the variable that represents the table or entity in function code. |
+|**tableName** | The name of the table.|
+|**partitionKey** | Optional. The partition key of the table entity to read. |
+|**rowKey** |Optional. The row key of the table entity to read. Can't be used with `take` or `filter`.|
+|**take** | Optional. The maximum number of entities to return. Can't be used with `rowKey`. |
+|**filter** | Optional. An OData filter expression for the entities to return from the table. Can't be used with `rowKey`.|
+|**connection** | The name of an app setting or setting collection that specifies how to connect to the table service. See [Connections](#connections). |
-Attributes are not supported by PowerShell.
+
-# [Python](#tab/python)
+## Annotations
-Attributes are not supported by Python.
+In the [Java functions runtime library](/java/api/overview/azure/functions/runtime), use the `@TableInput` annotation on parameters whose value would come from Table storage. This annotation can be used with native Java types, POJOs, or nullable values using `Optional<T>`. This annotation supports the following elements:
-
+| Element |Description|
+|||
+| **[TableInputName](/java/api/com.microsoft.azure.functions.annotation.tableinput.name)** | The name of the table. |
+ **[PartitionKey](/java/api/com.microsoft.azure.functions.annotation.tableinput.partitionkey)** |Optional. The partition key of the table entity to read. |
+|**[RowKey](/java/api/com.microsoft.azure.functions.annotation.tableinput.rowkey)** | The row key of the table entity to read. |
+|**[Take](/java/api/com.microsoft.azure.functions.annotation.tableinput.take)** | Optional. The maximum number of entities to read.|
+|**[Filter](/java/api/com.microsoft.azure.functions.annotation.tableinput.filter)** | Optional. An OData filter expression for table input. |
+|**[Connection](/java/api/com.microsoft.azure.functions.annotation.tableinput.connection)** | The name of an app setting or setting collection that specifies how to connect to the table service. See [Connections](#connections). |
## Configuration The following table explains the binding configuration properties that you set in the *function.json* file and the `Table` attribute.
-|function.json property | Attribute property |Description|
-|||-|
-|**type** | n/a | Must be set to `table`. This property is set automatically when you create the binding in the Azure portal.|
-|**direction** | n/a | Must be set to `in`. This property is set automatically when you create the binding in the Azure portal. |
-|**name** | n/a | The name of the variable that represents the table or entity in function code. |
-|**tableName** | **TableName** | The name of the table.|
-|**partitionKey** | **PartitionKey** |Optional. The partition key of the table entity to read. See the [usage](#usage) section for guidance on how to use this property.|
-|**rowKey** |**RowKey** | Optional. The row key of the table entity to read. See the [usage](#usage) section for guidance on how to use this property.|
-|**take** |**Take** | Optional. The maximum number of entities to read in JavaScript. See the [usage](#usage) section for guidance on how to use this property.|
-|**filter** |**Filter** | Optional. An OData filter expression for table input in JavaScript. See the [usage](#usage) section for guidance on how to use this property.|
-|**connection** |**Connection** | The name of an app setting that contains the Storage connection string to use for this binding. The setting can be the name of an "AzureWebJobs" prefixed app setting or connection string name. For example, if your setting name is "AzureWebJobsMyStorage", you can specify "MyStorage" here. The Functions runtime will automatically look for an app setting that named "AzureWebJobsMyStorage". If you leave `connection` empty, the Functions runtime uses the default Storage connection string in the app setting that is named `AzureWebJobsStorage`.|
+|function.json property | Description|
+||-|
+|**type** | Must be set to `table`. This property is set automatically when you create the binding in the Azure portal.|
+|**direction** | Must be set to `in`. This property is set automatically when you create the binding in the Azure portal. |
+|**name** | The name of the variable that represents the table or entity in function code. |
+|**tableName** | The name of the table.|
+|**partitionKey** | Optional. The partition key of the table entity to read. |
+|**rowKey** |Optional. The row key of the table entity to read. Can't be used with `take` or `filter`.|
+|**take** | Optional. The maximum number of entities to return. Can't be used with `rowKey`. |
+|**filter** | Optional. An OData filter expression for the entities to return from the table. Can't be used with `rowKey`.|
+|**connection** | The name of an app setting or setting collection that specifies how to connect to the table service. See [Connections](#connections). |
[!INCLUDE [app settings to local.settings.json](../../includes/functions-app-settings-local.md)]+ ## Usage
-# [C#](#tab/csharp)
-* **Read one row in**
+The usage of the binding depends on the extension package version, and the C# modality used in your function app, which can be one of the following:
- Set `partitionKey` and `rowKey`. Access the table data by using a method parameter `T <paramName>`. In C# script, `paramName` is the value specified in the `name` property of *function.json*. `T` is typically a type that implements `ITableEntity` or derives from `TableEntity`. The `filter` and `take` properties are not used in this scenario.
+# [In-process](#tab/in-process)
-* **Read one or more rows**
+An in-process class library is a compiled C# function that runs in the same process as the Functions runtime.
+
+# [Isolated process](#tab/isolated-process)
- Access the table data by using a method parameter `IQueryable<T> <paramName>`. In C# script, `paramName` is the value specified in the `name` property of *function.json*. `T` must be a type that implements `ITableEntity` or derives from `TableEntity`. You can use `IQueryable` methods to do any filtering required. The `partitionKey`, `rowKey`, `filter`, and `take` properties are not used in this scenario.
+An isolated process class library compiled C# function runs in a process isolated from the runtime. Isolated process is required to support C# functions running on .NET 5.0.
+
+# [C# script](#tab/csharp-script)
- > [!NOTE]
- > `IQueryable` isn't supported in the [Functions v2 runtime](functions-versions.md). An alternative is to [use a CloudTable paramName method parameter](https://stackoverflow.com/questions/48922485/binding-to-table-storage-in-v2-azure-functions-using-cloudtable) to read the table by using the Azure Storage SDK. If you try to bind to `CloudTable` and get an error message, make sure that you have a reference to [the correct Storage SDK version](./functions-bindings-storage-table.md#azure-storage-sdk-version-in-functions-1x).
+C# script is used primarily when creating C# functions in the Azure portal.
-# [C# Script](#tab/csharp-script)
+
-* **Read one row in**
+Choose a version to see usage details for the mode and version.
- Set `partitionKey` and `rowKey`. Access the table data by using a method parameter `T <paramName>`. In C# script, `paramName` is the value specified in the `name` property of *function.json*. `T` is typically a type that implements `ITableEntity` or derives from `TableEntity`. The `filter` and `take` properties are not used in this scenario.
+# [Combined Azure Storage extension](#tab/storage-extension/in-process)
-* **Read one or more rows**
+To return a specific entity by key, use a binding parameter that derives from [TableEntity](/dotnet/api/azure.data.tables.tableentity).
- Access the table data by using a method parameter `IQueryable<T> <paramName>`. In C# script, `paramName` is the value specified in the `name` property of *function.json*. `T` must be a type that implements `ITableEntity` or derives from `TableEntity`. You can use `IQueryable` methods to do any filtering required. The `partitionKey`, `rowKey`, `filter`, and `take` properties are not used in this scenario.
+To execute queries that return multiple entities, bind to a [CloudTable] object. You can then use this object to create and execute queries against the bound table. Note that [CloudTable] and related APIs belong to the [Microsoft.Azure.Cosmos.Table](/dotnet/api/microsoft.azure.cosmos.table) namespace.
- > [!NOTE]
- > `IQueryable` isn't supported in the [Functions v2 runtime](functions-versions.md). An alternative is to [use a CloudTable paramName method parameter](https://stackoverflow.com/questions/48922485/binding-to-table-storage-in-v2-azure-functions-using-cloudtable) to read the table by using the Azure Storage SDK. If you try to bind to `CloudTable` and get an error message, make sure that you have a reference to [the correct Storage SDK version](./functions-bindings-storage-table.md#azure-storage-sdk-version-in-functions-1x).
+# [Table API extension (preview)](#tab/table-api/in-process)
-# [Java](#tab/java)
+To return a specific entity by key, use a binding parameter that derives from [TableEntity](/dotnet/api/azure.data.tables.tableentity).
-The [TableInput](/java/api/com.microsoft.azure.functions.annotation.tableinput) attribute gives you access to the table row that triggered the function.
+To execute queries that return multiple entities, bind to a [TableClient] object. You can then use this object to create and execute queries against the bound table. Note that [TableClient] and related APIs belong to the [Azure.Data.Tables](/dotnet/api/azure.data.tables) namespace.
-# [JavaScript](#tab/javascript)
+# [Functions 1.x](#tab/functionsv1/in-process)
-Set the `filter` and `take` properties. Don't set `partitionKey` or `rowKey`. Access the input table entity (or entities) using `context.bindings.<BINDING_NAME>`. The deserialized objects have `RowKey` and `PartitionKey` properties.
+To return a specific entity by key, use a binding parameter that derives from [TableEntity]. The specific `TableName`, `PartitionKey`, and `RowKey` are used to try and get a specific entity from the table.
-# [PowerShell](#tab/powershell)
+To execute queries that return multiple entities, bind to an [`IQueryable<T>`] of a type that inherits from [TableEntity].
-Data is passed to the input parameter as specified by the `name` key in the *function.json* file. Specifying The `partitionKey` and `rowKey` allows you to filter to specific records. See the [PowerShell example](#example) for more detail.
+# [Combined Azure Storage extension](#tab/storage-extension/isolated-process)
-# [Python](#tab/python)
+To return a specific entity by key, use a plain-old CLR object (POCO). The specific `TableName`, `PartitionKey`, and `RowKey` are used to try and get a specific entity from the table.
-Table data is passed to the function as a JSON string. De-serialize the message by calling `json.loads` as shown in the input [example](#example).
+ When returning multiple entities as an [`IEnumerable<T>`], you can instead use `Take` and `Filter` properties to restrict the result set.
+
+# [Table API extension (preview)](#tab/table-api/isolated-process)
+
+The Table API extension does not currently support isolated process. You will instead need to use the combined Azure Storage extension.
+
+# [Functions 1.x](#tab/functionsv1/isolated-process)
+
+Functions version 1.x doesn't support isolated process.
+
+# [Combined Azure Storage extension](#tab/storage-extension/csharp-script)
+
+To return a specific entity by key, use a binding parameter that derives from [TableEntity](/dotnet/api/azure.data.tables.tableentity).
+
+To execute queries that return multiple entities, bind to a [CloudTable] object. You can then use this object to create and execute queries against the bound table. Note that [CloudTable] and related APIs belong to the [Microsoft.Azure.Cosmos.Table](/dotnet/api/microsoft.azure.cosmos.table) namespace.
+
+# [Table API extension (preview)](#tab/table-api/csharp-script)
+
+Version 3.x of the extension bundle doesn't currently include the Table API bindings. For now, you need to instead use version 2.x of the extension bundle, which uses the combined Azure Storage extension.
+
+# [Functions 1.x](#tab/functionsv1/csharp-script)
+
+To return a specific entity by key, use a binding parameter that derives from [TableEntity]. The specific `TableName`, `PartitionKey`, and `RowKey` are used to try and get a specific entity from the table.
+
+To execute queries that return multiple entities, bind to an [`IQueryable<T>`] of a type that inherits from [TableEntity].
+The [TableInput](/java/api/com.microsoft.azure.functions.annotation.tableinput) attribute gives you access to the table row that triggered the function.
+Set the `filter` and `take` properties. Don't set `partitionKey` or `rowKey`. Access the input table entity (or entities) using `context.bindings.<BINDING_NAME>`. The deserialized objects have `RowKey` and `PartitionKey` properties.
+Data is passed to the input parameter as specified by the `name` key in the *function.json* file. Specifying The `partitionKey` and `rowKey` allows you to filter to specific records.
+Table data is passed to the function as a JSON string. De-serialize the message by calling `json.loads` as shown in the input [example](#example).
+
+For specific usage details, see [Example](#example).
+ ## Next steps
-* [Write table storage data from a function](./functions-bindings-storage-table-output.md)
+* [Write table data from a function](./functions-bindings-storage-table-output.md)
+
+[TableInputAttribute]: /dotnet/api/microsoft.azure.webjobs.tableinputattribute
+[CloudTable]: /dotnet/api/microsoft.azure.cosmos.table.cloudtable
+[TableEntity]: /dotnet/api/azure.data.tables.tableentity
+[`IQueryable<T>`]: /dotnet/api/system.linq.iqueryable-1
+[`IEnumerable<T>`]: /dotnet/api/system.collections.generic.ienumerable-1
azure-functions Functions Bindings Storage Table Output https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-bindings-storage-table-output.md
Title: Azure Table storage output bindings for Azure Functions
-description: Understand how to use Azure Table storage output bindings in Azure Functions.
+ Title: Azure Tables output bindings for Azure Functions
+description: Understand how to use Azure Tables output bindings in Azure Functions.
- Previously updated : 09/03/2018 Last updated : 01/23/2022 ms.devlang: csharp, java, javascript, powershell, python
+zone_pivot_groups: programming-languages-set-functions-lang-workers
-# Azure Table storage output bindings for Azure Functions
-Use an Azure Table storage output binding to write entities to a table in an Azure Storage account.
+# Azure Tables output bindings for Azure Functions
+
+Use an Azure Tables output binding to write entities to a table in an Azure Storage or Cosmos DB account.
+
+For information on setup and configuration details, see the [overview](./functions-bindings-storage-table.md)
> [!NOTE]
-> This output binding does not support updating existing entities. Use the `TableOperation.Replace` operation [from the Azure Storage SDK](../cosmos-db/table/table-support.md) to update an existing entity.
+> This output binding only supports creating new entities in a table. If you need to update an existing entity from your function code, instead use an Azure Tables SDK directly.
## Example
-# [C#](#tab/csharp)
++
+# [In-process](#tab/in-process)
The following example shows a [C# function](functions-dotnet-class-library.md) that uses an HTTP trigger to write a single table row.
public class TableStorage
} ``` +
+# [Isolated process](#tab/isolated-process)
+
+The following `MyTableData` class represents a row of data in the table:
++
+The following function, which is started by a Queue Storage trigger, writes a new `MyDataTable` entity to a table named **OutputTable**.
++ # [C# Script](#tab/csharp-script) The following example shows a table output binding in a *function.json* file and [C# script](functions-reference-csharp.md) code that uses the binding. The function writes multiple table entities.
Here's the *function.json* file:
} ```
-The [configuration](#configuration) section explains these properties.
+The [attributes](#attributes) section explains these properties.
Here's the C# script code:
public class Person
```
-# [Java](#tab/java)
++ The following example shows a Java function that uses an HTTP trigger to write a single table row.
public class AddPersons {
} ```
-# [JavaScript](#tab/javascript)
-The following example shows a table output binding in a *function.json* file and a [JavaScript function](functions-reference-node.md) that uses the binding. The function writes multiple table entities.
+The following example shows a table output binding in a *function.json* file and a [JavaScript function](functions-reference-node.md) that uses the binding. The function writes multiple table entities.
Here's the *function.json* file:
module.exports = async function (context) {
}; ```
-# [PowerShell](#tab/powershell)
The following example demonstrates how to write multiple entities to a table from a function.
Binding configuration in _function.json_:
```json {
-  "bindings": [
-    {
-      "name": "InputData",
-      "type": "manualTrigger",
-      "direction": "in"
-    },
-    {
-      "tableName": "Person",
-      "connection": "MyStorageConnectionAppSetting",
-      "name": "TableBinding",
-      "type": "table",
-      "direction": "out"
-    }
-  ],
-  "disabled": false
+ "bindings": [
+ {
+ "name": "InputData",
+ "type": "manualTrigger",
+ "direction": "in"
+ },
+ {
+ "tableName": "Person",
+ "connection": "MyStorageConnectionAppSetting",
+ "name": "TableBinding",
+ "type": "table",
+ "direction": "out"
+ }
+ ],
+ "disabled": false
} ``` PowerShell code in _run.ps1_: ```powershell
-param($InputData,ΓÇ»$TriggerMetadata)
-ΓÇ»
-foreach ($i in 1..10) {
-    Push-OutputBinding -Name TableBinding -Value @{
-        PartitionKey = 'Test'
-        RowKey = "$i"
-        Name = "Name $i"
-    }
+param($InputData, $TriggerMetadata)
+
+foreach ($i in 1..10) {
+ Push-OutputBinding -Name TableBinding -Value @{
+ PartitionKey = 'Test'
+ RowKey = "$i"
+ Name = "Name $i"
+ }
} ```
-# [Python](#tab/python)
-The following example demonstrates how to use the Table storage output binding. The `table` binding is configured in the *function.json* by assigning values to `name`, `tableName`, `partitionKey`, and `connection`:
+The following example demonstrates how to use the Table storage output binding. Configure the `table` binding in the *function.json* by assigning values to `name`, `tableName`, `partitionKey`, and `connection`:
```json {
def main(req: func.HttpRequest, message: func.Out[str]) -> func.HttpResponse:
-## Attributes and annotations
+## Attributes
+
+Both [in-process](functions-dotnet-class-library.md) and [isolated process](dotnet-isolated-process-guide.md) C# libraries use attributes to define the function. C# script instead uses a function.json configuration file.
-# [C#](#tab/csharp)
+# [In-process](#tab/in-process)
-In [C# class libraries](functions-dotnet-class-library.md), use the [TableAttribute](https://github.com/Azure/azure-webjobs-sdk/blob/master/src/Microsoft.Azure.WebJobs.Extensions.Storage/Tables/TableAttribute.cs).
+In [C# class libraries](functions-dotnet-class-library.md), the `TableAttribute` supports the following properties:
-The attribute's constructor takes the table name. The attribute can be used on an `out` parameter or on the return value of the function, as shown in the following example:
+| Attribute property |Description|
+|||
+|**TableName** | The name of the table to which to write.|
+|**PartitionKey** | The partition key of the table entity to write. |
+|**RowKey** | The row key of the table entity to write. |
+|**Connection** | The name of an app setting or setting collection that specifies how to connect to the table service. See [Connections](#connections). |
+
+The attribute's constructor takes the table name. Use the attribute on an `out` parameter or on the return value of the function, as shown in the following example:
```csharp [FunctionName("TableOutput")]
public static MyPoco TableOutput(
} ```
-You can set the `Connection` property to specify the storage account to use, as shown in the following example:
+You can set the `Connection` property to specify a connection to the table service, as shown in the following example:
```csharp [FunctionName("TableOutput")]
public static MyPoco TableOutput(
} ```
-For a complete example, see the [C# example](#example).
-You can use the `StorageAccount` attribute to specify the storage account at class, method, or parameter level. For more information, see [Input - attributes](./functions-bindings-storage-table-input.md#attributes-and-annotations).
+# [Isolated process](#tab/isolated-process)
-# [C# Script](#tab/csharp-script)
+In [C# class libraries](dotnet-isolated-process-guide.md), the `TableInputAttribute` supports the following properties:
-Attributes are not supported by C# Script.
+| Attribute property |Description|
+|||
+|**TableName** | The name of the table to which to write.|
+|**PartitionKey** | The partition key of the table entity to write. |
+|**RowKey** | The row key of the table entity to write. |
+|**Connection** | The name of an app setting or setting collection that specifies how to connect to the table service. See [Connections](#connections). |
+
+# [C# script](#tab/csharp-script)
-# [Java](#tab/java)
+C# script uses a function.json file for configuration instead of attributes.
-In the [Java functions runtime library](/java/api/overview/azure/functions/runtime), use the [TableOutput](https://github.com/Azure/azure-functions-java-library/blob/master/src/main/java/com/microsoft/azure/functions/annotation/TableOutput.java/) annotation on parameters to write values into table storage.
+The following table explains the binding configuration properties for C# script that you set in the *function.json* file.
-See the [example for more detail](#example).
+|function.json property | Description|
+|||
+|**type** |Must be set to `table`. This property is set automatically when you create the binding in the Azure portal.|
+|**direction** | Must be set to `out`. This property is set automatically when you create the binding in the Azure portal. |
+|**name** | The variable name used in function code that represents the table or entity. Set to `$return` to reference the function return value.|
+|**tableName** |The name of the table to which to write.|
+|**partitionKey** |The partition key of the table entity to write. |
+|**rowKey** | The row key of the table entity to write. |
+|**connection** | The name of an app setting or setting collection that specifies how to connect to the table service. See [Connections](#connections). |
-# [JavaScript](#tab/javascript)
+
-Attributes are not supported by JavaScript.
+## Annotations
-# [PowerShell](#tab/powershell)
+In the [Java functions runtime library](/java/api/overview/azure/functions/runtime), use the [TableOutput](https://github.com/Azure/azure-functions-java-library/blob/master/src/main/java/com/microsoft/azure/functions/annotation/TableOutput.java/) annotation on parameters to write values into your tables. The attribute supports the following elements:
-Attributes are not supported by PowerShell.
+| Element |Description|
+|||
+|**name**| The variable name used in function code that represents the table or entity. |
+|**dataType**| Defines how Functions runtime should treat the parameter value. To learn more, see [dataType](/java/api/com.microsoft.azure.functions.annotation.tableoutput.datatype).
+|**tableName** | The name of the table to which to write.|
+|**partitionKey** | The partition key of the table entity to write. |
+|**rowKey** | The row key of the table entity to write. |
+|**connection** | The name of an app setting or setting collection that specifies how to connect to the table service. See [Connections](#connections). |
-# [Python](#tab/python)
+## Configuration
-Attributes are not supported by Python.
+The following table explains the binding configuration properties that you set in the *function.json* file.
+
+|function.json property | Description|
+|||
+|**type** |Must be set to `table`. This property is set automatically when you create the binding in the Azure portal.|
+|**direction** | Must be set to `out`. This property is set automatically when you create the binding in the Azure portal. |
+|**name** | The variable name used in function code that represents the table or entity. Set to `$return` to reference the function return value.|
+|**tableName** |The name of the table to which to write.|
+|**partitionKey** |The partition key of the table entity to write. |
+|**rowKey** | The row key of the table entity to write. |
+|**connection** | The name of an app setting or setting collection that specifies how to connect to the table service. See [Connections](#connections). |
+++
+## Usage
++
+The usage of the binding depends on the extension package version, and the C# modality used in your function app, which can be one of the following:
+
+# [In-process](#tab/in-process)
+
+An in-process class library is a compiled C# function runs in the same process as the Functions runtime.
+
+# [Isolated process](#tab/isolated-process)
+
+An isolated process class library compiled C# function runs in a process isolated from the runtime. Isolated process is required to support C# functions running on .NET 5.0.
+
+# [C# script](#tab/csharp-script)
+
+C# script is used primarily when creating C# functions in the Azure portal.
-## Configuration
+Choose a version to see usage details for the mode and version.
-The following table explains the binding configuration properties that you set in the *function.json* file and the `Table` attribute.
+# [Combined Azure Storage extension](#tab/storage-extension/in-process)
-|function.json property | Attribute property |Description|
-|||-|
-|**type** | n/a | Must be set to `table`. This property is set automatically when you create the binding in the Azure portal.|
-|**direction** | n/a | Must be set to `out`. This property is set automatically when you create the binding in the Azure portal. |
-|**name** | n/a | The variable name used in function code that represents the table or entity. Set to `$return` to reference the function return value.|
-|**tableName** |**TableName** | The name of the table.|
-|**partitionKey** |**PartitionKey** | The partition key of the table entity to write. See the [usage section](#usage) for guidance on how to use this property.|
-|**rowKey** |**RowKey** | The row key of the table entity to write. See the [usage section](#usage) for guidance on how to use this property.|
-|**connection** |**Connection** | The name of an app setting that contains the Storage connection string to use for this binding. If the app setting name begins with "AzureWebJobs", you can specify only the remainder of the name here. For example, if you set `connection` to "MyStorage", the Functions runtime looks for an app setting that is named "MyStorage". If you leave `connection` empty, the Functions runtime uses the default Storage connection string in the app setting that is named `AzureWebJobsStorage`.|
+The following types are supported for `out` parameters and return types:
+- A plain-old CLR object (POCO) that includes the `PartitionKey` and `RowKey` properties. You can accompany these properties by implementing `ITableEntity` or inheriting `TableEntity`.
+- `ICollector<T>` or `IAsyncCollector<T>` where `T` includes the `PartitionKey` and `RowKey` properties. You can accompany these properties by implementing `ITableEntity` or inheriting `TableEntity`.
-## Usage
+You can also bind to `CloudTable` [from the Storage SDK](/dotnet/api/microsoft.azure.cosmos.table.cloudtable) as a method parameter. You can then use that object to write to the table.
-# [C#](#tab/csharp)
+# [Table API extension (preview)](#tab/table-api/in-process)
-Access the output table entity by using a method parameter `ICollector<T> paramName` or `IAsyncCollector<T> paramName` where `T` includes the `PartitionKey` and `RowKey` properties. These properties are often accompanied by implementing `ITableEntity` or inheriting `TableEntity`.
+The following types are supported for `out` parameters and return types:
-Alternatively you can use a `CloudTable` method parameter to write to the table by using the Azure Storage SDK. If you try to bind to `CloudTable` and get an error message, make sure that you have a reference to [the correct Storage SDK version](./functions-bindings-storage-table.md#azure-storage-sdk-version-in-functions-1x).
+- A plain-old CLR object (POCO) that includes the `PartitionKey` and `RowKey` properties. You can accompany these properties by implementing `ITableEntity`.
+- `ICollector<T>` or `IAsyncCollector<T>` where `T` includes the `PartitionKey` and `RowKey` properties. You can accompany these properties by implementing `ITableEntity`.
-# [C# Script](#tab/csharp-script)
+You can also bind to `TableClient` [from the Azure SDK](/dotnet/api/azure.data.tables.tableclient). You can then use that object to write to the table.
-Access the output table entity by using a method parameter `ICollector<T> paramName` or `IAsyncCollector<T> paramName` where `T` includes the `PartitionKey` and `RowKey` properties. These properties are often accompanied by implementing `ITableEntity` or inheriting `TableEntity`. The `paramName` value is specified in the `name` property of *function.json*.
+# [Functions 1.x](#tab/functionsv1/in-process)
-Alternatively you can use a `CloudTable` method parameter to write to the table by using the Azure Storage SDK. If you try to bind to `CloudTable` and get an error message, make sure that you have a reference to [the correct Storage SDK version](./functions-bindings-storage-table.md#azure-storage-sdk-version-in-functions-1x).
+The following types are supported for `out` parameters and return types:
-# [Java](#tab/java)
+- A plain-old CLR object (POCO) that includes the `PartitionKey` and `RowKey` properties. You can accompany these properties by implementing `ITableEntity` or inheriting `TableEntity`.
+- `ICollector<T>` or `IAsyncCollector<T>` where `T` includes the `PartitionKey` and `RowKey` properties. You can accompany these properties by implementing `ITableEntity` or inheriting `TableEntity`.
-There are two options for outputting a Table storage row from a function by using the [TableStorageOutput](/java/api/com.microsoft.azure.functions.annotation.tableoutput) annotation:
+You can also bind to `CloudTable` [from the Storage SDK](/dotnet/api/microsoft.azure.cosmos.table.cloudtable) as a method parameter. You can then use that object to write to the table.
-- **Return value**: By applying the annotation to the function itself, the return value of the function is persisted as a Table storage row.
+# [Combined Azure Storage extension](#tab/storage-extension/isolated-process)
-- **Imperative**: To explicitly set the message value, apply the annotation to a specific parameter of the type [`OutputBinding<T>`](/java/api/com.microsoft.azure.functions.outputbinding), where `T` includes the `PartitionKey` and `RowKey` properties. These properties are often accompanied by implementing `ITableEntity` or inheriting `TableEntity`.
+Return a plain-old CLR object (POCO) with properties that can be mapped to the table entity.
-# [JavaScript](#tab/javascript)
+# [Table API extension (preview)](#tab/table-api/isolated-process)
-Access the output event by using `context.bindings.<name>` where `<name>` is the value specified in the `name` property of *function.json*.
+The Table API extension does not currently support isolated process. You will instead need to use the combined Azure Storage extension.
-# [PowerShell](#tab/powershell)
+# [Functions 1.x](#tab/functionsv1/isolated-process)
-To write to table data, use the `Push-OutputBinding` cmdlet, set the `-Name TableBinding` parameter and `-Value` parameter equal to the row data. See the [PowerShell example](#example) for more detail.
+Functions version 1.x doesn't support isolated process.
-# [Python](#tab/python)
+# [Combined Azure Storage extension](#tab/storage-extension/csharp-script)
-There are two options for outputting a Table storage row message from a function:
+The following types are supported for `out` parameters and return types:
+
+- A plain-old CLR object (POCO) that includes the `PartitionKey` and `RowKey` properties. You can accompany these properties by implementing `ITableEntity` or inheriting `TableEntity`.
+- `ICollector<T>` or `IAsyncCollector<T>` where `T` includes the `PartitionKey` and `RowKey` properties. You can accompany these properties by implementing `ITableEntity` or inheriting `TableEntity`.
+
+You can also bind to `CloudTable` [from the Storage SDK](/dotnet/api/microsoft.azure.cosmos.table.cloudtable) as a method parameter. You can then use that object to write to the table.
-- **Return value**: Set the `name` property in *function.json* to `$return`. With this configuration, the function's return value is persisted as a Table storage row.
+# [Table API extension (preview)](#tab/table-api/csharp-script)
-- **Imperative**: Pass a value to the [set](/python/api/azure-functions/azure.functions.out#set-val--t--none) method of the parameter declared as an [Out](/python/api/azure-functions/azure.functions.out) type. The value passed to `set` is persisted as an Event Hub message.
+Version 3.x of the extension bundle doesn't currently include the Table API bindings. For now, you need to instead use version 2.x of the extension bundle, which uses the combined Azure Storage extension.
+
+# [Functions 1.x](#tab/functionsv1/csharp-script)
+
+The following types are supported for `out` parameters and return types:
+
+- A plain-old CLR object (POCO) that includes the `PartitionKey` and `RowKey` properties. You can accompany these properties by implementing `ITableEntity` or inheriting `TableEntity`.
+- `ICollector<T>` or `IAsyncCollector<T>` where `T` includes the `PartitionKey` and `RowKey` properties. You can accompany these properties by implementing `ITableEntity` or inheriting `TableEntity`.
+
+You can also bind to `CloudTable` [from the Storage SDK](/dotnet/api/microsoft.azure.cosmos.table.cloudtable) as a method parameter. You can then use that object to write to the table.
+There are two options for outputting a Table storage row from a function by using the [TableStorageOutput](/java/api/com.microsoft.azure.functions.annotation.tableoutput) annotation:
+
+| Options | Description |
+|||
+| **Return value**| By applying the annotation to the function itself, the return value of the function persists as a Table storage row. |
+|**Imperative**| To explicitly set the table row, apply the annotation to a specific parameter of the type [`OutputBinding<T>`](/java/api/com.microsoft.azure.functions.outputbinding), where `T` includes the `PartitionKey` and `RowKey` properties. You can accompany these properties by implementing `ITableEntity` or inheriting `TableEntity`.|
+
+Access the output event by using `context.bindings.<name>` where `<name>` is the value specified in the `name` property of *function.json*.
+
+To write to table data, use the `Push-OutputBinding` cmdlet, set the `-Name TableBinding` parameter and `-Value` parameter equal to the row data. See the [PowerShell example](#example) for more detail.
++
+There are two options for outputting a Table storage row message from a function:
+
+| Options | Description |
+|||
+| **Return value**| Set the `name` property in *function.json* to `$return`. With this configuration, the function's return value persists as a Table storage row.|
+|**Imperative**| Pass a value to the [set](/python/api/azure-functions/azure.functions.out#set-val--t--none) method of the parameter declared as an [Out](/python/api/azure-functions/azure.functions.out) type. The value passed to `set` is persisted as table row.|
+
+For specific usage details, see [Example](#example).
+ ## Exceptions and return codes | Binding | Reference |
azure-functions Functions Bindings Storage Table https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-bindings-storage-table.md
Title: Azure Table storage bindings for Azure Functions
-description: Understand how to use Azure Table storage bindings in Azure Functions.
+ Title: Azure Tables bindings for Azure Functions
+description: Understand how to use Azure Tables bindings in Azure Functions.
- Previously updated : 09/03/2018 Last updated : 01/23/2022
+zone_pivot_groups: programming-languages-set-functions-lang-workers
-# Azure Table storage bindings for Azure Functions
-Azure Functions integrates with [Azure Storage](../storage/index.yml) via [triggers and bindings](./functions-triggers-bindings.md). Integrating with Table storage allows you to build functions that read and write Table storage data.
+# Azure Tables bindings for Azure Functions
+
+Azure Functions integrates with [Azure Tables](../cosmos-db/table/introduction.md) via [triggers and bindings](./functions-triggers-bindings.md). Integrating with Azure Tables allows you to build functions that read and write data using the Tables API for [Azure Storage](../storage/index.yml) and [Cosmos DB](../cosmos-db/introduction.md).
+
+> [!NOTE]
+> The Table bindings have historically only supported Azure Storage. Support for Cosmos DB is currently in preview. See [Table API extension (preview)](#table-api-extension).
| Action | Type | |||
-| Read table storage data in a function | [Input binding](./functions-bindings-storage-table-input.md) |
-| Allow a function to write table storage data |[Output binding](./functions-bindings-storage-table-output.md) |
+| Read table data in a function | [Input binding](./functions-bindings-storage-table-input.md) |
+| Allow a function to write table data |[Output binding](./functions-bindings-storage-table-output.md) |
+
+## Install extension
+
+The extension NuGet package you install depends on the C# mode you're using in your function app:
+
+# [In-process](#tab/in-process)
+
+Functions execute in the same process as the Functions host. To learn more, see [Develop C# class library functions using Azure Functions](functions-dotnet-class-library.md).
+
+# [Isolated process](#tab/isolated-process)
+
+Functions execute in an isolated C# worker process. To learn more, see [Guide for running functions on .NET 5.0 in Azure](dotnet-isolated-process-guide.md).
+
+# [C# script](#tab/csharp-script)
+
+Functions run as C# script, which is supported primarily for C# portal editing. To update existing binding extensions for C# script apps running in the portal without having to republish your function app, see [Update your extensions].
+++
+The process for installing the extension varies depending on the extension version:
-## Packages - Functions 2.x and higher
+<a name="storage-extension"></a>
+<a name="table-api-extension"></a>
-The Table storage bindings are provided in the [Microsoft.Azure.WebJobs.Extensions.Storage](https://www.nuget.org/packages/Microsoft.Azure.WebJobs.Extensions.Storage/4.0.5) NuGet package, version 4.x. Source code for the package is in the [azure-webjobs-sdk](https://github.com/Azure/azure-webjobs-sdk/tree/dev/src/Microsoft.Azure.WebJobs.Extensions.Storage/Tables) GitHub repository.
+# [Combined Azure Storage extension](#tab/storage-extension/in-process)
+Working with the bindings requires that you reference the appropriate NuGet package. Tables are included in a combined package for Azure Storage. Install the [Microsoft.Azure.WebJobs.Extensions.Storage NuGet package][storage-4.x], version 3.x or 4.x.
-## Packages - Functions 1.x
+> [!NOTE]
+> Tables have been moved out of this package starting in its 5.x version. You need to instead use version 4.x of the extension NuGet package or additionally include the [Table API extension](#table-api-extension) when using version 5.x.
-The Table storage bindings are provided in the [Microsoft.Azure.WebJobs](https://www.nuget.org/packages/Microsoft.Azure.WebJobs) NuGet package, version 2.x. Source code for the package is in the [azure-webjobs-sdk](https://github.com/Azure/azure-webjobs-sdk/tree/v2.x/src/Microsoft.Azure.WebJobs.Storage/Table) GitHub repository.
+# [Table API extension (preview)](#tab/table-api/in-process)
+A new Table API extension is now in preview. The new version introduces the ability to use Cosmos DB Table APIs and to [connect to Azure Storage using an identity instead of a secret](./functions-reference.md#configure-an-identity-based-connection). For a tutorial on configuring your function apps with managed identities, see the tutorial [creating a function app with identity-based connections](./functions-identity-based-connections-tutorial.md). For .NET applications, the new extension version also changes the types that you can bind to, replacing the types from `WindowsAzure.Storage` and `Microsoft.Azure.Storage` with newer types from [Azure.Data.Tables](/dotnet/api/azure.data.tables).
+
+This new extension is available by installing the [Microsoft.Azure.WebJobs.Extensions.Tables NuGet package][table-api-package] to a project using version 5.x or higher of the storage extension for [blobs](./functions-bindings-storage-blob.md?tabs=in-process%2Cextensionv5) and [queues](./functions-bindings-storage-queue.md?tabs=in-process%2Cextensionv5).
+
+Using the .NET CLI:
+
+```dotnetcli
+# Install the Tables API extension
+dotnet add package Microsoft.Azure.WebJobs.Extensions.Tables --version 1.0.0-beta.1
+
+# Update the combined Azure Storage extension (to a version which no longer includes Tables)
+dotnet add package Microsoft.Azure.WebJobs.Extensions.Storage --version 5.0.0
+```
+
+> [!IMPORTANT]
+> If you install the Table API extension with the [Microsoft.Azure.WebJobs.Extensions.Tables NuGet package][table-api-package], ensure that you are using [Microsoft.Azure.WebJobs.Extensions.Storage version 5.x or higher][storage-5.x], as prior versions of that package also include the older version of the table bindings. Using an older version of the storage extension will result in conflicts.
+
+Any existing functions in your project which use table bindings may need to be updated to account for changes in allowed parameter types.
+
+# [Functions 1.x](#tab/functionsv1/in-process)
+
+Functions 1.x apps automatically have a reference the [Microsoft.Azure.WebJobs](https://www.nuget.org/packages/Microsoft.Azure.WebJobs) NuGet package, version 2.x.
[!INCLUDE [functions-storage-sdk-version](../../includes/functions-storage-sdk-version.md)]
+# [Combined Azure Storage extension](#tab/storage-extension/isolated-process)
+
+Tables are included in a combined package for Azure Storage. Install the [Microsoft.Azure.Functions.Worker.Extensions.Storage NuGet package](https://www.nuget.org/packages/Microsoft.Azure.Functions.Worker.Extensions.Storage/4.0.4), version 4.x.
+
+> [!NOTE]
+> Tables have been moved out of this package starting in its 5.x version. You need to instead use version 4.x.
+
+# [Table API extension (preview)](#tab/table-api/isolated-process)
+
+The Table API extension does not currently support isolated process. You will instead need to use the [Storage extension](#storage-extension).
+
+# [Functions 1.x](#tab/functionsv1/isolated-process)
+
+Functions version 1.x doesn't support isolated process.
+
+# [Combined Azure Storage extension](#tab/storage-extension/csharp-script)
+
+You can install this version of the extension in your function app by registering the [extension bundle], version 2.x.
+
+> [!NOTE]
+> Version 3.x of the extension bundle doesn't include the Table Storage bindings. You need to instead use version 2.x for now.
+
+# [Table API extension (preview)](#tab/table-api/csharp-script)
+
+Version 3.x of the extension bundle doesn't currently include the Table API bindings. For now, you need to instead use version 2.x of the extension bundle, which uses the [Storage extension](#storage-extension).
+
+# [Functions 1.x](#tab/functionsv1/csharp-script)
+
+Functions 1.x apps automatically have a reference to the [Microsoft.Azure.WebJobs](https://www.nuget.org/packages/Microsoft.Azure.WebJobs) NuGet package, version 2.x.
+++++
+## Install bundle
+
+The Azure Tables bindings are part of an [extension bundle], which is specified in your host.json project file. You may need to modify this bundle to change the version of the bindings, or if bundles aren't already installed. To learn more, see [extension bundle].
+
+# [Bundle v3.x](#tab/extensionv3)
+
+Version 3.x of the extension bundle doesn't currently include the Azure Tables bindings. You need to instead use version 2.x of the extension bundle.
+
+# [Bundle v2.x](#tab/extensionv2)
+
+You can install this version of the extension in your function app by registering the [extension bundle], version 2.x.
+
+# [Functions 1.x](#tab/functions1)
+
+Functions 1.x apps automatically have a reference to the extension.
+++ ## Next steps -- [Read table storage data when a function runs](./functions-bindings-storage-table-input.md)-- [Write table storage data from a function](./functions-bindings-storage-table-output.md)
+- [Read table data when a function runs](./functions-bindings-storage-table-input.md)
+- [Write table data from a function](./functions-bindings-storage-table-output.md)
+
+[NuGet package]: https://www.nuget.org/packages/Microsoft.Azure.WebJobs.Extensions.Storage
+[storage-4.x]: https://www.nuget.org/packages/Microsoft.Azure.WebJobs.Extensions.Storage/4.0.5
+[storage-5.x]: https://www.nuget.org/packages/Microsoft.Azure.WebJobs.Extensions.Storage/5.0.0
+[table-api-package]: https://www.nuget.org/packages/Microsoft.Azure.WebJobs.Extensions.Tables/
+
+[extension bundle]: ./functions-bindings-register.md#extension-bundles
+
+[Update your extensions]: ./functions-bindings-register.md
+[extension bundle]: ./functions-bindings-register.md#extension-bundles
azure-functions Functions Deployment Slots https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-deployment-slots.md
All slots scale to the same number of workers as the production slot.
## Add a slot
-You can add a slot via the [CLI](/cli/azure/functionapp/deployment/slot#az_functionapp_deployment_slot_create) or through the portal. The following steps demonstrate how to create a new slot in the portal:
+You can add a slot via the [CLI](/cli/azure/functionapp/deployment/slot#az-functionapp-deployment-slot-create) or through the portal. The following steps demonstrate how to create a new slot in the portal:
1. Navigate to your function app.
You can add a slot via the [CLI](/cli/azure/functionapp/deployment/slot#az_funct
## Swap slots
-You can swap slots via the [CLI](/cli/azure/functionapp/deployment/slot#az_functionapp_deployment_slot_swap) or through the portal. The following steps demonstrate how to swap slots in the portal:
+You can swap slots via the [CLI](/cli/azure/functionapp/deployment/slot#az-functionapp-deployment-slot-swap) or through the portal. The following steps demonstrate how to swap slots in the portal:
1. Navigate to the function app. 1. Select **Deployment slots**, and then select **Swap**.
If a swap results in an error or you simply want to "undo" a swap, you can roll
## Remove a slot
-You can remove a slot via the [CLI](/cli/azure/functionapp/deployment/slot#az_functionapp_deployment_slot_delete) or through the portal. The following steps demonstrate how to remove a slot in the portal:
+You can remove a slot via the [CLI](/cli/azure/functionapp/deployment/slot#az-functionapp-deployment-slot-delete) or through the portal. The following steps demonstrate how to remove a slot in the portal:
1. Navigate to **Deployment slots** in the function app, and then select the slot name.
You can remove a slot via the [CLI](/cli/azure/functionapp/deployment/slot#az_fu
Using the [Azure CLI](/cli/azure/functionapp/deployment/slot), you can automate the following actions for a slot: -- [create](/cli/azure/functionapp/deployment/slot#az_functionapp_deployment_slot_create)-- [delete](/cli/azure/functionapp/deployment/slot#az_functionapp_deployment_slot_delete)-- [list](/cli/azure/functionapp/deployment/slot#az_functionapp_deployment_slot_list)-- [swap](/cli/azure/functionapp/deployment/slot#az_functionapp_deployment_slot_swap)-- [auto-swap](/cli/azure/functionapp/deployment/slot#az_functionapp_deployment_slot_auto_swap)
+- [create](/cli/azure/functionapp/deployment/slot#az-functionapp-deployment-slot-create)
+- [delete](/cli/azure/functionapp/deployment/slot#az-functionapp-deployment-slot-delete)
+- [list](/cli/azure/functionapp/deployment/slot#az-functionapp-deployment-slot-list)
+- [swap](/cli/azure/functionapp/deployment/slot#az-functionapp-deployment-slot-swap)
+- [auto-swap](/cli/azure/functionapp/deployment/slot#az-functionapp-deployment-slot-auto-swap)
## Change App Service plan
azure-functions Functions Deployment Technologies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-deployment-technologies.md
The following deployment methods are available in Azure Functions.
You can use an external package URL to reference a remote package (.zip) file that contains your function app. The file is downloaded from the provided URL, and the app runs in [Run From Package](run-functions-from-deployment-package.md) mode.
->__How to use it:__ Add [`WEBSITE_RUN_FROM_PACKAGE`](functions-app-settings.md#website_run_from_package) to your application settings. The value of this setting should be a URL (the location of the specific package file you want to run). You can add settings either [in the portal](functions-how-to-use-azure-function-app-settings.md#settings) or [by using the Azure CLI](/cli/azure/functionapp/config/appsettings#az_functionapp_config_appsettings_set).
+>__How to use it:__ Add [`WEBSITE_RUN_FROM_PACKAGE`](functions-app-settings.md#website_run_from_package) to your application settings. The value of this setting should be a URL (the location of the specific package file you want to run). You can add settings either [in the portal](functions-how-to-use-azure-function-app-settings.md#settings) or [by using the Azure CLI](/cli/azure/functionapp/config/appsettings#az-functionapp-config-appsettings-set).
> >If you use Azure Blob storage, use a private container with a [shared access signature (SAS)](../vs-azure-tools-storage-manage-with-storage-explorer.md#generate-a-sas-in-storage-explorer) to give Functions access to the package. Any time the application restarts, it fetches a copy of the content. Your reference must be valid for the lifetime of the application.
azure-functions Functions Develop Vs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-develop-vs.md
You can also manage application settings in one of these other ways:
* [Use the Azure portal](functions-how-to-use-azure-function-app-settings.md#settings). * [Use the `--publish-local-settings` publish option in the Azure Functions Core Tools](functions-run-local.md#publish).
-* [Use the Azure CLI](/cli/azure/functionapp/config/appsettings#az_functionapp_config_appsettings_set).
+* [Use the Azure CLI](/cli/azure/functionapp/config/appsettings#az-functionapp-config-appsettings-set).
## Monitoring functions
azure-functions Functions How To Azure Devops https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-how-to-azure-devops.md
Use the option **Deploy to Slot** in the **Azure Function App Deploy** task to s
## Create a pipeline with Azure CLI
-To create a build pipeline in Azure, use the `az functionapp devops-pipeline create` [command](/cli/azure/functionapp/devops-pipeline#az_functionapp_devops_pipeline_create). The build pipeline is created to build and release any code changes that are made in your repo. The command generates a new YAML file that defines the build and release pipeline and then commits it to your repo. The prerequisites for this command depend on the location of your code.
+To create a build pipeline in Azure, use the `az functionapp devops-pipeline create` [command](/cli/azure/functionapp/devops-pipeline#az-functionapp-devops-pipeline-create). The build pipeline is created to build and release any code changes that are made in your repo. The command generates a new YAML file that defines the build and release pipeline and then commits it to your repo. The prerequisites for this command depend on the location of your code.
- If your code is in GitHub:
azure-functions Functions How To Use Azure Function App Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-how-to-use-azure-function-app-settings.md
To add a setting in the portal, select **New application setting** and add the n
# [Azure CLI](#tab/azure-cli)
-The [`az functionapp config appsettings list`](/cli/azure/functionapp/config/appsettings#az_functionapp_config_appsettings_list) command returns the existing application settings, as in the following example:
+The [`az functionapp config appsettings list`](/cli/azure/functionapp/config/appsettings#az-functionapp-config-appsettings-list) command returns the existing application settings, as in the following example:
```azurecli-interactive az functionapp config appsettings list --name <FUNCTION_APP_NAME> \ --resource-group <RESOURCE_GROUP_NAME> ```
-The [`az functionapp config appsettings set`](/cli/azure/functionapp/config/appsettings#az_functionapp_config_appsettings_set) command adds or updates an application setting. The following example creates a setting with a key named `CUSTOM_FUNCTION_APP_SETTING` and a value of `12345`:
+The [`az functionapp config appsettings set`](/cli/azure/functionapp/config/appsettings#az-functionapp-config-appsettings-set) command adds or updates an application setting. The following example creates a setting with a key named `CUSTOM_FUNCTION_APP_SETTING` and a value of `12345`:
```azurecli-interactive
When you configure the **Allowed origins** list for your function app, the `Acce
When the wildcard (`*`) is used, all other domains are ignored.
-Use the [`az functionapp cors add`](/cli/azure/functionapp/cors#az_functionapp_cors_add) command to add a domain to the allowed origins list. The following example adds the contoso.com domain:
+Use the [`az functionapp cors add`](/cli/azure/functionapp/cors#az-functionapp-cors-add) command to add a domain to the allowed origins list. The following example adds the contoso.com domain:
```azurecli-interactive az functionapp cors add --name <FUNCTION_APP_NAME> \
az functionapp cors add --name <FUNCTION_APP_NAME> \
--allowed-origins https://contoso.com ```
-Use the [`az functionapp cors show`](/cli/azure/functionapp/cors#az_functionapp_cors_show) command to list the current allowed origins.
+Use the [`az functionapp cors show`](/cli/azure/functionapp/cors#az-functionapp-cors-show) command to list the current allowed origins.
### <a name="auth"></a>Authentication
azure-functions Functions Identity Access Azure Sql With Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-identity-access-azure-sql-with-managed-identity.md
# Tutorial: Connect a function app to Azure SQL with managed identity and SQL bindings
-Azure Functions provides a [managed identity](/azure/active-directory/managed-identities-azure-resources/overview), which is a turn-key solution for securing access to [Azure SQL Database](/azure/sql-database/) and other Azure services. Managed identities make your app more secure by eliminating secrets from your app, such as credentials in the connection strings. In this tutorial, you'll add managed identity to an Azure Function that utilizes [Azure SQL bindings](/azure/azure-functions/functions-bindings-azure-sql). A sample Azure Function project with SQL bindings is available in the [ToDo backend example](/samples/azure-samples/azure-sql-binding-func-dotnet-todo/todo-backend-dotnet-azure-sql-bindings-azure-functions/).
+Azure Functions provides a [managed identity](../active-directory/managed-identities-azure-resources/overview.md), which is a turn-key solution for securing access to [Azure SQL Database](/azure/sql-database/) and other Azure services. Managed identities make your app more secure by eliminating secrets from your app, such as credentials in the connection strings. In this tutorial, you'll add managed identity to an Azure Function that utilizes [Azure SQL bindings](./functions-bindings-azure-sql.md). A sample Azure Function project with SQL bindings is available in the [ToDo backend example](/samples/azure-samples/azure-sql-binding-func-dotnet-todo/todo-backend-dotnet-azure-sql-bindings-azure-functions/).
When you're finished with this tutorial, your Azure Function will connect to Azure SQL database without the need of username and password.
An overview of the steps you'll take:
First enable Azure AD authentication to SQL database by assigning an Azure AD user as the Active Directory admin of the server. This user is different from the Microsoft account you used to sign up for your Azure subscription. It must be a user that you created, imported, synced, or invited into Azure AD. For more information on allowed Azure AD users, see [Azure AD features and limitations in SQL database](../azure-sql/database/authentication-aad-overview.md#azure-ad-features-and-limitations).
-Enabling Azure AD authentication can be completed via the Azure portal, PowerShell, or Azure CLI. Directions for Azure CLI are below and information completing this via Azure portal and PowerShell is available in the [Azure SQL documentation on Azure AD authentication](/azure/azure-sql/database/authentication-aad-configure).
+Enabling Azure AD authentication can be completed via the Azure portal, PowerShell, or Azure CLI. Directions for Azure CLI are below and information completing this via Azure portal and PowerShell is available in the [Azure SQL documentation on Azure AD authentication](../azure-sql/database/authentication-aad-configure.md).
1. If your Azure AD tenant doesn't have a user yet, create one by following the steps at [Add or delete users using Azure Active Directory](../active-directory/fundamentals/add-users-azure-active-directory.md).
To enable system-assigned managed identity in the Azure portal:
![Turn on system assigned identity for Function app](./media/functions-identity-access-sql-with-managed-identity/function-system-identity.png)
-For information on enabling system-assigned managed identity through Azure CLI or PowerShell, check out more information on [using managed identities with Azure Functions](/azure/app-service/overview-managed-identity?toc=%2Fazure%2Fazure-functions%2Ftoc.json&tabs=dotnet#add-a-system-assigned-identity).
+For information on enabling system-assigned managed identity through Azure CLI or PowerShell, check out more information on [using managed identities with Azure Functions](../app-service/overview-managed-identity.md?tabs=dotnet&toc=%2fazure%2fazure-functions%2ftoc.json#add-a-system-assigned-identity).
## Grant SQL database access to the managed identity
In this step we'll connect to the SQL database with an Azure AD user account and
In the final step we'll configure the Azure Function SQL connection string to use Azure AD managed identity authentication.
-The connection string setting name is identified in our Functions code as the binding attribute "ConnectionStringSetting", as seen in the SQL input binding [attributes and annotations](/azure/azure-functions/functions-bindings-azure-sql-input?tabs=csharp#attributes-and-annotations).
+The connection string setting name is identified in our Functions code as the binding attribute "ConnectionStringSetting", as seen in the SQL input binding [attributes and annotations](./functions-bindings-azure-sql-input.md?tabs=csharp#attributes-and-annotations).
In the application settings of our Function App the SQL connection string setting should be updated to follow this format:
In the application settings of our Function App the SQL connection string settin
- [Read data from a database (Input binding)](./functions-bindings-azure-sql-input.md) - [Save data to a database (Output binding)](./functions-bindings-azure-sql-output.md)-- [Review ToDo API sample with Azure SQL bindings](/samples/azure-samples/azure-sql-binding-func-dotnet-todo/todo-backend-dotnet-azure-sql-bindings-azure-functions/)
+- [Review ToDo API sample with Azure SQL bindings](/samples/azure-samples/azure-sql-binding-func-dotnet-todo/todo-backend-dotnet-azure-sql-bindings-azure-functions/)
azure-functions Set Runtime Version https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/set-runtime-version.md
az functionapp config appsettings set --name <FUNCTION_APP> \
Replace `<FUNCTION_APP>` with the name of your function app. Also replace `<RESOURCE_GROUP>` with the name of the resource group for your function app. Also, replace `<VERSION>` with either a specific version, or `~4`, `~3`, `~2`, or `~1`.
-Choose **Try it** in the previous code example to run the command in [Azure Cloud Shell](../cloud-shell/overview.md). You can also run the [Azure CLI locally](/cli/azure/install-azure-cli) to execute this command. When running locally, you must first run [az login](/cli/azure/reference-index#az_login) to sign in.
+Choose **Try it** in the previous code example to run the command in [Azure Cloud Shell](../cloud-shell/overview.md). You can also run the [Azure CLI locally](/cli/azure/install-azure-cli) to execute this command. When running locally, you must first run [az login](/cli/azure/reference-index#az-login) to sign in.
# [PowerShell](#tab/powershell)
az functionapp config set --name <FUNCTION_APP> \
Replace `<FUNCTION_APP>` with the name of your function app. Also replace `<RESOURCE_GROUP>` with the name of the resource group for your function app. Also, replace `<LINUX_FX_VERSION>` with the value of a specific image as described above.
-You can run this command from the [Azure Cloud Shell](../cloud-shell/overview.md) by choosing **Try it** in the preceding code sample. You can also use the [Azure CLI locally](/cli/azure/install-azure-cli) to execute this command after executing [az login](/cli/azure/reference-index#az_login) to sign in.
+You can run this command from the [Azure Cloud Shell](../cloud-shell/overview.md) by choosing **Try it** in the preceding code sample. You can also use the [Azure CLI locally](/cli/azure/install-azure-cli) to execute this command after executing [az login](/cli/azure/reference-index#az-login) to sign in.
# [PowerShell](#tab/powershell)
azure-functions Storage Considerations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/storage-considerations.md
Because Functions use Azure Files during parts of the the dynamic scale-out proc
_This functionality is current only available when running on Linux._
-You can mount existing Azure Files shares to your Linux function apps. By mounting a share to your Linux function app, you can leverage existing machine learning models or other data in your functions. You can use the [`az webapp config storage-account add`](/cli/azure/webapp/config/storage-account#az_webapp_config_storage_account_add) command to mount an existing share to your Linux function app.
+You can mount existing Azure Files shares to your Linux function apps. By mounting a share to your Linux function app, you can leverage existing machine learning models or other data in your functions. You can use the [`az webapp config storage-account add`](/cli/azure/webapp/config/storage-account#az-webapp-config-storage-account-add) command to mount an existing share to your Linux function app.
In this command, `share-name` is the name of the existing Azure Files share, and `custom-id` can be any string that uniquely defines the share when mounted to the function app. Also, `mount-path` is the path from which the share is accessed in your function app. `mount-path` must be in the format `/dir-name`, and it can't start with `/home`.
azure-government Compare Azure Government Global Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-government/compare-azure-government-global-azure.md
recommendations: false Previously updated : 01/19/2022 Last updated : 02/11/2022 # Compare Azure Government and global Azure
Table below lists API endpoints in Azure vs. Azure Government for accessing and
||Azure Database for PostgreSQL|postgres.database.azure.com|postgres.database.usgovcloudapi.net|| ||Azure SQL Database|database.windows.net|database.usgovcloudapi.net|| |**Identity**|Azure AD|login.microsoftonline.com|login.microsoftonline.us||
+|||certauth.login.microsoftonline.com|certauth.login.microsoftonline.us||
|**Integration**|Service Bus|servicebus.windows.net|servicebus.usgovcloudapi.net|| |**Internet of Things**|Azure IoT Hub|azure-devices.net|azure-devices.us|| ||Azure Maps|atlas.microsoft.com|atlas.azure.us||
azure-government Documentation Government Csp List https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-government/documentation-government-csp-list.md
Below you can find a list of all the authorized Cloud Solution Providers (CSPs),
|CDW Corp.|cdwgsales@cdwg.com|800-808-4239| |Dell Corp.|Get_Azure@Dell.com|888-375-9857| |Insight Public Sector|federal@insight.com|800-467-4448|
-|PC Connection|govtssms@connection.com|800-998-0009|
+|PC Connection|govtssms@connection.com|800-800-0019|
|SHI, Inc.|msftgov@shi.com|888-764-8888| |Minburn Technology Group|microsoft@minburntech.com |571-699-0705 Opt. 1|
azure-government Documentation Government Plan Compliance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-government/documentation-government-plan-compliance.md
You can access Azure and Azure Government audit reports and related documentatio
- STP [Audit Reports](https://servicetrust.microsoft.com/ViewPage/MSComplianceGuideV3), which has a subsection for FedRAMP Reports. - STP [Data Protection Resources](https://servicetrust.microsoft.com/ViewPage/TrustDocumentsV3), which is further divided into Compliance Guides, FAQ and White Papers, and Pen Test and Security Assessments subsections.
-You must sign in to access audit reports on the STP. For more information, see [Get started with the Microsoft Service Trust Portal](https://aka.ms/stphelp).
+You must sign in to access audit reports on the STP. For more information, see [Get started with the Microsoft Service Trust Portal](/microsoft-365/compliance/get-started-with-service-trust-portal).
Alternatively, you can access certain audit reports and certificates in the Azure or Azure Government portal by navigating to *Home > Security Center > Regulatory compliance > Audit reports* or using direct links based on your subscription (sign in required):
Regulatory compliance in Azure Policy provides built-in initiative definitions t
- [Compare Azure Government and global Azure](./compare-azure-government-global-azure.md) - [Azure Government services by audit scope](./compliance/azure-services-in-fedramp-auditscope.md#azure-government-services-by-audit-scope) - [Azure Government isolation guidelines for Impact Level 5 workloads](./documentation-government-impact-level-5.md)-- [Azure Government DoD overview](./documentation-government-overview-dod.md)
+- [Azure Government DoD overview](./documentation-government-overview-dod.md)
azure-monitor Alerts Troubleshoot Metric https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/alerts-troubleshoot-metric.md
description: Common issues with Azure Monitor metric alerts and possible solutio
Previously updated : 2/10/2022 Last updated : 2/15/2022 # Troubleshooting problems in Azure Monitor metric alerts
Consider one of the following options:
* If your data has weekly seasonality, but not enough history is available for the metric, the calculated thresholds can result in having broad upper and lower bounds. For example, the calculation can treat weekdays and weekends in the same way, and build wide borders that don't always fit the data. This should resolve itself once enough metric history is available, at which point the correct seasonality will be detected and the calculated thresholds will update accordingly.
+## When configuring an alert rule's condition, why is Dynamic threshold disabled?
+While dynamic thresholds are supported for the vast majority of metrics, there are some metrics that can't use dynamic thresholds.
+
+The table below lists the metrics that aren't supported by dynamic thresholds.
+
+| Resource Type | Metric Name |
+| | |
+| Microsoft.ClassicStorage/storageAccounts | UsedCapacity |
+| Microsoft.ClassicStorage/storageAccounts/blobServices | BlobCapacity |
+| Microsoft.ClassicStorage/storageAccounts/blobServices | BlobCount |
+| Microsoft.ClassicStorage/storageAccounts/blobServices | IndexCapacity |
+| Microsoft.ClassicStorage/storageAccounts/fileServices | FileCapacity |
+| Microsoft.ClassicStorage/storageAccounts/fileServices | FileCount |
+| Microsoft.ClassicStorage/storageAccounts/fileServices | FileShareCount |
+| Microsoft.ClassicStorage/storageAccounts/fileServices | FileShareSnapshotCount |
+| Microsoft.ClassicStorage/storageAccounts/fileServices | FileShareSnapshotSize |
+| Microsoft.ClassicStorage/storageAccounts/fileServices | FileShareQuota |
+| Microsoft.Compute/disks | Composite Disk Read Bytes/sec |
+| Microsoft.Compute/disks | Composite Disk Read Operations/sec |
+| Microsoft.Compute/disks | Composite Disk Write Bytes/sec |
+| Microsoft.Compute/disks | Composite Disk Write Operations/sec |
+| Microsoft.ContainerService/managedClusters | NodesCount |
+| Microsoft.ContainerService/managedClusters | PodCount |
+| Microsoft.ContainerService/managedClusters | CompletedJobsCount |
+| Microsoft.ContainerService/managedClusters | RestartingContainerCount |
+| Microsoft.ContainerService/managedClusters | OomKilledContainerCount |
+| Microsoft.Devices/IotHubs | TotalDeviceCount |
+| Microsoft.Devices/IotHubs | ConnectedDeviceCount |
+| Microsoft.Devices/IotHubs | TotalDeviceCount |
+| Microsoft.Devices/IotHubs | ConnectedDeviceCount |
+| Microsoft.DocumentDB/databaseAccounts | CassandraConnectionClosures |
+| Microsoft.EventHub/clusters | Size |
+| Microsoft.EventHub/namespaces | Size |
+| Microsoft.IoTCentral/IoTApps | connectedDeviceCount |
+| Microsoft.IoTCentral/IoTApps | provisionedDeviceCount |
+| Microsoft.Kubernetes/connectedClusters | NodesCount |
+| Microsoft.Kubernetes/connectedClusters | PodCount |
+| Microsoft.Kubernetes/connectedClusters | CompletedJobsCount |
+| Microsoft.Kubernetes/connectedClusters | RestartingContainerCount |
+| Microsoft.Kubernetes/connectedClusters | OomKilledContainerCount |
+| Microsoft.MachineLearningServices/workspaces/onlineEndpoints | RequestsPerMinute |
+| Microsoft.MachineLearningServices/workspaces/onlineEndpoints/deployments | DeploymentCapacity |
+| Microsoft.Maps/accounts | CreatorUsage |
+| Microsoft.Media/mediaservices/streamingEndpoints | EgressBandwidth |
+| Microsoft.Network/applicationGateways | Throughput |
+| Microsoft.Network/azureFirewalls | Throughput |
+| Microsoft.Network/expressRouteGateways | ExpressRouteGatewayPacketsPerSecond |
+| Microsoft.Network/expressRouteGateways | ExpressRouteGatewayNumberOfVmInVnet |
+| Microsoft.Network/expressRouteGateways | ExpressRouteGatewayFrequencyOfRoutesChanged |
+| Microsoft.Network/virtualNetworkGateways | ExpressRouteGatewayPacketsPerSecond |
+| Microsoft.Network/virtualNetworkGateways | ExpressRouteGatewayNumberOfVmInVnet |
+| Microsoft.Network/virtualNetworkGateways | ExpressRouteGatewayFrequencyOfRoutesChanged |
+| Microsoft.ServiceBus/namespaces | Size |
+| Microsoft.ServiceBus/namespaces | Messages |
+| Microsoft.ServiceBus/namespaces | ActiveMessages |
+| Microsoft.ServiceBus/namespaces | DeadletteredMessages |
+| Microsoft.ServiceBus/namespaces | ScheduledMessages |
+| Microsoft.ServiceFabricMesh/applications | AllocatedCpu |
+| Microsoft.ServiceFabricMesh/applications | AllocatedMemory |
+| Microsoft.ServiceFabricMesh/applications | ActualCpu |
+| Microsoft.ServiceFabricMesh/applications | ActualMemory |
+| Microsoft.ServiceFabricMesh/applications | ApplicationStatus |
+| Microsoft.ServiceFabricMesh/applications | ServiceStatus |
+| Microsoft.ServiceFabricMesh/applications | ServiceReplicaStatus |
+| Microsoft.ServiceFabricMesh/applications | ContainerStatus |
+| Microsoft.ServiceFabricMesh/applications | RestartCount |
+| Microsoft.Storage/storageAccounts | UsedCapacity |
+| Microsoft.Storage/storageAccounts/blobServices | BlobCapacity |
+| Microsoft.Storage/storageAccounts/blobServices | BlobCount |
+| Microsoft.Storage/storageAccounts/blobServices | BlobProvisionedSize |
+| Microsoft.Storage/storageAccounts/blobServices | IndexCapacity |
+| Microsoft.Storage/storageAccounts/fileServices | FileCapacity |
+| Microsoft.Storage/storageAccounts/fileServices | FileCount |
+| Microsoft.Storage/storageAccounts/fileServices | FileShareCount |
+| Microsoft.Storage/storageAccounts/fileServices | FileShareSnapshotCount |
+| Microsoft.Storage/storageAccounts/fileServices | FileShareSnapshotSize |
+| Microsoft.Storage/storageAccounts/fileServices | FileShareCapacityQuota |
+| Microsoft.Storage/storageAccounts/fileServices | FileShareProvisionedIOPS |
++ ## Next steps - For general troubleshooting information about alerts and notifications, see [Troubleshooting problems in Azure Monitor alerts](alerts-troubleshoot.md).
azure-monitor Annotations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/annotations.md
Select any annotation marker to open details about the release, including reques
Release annotations are a feature of the cloud-based Azure Pipelines service of Azure DevOps. > [!IMPORTANT]
-> Annotations using API keys is deprecated. We recommend using [Azure CLI](https://docs.microsoft.com/azure/azure-monitor/app/annotations#create-release-annotations-with-azure-cli) instead.
+> Annotations using API keys is deprecated. We recommend using [Azure CLI](#create-release-annotations-with-azure-cli) instead.
### Install the annotations extension (one time)
To use the new release annotations:
## Next steps * [Create work items](./diagnostic-search.md#create-work-item)
-* [Automation with PowerShell](./powershell.md)
+* [Automation with PowerShell](./powershell.md)
azure-monitor App Insights Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/app-insights-overview.md
There are several ways to get started with Application Insights. Begin with what
### Prerequisites -- You need an Azure account. Application Insights is hosted in Azure, and sends its telemetry to Azure for analysis and presentation. If you don't have an Azure subscription, you can [sign up for free](https://azure.microsoft.com/free). If your organization already has an Azure subscription, an administrator can [add you to it](/azure/active-directory/fundamentals/add-users-azure-active-directory).
+- You need an Azure account. Application Insights is hosted in Azure, and sends its telemetry to Azure for analysis and presentation. If you don't have an Azure subscription, you can [sign up for free](https://azure.microsoft.com/free). If your organization already has an Azure subscription, an administrator can [add you to it](../../active-directory/fundamentals/add-users-azure-active-directory.md).
- The basic [Application Insights pricing plan](https://azure.microsoft.com/pricing/details/application-insights/) has no charge until your app has substantial usage.
When you receive an alert or discover a problem:
[platforms]: ./platforms.md [portal]: https://portal.azure.com/ [qna]: ../faq.yml
-[redfield]: ./status-monitor-v2-overview.md
-
+[redfield]: ./status-monitor-v2-overview.md
azure-monitor Cloudservices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/cloudservices.md
Did you build for .NET 4.6? .NET 4.6 is not automatically supported in Azure clo
[portal]: https://portal.azure.com/ [qna]: ../faq.yml [redfield]: ./status-monitor-v2-overview.md
-[start]: ./app-insights-overview.md
+[start]: ./app-insights-overview.md
azure-monitor Convert Classic Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/convert-classic-resource.md
az monitor app-insights component update --app
az monitor app-insights component update --app your-app-insights-resource-name -g your_resource_group --workspace "/subscriptions/00000000-0000-0000-0000-000000000000/resourcegroups/test1234/providers/microsoft.operationalinsights/workspaces/test1234555" ```
-For the full Azure CLI documentation for this command, consult the [Azure CLI documentation](/cli/azure/monitor/app-insights/component#az_monitor_app_insights_component_update).
+For the full Azure CLI documentation for this command, consult the [Azure CLI documentation](/cli/azure/monitor/app-insights/component#az-monitor-app-insights-component-update).
### Azure PowerShell
azure-monitor Create Workspace Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/create-workspace-resource.md
az monitor app-insights component create --app
az monitor app-insights component create --app demoApp --location eastus --kind web -g my_resource_group --workspace "/subscriptions/00000000-0000-0000-0000-000000000000/resourcegroups/test1234/providers/microsoft.operationalinsights/workspaces/test1234555" ```
-For the full Azure CLI documentation for this command, consult the [Azure CLI documentation](/cli/azure/monitor/app-insights/component#az_monitor_app_insights_component_create).
+For the full Azure CLI documentation for this command, consult the [Azure CLI documentation](/cli/azure/monitor/app-insights/component#az-monitor-app-insights-component-create).
### Azure PowerShell
azure-monitor Migrate From Instrumentation Keys To Connection Strings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/migrate-from-instrumentation-keys-to-connection-strings.md
+
+ Title: Migrate from instrumentation keys to connection strings
+description: Learn the steps required to upgrade from Azure Monitor Application Insights instrumentation keys to connection strings
+ Last updated : 02/14/2022++
+# Migration process to connection strings for Application Insights resources
+
+This guide walks through migrating from [instrumentation keys](separate-resources.md#about-resources-and-instrumentation-keys) to [connection strings](sdk-connection-string.md#overview).
+
+## Prerequisites
+
+- A [supported SDK version](#supported-sdk-versions)
+- An existing [application insights resource](create-workspace-resource.md)
+
+## Migration process
+
+1. Find your connection string displayed on the Overview blade of your Application Insights resource.
+
+ :::image type="content" source="media/migrate-from-instrumentation-keys-to-connection-strings/migrate-from-instrumentation-keys-to-connection-strings.png" alt-text="Screenshot displaying Application Insights overview and connection string" lightbox="media/migrate-from-instrumentation-keys-to-connection-strings/migrate-from-instrumentation-keys-to-connection-strings.png":::
+
+2. Hover over the connection string and select the ΓÇ£Copy to clipboardΓÇ¥ icon.
+
+3. Configure the Application Insights SDK by following [How to set connection strings](sdk-connection-string.md#how-to-set-a-connection-string).
+
+> [!IMPORTANT]
+> Using both a connection string and instrumentation key isn't recommended. Whichever was set last takes precedence.
+
+## Migration at scale (for multiple subscriptions)
+
+You can use environment variables to easily pass a connection string to the Application Insights SDK or Agent. If you hardcode an instrumentation key in your application code, that programming may take precedence before environment variables.
+
+To set a connection string via environment variable, place the value of the connection string into an environment variable named ΓÇ£APPLICATIONINSIGHTS_CONNECTION_STRINGΓÇ¥. This process can be automated in your Azure deployments. For example, the following ARM template shows how you can automatically include the correct connection string with an App Services deployment (be sure to include any other App Settings your app requires):
+
+```JSON
+{
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "parameters": {
+ "appServiceName": {
+ "type": "string",
+ "metadata": {
+ "description": "Name of the App Services resource"
+ }
+ },
+ "appServiceLocation": {
+ "type": "string",
+ "metadata": {
+ "description": "Location to deploy the App Services resource"
+ }
+ },
+ "appInsightsName": {
+ "type": "string",
+ "metadata": {
+ "description": "Name of the existing Application Insights resource to use with this App Service. Expected to be in the same Resource Group."
+ }
+ }
+ },
+ "resources": [
+ {
+ "apiVersion": "2016-03-01",
+ "name": "[parameters('appServiceName')]",
+ "type": "microsoft.web/sites",
+ "location": "[parameters('appServiceLocation')]",
+ "properties": {
+ "siteConfig": {
+ "appSettings": [
+ {
+ "name": "APPLICATIONINSIGHTS_CONNECTION_STRING",
+ "value": "[reference(concat('microsoft.insights/components/', parameters('appInsightsName')), '2015-05-01').ConnectionString]"
+ }
+ ]
+ },
+ "name": "[parameters('appServiceName')]"
+ }
+ }
+ ]
+}
+
+```
+## Supported SDK Versions
+
+- .NET and .NET Core v2.12.0
+- Java v2.5.1 and Java 3.0
+- JavaScript v2.3.0
+- NodeJS v1.5.0
+- Python v1.0.0
+
+## New capabilities
+
+Just like instrumentation keys, connections strings identify a resource to associate your telemetry data with. Connection strings provide a single configuration setting and eliminate the need for multiple proxy settings. It's a reliable, secure, and useful technology for sending data to the monitoring service.
+
+Connection strings allow you to take advantage of the latest capabilities of Application Insights.
+
+- **Reliability:** Connection strings make telemetry ingestion more reliable by removing dependencies on global ingestion endpoints.
+
+- **Security:** Connection strings allow authenticated telemetry ingestion by using [Azure AD authentication for Application Insights](azure-ad-authentication.md).
+
+- **Customized endpoints (sovereign or hybrid cloud environments):** Endpoint settings allow sending data to a specific [Azure Government region](custom-endpoints.md#regions-that-require-endpoint-modification). ([see examples](sdk-connection-string.md#how-to-set-a-connection-string))
+
+- **Privacy (regional endpoints)** ΓÇô Connection strings ease privacy concerns by sending data to regional endpoints, ensuring data doesn't leave a geographic region.
+
+## Troubleshooting
+
+Follow these steps if data isn't arriving after migration:
+
+1. Confirm you're using a supported SDK/agent that supports connection strings. If you use Application Insights integration in another Azure product offering, check its documentation on how to properly configure a connection string.
+
+2. Confirm you aren't setting both an instrumentation key and connection string at the same time. Instrumentation key settings should be removed from your configuration.
+
+3. Confirm your connection string is exactly as provided in the Azure portal.
+
+## FAQ
+
+### Where else can I find my connection string?
+The connection string is also included in the ARM resource properties for your Application Insights resource, under the field name ΓÇ£ConnectionStringΓÇ¥.
+### How does this impact auto instrumentation?
+
+Auto instrumentation scenarios aren't impacted.
+
+### Is auto instrumentation affected?
+
+You can't enable [Azure AD authentication](azure-ad-authentication.md) for [auto instrumentation](codeless-overview.md) scenarios. We have plans to address this limitation in the future.
+
+### What is the difference between global and regional ingestion?
+
+Global ingestion sends all telemetry data to a single endpoint, no matter where this data will end up or be stored. Regional ingestion allows you to define specific endpoints per region for data ingestion, ensuring data stays within a specific region during processing and storage.
+
+### How do connection strings impact the billing?
+
+Billing isn't impacted.
+
+### Microsoft Q&A
+
+Post questions to the [answers forum](https://docs.microsoft.com/answers/topics/24223/azure-monitor.html).
azure-monitor Profiler Bring Your Own Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/profiler-bring-your-own-storage.md
A BYOS storage account will be linked to an Application Insights resource. There
First, the Application Insights Profiler, and Snapshot Debugger service needs to be granted access to the storage account. To grant access, add the role `Storage Blob Data Contributor` to the AAD application named `Diagnostic Services Trusted Storage Access` via the Access Control (IAM) page in your storage account as shown in Figure 1.0. Steps:
-1. Click on the "Add" button in the "Add a role assignment" section
-1. Select "Storage Blob Data Contributor" role
-1. Select "Azure AD user, group, or service principal" in the "Assign access to" section
-1. Search & select "Diagnostic Services Trusted Storage Access" app
-1. Save changes
-
-_![Figure 1.0](media/profiler-bring-your-own-storage/figure-10.png)_
-_Figure 1.0_
+
+1. Select **Access control (IAM)**.
+
+1. Select **Add** > **Add role assignment** to open the Add role assignment page.
+
+1. Assign the following role. For detailed steps, see [Assign Azure roles using the Azure portal](../../role-based-access-control/role-assignments-portal.md).
+
+ | Setting | Value |
+ | | |
+ | Role | Storage Blob Data Contributor |
+ | Assign access to | User, group, or service principal |
+ | Members | Diagnostic Services Trusted Storage Access |
+
+ ![Add role assignment page in Azure portal.](../../../includes/role-based-access-control/media/add-role-assignment-page.png)
After you added the role, it will appear under the "Role assignments" section, like the below Figure 1.1. _![Figure 1.1](media/profiler-bring-your-own-storage/figure-11.png)_
azure-monitor Profiler https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/profiler.md
As of today, Profiler only supports Azure AD authentication when you reference a
Below you can find all the steps required to enable Azure AD for profiles ingestion: 1. Create and add the managed identity you want to use to authenticate against your Application Insights resource to your App Service.
- a. For System-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-system-assigned-identity)
+ a. For System-Assigned Managed identity, see the following [documentation](../../app-service/overview-managed-identity.md?tabs=portal%2chttp#add-a-system-assigned-identity)
- b. For User-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-user-assigned-identity)
+ b. For User-Assigned Managed identity, see the following [documentation](../../app-service/overview-managed-identity.md?tabs=portal%2chttp#add-a-user-assigned-identity)
-2. Configure and enable Azure AD in your Application Insights resource. For more information, see the following [documentation](https://docs.microsoft.com/azure/azure-monitor/app/azure-ad-authentication?tabs=net#configuring-and-enabling-azure-ad-based-authentication)
+2. Configure and enable Azure AD in your Application Insights resource. For more information, see the following [documentation](./azure-ad-authentication.md?tabs=net#configuring-and-enabling-azure-ad-based-authentication)
3. Add the following application setting, used to let Profiler agent know which managed identity to use: For System-Assigned Identity:
Profiler's files can be deleted when using WebDeploy to deploy changes to your w
[Enablement UI]: ./media/profiler/Enablement_UI.png [profiler-app-setting]:./media/profiler/profiler-app-setting.png
-[disable-profiler-webjob]: ./media/profiler/disable-profiler-webjob.png
+[disable-profiler-webjob]: ./media/profiler/disable-profiler-webjob.png
azure-monitor Snapshot Debugger Appservice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/snapshot-debugger-appservice.md
As of today, Snapshot Debugger only supports Azure AD authentication when you re
Below you can find all the steps required to enable Azure AD for profiles ingestion: 1. Create and add the managed identity you want to use to authenticate against your Application Insights resource to your App Service.
- a. For System-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-system-assigned-identity)
+ a. For System-Assigned Managed identity, see the following [documentation](../../app-service/overview-managed-identity.md?tabs=portal%2chttp#add-a-system-assigned-identity)
- b. For User-Assigned Managed identity, see the following [documentation](https://docs.microsoft.com/azure/app-service/overview-managed-identity?tabs=portal%2Chttp#add-a-user-assigned-identity)
+ b. For User-Assigned Managed identity, see the following [documentation](../../app-service/overview-managed-identity.md?tabs=portal%2chttp#add-a-user-assigned-identity)
-2. Configure and enable Azure AD in your Application Insights resource. For more information, see the following [documentation](https://docs.microsoft.com/azure/azure-monitor/app/azure-ad-authentication?tabs=net#configuring-and-enabling-azure-ad-based-authentication)
+2. Configure and enable Azure AD in your Application Insights resource. For more information, see the following [documentation](./azure-ad-authentication.md?tabs=net#configuring-and-enabling-azure-ad-based-authentication)
3. Add the following application setting, used to let Snapshot Debugger agent know which managed identity to use: For System-Assigned Identity:
For an Azure App Service, you can set app settings within the Azure Resource Man
- For help with troubleshooting Snapshot Debugger issues, see [Snapshot Debugger troubleshooting](snapshot-debugger-troubleshoot.md?toc=/azure/azure-monitor/toc.json). [Enablement UI]: ./media/snapshot-debugger/enablement-ui.png
-[snapshot-debugger-app-setting]:./media/snapshot-debugger/snapshot-debugger-app-setting.png
+[snapshot-debugger-app-setting]:./media/snapshot-debugger/snapshot-debugger-app-setting.png
azure-monitor Diagnostic Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/diagnostic-settings.md
Set-AzDiagnosticSetting -Name KeyVault-Diagnostics -ResourceId /subscriptions/xx
## Create using Azure CLI
-Use the [az monitor diagnostic-settings create](/cli/azure/monitor/diagnostic-settings#az_monitor_diagnostic_settings_create) command to create a diagnostic setting with [Azure CLI](/cli/azure/monitor). See the documentation for this command for descriptions of its parameters.
+Use the [az monitor diagnostic-settings create](/cli/azure/monitor/diagnostic-settings#az-monitor-diagnostic-settings-create) command to create a diagnostic setting with [Azure CLI](/cli/azure/monitor). See the documentation for this command for descriptions of its parameters.
> [!IMPORTANT] > You cannot use this method for the Azure Activity log. Instead, use [Create diagnostic setting in Azure Monitor using a Resource Manager template](./resource-manager-diagnostic-settings.md) to create a Resource Manager template and deploy it with CLI.
azure-monitor Solutions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/insights/solutions.md
Select the name of a solution to open its summary page. This page displays any v
### [Azure CLI](#tab/azure-cli)
-To list the monitoring solutions installed in your subscription, use the [az monitor log-analytics solution list](/cli/azure/monitor/log-analytics/solution#az_monitor_log_analytics_solution_list) command. Before you run the command, follow the prerequisites in [Install a monitoring solution](#install-a-monitoring-solution).
+To list the monitoring solutions installed in your subscription, use the [az monitor log-analytics solution list](/cli/azure/monitor/log-analytics/solution#az-monitor-log-analytics-solution-list) command. Before you run the command, follow the prerequisites in [Install a monitoring solution](#install-a-monitoring-solution).
```azurecli # List all log-analytics solutions in the current subscription.
Members of the community can submit management solutions to Azure Quickstart Tem
1. Sign in.
- If you're using a local installation of the CLI, sign in by using the [az login](/cli/azure/reference-index#az_login) command. Follow the steps displayed in your terminal to complete the authentication process.
+ If you're using a local installation of the CLI, sign in by using the [az login](/cli/azure/reference-index#az-login) command. Follow the steps displayed in your terminal to complete the authentication process.
```azurecli az login
To remove an installed solution by using the portal, find it in the [list of ins
### [Azure CLI](#tab/azure-cli)
-To remove an installed solution by using the Azure CLI, use the [az monitor log-analytics solution delete](/cli/azure/monitor/log-analytics/solution#az_monitor_log_analytics_solution_delete) command.
+To remove an installed solution by using the Azure CLI, use the [az monitor log-analytics solution delete](/cli/azure/monitor/log-analytics/solution#az-monitor-log-analytics-solution-delete) command.
```azurecli az monitor log-analytics solution delete --name
azure-monitor Monitor Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/monitor-reference.md
The table below lists the available curated visualizations and more detailed inf
|Name with docs link| State | [Azure portal Link](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/more)| Description | |:--|:--|:--|:--|
-| [Azure Monitor Workbooks for Azure Active Directory](/azure/active-directory/reports-monitoring/howto-use-azure-monitor-workbooks) | GA (General availability) | [Yes](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Workbooks) | Azure Active Directory provides workbooks to understand the effect of your Conditional Access policies, to troubleshoot sign-in failures, and to identify legacy authentications. |
-| [Azure Backup](/azure/backup/backup-azure-monitoring-use-azuremonitor) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_DataProtection/BackupCenterMenuBlade/backupReportsConfigure/menuId/backupReportsConfigure) | Provides built-in monitoring and alerting capabilities in a Recovery Services vault. |
-| [Azure Monitor for Azure Cache for Redis (preview)](/azure/azure-monitor/insights/redis-cache-insights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/redisCacheInsights) | Provides a unified, interactive view of overall performance, failures, capacity, and operational health |
-| [Azure Cosmos DB Insights](/azure/azure-monitor/insights/cosmosdb-insights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/cosmosDBInsights) | Provides a view of the overall performance, failures, capacity, and operational health of all your Azure Cosmos DB resources in a unified interactive experience. |
+| [Azure Monitor Workbooks for Azure Active Directory](../active-directory/reports-monitoring/howto-use-azure-monitor-workbooks.md) | GA (General availability) | [Yes](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Workbooks) | Azure Active Directory provides workbooks to understand the effect of your Conditional Access policies, to troubleshoot sign-in failures, and to identify legacy authentications. |
+| [Azure Backup](../backup/backup-azure-monitoring-use-azuremonitor.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_DataProtection/BackupCenterMenuBlade/backupReportsConfigure/menuId/backupReportsConfigure) | Provides built-in monitoring and alerting capabilities in a Recovery Services vault. |
+| [Azure Monitor for Azure Cache for Redis (preview)](./insights/redis-cache-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/redisCacheInsights) | Provides a unified, interactive view of overall performance, failures, capacity, and operational health |
+| [Azure Cosmos DB Insights](./insights/cosmosdb-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/cosmosDBInsights) | Provides a view of the overall performance, failures, capacity, and operational health of all your Azure Cosmos DB resources in a unified interactive experience. |
| [Azure Container Insights](/azure/azure-monitor/insights/container-insights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/containerInsights) | Monitors the performance of container workloads that are deployed to managed Kubernetes clusters hosted on Azure Kubernetes Service (AKS). It gives you performance visibility by collecting metrics from controllers, nodes, and containers that are available in Kubernetes through the Metrics API. Container logs are also collected. After you enable monitoring from Kubernetes clusters, these metrics and logs are automatically collected for you through a containerized version of the Log Analytics agent for Linux. | | [Azure Data Explorer insights](/azure/azure-monitor/insights/data-explorer) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/adxClusterInsights) | Azure Data Explorer Insights provides comprehensive monitoring of your clusters by delivering a unified view of your cluster performance, operations, usage, and failures. | | [Azure HDInsight (preview)](/azure/hdinsight/log-analytics-migration#insights) | Preview | No | An Azure Monitor workbook that collects important performance metrics from your HDInsight cluster and provides the visualizations and dashboards for most common scenarios. Gives a complete view of a single HDInsight cluster including resource utilization and application status|
The table below lists the available curated visualizations and more detailed inf
| [Azure Service Bus Insights](/azure/service-bus-messaging/service-bus-insights) | Preview | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/serviceBusInsights) | Azure Service Bus insights provide a view of the overall performance, failures, capacity, and operational health of all your Service Bus resources in a unified interactive experience. | | [Azure SQL insights](/azure/azure-monitor/insights/sql-insights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/sqlWorkloadInsights) | A comprehensive interface for monitoring any product in the Azure SQL family. SQL insights uses dynamic management views to expose the data you need to monitor health, diagnose problems, and tune performance. Note: If you are just setting up SQL monitoring, use this instead of the SQL Analytics solution. | | [Azure Storage Insights](/azure/azure-monitor/insights/storage-insights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/storageInsights) | Provides comprehensive monitoring of your Azure Storage accounts by delivering a unified view of your Azure Storage services performance, capacity, and availability. |
- | [Azure Network Insights](/azure/azure-monitor/insights/network-insights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/networkInsights) | Provides a comprehensive view of health and metrics for all your network resource. The advanced search capability helps you identify resource dependencies, enabling scenarios like identifying resource that are hosting your website, by simply searching for your website name. |
- | [Azure Monitor for Resource Groups](/azure/azure-monitor/insights/resource-group-insights) | GA | No | Triage and diagnose any problems your individual resources encounter, while offering context as to the health and performance of the resource group as a whole. |
- | [Azure Monitor SAP](/azure/virtual-machines/workloads/sap/monitor-sap-on-azure) | GA | No | An Azure-native monitoring product for anyone running their SAP landscapes on Azure. It works with both SAP on Azure Virtual Machines and SAP on Azure Large Instances. Collects telemetry data from Azure infrastructure and databases in one central location and visually correlate the data for faster troubleshooting. You can monitor different components of an SAP landscape, such as Azure virtual machines (VMs), high-availability cluster, SAP HANA database, SAP NetWeaver, and so on, by adding the corresponding provider for that component. |
+ | [Azure Network Insights](./insights/network-insights-overview.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/networkInsights) | Provides a comprehensive view of health and metrics for all your network resource. The advanced search capability helps you identify resource dependencies, enabling scenarios like identifying resource that are hosting your website, by simply searching for your website name. |
+ | [Azure Monitor for Resource Groups](./insights/resource-group-insights.md) | GA | No | Triage and diagnose any problems your individual resources encounter, while offering context as to the health and performance of the resource group as a whole. |
+ | [Azure Monitor SAP](../virtual-machines/workloads/sap/monitor-sap-on-azure.md) | GA | No | An Azure-native monitoring product for anyone running their SAP landscapes on Azure. It works with both SAP on Azure Virtual Machines and SAP on Azure Large Instances. Collects telemetry data from Azure infrastructure and databases in one central location and visually correlate the data for faster troubleshooting. You can monitor different components of an SAP landscape, such as Azure virtual machines (VMs), high-availability cluster, SAP HANA database, SAP NetWeaver, and so on, by adding the corresponding provider for that component. |
| [Azure Stack HCI insights](/azure-stack/hci/manage/azure-stack-hci-insights) | Preview | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/azureStackHCIInsights) | Azure Monitor Workbook based. Provides health, performance, and usage insights about registered Azure Stack HCI, version 21H2 clusters that are connected to Azure and are enrolled in monitoring. It stores its data in a Log Analytics workspace, which allows it to deliver powerful aggregation and filtering and analyze data trends over time. | | [Azure VM Insights](/azure/azure-monitor/insights/vminsights-overview) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_Monitoring/AzureMonitoringBrowseBlade/virtualMachines) | Monitors your Azure virtual machines (VM) and virtual machine scale sets at scale. It analyzes the performance and health of your Windows and Linux VMs, and monitors their processes and dependencies on other resources and external processes. |
- | [Azure Virtual Desktop Insights](/azure/virtual-desktop/azure-monitor) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_WVD/WvdManagerMenuBlade/insights/menuId/insights) | Azure Virtual Desktop Insights is a dashboard built on Azure Monitor Workbooks that helps IT professionals understand their Windows Virtual Desktop environments. |
+ | [Azure Virtual Desktop Insights](../virtual-desktop/azure-monitor.md) | GA | [Yes](https://portal.azure.com/#blade/Microsoft_Azure_WVD/WvdManagerMenuBlade/insights/menuId/insights) | Azure Virtual Desktop Insights is a dashboard built on Azure Monitor Workbooks that helps IT professionals understand their Windows Virtual Desktop environments. |
## Product integrations
The other services and older monitoring solutions in the following table store t
|:|:| | [Azure Automation](../automation/index.yml) | Manage operating system updates and track changes on Windows and Linux computers. See [Change Tracking](../automation/change-tracking/overview.md) and [Update Management](../automation/update-management/overview.md). | | [Azure Information Protection](/azure/information-protection/) | Classify and optionally protect documents and emails. See [Central reporting for Azure Information Protection](/azure/information-protection/reports-aip#configure-a-log-analytics-workspace-for-the-reports). |
-| [Defender for the Cloud (was Azure Security Center)](/azure/defender-for-cloud/defender-for-cloud-introduction/) | Collect and analyze security events and perform threat analysis. See [Data collection in Defender for the Cloud](/azure/defender-for-cloud/enable-data-collection) |
+| [Defender for the Cloud (was Azure Security Center)](../defender-for-cloud/defender-for-cloud-introduction.md) | Collect and analyze security events and perform threat analysis. See [Data collection in Defender for the Cloud](../defender-for-cloud/enable-data-collection.md) |
| [Microsoft Sentinel](../sentinel/index.yml) | Connects to different sources including Office 365 and Amazon Web Services Cloud Trail. See [Connect data sources](../sentinel/connect-data-sources.md). | | [Microsoft Intune](/intune/) | Create a diagnostic setting to send logs to Azure Monitor. See [Send log data to storage, Event Hubs, or log analytics in Intune (preview)](/intune/fundamentals/review-logs-using-azure-monitor). | | Network [Traffic Analytics](../network-watcher/traffic-analytics.md) | Analyzes Network Watcher network security group (NSG) flow logs to provide insights into traffic flow in your Azure cloud. |
The following table lists Azure services and the data they collect into Azure Mo
- Read more about the [Azure Monitor data platform which stores the logs and metrics collected by insights and solutions](data-platform.md). - Complete a [tutorial on monitoring an Azure resource](essentials/tutorial-resource-logs.md). - Complete a [tutorial on writing a log query to analyze data in Azure Monitor Logs](essentials/tutorial-resource-logs.md).-- Complete a [tutorial on creating a metrics chart to analyze data in Azure Monitor Metrics](essentials/tutorial-metrics.md).
+- Complete a [tutorial on creating a metrics chart to analyze data in Azure Monitor Metrics](essentials/tutorial-metrics.md).
azure-netapp-files Azacsnap Preview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/azacsnap-preview.md
> PREVIEWS ARE PROVIDED "AS-IS," "WITH ALL FAULTS," AND "AS AVAILABLE," AND ARE EXCLUDED FROM THE SERVICE LEVEL AGREEMENTS AND LIMITED WARRANTY > ref: https://azure.microsoft.com/support/legal/preview-supplemental-terms/
-This article provides a guide on setup and usage of the new features in preview for **AzAcSnap v5.1**. These new features can be used with Azure NetApp Files, Azure BareMetal, and now Azure Managed Disk. This guide should be read along with the documentation for the generally available version of AzAcSnap at [aka.ms/azacsnap](https://aka.ms/azacsnap).
+This article provides a guide on setup and usage of the new features in preview for **AzAcSnap v5.1**. These new features can be used with Azure NetApp Files, Azure BareMetal, and now Azure Managed Disk. This guide should be read along with the documentation for the generally available version of AzAcSnap at [aka.ms/azacsnap](./azacsnap-introduction.md).
The four new preview features provided with AzAcSnap v5.1 are: - Oracle Database support
to this document for details on using the preview features.
New database platforms and operating systems supported with this preview release. - **Databases**
- - Oracle Database release 12 or later (refer to [Oracle VM images and their deployment on Microsoft Azure](/azure/virtual-machines/workloads/oracle/oracle-vm-solutions) for details)
+ - Oracle Database release 12 or later (refer to [Oracle VM images and their deployment on Microsoft Azure](../virtual-machines/workloads/oracle/oracle-vm-solutions.md) for details)
- **Operating Systems** - Oracle Linux 7+
PORTAL_GENERATED_SAS="https://<targetstorageaccount>.blob.core.windows.net/<blob
- [Get started](azacsnap-get-started.md) - [Test AzAcSnap](azacsnap-cmd-ref-test.md)-- [Back up using AzAcSnap](azacsnap-cmd-ref-backup.md)
+- [Back up using AzAcSnap](azacsnap-cmd-ref-backup.md)
azure-netapp-files Create Active Directory Connections https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/create-active-directory-connections.md
na Previously updated : 01/21/2022 Last updated : 02/15/2022 # Create and manage Active Directory connections for Azure NetApp Files
Several features of Azure NetApp Files require that you have an Active Directory
* If you change the password of the Active Directory user account that is used in Azure NetApp Files, be sure to update the password configured in the [Active Directory Connections](#create-an-active-directory-connection). Otherwise, you will not be able to create new volumes, and your access to existing volumes might also be affected depending on the setup.
+* Before you can remove an Active Directory connection from your NetApp account, you need to first remove all volumes associated with it.
+ * Proper ports must be open on the applicable Windows Active Directory (AD) server. The required ports are as follows:
This setting is configured in the **Active Directory Connections** under **NetAp
## Create an Active Directory connection
-1. From your NetApp account, click **Active Directory connections**, then click **Join**.
+1. From your NetApp account, select **Active Directory connections**, then select **Join**.
Azure NetApp Files supports only one Active Directory connection within the same region and the same subscription. If Active Directory is already configured by another NetApp account in the same subscription and region, you cannot configure and join a different Active Directory from your NetApp account. However, you can enable the Shared AD feature to allow an Active Directory configuration to be shared by multiple NetApp accounts within the same subscription and the same region. See [Map multiple NetApp accounts in the same subscription and region to an AD connection](#shared_ad).
This setting is configured in the **Active Directory Connections** under **NetAp
![Active Directory credentials](../media/azure-netapp-files/active-directory-credentials.png)
-3. Click **Join**.
+3. Select **Join**.
The Active Directory connection you created appears.
azure-portal How To Create Azure Support Request https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-portal/supportability/how-to-create-azure-support-request.md
You can get to **Help + support** in the Azure portal. It's available from the A
### Azure role-based access control
-To create a support request, you must have the [Owner](../../role-based-access-control/built-in-roles.md#owner), [Contributor](../../role-based-access-control/built-in-roles.md#contributor), [Support Request Contributor](../../role-based-access-control/built-in-roles.md#support-request-contributor) role, or a custom role with [Microsoft.Support/*](/azure/role-based-access-control/resource-provider-operations#microsoftsupport), at the subscription level.
+To create a support request, you must have the [Owner](../../role-based-access-control/built-in-roles.md#owner), [Contributor](../../role-based-access-control/built-in-roles.md#contributor), [Support Request Contributor](../../role-based-access-control/built-in-roles.md#support-request-contributor) role, or a custom role with [Microsoft.Support/*](../../role-based-access-control/resource-provider-operations.md#microsoftsupport), at the subscription level.
To create a support request without a subscription, for example an Azure Active Directory scenario, you must be an [Admin](../../active-directory/roles/permissions-reference.md). > [!IMPORTANT]
-> If a support request requires investigation into multiple subscriptions, you must have the required access for each subscription involved ([Owner](../../role-based-access-control/built-in-roles.md#owner), [Contributor](../../role-based-access-control/built-in-roles.md#contributor), [Reader](../../role-based-access-control/built-in-roles.md#reader), [Support Request Contributor](../../role-based-access-control/built-in-roles.md#support-request-contributor), or a custom role with the [Microsoft.Support/supportTickets/read](/azure/role-based-access-control/resource-provider-operations#microsoftsupport) permission).
+> If a support request requires investigation into multiple subscriptions, you must have the required access for each subscription involved ([Owner](../../role-based-access-control/built-in-roles.md#owner), [Contributor](../../role-based-access-control/built-in-roles.md#contributor), [Reader](../../role-based-access-control/built-in-roles.md#reader), [Support Request Contributor](../../role-based-access-control/built-in-roles.md#support-request-contributor), or a custom role with the [Microsoft.Support/supportTickets/read](../../role-based-access-control/resource-provider-operations.md#microsoftsupport) permission).
### Go to Help + support from the global header
Follow these links to learn more:
* [Azure support ticket REST API](/rest/api/support) * Engage with us on [Twitter](https://twitter.com/azuresupport) * Get help from your peers in the [Microsoft Q&A question page](/answers/products/azure)
-* Learn more in [Azure Support FAQ](https://azure.microsoft.com/support/faq)
+* Learn more in [Azure Support FAQ](https://azure.microsoft.com/support/faq)
azure-resource-manager Deploy Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/bicep/deploy-cli.md
You need a Bicep file to deploy. The file must be local.
You need Azure CLI and to be connected to Azure: - **Install Azure CLI commands on your local computer.** To deploy Bicep files, you need [Azure CLI](/cli/azure/install-azure-cli) version **2.20.0 or later**.-- **Connect to Azure by using [az login](/cli/azure/reference-index#az_login)**. If you have multiple Azure subscriptions, you might also need to run [az account set](/cli/azure/account#az_account_set).
+- **Connect to Azure by using [az login](/cli/azure/reference-index#az-login)**. If you have multiple Azure subscriptions, you might also need to run [az account set](/cli/azure/account#az-account-set).
Samples for the Azure CLI are written for the `bash` shell. To run this sample in Windows PowerShell or Command Prompt, you may need to change elements of the script.
If you don't have Azure CLI installed, you can use Azure Cloud Shell. For more i
You can target your deployment to a resource group, subscription, management group, or tenant. Depending on the scope of the deployment, you use different commands.
-* To deploy to a **resource group**, use [az deployment group create](/cli/azure/deployment/group#az_deployment_group_create):
+* To deploy to a **resource group**, use [az deployment group create](/cli/azure/deployment/group#az-deployment-group-create):
```azurecli-interactive az deployment group create --resource-group <resource-group-name> --template-file <path-to-bicep> ```
-* To deploy to a **subscription**, use [az deployment sub create](/cli/azure/deployment/sub#az_deployment_sub_create):
+* To deploy to a **subscription**, use [az deployment sub create](/cli/azure/deployment/sub#az-deployment-sub-create):
```azurecli-interactive az deployment sub create --location <location> --template-file <path-to-bicep>
You can target your deployment to a resource group, subscription, management gro
For more information about subscription level deployments, see [Create resource groups and resources at the subscription level](deploy-to-subscription.md).
-* To deploy to a **management group**, use [az deployment mg create](/cli/azure/deployment/mg#az_deployment_mg_create):
+* To deploy to a **management group**, use [az deployment mg create](/cli/azure/deployment/mg#az-deployment-mg-create):
```azurecli-interactive az deployment mg create --location <location> --template-file <path-to-bicep>
You can target your deployment to a resource group, subscription, management gro
For more information about management group level deployments, see [Create resources at the management group level](deploy-to-management-group.md).
-* To deploy to a **tenant**, use [az deployment tenant create](/cli/azure/deployment/tenant#az_deployment_tenant_create):
+* To deploy to a **tenant**, use [az deployment tenant create](/cli/azure/deployment/tenant#az-deployment-tenant-create):
```azurecli-interactive az deployment tenant create --location <location> --template-file <path-to-bicep>
The deployment can take a few minutes to complete. When it finishes, you see a m
## Deploy remote Bicep file
-Currently, Azure CLI doesn't support deploying remote Bicep files. You can use [Bicep CLI](./install.md#vs-code-and-bicep-extension) to [build](/cli/azure/bicep#az_bicep_build) the Bicep file to a JSON template, and then load the JSON file to the remote location.
+Currently, Azure CLI doesn't support deploying remote Bicep files. You can use [Bicep CLI](./install.md#vs-code-and-bicep-extension) to [build](/cli/azure/bicep) the Bicep file to a JSON template, and then load the JSON file to the remote location.
## Parameters
azure-resource-manager Deploy To Resource Group https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/bicep/deploy-to-resource-group.md
To deploy to a resource group, use the resource group deployment commands.
# [Azure CLI](#tab/azure-cli)
-For Azure CLI, use [az deployment group create](/cli/azure/deployment/group#az_deployment_group_create). The following example deploys a template to create a resource group:
+For Azure CLI, use [az deployment group create](/cli/azure/deployment/group#az-deployment-group-create). The following example deploys a template to create a resource group:
```azurecli-interactive az deployment group create \
azure-resource-manager Lock Resources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/lock-resources.md
Remove-AzResourceLock -LockId $lockId
### Azure CLI
-You lock deployed resources with Azure CLI by using the [az lock create](/cli/azure/lock#az_lock_create) command.
+You lock deployed resources with Azure CLI by using the [az lock create](/cli/azure/lock#az-lock-create) command.
To lock a resource, provide the name of the resource, its resource type, and its resource group name.
To lock a resource group, provide the name of the resource group.
az lock create --name LockGroup --lock-type CanNotDelete --resource-group exampleresourcegroup ```
-To get information about a lock, use [az lock list](/cli/azure/lock#az_lock_list). To get all the locks in your subscription, use:
+To get information about a lock, use [az lock list](/cli/azure/lock#az-lock-list). To get all the locks in your subscription, use:
```azurecli az lock list
azure-resource-manager Manage Resource Groups Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/manage-resource-groups-cli.md
The resource group stores metadata about the resources. When you specify a locat
## Create resource groups
-to create a resource group, use [az group create](/cli/azure/group#az_group_create).
+to create a resource group, use [az group create](/cli/azure/group#az-group-create).
```azurecli-interactive az group create --name demoResourceGroup --location westus
az group create --name demoResourceGroup --location westus
## List resource groups
-To list the resource groups in your subscription, use [az group list](/cli/azure/group#az_group_list).
+To list the resource groups in your subscription, use [az group list](/cli/azure/group#az-group-list).
```azurecli-interactive az group list ```
-To get one resource group, use [az group show](/cli/azure/group#az_group_show).
+To get one resource group, use [az group show](/cli/azure/group#az-group-show).
```azurecli-interactive az group show --name exampleGroup
az group show --name exampleGroup
## Delete resource groups
-To delete a resource group, use [az group delete](/cli/azure/group#az_group_delete).
+To delete a resource group, use [az group delete](/cli/azure/group#az-group-delete).
```azurecli-interactive az group delete --name exampleGroup
The following example creates a storage account. The name you provide for the st
az storage account create --resource-group exampleGroup --name examplestore --location westus --sku Standard_LRS --kind StorageV2 ```
-To deploy an ARM template or Bicep file, use [az deployment group create](/cli/azure/deployment/group#az_deployment_group_create).
+To deploy an ARM template or Bicep file, use [az deployment group create](/cli/azure/deployment/group#az-deployment-group-create).
```azurecli-interactive az deployment group create --resource-group exampleGroup --template-file storage.bicep
For more information about deploying a Bicep file, see [Deploy resources with Bi
Locking prevents other users in your organization from accidentally deleting or modifying critical resources.
-To prevent a resource group and its resources from being deleted, use [az lock create](/cli/azure/lock#az_lock_create).
+To prevent a resource group and its resources from being deleted, use [az lock create](/cli/azure/lock#az-lock-create).
```azurecli-interactive az lock create --name LockGroup --lock-type CanNotDelete --resource-group exampleGroup ```
-To get the locks for a resource group, use [az lock list](/cli/azure/lock#az_lock_list).
+To get the locks for a resource group, use [az lock list](/cli/azure/lock#az-lock-list).
```azurecli-interactive az lock list --resource-group exampleGroup ```
-To delete a lock, use [az lock delete](/cli/azure/lock#az_lock_delete)
+To delete a lock, use [az lock delete](/cli/azure/lock#az-lock-delete)
```azurecli-interactive az lock delete --name exampleLock --resource-group exampleGroup
azure-resource-manager Tag Resources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/tag-resources.md
Remove-AzTag -ResourceId "/subscriptions/$subscription"
### Apply tags
-Azure CLI offers two commands for applying tags: [az tag create](/cli/azure/tag#az_tag_create) and [az tag update](/cli/azure/tag#az_tag_update). You must have Azure CLI 2.10.0 or later. You can check your version with `az version`. To update or install, see [Install the Azure CLI](/cli/azure/install-azure-cli).
+Azure CLI offers two commands for applying tags: [az tag create](/cli/azure/tag#az-tag-create) and [az tag update](/cli/azure/tag#az-tag-update). You must have Azure CLI 2.10.0 or later. You can check your version with `az version`. To update or install, see [Install the Azure CLI](/cli/azure/install-azure-cli).
The `az tag create` replaces all tags on the resource, resource group, or subscription. When calling the command, pass in the resource ID of the entity you wish to tag.
az tag update --resource-id /subscriptions/$sub --operation Merge --tags Team="W
### List tags
-To get the tags for a resource, resource group, or subscription, use the [az tag list](/cli/azure/tag#az_tag_list) command and pass in the resource ID for the entity.
+To get the tags for a resource, resource group, or subscription, use the [az tag list](/cli/azure/tag#az-tag-list) command and pass in the resource ID for the entity.
To see the tags for a resource, use:
The specified tags are removed.
}, ```
-To remove all tags, use the [az tag delete](/cli/azure/tag#az_tag_delete) command.
+To remove all tags, use the [az tag delete](/cli/azure/tag#az-tag-delete) command.
```azurecli-interactive az tag delete --resource-id $resource
The following limitations apply to tags:
* Each resource, resource group, and subscription can have a maximum of 50 tag name/value pairs. If you need to apply more tags than the maximum allowed number, use a JSON string for the tag value. The JSON string can contain many values that are applied to a single tag name. A resource group or subscription can contain many resources that each have 50 tag name/value pairs. * The tag name is limited to 512 characters, and the tag value is limited to 256 characters. For storage accounts, the tag name is limited to 128 characters, and the tag value is limited to 256 characters. * Tags can't be applied to classic resources such as Cloud Services.
-* Azure IP Groups and Azure Firewall Policies don't support PATCH operations, which means they don't support updating tags through the portal. Instead, use the update commands for those resources. For example, you can update tags for an IP group with the [az network ip-group update](/cli/azure/network/ip-group#az_network_ip_group_update) command.
+* Azure IP Groups and Azure Firewall Policies don't support PATCH operations, which means they don't support updating tags through the portal. Instead, use the update commands for those resources. For example, you can update tags for an IP group with the [az network ip-group update](/cli/azure/network/ip-group#az-network-ip-group-update) command.
* Tag names can't contain these characters: `<`, `>`, `%`, `&`, `\`, `?`, `/` > [!NOTE]
azure-resource-manager Deploy Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/deploy-cli.md
If you don't have Azure CLI installed, you can use Azure Cloud Shell. For more i
You can target your Azure deployment template to a resource group, subscription, management group, or tenant. Depending on the scope of the deployment, you use different commands.
-* To deploy to a **resource group**, use [az deployment group create](/cli/azure/deployment/group#az_deployment_group_create):
+* To deploy to a **resource group**, use [az deployment group create](/cli/azure/deployment/group#az-deployment-group-create):
```azurecli-interactive az deployment group create --resource-group <resource-group-name> --template-file <path-to-template> ```
-* To deploy to a **subscription**, use [az deployment sub create](/cli/azure/deployment/sub#az_deployment_sub_create):
+* To deploy to a **subscription**, use [az deployment sub create](/cli/azure/deployment/sub#az-deployment-sub-create):
```azurecli-interactive az deployment sub create --location <location> --template-file <path-to-template>
You can target your Azure deployment template to a resource group, subscription,
For more information about subscription level deployments, see [Create resource groups and resources at the subscription level](deploy-to-subscription.md).
-* To deploy to a **management group**, use [az deployment mg create](/cli/azure/deployment/mg#az_deployment_mg_create):
+* To deploy to a **management group**, use [az deployment mg create](/cli/azure/deployment/mg#az-deployment-mg-create):
```azurecli-interactive az deployment mg create --location <location> --template-file <path-to-template>
You can target your Azure deployment template to a resource group, subscription,
For more information about management group level deployments, see [Create resources at the management group level](deploy-to-management-group.md).
-* To deploy to a **tenant**, use [az deployment tenant create](/cli/azure/deployment/tenant#az_deployment_tenant_create):
+* To deploy to a **tenant**, use [az deployment tenant create](/cli/azure/deployment/tenant#az-deployment-tenant-create):
```azurecli-interactive az deployment tenant create --location <location> --template-file <path-to-template>
azure-resource-manager Deploy To Resource Group https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/deploy-to-resource-group.md
To deploy to a resource group, use the resource group deployment commands.
# [Azure CLI](#tab/azure-cli)
-For Azure CLI, use [az deployment group create](/cli/azure/deployment/group#az_deployment_group_create). The following example deploys a template to create a resource group:
+For Azure CLI, use [az deployment group create](/cli/azure/deployment/group#az-deployment-group-create). The following example deploys a template to create a resource group:
```azurecli-interactive az deployment group create \
azure-resource-manager Deploy What If https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/deploy-what-if.md
The preceding commands return a text summary that you can manually inspect. To g
To preview changes before deploying a template, use:
-* [az deployment group what-if](/cli/azure/deployment/group#az_deployment_group_what_if) for resource group deployments
-* [az deployment sub what-if](/cli/azure/deployment/sub#az_deployment_sub_what_if) for subscription level deployments
-* [az deployment mg what-if](/cli/azure/deployment/mg#az_deployment_mg_what_if) for management group deployments
-* [az deployment tenant what-if](/cli/azure/deployment/tenant#az_deployment_tenant_what_if) for tenant deployments
+* [az deployment group what-if](/cli/azure/deployment/group#az-deployment-group-what-if) for resource group deployments
+* [az deployment sub what-if](/cli/azure/deployment/sub#az-deployment-sub-what-if) for subscription level deployments
+* [az deployment mg what-if](/cli/azure/deployment/mg#az-deployment-mg-what-if) for management group deployments
+* [az deployment tenant what-if](/cli/azure/deployment/tenant#az-deployment-tenant-what-if) for tenant deployments
You can use the `--confirm-with-what-if` switch (or its short form `-c`) to preview the changes and get prompted to continue with the deployment. Add this switch to:
-* [az deployment group create](/cli/azure/deployment/group#az_deployment_group_create)
-* [az deployment sub create](/cli/azure/deployment/sub#az_deployment_sub_create).
-* [az deployment mg create](/cli/azure/deployment/mg#az_deployment_mg_create)
-* [az deployment tenant create](/cli/azure/deployment/tenant#az_deployment_tenant_create)
+* [az deployment group create](/cli/azure/deployment/group#az-deployment-group-create)
+* [az deployment sub create](/cli/azure/deployment/sub#az-deployment-sub-create).
+* [az deployment mg create](/cli/azure/deployment/mg#az-deployment-mg-create)
+* [az deployment tenant create](/cli/azure/deployment/tenant#az-deployment-tenant-create)
For example, use `az deployment group create --confirm-with-what-if` or `-c` for resource group deployments.
You see the expected changes and can confirm that you want the deployment to run
You can use the what-if operation through the Azure SDKs.
-* For Python, use [what-if](/python/api/azure-mgmt-resource/azure.mgmt.resource.resources.v2019_10_01.operations.deploymentsoperations#what-if-resource-group-name--deployment-name--properties--location-none--custom-headers-none--raw-false--polling-true-operation-config-).
+* For Python, use [what-if](/python/api/azure-mgmt-resource/azure.mgmt.resource.resources.v2019_10_01.operations.deploymentsoperations).
* For Java, use [DeploymentWhatIf Class](/java/api/com.azure.resourcemanager.resources.models.deploymentwhatif). * For .NET, use [DeploymentWhatIf Class](/dotnet/api/microsoft.azure.management.resourcemanager.models.deploymentwhatif).
azure-resource-manager Deployment History https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/deployment-history.md
To get the correlation ID, use:
# [Azure CLI](#tab/azure-cli)
-To list all the deployments for a resource group, use [az deployment group list](/cli/azure/deployment/group#az_deployment_group_list).
+To list all the deployments for a resource group, use [az deployment group list](/cli/azure/deployment/group#az-deployment-group-list).
```azurecli-interactive az deployment group list --resource-group ExampleGroup ```
-To get a specific deployment, use the [az deployment group show](/cli/azure/deployment/group#az_deployment_group_show).
+To get a specific deployment, use the [az deployment group show](/cli/azure/deployment/group#az-deployment-group-show).
```azurecli-interactive az deployment group show --resource-group ExampleGroup --name ExampleDeployment
To get the correlation ID, use:
# [Azure CLI](#tab/azure-cli)
-To list all the deployments for the current subscription, use [az deployment sub list](/cli/azure/deployment/sub?#az_deployment_sub_list).
+To list all the deployments for the current subscription, use [az deployment sub list](/cli/azure/deployment/sub?#az-deployment-sub-list).
```azurecli-interactive az deployment sub list ```
-To get a specific deployment, use the [az deployment sub show](/cli/azure/deployment/sub#az_deployment_sub_show).
+To get a specific deployment, use the [az deployment sub show](/cli/azure/deployment/sub#az-deployment-sub-show).
```azurecli-interactive az deployment sub show --name ExampleDeployment
To get the correlation ID, use:
# [Azure CLI](#tab/azure-cli)
-To list all the deployments for a management group, use [az deployment mg list](/cli/azure/deployment/mg#az_deployment_mg_list). If you don't have sufficient permissions to view deployments for the management group, you'll get an error.
+To list all the deployments for a management group, use [az deployment mg list](/cli/azure/deployment/mg#az-deployment-mg-list). If you don't have sufficient permissions to view deployments for the management group, you'll get an error.
```azurecli-interactive az deployment mg list --management-group-id examplemg ```
-To get a specific deployment, use the [az deployment mg show](/cli/azure/deployment/mg#az_deployment_mg_show).
+To get a specific deployment, use the [az deployment mg show](/cli/azure/deployment/mg#az-deployment-mg-show).
```azurecli-interactive az deployment mg show --management-group-id examplemg --name ExampleDeployment
To get the correlation ID, use:
# [Azure CLI](#tab/azure-cli)
-To list all the deployments for the current tenant, use [az deployment tenant list](/cli/azure/deployment/tenant#az_deployment_tenant_list). If you don't have sufficient permissions to view deployments for the tenant, you'll get an error.
+To list all the deployments for the current tenant, use [az deployment tenant list](/cli/azure/deployment/tenant#az-deployment-tenant-list). If you don't have sufficient permissions to view deployments for the tenant, you'll get an error.
```azurecli-interactive az deployment tenant list ```
-To get a specific deployment, use the [az deployment tenant show](/cli/azure/deployment/tenant#az_deployment_tenant_show).
+To get a specific deployment, use the [az deployment tenant show](/cli/azure/deployment/tenant#az-deployment-tenant-show).
```azurecli-interactive az deployment tenant show --name ExampleDeployment
To view deployment operations for other scopes, use:
# [Azure CLI](#tab/azure-cli)
-To view the deployment operations for deployment to a resource group, use the [az deployment operation group list](/cli/azure/deployment/operation/group#az_deployment_operation_group_list) command. You must have Azure CLI 2.6.0 or later.
+To view the deployment operations for deployment to a resource group, use the [az deployment operation group list](/cli/azure/deployment/operation/group#az-deployment-operation-group-list) command. You must have Azure CLI 2.6.0 or later.
```azurecli-interactive az deployment operation group list --resource-group ExampleGroup --name ExampleDeployment
az deployment operation group list --resource-group ExampleGroup --name ExampleD
To view deployment operations for other scopes, use:
-* [az deployment operation sub list](/cli/azure/deployment/operation/sub#az_deployment_operation_sub_list)
-* [az deployment operation mg list](/cli/azure/deployment/operation/sub#az_deployment_operation_mg_list)
-* [az deployment operation tenant list](/cli/azure/deployment/operation/sub#az_deployment_operation_tenant_list).
+* [az deployment operation sub list](/cli/azure/deployment/operation/sub#az-deployment-operation-sub-list)
+* [az deployment operation mg list](/cli/azure/deployment/operation/sub)
+* [az deployment operation tenant list](/cli/azure/deployment/operation/sub).
# [HTTP](#tab/http)
azure-resource-manager Quickstart Create Templates Use Visual Studio Code https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/quickstart-create-templates-use-visual-studio-code.md
To complete this quickstart, you need [Visual Studio Code](https://code.visualst
If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, see [Quickstart: Create Bicep files with Visual Studio Code](../bicep/quickstart-create-bicep-use-visual-studio-code.md).
+ ## Create an ARM template Create and open with Visual Studio Code a new file named *azuredeploy.json*. Enter `arm` into the code editor, which initiates Azure Resource Manager snippets for scaffolding out an ARM template.
This snippet creates the basic building blocks for an ARM template.
![Image showing a fully scaffolded ARM template](./media/quickstart-create-templates-use-visual-studio-code/2.png)
-Notice that the Visual Studio Code language mode has changed from *JSON* to *Azure Resource Manager Template*. The extension includes a language server specific to ARM templates which provides ARM template-specific validation, completion, and other language services.
+Notice that the Visual Studio Code language mode has changed from *JSON* to *Azure Resource Manager Template*. The extension includes a language server specific to ARM templates that provides ARM template-specific validation, completion, and other language services.
![Image showing Azure Resource Manager as the Visual Studio Code language mode](./media/quickstart-create-templates-use-visual-studio-code/3.png)
The **tab** key can be used to tab through configurable properties on the storag
One of the most powerful capabilities of the extension is its integration with Azure schemas. Azure schemas provide the extension with validation and resource-aware completion capabilities. Let's modify the storage account to see validation and completion in action.
-First, update the storage account kind to an invalid value such as `megaStorage`. Notice that this action produces a warning indicating that `megaStorage` is not a valid value.
+First, update the storage account kind to an invalid value such as `megaStorage`. Notice that this action produces a warning indicating that `megaStorage` isn't a valid value.
![Image showing an invalid storage configuration](./media/quickstart-create-templates-use-visual-studio-code/7.png)
Now that the parameter file has been mapped to the template, the extension valid
![Image showing an invalidated template due to parameter file issue](./media/quickstart-create-templates-use-visual-studio-code/17.png)
-Navigate back to the ARM template and notice that an error has been raised indicating that the value does not meet the parameter criteria.
+Navigate back to the ARM template and notice that an error has been raised indicating that the value doesn't meet the parameter criteria.
![Image showing a valid ARM template](./media/quickstart-create-templates-use-visual-studio-code/18.png)
azure-resource-manager Template Functions Array https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-functions-array.md
Title: Template functions - arrays description: Describes the functions to use in an Azure Resource Manager template (ARM template) for working with arrays. Previously updated : 09/08/2021 Last updated : 02/11/2022 # Array functions for ARM templates
Resource Manager provides several functions for working with arrays in your Azur
To get an array of string values delimited by a value, see [split](template-functions-string.md#split).
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, see [array](../bicep/bicep-functions-array.md) functions.
+ ## array `array(convertToArray)`
azure-resource-manager Template Functions Comparison https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-functions-comparison.md
Title: Template functions - comparison description: Describes the functions to use in an Azure Resource Manager template (ARM template) to compare values. Previously updated : 09/08/2021 Last updated : 02/11/2022 # Comparison functions for ARM templates
Resource Manager provides several functions for making comparisons in your Azure
* [less](#less) * [lessOrEquals](#lessorequals)
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, see the [coalesce](../bicep/operators-logical.md) logical operator and [comparison](../bicep/operators-comparison.md) operators.
+ ## coalesce `coalesce(arg1, arg2, arg3, ...)`
azure-resource-manager Template Functions Date https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-functions-date.md
Title: Template functions - date description: Describes the functions to use in an Azure Resource Manager template (ARM template) to work with dates. Previously updated : 09/09/2021 Last updated : 02/11/2022 # Date functions for ARM templates
Resource Manager provides the following functions for working with dates in your
* [dateTimeAdd](#datetimeadd) * [utcNow](#utcnow)
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, see [date](../bicep/bicep-functions-date.md) functions.
+ ## dateTimeAdd `dateTimeAdd(base, duration, [format])`
Returns the current (UTC) datetime value in the specified format. If no format i
You can only use this function within an expression for the default value of a parameter. Using this function anywhere else in a template returns an error. The function isn't allowed in other parts of the template because it returns a different value each time it's called. Deploying the same template with the same parameters wouldn't reliably produce the same results.
-If you use the [option to rollback on error](rollback-on-error.md) to an earlier successful deployment, and the earlier deployment includes a parameter that uses utcNow, the parameter isn't reevaluated. Instead, the parameter value from the earlier deployment is automatically reused in the rollback deployment.
+If you use the [option to rollback on error](rollback-on-error.md) to an earlier successful deployment, and the earlier deployment includes a parameter that uses `utcNow`, the parameter isn't reevaluated. Instead, the parameter value from the earlier deployment is automatically reused in the rollback deployment.
-Be careful redeploying a template that relies on the utcNow function for a default value. When you redeploy and don't provide a value for the parameter, the function is reevaluated. If you want to update an existing resource rather than create a new one, pass in the parameter value from the earlier deployment.
+Be careful redeploying a template that relies on the `utcNow` function for a default value. When you redeploy and don't provide a value for the parameter, the function is reevaluated. If you want to update an existing resource rather than create a new one, pass in the parameter value from the earlier deployment.
### Return value
azure-resource-manager Template Functions Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-functions-deployment.md
Title: Template functions - deployment description: Describes the functions to use in an Azure Resource Manager template (ARM template) to retrieve deployment information. Previously updated : 09/09/2021 Last updated : 02/11/2022 # Deployment functions for ARM templates
Resource Manager provides the following functions for getting values related to
To get values from resources, resource groups, or subscriptions, see [Resource functions](template-functions-resource.md).
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, see [deployment](../bicep/bicep-functions-deployment.md) functions.
+ ## deployment `deployment()`
azure-resource-manager Template Functions Logical https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-functions-logical.md
Title: Template functions - logical description: Describes the functions to use in an Azure Resource Manager template (ARM template) to determine logical values. Previously updated : 09/09/2021 Last updated : 02/11/2022 # Logical functions for ARM templates
Resource Manager provides several functions for making comparisons in your Azure
* [or](#or) * [true](#true)
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, see the [bool](../bicep/bicep-functions-logical.md) logical function and [logical](../bicep/operators-logical.md) operators.
+ ## and `and(arg1, arg2, ...)`
azure-resource-manager Template Functions Numeric https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-functions-numeric.md
Title: Template functions - numeric description: Describes the functions to use in an Azure Resource Manager template (ARM template) to work with numbers. Previously updated : 09/09/2021 Last updated : 02/11/2022 # Numeric functions for ARM templates
Resource Manager provides the following functions for working with integers in y
* [mul](#mul) * [sub](#sub)
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more about using `int`, `min`, and `max` in Bicep, see [numeric](../bicep/bicep-functions-numeric.md) functions. For other numeric values, see [numeric](../bicep/operators-numeric.md) operators.
+ ## add `add(operand1, operand2)`
azure-resource-manager Template Functions Object https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-functions-object.md
Title: Template functions - objects description: Describes the functions to use in an Azure Resource Manager template (ARM template) for working with objects. Previously updated : 09/09/2021 Last updated : 02/11/2022 # Object functions for ARM templates
Resource Manager provides several functions for working with objects in your Azu
* [null](#null) * [union](#union)
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, see [object](../bicep/bicep-functions-object.md) functions.
+ ## contains `contains(container, itemToFind)`
azure-resource-manager Template Functions Resource https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-functions-resource.md
Title: Template functions - resources description: Describes the functions to use in an Azure Resource Manager template (ARM template) to retrieve values about resources. Previously updated : 12/28/2021 Last updated : 02/11/2022
To get values from parameters, variables, or the current deployment, see [Deploy
To get deployment scope values, see [Scope functions](template-functions-scope.md).
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, see [resource](../bicep/bicep-functions-resource.md) functions.
+ ## extensionResourceId `extensionResourceId(baseResourceId, resourceType, resourceName1, [resourceName2], ...)`
azure-resource-manager Template Functions Scope https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-functions-scope.md
Title: Template functions - scope description: Describes the functions to use in an Azure Resource Manager template (ARM template) to retrieve values about deployment scope. Previously updated : 11/23/2021 Last updated : 02/11/2022 # Scope functions for ARM templates
Resource Manager provides the following functions for getting deployment scope v
To get values from parameters, variables, or the current deployment, see [Deployment value functions](template-functions-deployment.md).
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, see [scope](../bicep/bicep-functions-scope.md) functions.
+ ## managementGroup `managementGroup()`
azure-resource-manager Template Functions String https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-functions-string.md
Title: Template functions - string description: Describes the functions to use in an Azure Resource Manager template (ARM template) to work with strings. Previously updated : 10/29/2021 Last updated : 02/11/2022 # String functions for ARM templates
Resource Manager provides the following functions for working with strings in yo
* [uriComponent](#uricomponent) * [uriComponentToString](#uricomponenttostring)
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, see [string](../bicep/bicep-functions-string.md) functions.
+ ## base64 `base64(inputString)`
azure-resource-manager Template Functions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/templates/template-functions.md
Title: Template functions description: Describes the functions to use in an Azure Resource Manager template (ARM template) to retrieve values, work with strings and numerics, and retrieve deployment information. Previously updated : 01/20/2022 Last updated : 02/11/2022 # ARM template functions
To create your own functions, see [User-defined functions](./syntax.md#functions
Most functions work the same when deployed to a resource group, subscription, management group, or tenant. A few functions can't be used in all scopes. They're noted in the lists below.
+> [!TIP]
+> We recommend [Bicep](../bicep/overview.md) because it offers the same capabilities as ARM templates and the syntax is easier to use. To learn more, see [Bicep functions](../bicep/bicep-functions.md) and [Bicep operators](../bicep/operators.md).
+ <a id="array" aria-hidden="true"></a> <a id="concatarray" aria-hidden="true"></a> <a id="contains" aria-hidden="true"></a>
Resource Manager provides several functions for working with arrays.
* [take](template-functions-array.md#take) * [union](template-functions-array.md#union)
-For Bicep files, use the Bicep [array](../bicep/bicep-functions-array.md) functions.
+For Bicep files, use the [array](../bicep/bicep-functions-array.md) functions.
<a id="coalesce" aria-hidden="true"></a> <a id="equals" aria-hidden="true"></a>
Resource Manager provides several functions for making comparisons in your templ
* [greater](template-functions-comparison.md#greater) * [greaterOrEquals](template-functions-comparison.md#greaterorequals)
-For Bicep files, use the Bicep [coalesce](../bicep/operators-logical.md) logical operator. For comparisons, use the Bicep [comparison](../bicep/operators-comparison.md) operators.
-
-<a id="deployment" aria-hidden="true"></a>
-<a id="parameters" aria-hidden="true"></a>
-<a id="variables" aria-hidden="true"></a>
+For Bicep files, use the [coalesce](../bicep/operators-logical.md) logical operator. For comparisons, use the [comparison](../bicep/operators-comparison.md) operators.
## Date functions
Resource Manager provides the following functions for working with dates.
* [dateTimeAdd](template-functions-date.md#datetimeadd) * [utcNow](template-functions-date.md#utcnow)
-For Bicep files, use the Bicep [date](../bicep/bicep-functions-date.md) functions.
+For Bicep files, use the [date](../bicep/bicep-functions-date.md) functions.
+
+<a id="deployment" aria-hidden="true"></a>
+<a id="parameters" aria-hidden="true"></a>
+<a id="variables" aria-hidden="true"></a>
## Deployment value functions
Resource Manager provides the following functions for getting values from sectio
* [parameters](template-functions-deployment.md#parameters) * [variables](template-functions-deployment.md#variables)
-For Bicep files, use the Bicep [deployment](../bicep/bicep-functions-deployment.md) functions.
+For Bicep files, use the [deployment](../bicep/bicep-functions-deployment.md) functions.
<a id="and" aria-hidden="true"></a> <a id="bool" aria-hidden="true"></a>
Resource Manager provides the following functions for working with logical condi
* [or](template-functions-logical.md#or) * [true](template-functions-logical.md#true)
-For Bicep files, use the Bicep [bool](../bicep/bicep-functions-logical.md) logical function. For other logical values, use Bicep [logical](../bicep/operators-logical.md) operators.
+For Bicep files, use the [bool](../bicep/bicep-functions-logical.md) logical function. For other logical values, use [logical](../bicep/operators-logical.md) operators.
<a id="add" aria-hidden="true"></a> <a id="copyindex" aria-hidden="true"></a>
Resource Manager provides the following functions for working with integers:
* [mul](template-functions-numeric.md#mul) * [sub](template-functions-numeric.md#sub)
-For Bicep files that use `int`, `min`, and `max` use Bicep [numeric](../bicep/bicep-functions-numeric.md) functions. For other numeric values, use Bicep [numeric](../bicep/operators-numeric.md) operators.
+For Bicep files that use `int`, `min`, and `max` use [numeric](../bicep/bicep-functions-numeric.md) functions. For other numeric values, use [numeric](../bicep/operators-numeric.md) operators.
<a id="json" aria-hidden="true"></a>
Resource Manager provides several functions for working with objects.
* [null](template-functions-object.md#null) * [union](template-functions-object.md#union)
-For Bicep files, use the Bicep [object](../bicep/bicep-functions-object.md) functions.
+For Bicep files, use the [object](../bicep/bicep-functions-object.md) functions.
<a id="extensionResourceId" aria-hidden="true"></a> <a id="listkeys" aria-hidden="true"></a>
Resource Manager provides the following functions for getting resource values:
* [subscriptionResourceId](template-functions-resource.md#subscriptionresourceid) * [tenantResourceId](template-functions-resource.md#tenantresourceid)
-For Bicep files, use the Bicep [resource](../bicep/bicep-functions-resource.md) functions.
+For Bicep files, use the [resource](../bicep/bicep-functions-resource.md) functions.
<a id="managementgroup" aria-hidden="true"></a> <a id="resourcegroup" aria-hidden="true"></a>
Resource Manager provides the following functions for getting deployment scope v
* [subscription](template-functions-scope.md#subscription) - can only be used in deployments to a resource group or subscription. * [tenant](template-functions-scope.md#tenant) - can be used for deployments at any scope.
-For Bicep files, use the Bicep [scope](../bicep/bicep-functions-scope.md) functions.
+For Bicep files, use the [scope](../bicep/bicep-functions-scope.md) functions.
<a id="base64" aria-hidden="true"></a> <a id="base64tojson" aria-hidden="true"></a>
Resource Manager provides the following functions for working with strings:
* [uriComponent](template-functions-string.md#uricomponent) * [uriComponentToString](template-functions-string.md#uricomponenttostring)
-For Bicep files, use the Bicep [string](../bicep/bicep-functions-string.md) functions.
+For Bicep files, use the [string](../bicep/bicep-functions-string.md) functions.
## Next steps
azure-signalr Availability Zones https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-signalr/availability-zones.md
+
+ Title: Availability zones support in Azure SignalR Service
+description: Availability zones support in Azure SignalR Service
+++ Last updated : 02/15/2022++
+# Availability zones support in Azure SignalR Service
+
+[Availability zones](../availability-zones/az-overview.md#availability-zones) are unique physical locations within an Azure region. To ensure resiliency, there's a minimum of three separate zones in all enabled regions. Each zone has one or more datacenters equipped with independent power, cooling, and networking.
+
+## Zone redundancy
+
+Azure SignalR Service leverages availability zones in a Zone Redundant manner. That means, the service doesn't spin to a specific zone. Instead workloads are evenly distributed across multiple zones in a region. When a single zone fails, traffic are automatically routed to other zones, keeping the service available.
+
+## Region support
+
+Not all Azure regions support availability zones. For the regions list, see [regions that support availability zones](../availability-zones/az-region.md).
+
+## Tier support
+
+Zone redundancy is a Premium tier feature. It is implicitly enabled when you create or upgrade to a Premium tier resource. Standard tier resources can be upgraded to Premium tier without downtime.
+
+## Next steps
+
+* Learn more about [regions that support availability zones](../availability-zones/az-region.md).
+* Learn more about building for [reliability](/azure/architecture/framework/resiliency/app-design) in Azure.
azure-signalr Concept Upstream https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-signalr/concept-upstream.md
POST
#### Connected
-Content-Type: application/json
+Content-Type: `application/json`
#### Disconnected
azure-signalr Signalr Howto Scale Autoscale https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-signalr/signalr-howto-scale-autoscale.md
+
+ Title: Auto scale Azure SignalR Service
+description: Learn how to autoscale Azure SignalR Service.
+++
+ms.devlang: csharp
+ Last updated : 02/11/2022+++
+# Automatically scale units of an Azure SignalR Service
+Autoscale allows you to have the right unit count to handle the load on your application. It allows you to add resources to handle increases in load and also save money by removing resources that are sitting idle. See [Overview of autoscale in Microsoft Azure](../azure-monitor/autoscale/autoscale-overview.md) to learn more about the Autoscale feature of Azure Monitor.
+
+> [!IMPORTANT]
+> This article applies to only the **Premium** tier of Azure SignalR Service.
+
+By using the Autoscale feature for Azure SignalR Service, you can specify a minimum and maximum number of units and add or remove units automatically based on a set of rules.
+
+For example, you can implement the following scaling scenarios using the Autoscale feature.
+
+- Increase units when the Connection Quota Utilization above 70%.
+- Decrease units when the Connection Quota Utilization below 20%.
+- Use more units during business hours and fewer during off hours.
+
+This article shows you how you can automatically scale units in the Azure portal.
++
+## Autoscale setting page
+First, follow these steps to navigate to the **Scale out** page for your Azure SignalR Service.
+
+1. In your browser, open the [Azure portal](https://portal.azure.com).
+
+2. In your SignalR Service page, from the left menu, select **Scale out**.
+
+3. Make sure the resource is in Premium Tier and you will see a **Custom autoscale** setting.
++
+## Custom autoscale - Default condition
+You can configure automatic scaling of units by using conditions. This scale condition is executed when none of the other scale conditions match. You can set the default condition in one of the following ways:
+
+- Scale based on a metric
+- Scale to specific units
+
+You can't set a schedule to autoscale on a specific days or date range for a default condition. This scale condition is executed when none of the other scale conditions with schedules match.
+
+### Scale based on a metric
+The following procedure shows you how to add a condition to automatically increase units (scale out) when the Connection Quota Utilization is greater than 70% and decrease units (scale in) when the Connection Quota Utilization is less than 20%. Increments or decrements are done between available units.
+
+1. On the **Scale out** page, select **Custom autoscale** for the **Choose how to scale your resource** option.
+1. Select **Scale based on a metric** for **Scale mode**.
+1. Select **+ Add a rule**.
+
+ :::image type="content" source="./media/signalr-howto-scale-autoscale/default-autoscale.png" alt-text="Default - scale based on a metric":::
+
+1. On the **Scale rule** page, follow these steps:
+ 1. Select a metric from the **Metric name** drop-down list. In this example, it's **Connection Quota Utilization**.
+ 1. Select an operator and threshold values. In this example, they're **Greater than** and **70** for **Metric threshold to trigger scale action**.
+ 1. Select an **operation** in the **Action** section. In this example, it's set to **Increase**.
+ 1. Then, select **Add**
+
+ :::image type="content" source="./media/signalr-howto-scale-autoscale/default-scale-out.png" alt-text="Default - scale out if Connection Quota Utilization is greater than 70%":::
+
+1. Select **+ Add a rule** again, and follow these steps on the **Scale rule** page:
+ 1. Select a metric from the **Metric name** drop-down list. In this example, it's **Connection Quota Utilization**.
+ 1. Select an operator and threshold values. In this example, they're **Less than** and **20** for **Metric threshold to trigger scale action**.
+ 1. Select an **operation** in the **Action** section. In this example, it's set to **Decrease**.
+ 1. Then, select **Add**
+
+ :::image type="content" source="./media/signalr-howto-scale-autoscale/default-scale-in.png" alt-text="Default - scale in if Connection Quota Utilization is less than 20%":::
+
+1. Set the **minimum** and **maximum** and **default** number of units.
+
+1. Select **Save** on the toolbar to save the autoscale setting.
+
+### Scale to specific number of units
+Follow these steps to configure the rule to scale to a specific units. Again, the default condition is applied when none of the other scale conditions match.
+
+1. On the **Scale out** page, select **Custom autoscale** for the **Choose how to scale your resource** option.
+1. Select **Scale to a specific units** for **Scale mode**.
+1. For **Units**, select the number of default units.
+
+ :::image type="content" source="./media/signalr-howto-scale-autoscale/default-specific-units.png" alt-text="Default - scale to specific units":::
+
+## Custom autoscale - Additional conditions
+The previous section shows you how to add a default condition for the autoscale setting. This section shows you how to add more conditions to the autoscale setting. For these additional non-default conditions, you can set a schedule based on specific days of a week or a date range.
+
+### Scale based on a metric
+1. On the **Scale out** page, select **Custom autoscale** for the **Choose how to scale your resource** option.
+1. Select **Add a scale condition** under the **Default** block.
+
+ :::image type="content" source="./media/signalr-howto-scale-autoscale/additional-add-condition.png" alt-text="Custom - add a scale condition link":::
+1. Confirm that the **Scale based on a metric** option is selected.
+1. Select **+ Add a rule** to add a rule to increase units when the **Connection Quota Utilization** goes above 70%. Follow steps from the [default condition](#custom-autoscaledefault-condition) section.
+5. Set the **minimum** and **maximum** and **default** number of units.
+6. You can also set a **schedule** on a custom condition (but not on the default condition). You can either specify start and end dates for the condition (or) select specific days (Monday, Tuesday, and so on.) of a week.
+ 1. If you select **Specify start/end dates**, select the **Timezone**, **Start date and time** and **End date and time** (as shown in the following image) for the condition to be in effect.
+ 1. If you select **Repeat specific days**, select the days of the week, timezone, start time, and end time when the condition should apply.
azure-signalr Signalr Howto Scale Signalr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-signalr/signalr-howto-scale-signalr.md
For a table of service limits, quotas, and constraints in each tier, see [Signal
In this guide, you learned about how to scale single SignalR Service instance.
+Autoscale is supported in Azure SignalR Service Premium Tier.
+
+> [!div class="nextstepaction"]
+> [Automatically scale units of an Azure SignalR Service](./signalr-howto-scale-autoscale.md)
+ Multiple endpoints are also supported for scaling, sharding, and cross-region scenarios. > [!div class="nextstepaction"]
azure-sql Auto Failover Group Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/auto-failover-group-overview.md
As discussed previously, auto-failover groups can also be managed programmatical
| Command | Description | | | |
-| [az sql failover-group create](/cli/azure/sql/failover-group#az_sql_failover_group_create) |This command creates a failover group and registers it on both primary and secondary servers|
-| [az sql failover-group delete](/cli/azure/sql/failover-group#az_sql_failover_group_delete) | Removes a failover group from the server |
-| [az sql failover-group show](/cli/azure/sql/failover-group#az_sql_failover_group_show) | Retrieves a failover group configuration |
-| [az sql failover-group update](/cli/azure/sql/failover-group#az_sql_failover_group_update) |Modifies a failover group's configuration and/or adds one or more databases to a failover group|
-| [az sql failover-group set-primary](/cli/azure/sql/failover-group#az_sql_failover_group_set_primary) | Triggers failover of a failover group to the secondary server |
+| [az sql failover-group create](/cli/azure/sql/failover-group#az-sql-failover-group-create) |This command creates a failover group and registers it on both primary and secondary servers|
+| [az sql failover-group delete](/cli/azure/sql/failover-group#az-sql-failover-group-delete) | Removes a failover group from the server |
+| [az sql failover-group show](/cli/azure/sql/failover-group#az-sql-failover-group-show) | Retrieves a failover group configuration |
+| [az sql failover-group update](/cli/azure/sql/failover-group#az-sql-failover-group-update) |Modifies a failover group's configuration and/or adds one or more databases to a failover group|
+| [az sql failover-group set-primary](/cli/azure/sql/failover-group#az-sql-failover-group-set-primary) | Triggers failover of a failover group to the secondary server |
# [Rest API](#tab/rest-api)
As discussed previously, auto-failover groups can also be managed programmatical
| Command | Description | | | |
-| [az sql failover-group create](/cli/azure/sql/failover-group#az_sql_failover_group_create) |This command creates a failover group and registers it on both primary and secondary servers|
-| [az sql failover-group delete](/cli/azure/sql/failover-group#az_sql_failover_group_delete) | Removes a failover group from the server |
-| [az sql failover-group show](/cli/azure/sql/failover-group#az_sql_failover_group_show) | Retrieves a failover group configuration |
-| [az sql failover-group update](/cli/azure/sql/failover-group#az_sql_failover_group_update) |Modifies a failover group's configuration and/or adds one or more databases to a failover group|
-| [az sql failover-group set-primary](/cli/azure/sql/failover-group#az_sql_failover_group_set_primary) | Triggers failover of a failover group to the secondary server |
+| [az sql failover-group create](/cli/azure/sql/failover-group#az-sql-failover-group-create) |This command creates a failover group and registers it on both primary and secondary servers|
+| [az sql failover-group delete](/cli/azure/sql/failover-group#az-sql-failover-group-delete) | Removes a failover group from the server |
+| [az sql failover-group show](/cli/azure/sql/failover-group#az-sql-failover-group-show) | Retrieves a failover group configuration |
+| [az sql failover-group update](/cli/azure/sql/failover-group#az-sql-failover-group-update) |Modifies a failover group's configuration and/or adds one or more databases to a failover group|
+| [az sql failover-group set-primary](/cli/azure/sql/failover-group#az-sql-failover-group-set-primary) | Triggers failover of a failover group to the secondary server |
# [Rest API](#tab/rest-api)
azure-sql Automated Backups Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/automated-backups-overview.md
To perform a restore, see [Restore database from backups](recovery-using-backups
||||| | **Change backup retention** | [SQL Database](#change-the-short-term-retention-policy-using-the-azure-portal) <br/> [SQL Managed Instance](#change-the-short-term-retention-policy-using-the-azure-portal) | [SQL Database](#change-the-short-term-retention-policy-using-azure-cli) <br/> [SQL Managed Instance](#change-the-short-term-retention-policy-using-azure-cli) | [SQL Database](#change-the-short-term-retention-policy-using-powershell) <br/>[SQL Managed Instance](#change-the-short-term-retention-policy-using-powershell) | | **Change long-term backup retention** | [SQL Database](long-term-backup-retention-configure.md#create-long-term-retention-policies)<br/> [SQL Managed Instance](../managed-instance/long-term-backup-retention-configure.md) | [SQL Database](long-term-backup-retention-configure.md) <br/> [SQL Managed Instance](../managed-instance/long-term-backup-retention-configure.md) | [SQL Database](long-term-backup-retention-configure.md)<br/>[SQL Managed Instance](../managed-instance/long-term-backup-retention-configure.md) |
-| **Restore a database from a point in time** | [SQL Database](recovery-using-backups.md#point-in-time-restore)<br>[SQL Managed Instance](../managed-instance/point-in-time-restore.md) | [SQL Database](/cli/azure/sql/db#az_sql_db_restore) <br/> [SQL Managed Instance](/cli/azure/sql/midb#az_sql_midb_restore) | [SQL Database](/powershell/module/az.sql/restore-azsqldatabase) <br/> [SQL Managed Instance](/powershell/module/az.sql/restore-azsqlinstancedatabase) |
+| **Restore a database from a point in time** | [SQL Database](recovery-using-backups.md#point-in-time-restore)<br>[SQL Managed Instance](../managed-instance/point-in-time-restore.md) | [SQL Database](/cli/azure/sql/db#az-sql-db-restore) <br/> [SQL Managed Instance](/cli/azure/sql/midb#az-sql-midb-restore) | [SQL Database](/powershell/module/az.sql/restore-azsqldatabase) <br/> [SQL Managed Instance](/powershell/module/az.sql/restore-azsqlinstancedatabase) |
| **Restore a deleted database** | [SQL Database](recovery-using-backups.md)<br>[SQL Managed Instance](../managed-instance/point-in-time-restore.md#restore-a-deleted-database) | [SQL Database](long-term-backup-retention-configure.md#restore-from-ltr-backups) <br/> [SQL Managed Instance](../managed-instance/long-term-backup-retention-configure.md#restore-from-ltr-backups) | [SQL Database](/powershell/module/az.sql/get-azsqldeleteddatabasebackup) <br/> [SQL Managed Instance](/powershell/module/az.sql/get-azsqldeletedinstancedatabasebackup)| | **Restore a database from Azure Blob storage** | | | <br/>[SQL Managed Instance](../managed-instance/restore-sample-database-quickstart.md) |
az sql db update \
--name mydb \ --backup-storage-redundancy Local ```
-For more details, see [az sql db create](/cli/azure/sql/db#az_sql_db_create) and [az sql db update](/cli/azure/sql/db#az_sql_db_update).
+For more details, see [az sql db create](/cli/azure/sql/db#az-sql-db-create) and [az sql db update](/cli/azure/sql/db#az-sql-db-update).
#### [SQL Managed Instance](#tab/managed-instance)
azure-sql High Availability Sla https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/high-availability-sla.md
A failover can be initiated using PowerShell, REST API, or Azure CLI:
|Deployment type|PowerShell|REST API| Azure CLI| |:|:|:|:|
-|Database|[Invoke-AzSqlDatabaseFailover](/powershell/module/az.sql/invoke-azsqldatabasefailover)|[Database failover](/rest/api/sql/databases/failover)|[az rest](/cli/azure/reference-index#az_rest) may be used to invoke a REST API call from Azure CLI|
-|Elastic pool|[Invoke-AzSqlElasticPoolFailover](/powershell/module/az.sql/invoke-azsqlelasticpoolfailover)|[Elastic pool failover](/javascript/api/@azure/arm-sql/elasticpools#failover_string__string__string__msRest_RequestOptionsBase)|[az rest](/cli/azure/reference-index#az_rest) may be used to invoke a REST API call from Azure CLI|
-|Managed Instance|[Invoke-AzSqlInstanceFailover](/powershell/module/az.sql/Invoke-AzSqlInstanceFailover/)|[Managed Instances - Failover](/rest/api/sql/managed%20instances%20-%20failover/failover)|[az sql mi failover](/cli/azure/sql/mi/#az_sql_mi_failover)|
+|Database|[Invoke-AzSqlDatabaseFailover](/powershell/module/az.sql/invoke-azsqldatabasefailover)|[Database failover](/rest/api/sql/databases/failover)|[az rest](/cli/azure/reference-index#az-rest) may be used to invoke a REST API call from Azure CLI|
+|Elastic pool|[Invoke-AzSqlElasticPoolFailover](/powershell/module/az.sql/invoke-azsqlelasticpoolfailover)|[Elastic pool failover](/javascript/api/@azure/arm-sql/elasticpools)|[az rest](/cli/azure/reference-index#az-rest) may be used to invoke a REST API call from Azure CLI|
+|Managed Instance|[Invoke-AzSqlInstanceFailover](/powershell/module/az.sql/Invoke-AzSqlInstanceFailover/)|[Managed Instances - Failover](/rest/api/sql/managed%20instances%20-%20failover/failover)|[az sql mi failover](/cli/azure/sql/mi/#az-sql-mi-failover)|
> [!IMPORTANT] > The Failover command is not available for readable secondary replicas of Hyperscale databases.
azure-sql Ledger Create A Single Database With Ledger Enabled https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/ledger-create-a-single-database-with-ledger-enabled.md
az storage account create \
### Grant the server permissions to write ledger digests
-Assign the managed identity of the server to the [Storage Blob Data Contributor](/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) role with the [az role assignment create](/cli/azure/sql/db) command. This gives the SQL server the appropriate permissions to publish database digests to the storage account.
+Assign the managed identity of the server to the [Storage Blob Data Contributor](../../role-based-access-control/built-in-roles.md#storage-blob-data-contributor) role with the [az role assignment create](/cli/azure/sql/db) command. This gives the SQL server the appropriate permissions to publish database digests to the storage account.
```azurecli-interactive az role assignment create \
$storage
### Grant the server permissions to write ledger digests
-Assign the managed identity of the server to the [Storage Blob Data Contributor](/azure/role-based-access-control/built-in-roles#storage-blob-data-contributor) role with the [New-AzRoleAssignment](/powershell/module/az.Resources/New-azRoleAssignment) cmdlet. This gives the SQL server the appropriate permissions to publish database digests to the storage account.
+Assign the managed identity of the server to the [Storage Blob Data Contributor](../../role-based-access-control/built-in-roles.md#storage-blob-data-contributor) role with the [New-AzRoleAssignment](/powershell/module/az.Resources/New-azRoleAssignment) cmdlet. This gives the SQL server the appropriate permissions to publish database digests to the storage account.
```azurepowershell-interactive Write-host "Granting the server access to the storage account..."
Remove-AzResourceGroup -Name $resourceGroupName
Connect and query your database by using different tools and languages: - [Create and use updatable ledger tables](ledger-how-to-updatable-ledger-tables.md)-- [Create and use append-only ledger tables](ledger-how-to-append-only-ledger-tables.md)
+- [Create and use append-only ledger tables](ledger-how-to-append-only-ledger-tables.md)
azure-sql Logical Servers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/logical-servers.md
To create and manage servers, databases, and firewalls with the [Azure CLI](/cli
| Cmdlet | Description | | | |
-|[az sql db create](/cli/azure/sql/db#az_sql_db_create) |Creates a database|
-|[az sql db list](/cli/azure/sql/db#az_sql_db_list)|Lists all databases managed by a server, or all databases in an elastic pool|
-|[az sql db list-editions](/cli/azure/sql/db#az_sql_db_list_editions)|Lists available service objectives and storage limits|
-|[az sql db list-usages](/cli/azure/sql/db#az_sql_db_list_usages)|Returns database usages|
-|[az sql db show](/cli/azure/sql/db#az_sql_db_show)|Gets a database
-|[az sql db update](/cli/azure/sql/db#az_sql_db_update)|Updates a database|
-|[az sql db delete](/cli/azure/sql/db#az_sql_db_delete)|Removes a database|
-|[az group create](/cli/azure/group#az_group_create)|Creates a resource group|
-|[az sql server create](/cli/azure/sql/server#az_sql_server_create)|Creates a server|
-|[az sql server list](/cli/azure/sql/server#az_sql_server_list)|Lists servers|
-|[az sql server list-usages](/cli/azure/sql/server#az_sql_server_list-usages)|Returns server usages|
-|[az sql server show](/cli/azure/sql/server#az_sql_server_show)|Gets a server|
-|[az sql server update](/cli/azure/sql/server#az_sql_server_update)|Updates a server|
-|[az sql server delete](/cli/azure/sql/server#az_sql_server_delete)|Deletes a server|
-|[az sql server firewall-rule create](/cli/azure/sql/server/firewall-rule#az_sql_server_firewall_rule_create)|Creates a server firewall rule|
-|[az sql server firewall-rule list](/cli/azure/sql/server/firewall-rule#az_sql_server_firewall_rule_list)|Lists the firewall rules on a server|
-|[az sql server firewall-rule show](/cli/azure/sql/server/firewall-rule#az_sql_server_firewall_rule_show)|Shows the detail of a firewall rule|
-|[az sql server firewall-rule update](/cli/azure/sql/server/firewall-rule##az_sql_server_firewall_rule_update)|Updates a firewall rule|
-|[az sql server firewall-rule delete](/cli/azure/sql/server/firewall-rule#az_sql_server_firewall_rule_delete)|Deletes a firewall rule|
+|[az sql db create](/cli/azure/sql/db#az-sql-db-create) |Creates a database|
+|[az sql db list](/cli/azure/sql/db#az-sql-db-list)|Lists all databases managed by a server, or all databases in an elastic pool|
+|[az sql db list-editions](/cli/azure/sql/db#az-sql-db-list-editions)|Lists available service objectives and storage limits|
+|[az sql db list-usages](/cli/azure/sql/db#az-sql-db-list-usages)|Returns database usages|
+|[az sql db show](/cli/azure/sql/db#az-sql-db-show)|Gets a database
+|[az sql db update](/cli/azure/sql/db#az-sql-db-update)|Updates a database|
+|[az sql db delete](/cli/azure/sql/db#az-sql-db-delete)|Removes a database|
+|[az group create](/cli/azure/group#az-group-create)|Creates a resource group|
+|[az sql server create](/cli/azure/sql/server#az-sql-server-create)|Creates a server|
+|[az sql server list](/cli/azure/sql/server#az-sql-server-list)|Lists servers|
+|[az sql server list-usages](/cli/azure/sql/server#az-sql-server-list-usages)|Returns server usages|
+|[az sql server show](/cli/azure/sql/server#az-sql-server-show)|Gets a server|
+|[az sql server update](/cli/azure/sql/server#az-sql-server-update)|Updates a server|
+|[az sql server delete](/cli/azure/sql/server#az-sql-server-delete)|Deletes a server|
+|[az sql server firewall-rule create](/cli/azure/sql/server/firewall-rule#az-sql-server-firewall-rule-create)|Creates a server firewall rule|
+|[az sql server firewall-rule list](/cli/azure/sql/server/firewall-rule#az-sql-server-firewall-rule-list)|Lists the firewall rules on a server|
+|[az sql server firewall-rule show](/cli/azure/sql/server/firewall-rule#az-sql-server-firewall-rule-show)|Shows the detail of a firewall rule|
+|[az sql server firewall-rule update](/cli/azure/sql/server/firewall-rule##az-sql-server-firewall-rule-update)|Updates a firewall rule|
+|[az sql server firewall-rule delete](/cli/azure/sql/server/firewall-rule#az-sql-server-firewall-rule-delete)|Deletes a firewall rule|
> [!TIP] > For an Azure CLI quickstart, see [Create a database in Azure SQL Database using the Azure CLI](az-cli-script-samples-content-guide.md). For Azure CLI example scripts, see [Use the CLI to create a database in Azure SQL Database and configure a firewall rule](scripts/create-and-configure-database-cli.md) and [Use Azure CLI to monitor and scale a database in Azure SQL Database](scripts/monitor-and-scale-database-cli.md).
azure-sql Private Endpoint Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/private-endpoint-overview.md
To establish connectivity from an on-premises environment to the database in SQL
- [Site-to-Site VPN connection](../../vpn-gateway/vpn-gateway-create-site-to-site-rm-powershell.md) - [ExpressRoute circuit](../../expressroute/expressroute-howto-linkvnet-portal-resource-manager.md)
-Consider [DNS configuration scenarios](/azure/private-link/private-endpoint-dns#dns-configuration-scenarios) as well, as the FQDN of the service can resolve to the public IP address.
+Consider [DNS configuration scenarios](../../private-link/private-endpoint-dns.md#dns-configuration-scenarios) as well, as the FQDN of the service can resolve to the public IP address.
## Connecting from Azure Synapse Analytics to Azure Storage using Polybase and the COPY statement
With Private Link, customers can now set up network access controls like NSGs to
[6]: media/private-endpoint/pec-select.png [7]: media/private-endpoint/pec-click.png [8]: media/private-endpoint/pec-nic-click.png
-[9]: media/private-endpoint/pec-ip-display.png
+[9]: media/private-endpoint/pec-ip-display.png
azure-sql Recovery Using Backups https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/recovery-using-backups.md
To restore a database by using the REST API:
#### SQL Database
-To restore a database by using the Azure CLI, see [az sql db restore](/cli/azure/sql/db#az_sql_db_restore).
+To restore a database by using the Azure CLI, see [az sql db restore](/cli/azure/sql/db#az-sql-db-restore).
#### SQL Managed Instance
-To restore a managed instance database by using the Azure CLI, see [az sql midb restore](/cli/azure/sql/midb#az_sql_midb_restore).
+To restore a managed instance database by using the Azure CLI, see [az sql midb restore](/cli/azure/sql/midb#az-sql-midb-restore).
## Summary
azure-sql Serverless Tier Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/serverless-tier-overview.md
Modifying the maximum or minimum vCores, and autopause delay, is performed by us
### Use Azure CLI
-Modifying the maximum or minimum vCores, and autopause delay, is performed by using the [az sql db update](/cli/azure/sql/db#az_sql_db_update) command in Azure CLI using the `capacity`, `min-capacity`, and `auto-pause-delay` arguments.
+Modifying the maximum or minimum vCores, and autopause delay, is performed by using the [az sql db update](/cli/azure/sql/db#az-sql-db-update) command in Azure CLI using the `capacity`, `min-capacity`, and `auto-pause-delay` arguments.
## Monitoring
azure-sql Service Tier Hyperscale https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/service-tier-hyperscale.md
With the ability to rapidly spin up/down additional read-only compute nodes, the
## Create a Hyperscale database
-A Hyperscale database can be created using the [Azure portal](https://portal.azure.com), [T-SQL](/sql/t-sql/statements/create-database-transact-sql), [PowerShell](/powershell/module/azurerm.sql/new-azurermsqldatabase), or [CLI](/cli/azure/sql/db#az_sql_db_create). Hyperscale databases are available only using the [vCore-based purchasing model](service-tiers-vcore.md).
+A Hyperscale database can be created using the [Azure portal](https://portal.azure.com), [T-SQL](/sql/t-sql/statements/create-database-transact-sql), [PowerShell](/powershell/module/azurerm.sql/new-azurermsqldatabase), or [CLI](/cli/azure/sql/db#az-sql-db-create). Hyperscale databases are available only using the [vCore-based purchasing model](service-tiers-vcore.md).
The following T-SQL command creates a Hyperscale database. You must specify both the edition and service objective in the `CREATE DATABASE` statement. Refer to the [resource limits](./resource-limits-vcore-single-databases.md#hyperscaleprovisioned-computegen4) for a list of valid service objectives.
This will create a Hyperscale database on Gen5 hardware with four cores.
## Upgrade existing database to Hyperscale
-You can move your existing databases in Azure SQL Database to Hyperscale using the [Azure portal](https://portal.azure.com), [T-SQL](/sql/t-sql/statements/alter-database-transact-sql), [PowerShell](/powershell/module/azurerm.sql/set-azurermsqldatabase), or [CLI](/cli/azure/sql/db#az_sql_db_update). At this time, this is a one-way migration. You can't move databases from Hyperscale to another service tier, other than by exporting and importing data. For proofs of concept (POCs), we recommend making a copy of your production databases, and migrating the copy to Hyperscale. Migrating an existing database in Azure SQL Database to the Hyperscale tier is a size of data operation.
+You can move your existing databases in Azure SQL Database to Hyperscale using the [Azure portal](https://portal.azure.com), [T-SQL](/sql/t-sql/statements/alter-database-transact-sql), [PowerShell](/powershell/module/azurerm.sql/set-azurermsqldatabase), or [CLI](/cli/azure/sql/db#az-sql-db-update). At this time, this is a one-way migration. You can't move databases from Hyperscale to another service tier, other than by exporting and importing data. For proofs of concept (POCs), we recommend making a copy of your production databases, and migrating the copy to Hyperscale. Migrating an existing database in Azure SQL Database to the Hyperscale tier is a size of data operation.
The following T-SQL command moves a database into the Hyperscale service tier. You must specify both the edition and service objective in the `ALTER DATABASE` statement.
These are the current limitations to the Hyperscale service tier as of GA. We'r
| When changing Azure SQL Database service tier to Hyperscale, the operation fails if the database has any data files larger than 1 TB | In some cases, it may be possible to work around this issue by [shrinking](file-space-manage.md#shrinking-data-files) the large files to be less than 1 TB before attempting to change the service tier to Hyperscale. Use the following query to determine the current size of database files. `SELECT file_id, name AS file_name, size * 8. / 1024 / 1024 AS file_size_GB FROM sys.database_files WHERE type_desc = 'ROWS'`;| | SQL Managed Instance | Azure SQL Managed Instance isn't currently supported with Hyperscale databases. | | Elastic Pools | Elastic Pools aren't currently supported with Hyperscale.|
-| Migration to Hyperscale is currently a one-way operation | Once a database is migrated to Hyperscale, it can't be migrated directly to a non-Hyperscale service tier. At present, the only way to migrate a database from Hyperscale to non-Hyperscale is to export/import using a bacpac file or other data movement technologies (Bulk Copy, Azure Data Factory, Azure Databricks, SSIS, etc.) Bacpac export/import from Azure portal, from PowerShell using [New-AzSqlDatabaseExport](/powershell/module/az.sql/new-azsqldatabaseexport) or [New-AzSqlDatabaseImport](/powershell/module/az.sql/new-azsqldatabaseimport), from Azure CLI using [az sql db export](/cli/azure/sql/db#az_sql_db_export) and [az sql db import](/cli/azure/sql/db#az_sql_db_import), and from [REST API](/rest/api/sql/) is not supported. Bacpac import/export for smaller Hyperscale databases (up to 200 GB) is supported using SSMS and [SqlPackage](/sql/tools/sqlpackage) version 18.4 and later. For larger databases, bacpac export/import may take a long time, and may fail for various reasons.|
+| Migration to Hyperscale is currently a one-way operation | Once a database is migrated to Hyperscale, it can't be migrated directly to a non-Hyperscale service tier. At present, the only way to migrate a database from Hyperscale to non-Hyperscale is to export/import using a bacpac file or other data movement technologies (Bulk Copy, Azure Data Factory, Azure Databricks, SSIS, etc.) Bacpac export/import from Azure portal, from PowerShell using [New-AzSqlDatabaseExport](/powershell/module/az.sql/new-azsqldatabaseexport) or [New-AzSqlDatabaseImport](/powershell/module/az.sql/new-azsqldatabaseimport), from Azure CLI using [az sql db export](/cli/azure/sql/db#az-sql-db-export) and [az sql db import](/cli/azure/sql/db#az-sql-db-import), and from [REST API](/rest/api/sql/) is not supported. Bacpac import/export for smaller Hyperscale databases (up to 200 GB) is supported using SSMS and [SqlPackage](/sql/tools/sqlpackage) version 18.4 and later. For larger databases, bacpac export/import may take a long time, and may fail for various reasons.|
| Migration of databases with In-Memory OLTP objects | Hyperscale supports a subset of In-Memory OLTP objects, including memory-optimized table types, table variables, and natively compiled modules. However, when any kind of In-Memory OLTP objects are present in the database being migrated, migration from Premium and Business Critical service tiers to Hyperscale is not supported. To migrate such a database to Hyperscale, all In-Memory OLTP objects and their dependencies must be dropped. After the database is migrated, these objects can be recreated. Durable and non-durable memory-optimized tables are not currently supported in Hyperscale, and must be changed to disk tables.| | Geo-replication | [Geo-replication](active-geo-replication-overview.md) and [auto-failover groups](auto-failover-group-overview.md) on Hyperscale is now in public preview. | | Intelligent Database Features | With the exception of the "Force Plan" option, all other Automatic Tuning options aren't yet supported on Hyperscale: options may appear to be enabled, but there won't be any recommendations or actions made. |
azure-sql Single Database Scale https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/single-database-scale.md
After initially picking the number of vCores or DTUs, you can scale a single dat
* [Transact-SQL](/sql/t-sql/statements/alter-database-transact-sql#overview-sql-database) * [Azure portal](single-database-manage.md#the-azure-portal) * [PowerShell](/powershell/module/az.sql/set-azsqldatabase)
-* [Azure CLI](/cli/azure/sql/db#az_sql_db_update)
+* [Azure CLI](/cli/azure/sql/db#az-sql-db-update)
* [REST API](/rest/api/sql/databases/update) > [!IMPORTANT]
You're billed for each hour a database exists using the highest service tier + c
### vCore-based purchasing model - Storage can be provisioned up to the data storage max size limit using 1-GB increments. The minimum configurable data storage is 1 GB. For data storage max size limits in each service objective, see resource limit documentation pages for [Resource limits for single databases using the vCore purchasing model](resource-limits-vcore-single-databases.md) and [Resource limits for single databases using the DTU purchasing model](resource-limits-dtu-single-databases.md).-- Data storage for a single database can be provisioned by increasing or decreasing its max size using the [Azure portal](https://portal.azure.com), [Transact-SQL](/sql/t-sql/statements/alter-database-transact-sql#examples-1), [PowerShell](/powershell/module/az.sql/set-azsqldatabase), [Azure CLI](/cli/azure/sql/db#az_sql_db_update), or [REST API](/rest/api/sql/databases/update). If the max size value is specified in bytes, it must be a multiple of 1 GB (1073741824 bytes).
+- Data storage for a single database can be provisioned by increasing or decreasing its max size using the [Azure portal](https://portal.azure.com), [Transact-SQL](/sql/t-sql/statements/alter-database-transact-sql#examples-1), [PowerShell](/powershell/module/az.sql/set-azsqldatabase), [Azure CLI](/cli/azure/sql/db#az-sql-db-update), or [REST API](/rest/api/sql/databases/update). If the max size value is specified in bytes, it must be a multiple of 1 GB (1073741824 bytes).
- The amount of data that can be stored in the data files of a database is limited by the configured data storage max size. In addition to that storage, Azure SQL Database automatically allocates 30% more storage to be used for the transaction log. - Azure SQL Database automatically allocates 32 GB per vCore for the `tempdb` database. `tempdb` is located on the local SSD storage in all service tiers. - The price of storage for a single database or an elastic pool is the sum of data storage and transaction log storage amounts multiplied by the storage unit price of the service tier. The cost of `tempdb` is included in the price. For details on storage price, see [Azure SQL Database pricing](https://azure.microsoft.com/pricing/details/sql-database/).
You're billed for each hour a database exists using the highest service tier + c
### DTU-based purchasing model - The DTU price for a single database includes a certain amount of storage at no additional cost. Extra storage beyond the included amount can be provisioned for an additional cost up to the max size limit in increments of 250 GB up to 1 TB, and then in increments of 256 GB beyond 1 TB. For included storage amounts and max size limits, see [Single database: Storage sizes and compute sizes](resource-limits-dtu-single-databases.md#single-database-storage-sizes-and-compute-sizes).-- Extra storage for a single database can be provisioned by increasing its max size using the Azure portal, [Transact-SQL](/sql/t-sql/statements/alter-database-transact-sql#examples-1), [PowerShell](/powershell/module/az.sql/set-azsqldatabase), the [Azure CLI](/cli/azure/sql/db#az_sql_db_update), or the [REST API](/rest/api/sql/databases/update).
+- Extra storage for a single database can be provisioned by increasing its max size using the Azure portal, [Transact-SQL](/sql/t-sql/statements/alter-database-transact-sql#examples-1), [PowerShell](/powershell/module/az.sql/set-azsqldatabase), the [Azure CLI](/cli/azure/sql/db#az-sql-db-update), or the [REST API](/rest/api/sql/databases/update).
- The price of extra storage for a single database is the extra storage amount multiplied by the extra storage unit price of the service tier. For details on the price of extra storage, see [Azure SQL Database pricing](https://azure.microsoft.com/pricing/details/sql-database/). > [!IMPORTANT]
azure-sql Long Term Backup Retention Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/managed-instance/long-term-backup-retention-configure.md
This example shows how to list the LTR policies within an instance for a single
```powershell # gets the current version of LTR policy for a database
-$LTRPolicies = @{
+$LTRPolicy = @{
InstanceName = $instanceName DatabaseName = $dbName ResourceGroupName = $resourceGroup
azure-video-analyzer Video Indexer Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/video-indexer-overview.md
Title: What is Azure Video Analyzer for Media (formerly Video Indexer)? description: This article gives an overview of the Azure Video Analyzer for Media (formerly Video Indexer) service. Previously updated : 01/04/2022 Last updated : 02/15/2022
When indexing by one channel, partial result for those models will be available.
* **Keywords extraction**: Extracts keywords from speech and visual text. * **Named entities extraction**: Extracts brands, locations, and people from speech and visual text via natural language processing (NLP).
-* **Topic inference**: Makes inference of main topics from transcripts. The 2nd-level IPTC taxonomy is included.
+* **Topic inference**: Extracts topics based on various keywords (i.e. keywords 'Stock Exchange', 'Wall Street' will produce the topic 'Economics'). The model uses three different ontologies ([IPTC](https://iptc.org/standards/media-topics/), [Wikipedia](https://www.wikipedia.org/) and the Video Indexer hierarchical topic ontology). The model uses transcription (spoken words), OCR content (visual text), and celebrities recognized in the video using the Video Indexer facial recognition model.
* **Artifacts**: Extracts rich set of "next level of details" artifacts for each of the models. * **Sentiment analysis**: Identifies positive, negative, and neutral sentiments from speech and visual text.
backup Back Up Azure Stack Hyperconverged Infrastructure Virtual Machines https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/back-up-azure-stack-hyperconverged-infrastructure-virtual-machines.md
Title: Back up Azure Stack HCI virtual machines with MABS description: This article contains the procedures to back up and recover virtual machines using Microsoft Azure Backup Server (MABS). Previously updated : 07/27/2021 Last updated : 02/15/2022+++ # Back up Azure Stack HCI virtual machines with Azure Backup Server This article explains how to back up virtual machines on Azure Stack HCI using Microsoft Azure Backup Server (MABS).
-> [!NOTE]
-> This support applies to Azure Stack HCI version 20H2. Backup of virtual machines on Azure Stack HCI version 21H2 is not supported.
- ## Supported scenarios MABS can back up Azure Stack HCI virtual machines in the following scenarios:
backup Backup Azure Delete Vault https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-delete-vault.md
To delete existing Recovery Services vault, perform the following steps:
[--yes] ```
- For more information, see thisΓÇ»[article](/cli/azure/backup/protection#az_backup_protection_disable).
+ For more information, see thisΓÇ»[article](/cli/azure/backup/protection#az-backup-protection-disable).
- Delete an existing Recovery Services vault:
backup Backup Azure Vms Enhanced Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-vms-enhanced-policy.md
Title: Back up Azure VMs with Enhanced policy (in preview) description: Learn how to configure Enhanced policy to back up VMs. Previously updated : 12/21/2021 Last updated : 02/11/2022
# Back up an Azure VM using Enhanced policy (in preview)
-This article explains how to use _Enhanced policy_ to configure _Multiple Backups Per Day_ and back up [Trusted Launch VMs](../virtual-machines/trusted-launch.md) with the Azure Backup service. _Enhanced policy_ for backup of VMs is in preview.
+This article explains how to use _Enhanced policy_ to configure _Multiple Backups Per Day_ and back up [Trusted Launch VMs](../virtual-machines/trusted-launch.md) with Azure Backup service. _Enhanced policy_ for VM backup is in preview.
Azure Backup now supports _Enhanced policy_ that's needed to support new Azure offerings. For example, [Trusted Launch VM](../virtual-machines/trusted-launch.md) is supported with _Enhanced policy_ only. To enroll your subscription for backup of Trusted Launch VM, write to us at [askazurebackupteam@microsoft.com](mailto:askazurebackupteam@microsoft.com). >[!Important] >The existing [default policy](./backup-during-vm-creation.md#create-a-vm-with-backup-configured) wonΓÇÖt support protecting newer Azure offerings, such as Trusted Launch VM, UltraSSD, Shared disk, and Confidential Azure VMs.
-You must enable backup for Trusted Launch VM through enhanced policy only. The Enhanced policy provides the following features:
+You must enable backup of Trusted Launch VM through enhanced policy only. Enhanced policy provides the following features:
- Supports _Multiple Backups Per Day_. To enroll your subscription for this feature, write to us at [askazurebackupteam@microsoft.com](mailto:askazurebackupteam@microsoft.com). - Instant Restore tier is zonally redundant using Zone-redundant storage (ZRS) resiliency. See the [pricing details for Enhanced policy storage here](https://azure.microsoft.com/pricing/details/managed-disks/).
The following screenshot shows _Multiple Backups_ occurred in a day.
:::image type="content" source="./media/backup-azure-vms-enhanced-policy/multiple-backups-per-day-inline.png" alt-text="Screenshot showing the multiple backup instances occurred in a day." lightbox="./media/backup-azure-vms-enhanced-policy/multiple-backups-per-day-expanded.png"::: >[!Note]
->The above screenshot shows that one of the backups is transferred to the Vault-Standard tier.
+>The above screenshot shows that one of the backups is transferred to Vault-Standard tier.
## Create an Enhanced policy and configure VM backup
Follow these steps:
:::image type="content" source="./media/backup-azure-vms-enhanced-policy/select-enhanced-backup-policy-sub-type.png" alt-text="Screenshot showing to select backup policies sub-type as enhanced.":::
- - **Backup schedule**: You can select frequency as **Hourly**/Daily/Weekly.
+ - **Backup schedule**: You can select frequency as **Hourly**/Daily/Weekly.
- By default, the enhanced backup schedule is set to **Hourly**, with **8 AM** as the start time, **Every 4 hours** as the schedule, and **24 Hours** as duration. You can choose to modify the settings as needed.
+ By default, enhanced backup schedule is set to **Hourly**, with **8 AM** as start time, **Every 4 hours** as schedule, and **24 Hours** as duration. You can choose to modify the settings as needed.
- Note that the Hourly backup frequency is in preview. To enroll your subscription for this feature, write to us at [askazurebackupteam@microsoft.com](mailto:askazurebackupteam@microsoft.com).
+ Note that Hourly backup frequency is in preview. To enroll your subscription for this feature, write to us at [askazurebackupteam@microsoft.com](mailto:askazurebackupteam@microsoft.com).
- - **Instant Restore**: You can set the retention of recovery snapshot from 1 to 30 days. The default value is set to 7.
- - **Retention range**: The options for retention range are auto-selected based on the backup frequency you choose. The default retention for daily, weekly, monthly, and yearly backup points are set to 180 days, 12 weeks, 60 months, and 10 years respectively. You can customize the values as per the requirement.
+ - **Instant Restore**: You can set the retention of recovery snapshot from _1_ to _30_ days. The default value is set to _7_.
+ - **Retention range**: Options for retention range are auto-selected based on backup frequency you choose. The default retention for daily, weekly, monthly, and yearly backup points are set to 180 days, 12 weeks, 60 months, and 10 years respectively. You can customize these values as required.
:::image type="content" source="./media/backup-azure-vms-enhanced-policy/enhanced-backup-policy-settings.png" alt-text="Screenshot showing to configure the enhanced backup policy."::: 6. Click **Create**. >[!Note]
->- We support the Enhanced policy configuration through [Recovery Services vault](./backup-azure-arm-vms-prepare.md) and [VM Manage blade](./backup-during-vm-creation.md#start-a-backup-after-creating-the-vm) only. Configuration through Backup center is currently not supported.
->- For hourly backups, the last backup of the day is transferred to the vault. If the backup fails, the first backup of the next day is transferred to the vault.
+>- We support Enhanced policy configuration through [Recovery Services vault](./backup-azure-arm-vms-prepare.md) and [VM Manage blade](./backup-during-vm-creation.md#start-a-backup-after-creating-the-vm) only. Configuration through Backup center is currently not supported.
+>- For hourly backups, the last backup of the day is transferred to vault. If backup fails, the first backup of the next day is transferred to vault.
>- Enhanced policy can be only availed for unprotected VMs that are new to Azure Backup. Note that Azure VMs that are protected with existing policy can't be moved to Enhanced policy. ## Next steps
backup Backup Mabs Protection Matrix https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-mabs-protection-matrix.md
Title: MABS (Azure Backup Server) V3 UR1 protection matrix description: This article provides a support matrix listing all workloads, data types, and installations that Azure Backup Server protects. Previously updated : 01/18/2022 Last updated : 02/15/2022
The following sections details the protection support matrix for MABS:
| **Workload** | **Version** | **Azure Backup Server installation** | **Supported Azure Backup Server** | **Protection and recovery** | | | - | | - | | | Hyper-V host - MABS protection agent on Hyper-V host server, cluster, or VM | Windows Server 2022, 2019, 2016, 2012 R2, 2012 | Physical server <br><br> Hyper-V virtual machine <br><br> VMware virtual machine | V3 UR1 and V3 UR2 | Protect: Virtual machines, cluster shared volumes (CSVs) <br><br> Recover: Virtual machine, Item-level recovery of files and folders available only for Windows, volumes, virtual hard drives |
-| Azure Stack HCI | V1 and 20H2 | Physical server <br><br> Hyper-V / Azure Stack HCI virtual machine <br><br> VMware virtual machine | V3 UR2 and later | Protect: Virtual machines, cluster shared volumes (CSVs) <br><br> Recover: Virtual machine, Item-level recovery of files and folders available only for Windows, volumes, virtual hard drives |
+| Azure Stack HCI | V1, 20H2, and 21H2 | Physical server <br><br> Hyper-V / Azure Stack HCI virtual machine <br><br> VMware virtual machine | V3 UR2 and later | Protect: Virtual machines, cluster shared volumes (CSVs) <br><br> Recover: Virtual machine, Item-level recovery of files and folders available only for Windows, volumes, virtual hard drives |
| VMware VMs | VMware server 5.5, 6.0, or 6.5, 6.7 (Licensed Version) | Hyper-V virtual machine <br><br> VMware virtual machine | V3 UR1 | Protect: VMware VMs on cluster-shared volumes (CSVs), NFS, and SAN storage <br><br> Recover: Virtual machine, Item-level recovery of files and folders available only for Windows, volumes, virtual hard drives <br><br> VMware vApps aren't supported. | | VMware VMs | VMware server 7.0, 6.7, 6.5 or 6.0 (Licensed Version) | Hyper-V virtual machine <br><br> VMware virtual machine | V3 UR2 and later | Protect: VMware VMs on cluster-shared volumes (CSVs), NFS, and SAN storage <br><br> Recover: Virtual machine, Item-level recovery of files and folders available only for Windows, volumes, virtual hard drives <br><br> VMware vApps aren't supported. |
backup Backup Reports Email https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-reports-email.md
Title: Email Azure Backup Reports description: Create automated tasks to receive periodic reports via email Previously updated : 10/19/2021 Last updated : 02/14/2022+++ # Email Azure Backup Reports
To troubleshoot this issue:
* **Azure Monitor Logs Connector has not been not authorized**: To fix this issue, follow the authorization steps as provided above. * **Error in the LA query**: In case you have customized the logic app with your own queries, an error in any of the LA queries might be causing the logic app to fail. You can select the relevant step and view the error which is causing the query to run incorrectly.
+### Scenario 3: Error in authorizing O365 API connection
+
+When attempting to authorize the O365 API connection, you might see an error of the form _Test connection failed. Error 'REST API is not yet supported for this mailbox. This error can occur for sandbox (test) accounts or for accounts that are on a dedicated (on-premises) mail server._
+
+This error can occur if the mailbox is on a dedicated Microsoft Exchange Server and isn't a valid Office 365 mailbox. [Learn more](/connectors/office365/#common-errors)
+
+To get a valid Office 365 mailbox, submit a request to your Exchange or Global administrator to migrate the mailbox account. Users who don't have administrator permissions can't migrate accounts. For information on how to migrate the mailbox account, see [How to migrate mailbox data by using the Exchange Admin Center in Office 365](/exchange/troubleshoot/move-or-migrate-mailboxes/migrate-data-with-admin-center).
+
+### Scenario 4: Error in authorizing Azure Monitor Logs connection
+
+When attempting to authorize the Azure Monitor logs connection, you might see an _InvalidAuthenticationTokenTenant_ error. This generally happens when you're logged in to a different tenant at the time of authorizing the connection to Azure Monitor logs. You need to log in to the same tenant as the tenant where the Log Analytics workspace exists to complete the authorization successfully.
+
+To ensure you're logged in to the right tenant, you can open _portal.azure.com/< tenant-id-of-workspace >_ in the browser and perform the authorization. To find the tenant ID, go to **Azure Activity Directory** -> **Overview** -> **Manage Tenants**.
+ If the issues persist, contact Microsoft support. ## Next steps
backup Backup Support Matrix Iaas https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-support-matrix-iaas.md
Title: Support matrix for Azure VM backup description: Provides a summary of support settings and limitations when backing up Azure VMs with the Azure Backup service. Previously updated : 12/08/2021 Last updated : 02/11/2022
Backup of Azure VMs with locks | Unsupported for unmanaged VMs. <br><br> Support
Windows Storage Spaces configuration of standalone Azure VMs | Supported [Azure Virtual Machine Scale Sets](../virtual-machine-scale-sets/virtual-machine-scale-sets-orchestration-modes.md#scale-sets-with-flexible-orchestration) | Supported for flexible orchestration model to back up and restore Single Azure VM. Restore with Managed identities | Yes, supported for managed Azure VMs, and not supported for classic and unmanaged Azure VMs. <br><br> Cross Region Restore isn't supported with managed identities. <br><br> Currently, this is available in all Azure public and national cloud regions. <br><br> [Learn more](backup-azure-arm-restore-vms.md#restore-vms-with-managed-identities).
-<a name="tvm-backup">Trusted Launch VM</a> | Backup supported (in preview) <br><br> To enroll your subscription for this feature, write to us at [askazurebackupteam@microsoft.com](mailto:askazurebackupteam@microsoft.com). <br><br> Backup for Trusted Launch VM is supported through [Enhanced policy](backup-azure-vms-enhanced-policy.md). You can enable backup only through [Recovery Services vault](./backup-azure-arm-vms-prepare.md) and [VM Manage blade](./backup-during-vm-creation.md#start-a-backup-after-creating-the-vm). <br><br> **Feature details** <br> <ul><li> Migration of an existing [Generation 2](../virtual-machines/generation-2.md) VM (protected with Azure Backup) to Trusted Launch VM is currently not supported. Learn about how to [create a Trusted Launch VM](../virtual-machines/trusted-launch-portal.md?tabs=portal#deploy-a-trusted-vm). </li><li> Configurations of Backup, Alerts, and Monitoring for Trusted Launch VM are currently not supported through Backup center. </li><li> Currently, you can restore as [Create VM](./backup-azure-arm-restore-vms.md#create-a-vm), or [Restore disk](./backup-azure-arm-restore-vms.md#restore-disks) only. </li><li> [vTPM state](../virtual-machines/trusted-launch.md#vtpm) doesn't persist while you restore a VM from a recovery point. Therefore, scenarios that require vTPM persistence may not work across the backup and restore operation. </li></ul>
+<a name="tvm-backup">Trusted Launch VM</a> | Backup supported (in preview) <br><br> To enroll your subscription for this feature, write to us at [askazurebackupteam@microsoft.com](mailto:askazurebackupteam@microsoft.com). <br><br> Backup of Trusted Launch VM is supported through [Enhanced policy](backup-azure-vms-enhanced-policy.md). You can enable backup only through [Recovery Services vault](./backup-azure-arm-vms-prepare.md) and [VM Manage blade](./backup-during-vm-creation.md#start-a-backup-after-creating-the-vm). <br><br> **Feature details** <br> <ul><li> Migration of an existing [Generation 2](../virtual-machines/generation-2.md) VM (protected with Azure Backup) to Trusted Launch VM is currently not supported. Learn about how to [create a Trusted Launch VM](../virtual-machines/trusted-launch-portal.md?tabs=portal#deploy-a-trusted-vm). </li><li> Configurations of Backup, Alerts, and Monitoring for Trusted Launch VM are currently not supported through Backup center. </li><li> Currently, you can restore as [Create VM](./backup-azure-arm-restore-vms.md#create-a-vm), or [Restore disk](./backup-azure-arm-restore-vms.md#restore-disks) only. </li><li> [vTPM state](../virtual-machines/trusted-launch.md#vtpm) doesn't persist while you restore a VM from a recovery point. Therefore, scenarios that require vTPM persistence may not work across backup and restore operations. </li></ul>
## VM storage support
backup Backup Vault Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-vault-overview.md
A Backup vault is an entity that stores the backups and recovery points created
- Azure Backup automatically handles storage for the vault. Choose the storage redundancy that matches your business needs when creating the Backup vault. -- To learn more about storage redundancy, see these articles on [geo](../storage/common/storage-redundancy.md#geo-redundant-storage), [zonal](../storage/common/storage-redundancy.md#zone-redundant-storage), and [local](../storage/common/storage-redundancy.md#locally-redundant-storage) redundancy.
+- To learn more about storage redundancy, see these articles on [geo](../storage/common/storage-redundancy.md#geo-redundant-storage), [zonal (preview)](../storage/common/storage-redundancy.md#zone-redundant-storage), and [local](../storage/common/storage-redundancy.md#locally-redundant-storage) redundancy.
## Encryption settings in the Backup vault
In the **Backup Instances** tile, you get a summarized view of all backup instan
This section explains how to move a Backup vault (configured for Azure Backup) across Azure subscriptions and resource groups using the Azure portal. >[!Note]
->You can also move Backup vaults to a different resource group or subscription using [PowerShell](/powershell/module/az.resources/move-azresource?view=azps-6.3.0&preserve-view=true) and [CLI](/cli/azure/resource#az_resource_move).
+>You can also move Backup vaults to a different resource group or subscription using [PowerShell](/powershell/module/az.resources/move-azresource?view=azps-6.3.0&preserve-view=true) and [CLI](/cli/azure/resource#az-resource-move).
### Supported regions
backup Configure Reports https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/configure-reports.md
Title: Configure Azure Backup reports description: Configure and view reports for Azure Backup by using Log Analytics and Azure workbooks Previously updated : 02/10/2020 Last updated : 02/14/2022+++ # Configure Azure Backup reports
In the case of items backed up weekly, this grid helps you identify all items th
![Policy Adherence By Time Period](./media/backup-azure-configure-backup-reports/policy-adherence-by-time-period.png)
-* **Policy Adherence by Backup Instance**: Using this view, you can policy adherence details at a backup instance level. A cell which is green denotes that the backup instance had at least one successful backup on the given day. A cell which is red denotes that the backup instance did not have even one successful backup on the given day. Daily, weekly and monthly aggregations follow the same behavior as the Policy Adherence by Time Period view. You can click on any row to view all backup jobs on the given backup instance in the selected time range.
+* **Policy Adherence by Backup Instance**: Using this view, you can view policy adherence details at a backup instance level. A cell which is green denotes that the backup instance had at least one successful backup on the given day. A cell which is red denotes that the backup instance did not have even one successful backup on the given day. Daily, weekly and monthly aggregations follow the same behavior as the Policy Adherence by Time Period view. You can click on any row to view all backup jobs on the given backup instance in the selected time range.
![Policy Adherence By Backup Instance](./media/backup-azure-configure-backup-reports/policy-adherence-by-backup-instance.png)
Once the logic app is created, you'll need to authorize connections to Azure Mon
Backup Reports uses [system functions on Azure Monitor logs](backup-reports-system-functions.md). These functions operate on data in the raw Azure Backup tables in LA and return formatted data that helps you easily retrieve information of all your backup-related entities, using simple queries.
-To create your own reporting workbooks using Backup Reports as a base, you can navigate to Backup Reports, click on **Edit** at the top of the report, and view/edit the queries being used in the reports. Refer to [Azure workbooks documentation](../azure-monitor/visualize/workbooks-overview.md) to learn more about how to create custom reports.
+To create your own reporting workbooks using Backup Reports as a base, you can go to **Backup Reports**, click **Edit** at the top of the report, and view/edit the queries being used in the reports. Refer to [Azure workbooks documentation](../azure-monitor/visualize/workbooks-overview.md) to learn more about how to create custom reports.
## Export to Excel
If you use [Azure Lighthouse](../lighthouse/index.yml) with delegated access to
- If the selected time range spans a period of 30 days of less, charts are rendered in daily view, where there is one data point for every day. If the time range spans a period greater than 30 days and less than (or equal to) 90 days, charts are rendered in weekly view. For larger time ranges, charts are rendered in monthly view. Aggregating data weekly or monthly helps in better performance of queries and easier readability of data in charts. - The Policy Adherence grids also follow a similar aggregation logic as described above. However, there are a couple of minor differences. The first difference is that for items with weekly backup policy, there is no daily view (only weekly and monthly views are available). Further, in the grids for items with weekly backup policy, a 'month' is considered as a 4-week period (28 days), and not 30 days, to eliminate partial weeks from consideration.
+## How to troubleshoot?
+
+If you observe data discrepancy issues in Backup Reports, perform these preliminary checks:
+
+1. Ensure that all vaults are sending the required [diagnostics logs to the Log Analytics workspace](#2-configure-diagnostics-settings-for-your-vaults).
+1. Ensure that you've selected right filters in Backup Reports.
+1. Review the following limits in Backup Reports:
+
+ - After you configure diagnostics, it might take up to 24 hours for the initial data push to complete. After data starts flowing into the Log Analytics workspace, you might not see data in the reports immediately because data for the current partial day isn't shown in the reports. We recommend you start viewing the reports two days after you configure your vaults to send data to Log Analytics.
+ - SQL log backup jobs are currently not displayed in Backup Reports.
+ - As mentioned above, the reports don't show data for the current partial day, and take only full days (UTC) into consideration.
+
+ For example, in the report, even if you select a time range of 23/3 4:30 PM ΓÇô 24/3 10:00 AM, internally the query runs for the period 23/3 12:00 AM UTC ΓÇô 24/3 11:59 PM UTC. This meaning that the time component of the datetime is overridden by the query.
+
+ Similarly, if today's date is March 29, data is only shown upto the end (11:59 pm UTC) of March 28. For jobs that were created on March 29, you can see them when you check the reports on the next day, that is, March 30.
+
+If none of the above explains the data seen in the report, please contact Microsoft Support.
+ ## Query load times The widgets in the Backup report are powered by Kusto queries, which run on the user's Log Analytics workspaces. These queries typically involve the processing of large amounts of data, with multiple joins to enable richer insights. As a result, the widgets might not load instantaneously when the user views reports across a large backup estate. This table provides a rough estimate of the time that different widgets can take to load, based on the number of Backup items and the time range for which the report is being viewed.
backup Metrics Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/metrics-overview.md
Title: Monitor the health of your backups using Azure Backup Metrics (preview)
description: In this article, learn about the metrics available for Azure Backup to monitor your backup health Previously updated : 10/20/2021 Last updated : 02/14/2022
Based on the alert rules configuration, the fired alert appears under the **Data
You can use the different programmatic clients, such as PowerShell, CLI, or REST API, to access the metrics functionality. See [Azure Monitor REST API documentation](../azure-monitor/essentials/rest-api-walkthrough.md) for more details.
+### Sample alert scenarios
+
+#### Fire a single alert if all backups for a vault were successful in last 24 hours
+
+**Alert Rule: Fire an alert if Backup Health Events < 1 in last 24 hours for**:
+
+Dimensions["HealthStatus"]="Persistent Unhealthy / Transient Unhealthy / Persistent Degraded / Transient Degraded"
+
+#### Fire an alert after every failed backup job
+
+**Alert Rule: Fire an alert if Backup Health Events > 0 in last 5 minutes for**:
+
+- Dimensions["HealthStatus"]= "Persistent Unhealthy / Transient Unhealthy / Persistent Degraded / Transient Degraded"
+- Dimensions["DatasourceId"]= "All current and future values"
+
+#### Fire an alert if there were consecutive backup failures for the same item in last 24 hours
+
+**Alert Rule: Fire an alert if Backup Health Events > 1 in last 24 hours for**:
+
+- Dimensions["HealthStatus"]= "Persistent Unhealthy / Transient Unhealthy / Persistent Degraded / Transient Degraded"
+- Dimensions["DatasourceId"]= "All current and future values"
+
+#### Fire an alert if no backup job was executed for an item in last 24 hours
+
+**Alert Rule: Fire an alert if Backup Health Events < 1 in the last 24 hours for**:
+
+Dimensions["DatasourceId"]= "All current and future values"
+ ## Next steps+ - [Learn more about monitoring and reporting in Azure Backup](monitoring-and-alerts-overview.md). - [Learn more about Azure Monitor metrics](../azure-monitor/essentials/data-platform-metrics.md). - [Learn more about Azure alerts](../azure-monitor/alerts/alerts-overview.md).
bastion Configuration Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/bastion/configuration-settings.md
You can configure this setting using the following methods:
| | | | | Azure portal | Subnet |[Quickstart - Configure Bastion from VM settings](quickstart-host-portal.md)<br>[Tutorial - Configure Bastion](tutorial-create-host-portal.md)| | Azure PowerShell | -subnetName|[cmdlet](/powershell/module/az.network/new-azbastion#parameters) |
-| Azure CLI | --subnet-name | [command](/cli/azure/network/vnet#az_network_vnet_create) |
+| Azure CLI | --subnet-name | [command](/cli/azure/network/vnet#az-network-vnet-create) |
## <a name="public-ip"></a>Public IP address
bastion Connect Native Client Windows https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/bastion/connect-native-client-windows.md
This article helps you configure Bastion, and then connect to a VM in the VNet using the native client (SSH or RDP) on your local workstation. This feature lets you connect to your target VMs via Bastion using Azure CLI, and expands your sign-in options to include local SSH key pair and Azure Active Directory (Azure AD). For more information about Azure Bastion, see [What is Azure Bastion?](bastion-overview.md) > [!NOTE]
-> This configuration requires the Standard SKU tier for Azure Bastion.
->
+> * This configuration requires the Standard SKU tier for Azure Bastion.
+> * The user's capabilities on the VM using a native client are dependent on what is enabled on the native client. Controlling access to features such as file transfer via the Bastion is not supported.
Currently, this feature has the following limitation:
certification Program Requirements Edge Secured Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/certification/program-requirements-edge-secured-core.md
Overview content
## Windows IoT OS Support Edge Secured-core for Windows IoT requires Windows 10 IoT Enterprise version 1903 or greater
-* [Windows 10 IoT Enterprise Lifecycle](https://docs.microsoft.com/lifecycle/products/windows-10-iot-enterprise)
+* [Windows 10 IoT Enterprise Lifecycle](/lifecycle/products/windows-10-iot-enterprise)
> [!Note] > The Windows secured-core tests require you to download and run the following package (https://aka.ms/Scforwiniot) from an Administrator Command Prompt on the IoT device being validated.
Validation|Device to be validated through toolset to ensure the device supports
|Resources|| </br>
cognitive-services Copy Move Projects https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Custom-Vision-Service/copy-move-projects.md
After you've created and trained a Custom Vision project, you may want to copy your project to another resource. If your app or business depends on the use of a Custom Vision project, we recommend you copy your model to another Custom Vision account in another region. Then if a regional outage occurs, you can access your project in the region where it was copied.
-As a part of Azure, Custom Vision Service has components that are maintained across multiple regions. Service zones and regions are used by all of our services to provide continued service to our customers. For more information on zones and regions, see [Azure regions](/azure/availability-zones/az-overview). If you need additional information or have any issues, please [contact support](/answers/topics/azure-custom-vision.html).
+As a part of Azure, Custom Vision Service has components that are maintained across multiple regions. Service zones and regions are used by all of our services to provide continued service to our customers. For more information on zones and regions, see [Azure regions](../../availability-zones/az-overview.md). If you need additional information or have any issues, please [contact support](/answers/topics/azure-custom-vision.html).
The **[ExportProject](https://southcentralus.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.3/operations/5eb0bcc6548b571998fddeb3)** and **[ImportProject](https://southcentralus.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.3/operations/5eb0bcc7548b571998fddee3)** APIs enable this scenario by allowing you to copy projects from one Custom Vision account into others. This guide shows you how to use these REST APIs with cURL. You can also use an HTTP request service like Postman to issue the requests.
You'll get a `200/OK` response with metadata about your newly imported project.
## Next steps In this guide, you learned how to copy and move a project between Custom Vision resources. Next, explore the API reference docs to see what else you can do with Custom Vision.
-* [REST API reference documentation](https://southcentralus.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.3/operations/5eb0bcc6548b571998fddeb3)
+* [REST API reference documentation](https://southcentralus.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.3/operations/5eb0bcc6548b571998fddeb3)
cognitive-services Utterances https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/LUIS/concepts/utterances.md
Collect utterances that you think users will enter. Include utterances, which me
* Pluralization * Stemming * Noun and verb choice
-* [Punctuation](/azure/cognitive-services/luis/luis-reference-application-settings#punctuation-normalization) - using both correct and incorrect grammar
+* [Punctuation](../luis-reference-application-settings.md#punctuation-normalization) - using both correct and incorrect grammar
## Choose varied utterances
When you start [adding example utterances](/azure/cognitive-services/luis/luis-
## Utterances aren't always well formed
-Your app may need to process sentences, like "Book a ticket to Paris for me", or a fragment of a sentence, like "Booking" or "Paris flight" Users also often make spelling mistakes. When planning your app, consider whether or not you want to use [Bing Spell Check](/azure/cognitive-services/luis/luis-tutorial-bing-spellcheck) to correct user input before passing it to LUIS.
+Your app may need to process sentences, like "Book a ticket to Paris for me", or a fragment of a sentence, like "Booking" or "Paris flight" Users also often make spelling mistakes. When planning your app, consider whether or not you want to use [Bing Spell Check](../luis-tutorial-bing-spellcheck.md) to correct user input before passing it to LUIS.
If you do not spell check user utterances, you should train LUIS on utterances that include typos and misspellings.
If you turn on a normalization setting, scores in the **Test** pane, batch tes
When you clone a version in the LUIS portal, the version settings are kept in the new cloned version.
-Set your app's version settings using the LUIS portal by selecting **Manage** from the top navigation menu, in the **Application Settings** page. You can also use the [Update Version Settings API](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/versions-update-application-version-settings). See the [Reference](/azure/cognitive-services/luis/luis-reference-application-settings) documentation for more information.
+Set your app's version settings using the LUIS portal by selecting **Manage** from the top navigation menu, in the **Application Settings** page. You can also use the [Update Version Settings API](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/versions-update-application-version-settings). See the [Reference](../luis-reference-application-settings.md) documentation for more information.
## Word forms
Training is generally non-deterministic: utterance prediction can vary slightly
## Testing utterances
-Developers should start testing their LUIS application with real data by sending utterances to the [prediction endpoint](/azure/cognitive-services/luis/luis-how-to-azure-subscription) URL. These utterances are used to improve the performance of the intents and entities with [Review utterances](/azure/cognitive-services/luis/luis-how-to-review-endpoint-utterances). Tests submitted using the testing pane in the LUIS portal are not sent through the endpoint, and don't contribute to active learning.
+Developers should start testing their LUIS application with real data by sending utterances to the [prediction endpoint](../luis-how-to-azure-subscription.md) URL. These utterances are used to improve the performance of the intents and entities with [Review utterances](/azure/cognitive-services/luis/luis-how-to-review-endpoint-utterances). Tests submitted using the testing pane in the LUIS portal are not sent through the endpoint, and don't contribute to active learning.
## Review utterances
-After your model is trained, published, and receiving [endpoint](/azure/cognitive-services/luis/luis-glossary#endpoint) queries, [review the utterances](/azure/cognitive-services/luis/luis-how-to-review-endpoint-utterances) suggested by LUIS. LUIS selects endpoint utterances that have low scores for either the intent or entity.
+After your model is trained, published, and receiving [endpoint](../luis-glossary.md#endpoint) queries, [review the utterances](/azure/cognitive-services/luis/luis-how-to-review-endpoint-utterances) suggested by LUIS. LUIS selects endpoint utterances that have low scores for either the intent or entity.
## Best practices
After the app is published, only add utterances from active learning in the deve
## Next steps * [Intents](intents.md)
-* [Patterns and features concepts](patterns-features.md)
+* [Patterns and features concepts](patterns-features.md)
cognitive-services Faq https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/LUIS/faq.md
LUIS has a monthly quota and a per-second quota, based on the pricing tier of th
If your LUIS app request rate exceeds the allowed [quota rate](https://azure.microsoft.com/pricing/details/cognitive-services/language-understanding-intelligent-services/), you can:
-* Spread the load to more LUIS apps with the [same app definition](luis-concept-enterprise.md#use-multiple-apps-with-same-app-definition). This includes, optionally, running LUIS from a [container](/azure/cognitive-services/luis/luis-container-howto).
+* Spread the load to more LUIS apps with the [same app definition](luis-concept-enterprise.md#use-multiple-apps-with-same-app-definition). This includes, optionally, running LUIS from a [container](./luis-container-howto.md).
* Create and [assign multiple keys](luis-concept-enterprise.md#assign-multiple-luis-keys-to-same-app) to the app. ## Can I Use multiple apps with same app definition?
To get the same top intent between all the apps, make sure the intent prediction
When training these apps, make sure to [train with all data](luis-how-to-train.md#train-with-all-data).
-Designate a single main app. Any utterances that are suggested for review should be added to the main app, then moved back to all the other apps. This is either a full export of the app, or loading the labeled utterances from the main app to the other apps. Loading can be done from either the [LUIS](/azure/cognitive-services/luis/luis-reference-regions) website or the authoring API for a [single utterance](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/5890b47c39e2bb052c5b9c08) or for a [batch](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/5890b47c39e2bb052c5b9c09).
+Designate a single main app. Any utterances that are suggested for review should be added to the main app, then moved back to all the other apps. This is either a full export of the app, or loading the labeled utterances from the main app to the other apps. Loading can be done from either the [LUIS](./luis-reference-regions.md) website or the authoring API for a [single utterance](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/5890b47c39e2bb052c5b9c08) or for a [batch](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/5890b47c39e2bb052c5b9c09).
Schedule a periodic review, such as every two weeks, of [endpoint utterances](luis-how-to-review-endpoint-utterances.md) for active learning, then retrain and republish the app.
Yes, you can use the LUIS [container](luis-container-howto.md) for these scenari
## How do I integrate LUIS with Azure Bot Services?
-Use this [tutorial](/composer/how-to-add-luis) to integrate LUIS app with a Bot
+Use this [tutorial](/composer/how-to-add-luis) to integrate LUIS app with a Bot
cognitive-services Improve Application https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/LUIS/how-to/improve-application.md
Use this article to learn how you can improve your LUIS apps, such as reviewing
## Active Learning
-The process of reviewing endpoint utterances for correct predictions is called Active learning. Active learning captures queries that are sent to the endpoint, and selects user utterances that it is unsure of. You review these utterances to select the intent and mark the entities for these real-world utterances. Then you can accept these changes into your app's example utterances, then [train](/azure/cognitive-services/luis/how-to/train-test?branch=pr-en-us-181263) and [publish](/azure/cognitive-services/luis/how-to/publish?branch=pr-en-us-181263) the app. This helps LUIS identify utterances more accurately.
+The process of reviewing endpoint utterances for correct predictions is called Active learning. Active learning captures queries that are sent to the endpoint, and selects user utterances that it is unsure of. You review these utterances to select the intent and mark the entities for these real-world utterances. Then you can accept these changes into your app's example utterances, then [train](./train-test.md?branch=pr-en-us-181263) and [publish](./publish.md?branch=pr-en-us-181263) the app. This helps LUIS identify utterances more accurately.
## Log user queries to enable active learning
-To enable active learning, you must log user queries. This is accomplished by calling the [endpoint query](/azure/cognitive-services/luis/luis-get-started-create-app#query-the-v3-api-prediction-endpoint) with the `log=true` query string parameter and value.
+To enable active learning, you must log user queries. This is accomplished by calling the [endpoint query](../luis-get-started-create-app.md#query-the-v3-api-prediction-endpoint) with the `log=true` query string parameter and value.
> [!Note] > To disable active learning, don't log user queries. You can change the query parameters by setting log=false in the endpoint query or omit the log parameter because the default value is false for the V3 endpoint.
The optional square brackets syntax "*[ ]*" lets you add optional text to the te
### Next Steps:
-To test how performance improves, you can access the test console by selecting **Test** in the top panel. For instructions on how to test your app using the test console, see [Train and test your app](/azure/cognitive-services/luis/luis-interactive-test).
+To test how performance improves, you can access the test console by selecting **Test** in the top panel. For instructions on how to test your app using the test console, see [Train and test your app](/azure/cognitive-services/luis/luis-interactive-test).
cognitive-services Intents https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/LUIS/how-to/intents.md
When the filters and view are applied and there are example utterances with erro
Each row shows the current training's prediction score for the example utterance, and the nearest other intent score, which is the difference between these two scores. > [!Tip]
-> To fix intent prediction errors, use the [Summary dashboard](/azure/cognitive-services/luis/luis-how-to-use-dashboard). The summary dashboard provides analysis for the active version's last training and offers the top suggestions to fix your model.
+> To fix intent prediction errors, use the [Summary dashboard](../luis-how-to-use-dashboard.md). The summary dashboard provides analysis for the active version's last training and offers the top suggestions to fix your model.
## Add a prebuilt intent
Now imagine you want to quickly create a confirmation intent. You can use one of
* [Add entities](entities.md) * [Label entities](label-utterances.md)
-* [Train and test](train-test.md)
+* [Train and test](train-test.md)
cognitive-services Orchestration Projects https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/LUIS/how-to/orchestration-projects.md
Last updated 01/06/2022
# Combine LUIS and question answering capabilities
-Cognitive Services provides two natural language processing services, [Language Understanding](/azure/cognitive-services/luis/what-is-luis) (LUIS) and question answering, each with a different purpose. Understand when to use each service and how they compliment each other.
+Cognitive Services provides two natural language processing services, [Language Understanding](../what-is-luis.md) (LUIS) and question answering, each with a different purpose. Understand when to use each service and how they compliment each other.
Natural language processing (NLP) allows your client application, such as a chat bot, to work with your users' natural language.
You need to follow the following steps to change LUIS authoring resource to a La
## Next steps
-[Conversational language understanding documentation](../../language-service/conversational-language-understanding/how-to/create-project.md#create-an-orchestration-workflow-project).
+[Conversational language understanding documentation](../../language-service/conversational-language-understanding/how-to/create-project.md#create-an-orchestration-workflow-project).
cognitive-services Publish https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/LUIS/how-to/publish.md
If you need the endpoint URL, select the link or select **Manage** in the top
## Next steps -- See [Manage keys](/azure/cognitive-services/luis/luis-how-to-azure-subscription) to add keys to Azure subscription key to LUIS-- See [Train and test your app](/azure/cognitive-services/luis/luis-interactive-test) for instructions on how to test your published app in the test console.
+- See [Manage keys](../luis-how-to-azure-subscription.md) to add keys to Azure subscription key to LUIS
+- See [Train and test your app](/azure/cognitive-services/luis/luis-interactive-test) for instructions on how to test your published app in the test console.
cognitive-services Sign In https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/LUIS/how-to/sign-in.md
Use this article to get started with the LUIS portal, and create an authoring re
* **Tenant Name** - the tenant your Azure subscription is associated with. You will not be able to switch tenants from the existing window. You can switch tenants by closing this window and selecting the avatar at the top-right corner of the screen, containing your initials. Select **Choose a different authoring resource** from the top to reopen the window. * **Azure Resource group name** - a custom resource group name you choose in your subscription. Resource groups allow you to group Azure resources for access and management. If you currently do not have a resource group in your subscription, you will not be allowed to create one in the LUIS portal. Go to [Azure portal](https://portal.azure.com/#create/Microsoft.ResourceGroup) to create one then go to LUIS to continue the sign-in process. * **Azure Resource name** - a custom name you choose, used as part of the URL for your authoring transactions. Your resource name can only include alphanumeric characters, `-`, and can't start or end with `-`. If any other symbols are included in the name, creating a resource will fail.
-* **Location** - Choose to author your applications in one of the [three authoring locations](/azure/cognitive-services/luis/luis-reference-regions) that are currently supported by LUIS including: West US, West Europe and East Australia
-* **Pricing tier** - By default, F0 authoring pricing tier is selected as it is the recommended. Create a [customer managed key](/azure/cognitive-services/luis/encrypt-data-at-rest#customer-managed-keys-for-language-understanding) from the Azure portal if you are looking for an extra layer of security.
+* **Location** - Choose to author your applications in one of the [three authoring locations](../luis-reference-regions.md) that are currently supported by LUIS including: West US, West Europe and East Australia
+* **Pricing tier** - By default, F0 authoring pricing tier is selected as it is the recommended. Create a [customer managed key](../encrypt-data-at-rest.md#customer-managed-keys-for-language-understanding) from the Azure portal if you are looking for an extra layer of security.
8. Now you have successfully signed in to LUIS. You can now start creating applications.
There are a couple of ways to create a LUIS app. You can create a LUIS app in th
**Using the LUIS portal** You can create a new app in the portal in several ways: * Start with an empty app and create intents, utterances, and entities.
-* Start with an empty app and add a [prebuilt domain](/azure/cognitive-services/luis/luis-concept-prebuilt-model?branch=pr-en-us-181263).
+* Start with an empty app and add a [prebuilt domain](../luis-concept-prebuilt-model.md?branch=pr-en-us-181263).
* Import a LUIS app from a .lu or .json file that already contains intents, utterances, and entities. **Using the authoring APIs** You can create a new app with the authoring APIs in a couple of ways:
There are a couple of ways to create a LUIS app. You can create a LUIS app in th
## Next steps
-If your app design includes intent detection, [create new intents](intents.md), and add example utterances. If your app design is only data extraction, add example utterances to the None intent, then [create entities](entities.md), and label the example utterances with those entities.
+If your app design includes intent detection, [create new intents](intents.md), and add example utterances. If your app design is only data extraction, add example utterances to the None intent, then [create entities](entities.md), and label the example utterances with those entities.
cognitive-services Train Test https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/LUIS/how-to/train-test.md
Testing an app is an iterative process. After training your LUIS app, test it wi
Interactive testing is done from the **Test** panel of the LUIS portal. You can enter an utterance to see how intents and entities are identified and scored. If LUIS isn't predicting an utterance's intents and entities as you would expect, copy the utterance to the **Intent** page as a new utterance. Then label parts of that utterance for entities to train your LUIS app.
-See [batch testing](/azure/cognitive-services/luis/luis-how-to-batch-test) if you are testing more than one utterance at a time, and the [Prediction scores](/azure/cognitive-services/luis/luis-concept-prediction-score) article to learn more about prediction scores.
+See [batch testing](../luis-how-to-batch-test.md) if you are testing more than one utterance at a time, and the [Prediction scores](../luis-concept-prediction-score.md) article to learn more about prediction scores.
## Test an utterance
If you are using [Patterns](/azure/cognitive-services/luis/luis-concept-patterns
## Compare with published version
-You can test the active version of your app with the published [endpoint](/azure/cognitive-services/luis/luis-glossary#endpoint) version. In the **Inspect** panel, select **Compare with published**.
+You can test the active version of your app with the published [endpoint](../luis-glossary.md#endpoint) version. In the **Inspect** panel, select **Compare with published**.
> [!NOTE] > Any testing against the published model is deducted from your Azure subscription quota balance.
You can view the endpoint JSON returned for the comparison by selecting the **Sh
## Next steps
-If testing requires testing a batch of utterances, See [batch testing](/azure/cognitive-services/luis/luis-how-to-batch-test).
+If testing requires testing a batch of utterances, See [batch testing](../luis-how-to-batch-test.md).
If testing indicates that your LUIS app doesn't recognize the correct intents and entities, you can work to improve your LUIS app's accuracy by labeling more utterances or adding features.
-* [Improve your application](/azure/cognitive-services/luis/how-to/improve-application?branch=pr-en-us-181263)
-* [Publishing your application](/azure/cognitive-services/luis/how-to/publish?branch=pr-en-us-181263)
+* [Improve your application](./improve-application.md?branch=pr-en-us-181263)
+* [Publishing your application](./publish.md?branch=pr-en-us-181263)
cognitive-services Data Sources And Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/QnAMaker/Concepts/data-sources-and-content.md
The table below summarizes the types of content and file formats that are suppor
|Source Type|Content Type| Examples| |--|--|--|
-|URL|FAQs<br> (Flat, with sections or with a topics homepage)<br>Support pages <br> (Single page how-to articles, troubleshooting articles etc.)|[Plain FAQ](../troubleshooting.md), <br>[FAQ with links](https://www.microsoft.com/en-us/software-download/faq),<br> [FAQ with topics homepage](https://www.microsoft.com/Licensing/servicecenter/Help/Faq.aspx)<br>[Support article](./best-practices.md)|
+|URL|FAQs<br> (Flat, with sections or with a topics homepage)<br>Support pages <br> (Single page how-to articles, troubleshooting articles etc.)|[Plain FAQ](../troubleshooting.md), <br>[FAQ with links](https://www.microsoft.com/microsoft-365/microsoft-365-for-home-and-school-faq),<br> [FAQ with topics homepage](https://www.microsoft.com/Licensing/servicecenter/Help/Faq.aspx)<br>[Support article](./best-practices.md)|
|PDF / DOC|FAQs,<br> Product Manual,<br> Brochures,<br> Paper,<br> Flyer Policy,<br> Support guide,<br> Structured QnA,<br> etc.|**Without Multi-turn**<br>[Structured QnA.docx](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/structured.docx),<br> [Sample Product Manual.pdf](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/product-manual.pdf),<br> [Sample semi-structured.docx](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/semi-structured.docx),<br> [Sample white paper.pdf](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/white-paper.pdf),<br> [Unstructured blog.pdf](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/Introducing-surface-laptop-4-and-new-access.pdf),<br> [Unstructured white paper.pdf](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/sample-unstructured-paper.pdf)<br><br>**Multi-turn**:<br>[Surface Pro (docx)](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/multi-turn.docx)<br>[Contoso Benefits (docx)](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/Multiturn-ContosoBenefits.docx)<br>[Contoso Benefits (pdf)](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/Multiturn-ContosoBenefits.pdf)| |*Excel|Structured QnA file<br> (including RTF, HTML support)|**Without Multi-turn**:<br>[Sample QnA FAQ.xls](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/QnA%20Maker%20Sample%20FAQ.xlsx)<br><br>**Multi-turn**:<br>[Structured simple FAQ.xls](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/Structured-multi-turn-format.xlsx)<br>[Surface laptop FAQ.xls](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/Multiturn-Surface-Pro.xlsx)| |*TXT/TSV|Structured QnA file|[Sample chit-chat.tsv](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/qna-maker/data-source-formats/Scenario_Responses_Friendly.tsv)|
cognitive-services Add Sharepoint Datasources https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/QnAMaker/How-To/add-sharepoint-datasources.md
If the QnA Maker knowledge base manager is not the Active Directory manager, you
## Add supported file types to knowledge base
-You can add all QnA Maker-supported [file types](/azure/cognitive-services/qnamaker/concepts/data-sources-and-content#file-and-url-data-types) from a SharePoint site to your knowledge base. You may have to grant [permissions](#permissions) if the file resource is secured.
+You can add all QnA Maker-supported [file types](../concepts/data-sources-and-content.md#file-and-url-data-types) from a SharePoint site to your knowledge base. You may have to grant [permissions](#permissions) if the file resource is secured.
1. From the library with the SharePoint site, select the file's ellipsis menu, `...`. 1. Copy the file's URL.
Use the **@microsoft.graph.downloadUrl** from the previous section as the `fileu
## Next steps > [!div class="nextstepaction"]
-> [Collaborate on your knowledge base](/azure/cognitive-services/qnamaker/concepts/data-sources-and-content#file-and-url-data-types.yml)
+> [Collaborate on your knowledge base](../concepts/data-sources-and-content.md)
cognitive-services Beginners Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Translator/custom-translator/v2-preview/beginners-guide.md
Finding in-domain quality data is often a challenging task that varies based on
| Source | What it does | Rules to follow | ||||
-| Bilingual training documents | Teaches the system your terminology and style. | **Be liberal**. Any in-domain human translation is better than machine translation. Add and remove documents as you go and try to improve the [BLEU score](/azure/cognitive-services/translator/custom-translator/what-is-bleu-score?WT.mc_id=aiml-43548-heboelma). |
+| Bilingual training documents | Teaches the system your terminology and style. | **Be liberal**. Any in-domain human translation is better than machine translation. Add and remove documents as you go and try to improve the [BLEU score](../what-is-bleu-score.md?WT.mc_id=aiml-43548-heboelma). |
| Tuning documents | Trains the Neural Machine Translation parameters. | **Be strict**. Compose them to be optimally representative of what you are going to translation in the future. |
-| Test documents | Calculate the [BLEU score](/azure/cognitive-services/translator/custom-translator/what-is-bleu-score?WT.mc_id=aiml-43548-heboelma).| **Be strict**. Compose test documents to be optimally representative of what you plan to translate in the future. |
+| Test documents | Calculate the [BLEU score](../what-is-bleu-score.md?WT.mc_id=aiml-43548-heboelma).| **Be strict**. Compose test documents to be optimally representative of what you plan to translate in the future. |
| Phrase dictionary | Forces the given translation 100% of the time. | **Be restrictive**. A phrase dictionary is case-sensitive and any word or phrase listed is translated in the way you specify. In many cases, it is better to not use a phrase dictionary and let the system learn. | | Sentence dictionary | Forces the given translation 100% of the time. | **Be strict**. A sentence dictionary is case-insensitive and good for common in domain short sentences. For a sentence dictionary match to occur, the entire submitted sentence must match the source dictionary entry. If only a portion of the sentence matches, the entry won't match. |
BLEU (Bilingual Evaluation Understudy) is an algorithm for evaluating the precis
A BLEU score is a number between zero and 100. A score of zero indicates a low quality translation where nothing in the translation matched the reference. A score of 100 indicates a perfect translation that is identical to the reference. It's not necessary to attain a score of 100 - a BLEU score between 40 and 60 indicates a high-quality translation.
-[Read more](/azure/cognitive-services/translator/custom-translator/what-is-bleu-score?WT.mc_id=aiml-43548-heboelma)
+[Read more](../what-is-bleu-score.md?WT.mc_id=aiml-43548-heboelma)
## What happens if I don't submit tuning or testing data?
After your model is successfully trained, you can view the model's BLEU score an
## Next steps > [!div class="nextstepaction"]
-> [Try our Quickstart](quickstart.md)
+> [Try our Quickstart](quickstart.md)
cognitive-services Create Manage Training Documents https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Translator/custom-translator/v2-preview/how-to/create-manage-training-documents.md
Finding in-domain quality data is often a challenging task that varies based on
| Source | What it does | Rules to follow | ||||
-| Bilingual training documents | Teaches the system your terminology and style. | **Be liberal**. Any in-domain human translation is better than machine translation. Add and remove documents as you go and try to improve the [BLEU score](/azure/cognitive-services/translator/custom-translator/what-is-bleu-score?WT.mc_id=aiml-43548-heboelma). |
+| Bilingual training documents | Teaches the system your terminology and style. | **Be liberal**. Any in-domain human translation is better than machine translation. Add and remove documents as you go and try to improve the [BLEU score](../../what-is-bleu-score.md?WT.mc_id=aiml-43548-heboelma). |
| Tuning documents | Trains the Neural Machine Translation parameters. | **Be strict**. Compose them to be optimally representative of what you are going to translation in the future. | | Test documents | Calculate the [BLEU score](../beginners-guide.md#what-is-a-bleu-score).| **Be strict**. Compose test documents to be optimally representative of what you plan to translate in the future. | | Phrase dictionary | Forces the given translation 100% of the time. | **Be restrictive**. A phrase dictionary is case-sensitive and any word or phrase listed is translated in the way you specify. In many cases, it is better to not use a phrase dictionary and let the system learn. |
At this point, Custom Translator is processing your documents and attempting to
- Learn [how to train a model](train-custom-model.md). - Learn [how to test and evaluate model quality](view-model-test-translation.md). - Learn [how to publish model](publish-model.md).-- Learn [how to translate with custom models](translate-with-custom-model.md).
+- Learn [how to translate with custom models](translate-with-custom-model.md).
cognitive-services View Model Test Translation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Translator/custom-translator/v2-preview/how-to/view-model-test-translation.md
BLEU (Bilingual Evaluation Understudy) is an algorithm for evaluating the precis
A BLEU score is a number between zero and 100. A score of zero indicates a low-quality translation where nothing in the translation matched the reference. A score of 100 indicates a perfect translation that is identical to the reference. It's not necessary to attain a score of 100ΓÇöa BLEU score between 40 and 60 indicates a high-quality translation.
-[Read more](/azure/cognitive-services/translator/custom-translator/what-is-bleu-score?WT.mc_id=aiml-43548-heboelma)
+[Read more](../../what-is-bleu-score.md?WT.mc_id=aiml-43548-heboelma)
## Model details
A BLEU score is a number between zero and 100. A score of zero indicates a low-q
## Next steps - Learn [how to publish/deploy a custom model](publish-model.md).-- Learn [how to translate documents with a custom model](translate-with-custom-model.md).
+- Learn [how to translate documents with a custom model](translate-with-custom-model.md).
cognitive-services Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Translator/custom-translator/v2-preview/quickstart.md
Once you have the above prerequisites, sign in to the [Custom Translator](https:
You can read an overview of translation and custom translation, learn some tips, and watch a getting started video in the [Azure AI technical blog](https://techcommunity.microsoft.com/t5/azure-ai/customize-a-translation-to-make-sense-in-a-specific-context/ba-p/2811956). >[!Note]
->Custom Translator does not support creating workspace for a Translator Text API resource created inside an [Enabled VNet](/azure/api-management/api-management-using-with-vnet?tabs=stv2).
+>Custom Translator does not support creating workspace for a Translator Text API resource created inside an [Enabled VNet](../../../../api-management/api-management-using-with-vnet.md?tabs=stv2).
## Process summary
Publishing your model makes it available for use with the Translator API. A proj
## Next steps > [!div class="nextstepaction"]
-> [Learn how to manage workspaces](how-to/create-manage-workspace.md)
+> [Learn how to manage workspaces](how-to/create-manage-workspace.md)
cognitive-services Start Translation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Translator/document-translation/reference/start-translation.md
Destination for the finished translated documents.
The following are examples of batch requests. > [!NOTE]
-> In the following examples, limited access has been granted to the contents of an Azure Storage container [using a shared access signature(SAS)](/azure/storage/common/storage-sas-overview) token.
+> In the following examples, limited access has been granted to the contents of an Azure Storage container [using a shared access signature(SAS)](../../../../storage/common/storage-sas-overview.md) token.
**Translating all documents in a container**
Operation-Location: https://<NAME-OF-YOUR-RESOURCE>.cognitiveservices.azure.com/
Follow our quickstart to learn more about using Document Translation and the client library. > [!div class="nextstepaction"]
-> [Get started with Document Translation](../get-started-with-document-translation.md)
-
+> [Get started with Document Translation](../get-started-with-document-translation.md)
cognitive-services Create Project https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/language-service/custom-named-entity-recognition/how-to/create-project.md
To set proper roles on your storage account:
[!INCLUDE [Storage connection note](../../custom-classification/includes/storage-account-note.md)]
-For information on authorizing access to your Azure blob storage account and data, see [Authorize access to data in Azure storage](/azure/storage/common/authorize-data-access?toc=/azure/storage/blobs/toc.json).
+For information on authorizing access to your Azure blob storage account and data, see [Authorize access to data in Azure storage](../../../../storage/common/authorize-data-access.md?toc=%2fazure%2fstorage%2fblobs%2ftoc.json).
## Prepare training data
communication-services Custom Teams Endpoint Authentication Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/interop/custom-teams-endpoint-authentication-overview.md
The following sequence diagram is showing detailed steps of the authentication:
:::image type="content" source="./media/custom-teams-endpoint/authentication-case-single-tenant-azure-rbac.svg" alt-text="Sequence diagram is describing detailed set of steps, that happens to authenticate Teams user. In the end, client application retrieves an Azure Communication Services access token for single tenant Azure AD application." lightbox="./media/custom-teams-endpoint/authentication-case-single-tenant-azure-rbac.svg"::: Prerequisites:-- Alice or her Azure AD Administrator needs to provide consent to the Fabrikam's Azure Active Directory Application before first sign in. To learn more about [consent flow](https://docs.microsoft.com/azure/active-directory/develop/consent-framework).-- The admin of the Azure Communication Services resource must grant Alice permission to perform this action. You can learn about the [Azure RBAC role assignment](https://docs.microsoft.com/azure/role-based-access-control/role-assignments-portal).
+- Alice or her Azure AD Administrator needs to provide consent to the Fabrikam's Azure Active Directory Application before first sign in. To learn more about [consent flow](../../../active-directory/develop/consent-framework.md).
+- The admin of the Azure Communication Services resource must grant Alice permission to perform this action. You can learn about the [Azure RBAC role assignment](../../../role-based-access-control/role-assignments-portal.md).
Steps: 1. Authentication of Alice from Fabrikam against Fabrikam's Azure Active Directory: This step is standard OAuth flow leveraging Microsoft Authentication Library (MSAL) to authenticate against Fabrikam's Azure Active Directory. Alice is authenticating for Fabrikam's Azure AD application. If the authentication of Alice is successful, Fabrikam's Client application receives Azure AD access token 'A'. Details of the token are captured below. Developer experience is captured in the [quickstart](../../quickstarts/manage-teams-identity.md).
The following sequence diagram is showing detailed steps of the authentication:
:::image type="content" source="./media/custom-teams-endpoint/authentication-case-multiple-tenants-hmac.svg" alt-text="Sequence diagram is describing detailed set of steps, that happens to authenticate Teams user and retrieve Azure Communication Services access token for multi-tenant Azure AD application." lightbox="./media/custom-teams-endpoint/authentication-case-multiple-tenants-hmac.svg"::: Prerequisites:-- Alice or her Azure AD Administrator needs to provide consent to the Contoso's Azure Active Directory Application before first sign in. To learn more about [consent flow](https://docs.microsoft.com/azure/active-directory/develop/consent-framework).
+- Alice or her Azure AD Administrator needs to provide consent to the Contoso's Azure Active Directory Application before first sign in. To learn more about [consent flow](../../../active-directory/develop/consent-framework.md).
Steps: 1. Authentication of Alice from Fabrikam against Fabrikam's Azure Active Directory: This step is standard OAuth flow using Microsoft Authentication Library (MSAL) to authenticate against Fabrikam's Azure Active Directory. Alice is authenticating for Contoso's Azure AD application. If the authentication of Alice is successful, Contoso's Client application receives Azure AD access token 'A'. Details of the token are captured below. Developer experience is captured in the [quickstart](../../quickstarts/manage-teams-identity.md).
The following articles might be of interest to you:
- Learn more about [authentication](../authentication.md). - Try [quickstart for authentication of Teams users](../../quickstarts/manage-teams-identity.md).-- Try [quickstart for calling to a Teams user](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
+- Try [quickstart for calling to a Teams user](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
communication-services Custom Teams Endpoint Firewall Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/interop/custom-teams-endpoint-firewall-configuration.md
Azure Communication Services provides the ability to leverage Communication Serv
The following articles might be of interest to you: - Learn more about [Azure Communication Services firewall configuration](../voice-video-calling/network-requirements.md).-- Learn about [Microsoft Teams firewall configuration](https://docs.microsoft.com/microsoft-365/enterprise/urls-and-ip-address-ranges?view=o365-worldwide#skype-for-business-online-and-microsoft-teams).
+- Learn about [Microsoft Teams firewall configuration](/microsoft-365/enterprise/urls-and-ip-address-ranges?view=o365-worldwide#skype-for-business-online-and-microsoft-teams).
communication-services Custom Teams Endpoint Use Cases https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/interop/custom-teams-endpoint-use-cases.md
The following sequence diagram shows detailed steps for initiation of a Teams Vo
### Steps 1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
-2. Load users from Fabrikam's organization and their identifiers: Contoso client application utilizes Graph API to get a list of users from Fabrikam's tenant. Alice or her Admin needs to provide consent to Graph API to perform this action. You can learn more about [the Graph API command in the documentation](https://docs.microsoft.com/graph/api/user-list).
+2. Load users from Fabrikam's organization and their identifiers: Contoso client application utilizes Graph API to get a list of users from Fabrikam's tenant. Alice or her Admin needs to provide consent to Graph API to perform this action. You can learn more about [the Graph API command in the documentation](/graph/api/user-list).
``` GET https://graph.microsoft.com/v1.0/users
The following sequence diagram shows detailed steps for joining a Teams meeting:
### Steps 1. Authenticate Alice from Fabrikam in Contoso's client application: Alice is using a browser to open Fabrikam's web page and authenticates. You can find more details about [the authentication with Teams identity](./custom-teams-endpoint-authentication-overview.md). If the authentication is successful, Alice is redirected to the initial page.
-2. Load Teams meetings and their identifiers: Contoso client application utilizes Graph API to get a list of Teams meetings for Fabrikam's users. Alice or her Admin needs to provide consent to Graph API to perform this action. You can learn more about [the Graph API command in the documentation](https://docs.microsoft.com/graph/api/user-list-calendarview).
+2. Load Teams meetings and their identifiers: Contoso client application utilizes Graph API to get a list of Teams meetings for Fabrikam's users. Alice or her Admin needs to provide consent to Graph API to perform this action. You can learn more about [the Graph API command in the documentation](/graph/api/user-list-calendarview).
``` GET https://graph.microsoft.com/v1.0/me/calendar/calendarView?startDateTime={start_datetime}&endDateTime={end_datetime}
The following articles might be of interest to you:
- Learn more about [authentication](../authentication.md). - Try [quickstart for authentication of Teams users](../../quickstarts/manage-teams-identity.md).-- Try [quickstart for calling to a Teams user](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
+- Try [quickstart for calling to a Teams user](../../quickstarts/voice-video-calling/get-started-with-voice-video-calling-custom-teams-client.md).
communication-services Join Teams Meeting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/join-teams-meeting.md
During a meeting, Communication Services users will be able to use core audio, v
Additional information on required dataflows for joining Teams meetings is available at the [client and server architecture page](client-and-server-architecture.md). The [Group Calling Hero Sample](../samples/calling-hero-sample.md) provides example code for joining a Teams meeting from a web application. ## Diagnostics and call analytics
-After a Teams meeting ends, diagnostic information about the meeting is available using the [Communication Services logging and diagnostics](/azure/communication-services/concepts/logging-and-diagnostics) and using the [Teams Call Analytics](/MicrosoftTeams/use-call-analytics-to-troubleshoot-poor-call-quality) in the Teams admin center. Communication Services users will appear as "Anonymous" in Call Analytics screens. Communication Services users aren't included in the [Teams real-time Analytics](/microsoftteams/use-real-time-telemetry-to-troubleshoot-poor-meeting-quality).
+After a Teams meeting ends, diagnostic information about the meeting is available using the [Communication Services logging and diagnostics](./logging-and-diagnostics.md) and using the [Teams Call Analytics](/MicrosoftTeams/use-call-analytics-to-troubleshoot-poor-call-quality) in the Teams admin center. Communication Services users will appear as "Anonymous" in Call Analytics screens. Communication Services users aren't included in the [Teams real-time Analytics](/microsoftteams/use-real-time-telemetry-to-troubleshoot-poor-meeting-quality).
## Privacy Interoperability between Azure Communication Services and Microsoft Teams enables your applications and users to participate in Teams calls, meetings, and chat. It is your responsibility to ensure that the users of your application are notified when recording or transcription are enabled in a Teams call or meeting.
Microsoft will indicate to you via the Azure Communication Services API that rec
- PowerPoint presentations aren't rendered for Communication Services users. - Teams meetings support up to 1000 participants, but the Azure Communication Services Calling SDK currently only supports 350 participants and Chat SDK supports 250 participants. - With [Cloud Video Interop for Microsoft Teams](/microsoftteams/cloud-video-interop), some devices have seen issues when a Communication Services user shares their screen.-- [Communication Services voice and video calling events](/azure/event-grid/communication-services-voice-video-events) aren't raised for Teams meeting.
+- [Communication Services voice and video calling events](../../event-grid/communication-services-voice-video-events.md) aren't raised for Teams meeting.
- Features such as reactions, raised hand, together mode, and breakout rooms are only available for Teams users. - Communication Services users can't interact with poll or Q&A apps in meetings. - Communication Services won't have access to all chat features supported by Teams. They can send and receive text messages, use typing indicators, read receipts and other features supported by Chat SDK. However features like file sharing, reply or react to a message aren't supported for Communication Services users.
communication-services Concepts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/router/concepts.md
An exception policy controls the behavior of a Job based on a trigger and execut
[nuget]: https://www.nuget.org/ [netstandars2mappings]:https://github.com/dotnet/standard/blob/master/docs/versions.md [useraccesstokens]:https://docs.microsoft.com/azure/communication-services/quickstarts/access-tokens?pivots=programming-language-csharp
-[communication_resource_docs]: https://docs.microsoft.com/azure/communication-services/quickstarts/create-communication-resource?tabs=windows&pivots=platform-azp
-[communication_resource_create_portal]: https://docs.microsoft.com/azure/communication-services/quickstarts/create-communication-resource?tabs=windows&pivots=platform-azp
-[communication_resource_create_power_shell]: https://docs.microsoft.com/powershell/module/az.communication/new-azcommunicationservice
-[communication_resource_create_net]: https://docs.microsoft.com/azure/communication-services/quickstarts/create-communication-resource?tabs=windows&pivots=platform-net
+[communication_resource_docs]: ../../quickstarts/create-communication-resource.md?pivots=platform-azp&tabs=windows
+[communication_resource_create_portal]: ../../quickstarts/create-communication-resource.md?pivots=platform-azp&tabs=windows
+[communication_resource_create_power_shell]: /powershell/module/az.communication/new-azcommunicationservice
+[communication_resource_create_net]: ../../quickstarts/create-communication-resource.md?pivots=platform-net&tabs=windows
[subscribe_events]: ../../how-tos/router-sdk/subscribe-events.md [worker_registered_event]: ../../how-tos/router-sdk/subscribe-events.md#microsoftcommunicationrouterworkerregistered [job_classified_event]: ../../how-tos/router-sdk/subscribe-events.md#microsoftcommunicationrouterjobclassified [offer_issued_event]: ../../how-tos/router-sdk/subscribe-events.md#microsoftcommunicationrouterworkerofferissued
-[offer_accepted_event]: ../../how-tos/router-sdk/subscribe-events.md#microsoftcommunicationrouterworkerofferaccepted
+[offer_accepted_event]: ../../how-tos/router-sdk/subscribe-events.md#microsoftcommunicationrouterworkerofferaccepted
communication-services Sdk Options https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/sdk-options.md
Development of Calling and Chat applications can be accelerated by the [Azure C
| Chat | [REST](/rest/api/communication/) with proprietary signaling | Client & Service | Add real-time text chat to your applications | | Calling | Proprietary transport | Client | Voice, video, screen-sharing, and other real-time communication | | Calling Server | [REST](/rest/api/communication/callautomation/server-calls) | Service| Make and manage calls, play audio, and configure recording |
-| Network Traversal | [REST](/azure/communication-services/concepts/network-traversal)| Service| Access TURN servers for low-level data transport |
+| Network Traversal | [REST](./network-traversal.md)| Service| Access TURN servers for low-level data transport |
| UI Library | N/A | Client | Production-ready UI components for chat and calling apps | ### Languages and publishing locations
For more information, see the following SDK overviews:
To get started with Azure Communication - [Create an Azure Communication Services resource](../quickstarts/create-communication-resource.md)-- Generate [User Access Tokens](../quickstarts/access-tokens.md)
+- Generate [User Access Tokens](../quickstarts/access-tokens.md)
communication-services Closed Captions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/voice-video-calling/closed-captions.md
Azure Communication Services allows one to enable Closed Captions for the VoIP c
Closed Captions is the conversion of a voice or video call audio track into written words that appear in real time. Closed Captions are never saved and are only visible to the user that has enabled it. Here are main scenarios where Closed Captions are useful: -- **Accessibility**. In the workplace or consumer apps, Closed Captioning for meetings, conference calls, and training videos can make a huge difference. -- **Accessibility**. Scenarios when audio can't be heard, either because of a noisy environment, such as an airport, or because of an environment that must be kept quiet, such as a hospital.
+- **Accessibility**. In the workplace or consumer apps, Closed Captioning for meetings, conference calls, and training videos can make a huge difference. Scenarios when audio can't be heard, either because of a noisy environment, such as an airport, or because of an environment that must be kept quiet, such as a hospital.
- **Inclusivity**. Closed Captioning was developed to aid hearing-impaired people, but it could be useful for a language proficiency as well. ![closed captions work flow](../media/call-closed-caption.png)
Here are main scenarios where Closed Captions are useful:
- Support for multiple platforms with cross-platform support. - Async processing with client subscription to events and callbacks. - Multiple languages to choose from for recognition.-- Support for existing SkypeToken Authentication ## Availability
The private preview will be available on all platforms.
## Next steps -- Get started with a [Closed Captions Quickstart](Thttps://docs.microsoft.com/azure/communication-services/quickstarts/voice-video-calling/get-started-with-closed-captions?pivots=platform-iosBD)----
+- Get started with a [Closed Captions Quickstart](/azure/communication-services/quickstarts/voice-video-calling/get-started-with-closed-captions?pivots=platform-iosBD)
communication-services Media Comp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/voice-video-calling/media-comp.md
These media streams are typically arrayed in a grid and broadcast to call partic
- Connect devices and services using streaming protocols such as [RTMP](https://datatracker.ietf.org/doc/html/rfc7016) or [SRT](https://datatracker.ietf.org/doc/html/draft-sharabayko-srt) - Compose media streams into complex scenes
-RTMP & SRT connectivity can be used for both input and output. Using RTMP/SRT input, a videography studio that emits RTMP/SRT can join an Azure Communication Services call. RTMP/SRT output allows you to stream media from Azure Communication Services into [Azure Media Services](https://docs.microsoft.com/azure/media-services/latest/concepts-overview), YouTube Live, and many other broadcasting channels. The ability to attach industry standard RTMP/SRT emitters and to output content to RTMP/SRT subscribers for broadcasting transforms a small group call into a virtual event that reaches millions of people in real time.
+RTMP & SRT connectivity can be used for both input and output. Using RTMP/SRT input, a videography studio that emits RTMP/SRT can join an Azure Communication Services call. RTMP/SRT output allows you to stream media from Azure Communication Services into [Azure Media Services](../../../media-services/latest/concepts-overview.md), YouTube Live, and many other broadcasting channels. The ability to attach industry standard RTMP/SRT emitters and to output content to RTMP/SRT subscribers for broadcasting transforms a small group call into a virtual event that reaches millions of people in real time.
Media Composition REST APIs (and open-source SDKs) allow you to command the Azure service to cloud compose these media streams. For example, a **presenter layout** can be used to compose a speaker and a translator together in a classic picture-in-picture style. Media Composition allows for all clients and services connected to the media data plane to enjoy a particular dynamic layout without local processing or application complexity.
The presenter layout is one of several layouts available through the media compo
<!-To try out media composition, check out following content:--> <!- [Quick Start - Applying Media Composition to a video call](../../quickstarts/media-composition/get-started-media-composition.md) -->
-<!- [Tutorial - Media Composition Layouts](../../quickstarts/media-composition/media-composition-layouts.md) -->
+<!- [Tutorial - Media Composition Layouts](../../quickstarts/media-composition/media-composition-layouts.md) -->
communication-services Quickstart Botframework Integration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/quickstarts/chat/quickstart-botframework-integration.md
After creating the Azure Bot resource, next step would be to set a password for
### Create a Web App where actual bot logic resides
-Create a Web App where actual bot logic resides. You could check out some samples at [Bot Builder Samples](https://github.com/Microsoft/BotBuilder-Samples) and tweak them or use Bot Builder SDK to create one: [Bot Builder documentation](https://docs.microsoft.com/composer/introduction). One of the simplest ones to play around with is Echo Bot located here with steps on how to use it and it's the one being used in this example [Echo Bot](https://github.com/microsoft/BotBuilder-Samples/tree/main/samples/csharp_dotnetcore/02.echo-bot). Generally, the Bot Service expects the Bot Application Web App Controller to expose an endpoint `/api/messages`, which handles all the messages reaching the bot. To create the Bot application, follow these steps.
+Create a Web App where actual bot logic resides. You could check out some samples at [Bot Builder Samples](https://github.com/Microsoft/BotBuilder-Samples) and tweak them or use Bot Builder SDK to create one: [Bot Builder documentation](/composer/introduction). One of the simplest ones to play around with is Echo Bot located here with steps on how to use it and it's the one being used in this example [Echo Bot](https://github.com/microsoft/BotBuilder-Samples/tree/main/samples/csharp_dotnetcore/02.echo-bot). Generally, the Bot Service expects the Bot Application Web App Controller to expose an endpoint `/api/messages`, which handles all the messages reaching the bot. To create the Bot application, follow these steps.
1. As in previously shown create a resource and choose `Web App` in search.
And on the ACS User side, the ACS message's metadata field will indicate this is
## Next steps
-Try the [Sample App](https://github.com/Azure/communication-preview/tree/master/samples/AzureBotService-Sample-App), which showcases a 1:1 chat between the end user and chat bot, and uses BotFramework's WebChat UI component.
+Try the [Sample App](https://github.com/Azure/communication-preview/tree/master/samples/AzureBotService-Sample-App), which showcases a 1:1 chat between the end user and chat bot, and uses BotFramework's WebChat UI component.
container-apps Vnet Custom https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/container-apps/vnet-custom.md
Additionally, subnets must have a size between /21 and /12.
As a Container Apps environment is created, you provide resource IDs for two different subnets. Both subnets must be defined in the same container apps. - **App subnet**: Subnet for user app containers. Subnet that contains IP ranges mapped to applications deployed as containers.-- **Control plane subnet**: Subnet for [control plane infrastructure](/azure/azure-resource-manager/management/control-plane-and-data-plane) components and user app containers.
+- **Control plane subnet**: Subnet for [control plane infrastructure](../azure-resource-manager/management/control-plane-and-data-plane.md) components and user app containers.
::: zone pivot="azure-cli"
az group delete `
## Additional resources -- Refer to [What is Azure Private Endpoint](/azure/private-link/private-endpoint-overview) for more details on configuring your private endpoint.
+- Refer to [What is Azure Private Endpoint](../private-link/private-endpoint-overview.md) for more details on configuring your private endpoint.
-- To set up DNS name resolution for internal services, you must [set up your own DNS server](/azure/dns/).
+- To set up DNS name resolution for internal services, you must [set up your own DNS server](../dns/index.yml).
## Next steps > [!div class="nextstepaction"]
-> [Managing autoscaling behavior](scale-app.md)
+> [Managing autoscaling behavior](scale-app.md)
cosmos-db Concepts Limits https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/concepts-limits.md
Cosmos DB automatically takes backups of your data at regular intervals. For det
| | | | Maximum number of databases | 500 | | Maximum number of containers per database with shared throughput |25 |
-| Maximum number of containers per database or account with dedicated throughput | 500 |
+| Maximum number of containers per account | 500 |
| Maximum number of regions | No limit (All Azure regions) | ### Serverless
cosmos-db Modeling Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/modeling-data.md
Previously updated : 08/26/2021 Last updated : 02/15/2022 # Data modeling in Azure Cosmos DB
Take this JSON snippet.
} ```
-This might be what a post entity with embedded comments would look like if we were modeling a typical blog, or CMS, system. The problem with this example is that the comments array is **unbounded**, meaning that there's no (practical) limit to the number of comments any single post can have. This may become a problem as the size of the item could grow infinitely large.
+This might be what a post entity with embedded comments would look like if we were modeling a typical blog, or CMS, system. The problem with this example is that the comments array is **unbounded**, meaning that there's no (practical) limit to the number of comments any single post can have. This may become a problem as the size of the item could grow infinitely large so is a design you should avoid.
As the size of the item grows the ability to transmit the data over the wire as well as reading and updating the item, at scale, will be impacted.
Post item:
} Comment items:
-{
- "postId": "1"
- "comments": [
- {"id": 4, "author": "anon", "comment": "more goodness"},
- {"id": 5, "author": "bob", "comment": "tails from the field"},
- ...
- {"id": 99, "author": "angry", "comment": "blah angry blah angry"}
- ]
-},
-{
- "postId": "1"
- "comments": [
- {"id": 100, "author": "anon", "comment": "yet more"},
- ...
- {"id": 199, "author": "bored", "comment": "will this ever end?"}
- ]
-}
+[
+ {"id": 4, "postId": "1", "author": "anon", "comment": "more goodness"},
+ {"id": 5, "postId": "1", "author": "bob", "comment": "tails from the field"},
+ ...
+ {"id": 99, "postId": "1", "author": "angry", "comment": "blah angry blah angry"},
+ {"id": 100, "postId": "2", "author": "anon", "comment": "yet more"},
+ ...
+ {"id": 199, "postId": "2", "author": "bored", "comment": "will this ever end?"}
+]
```
-This model has the three most recent comments embedded in the post container, which is an array with a fixed set of attributes. The other comments are grouped in to batches of 100 comments and stored as separate items. The size of the batch was chosen as 100 because our fictitious application allows the user to load 100 comments at a time.
+This model has a document for each comment with a property that contains the post id. This allows posts to contain any number of comments and can grow efficiently. Users wanting to see more
+than the most recent comments would query this container passing the postId which should be the partition key for the comments container.
Another case where embedding data isn't a good idea is when the embedded data is used often across items and will change frequently.
cosmos-db Sql Api Sdk Java Spring V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-api-sdk-java-spring-v2.md
Spring Data Azure Cosmos DB version 2 for Core (SQL) allows developers to use Azure Cosmos DB in Spring applications. Spring Data Azure Cosmos DB exposes the Spring Data interface for manipulating databases and collections, working with documents, and issuing queries. Both Sync and Async (Reactive) APIs are supported in the same Maven artifact.
+> [!IMPORTANT]
+> This is *not* the latest Azure Spring Data Cosmos SDK for Azure Cosmos DB and is outdated! Because of performance issues and instability in Azure Spring Data Cosmos SDK V2, we highly recommend to use [Azure Spring Data Cosmos v3](sql-api-sdk-java-spring-v3.md) for your project. To upgrade, follow the instructions in the [Migrate to Azure Cosmos DB Java SDK v4](migrate-java-v4-sdk.md) guide to understand the difference in the underlying Java SDK V4.
+>
+ The [Spring Framework](https://spring.io/projects/spring-framework) is a programming and configuration model that streamlines Java application development. Spring streamlines the "plumbing" of applications by using dependency injection. Many developers like Spring because it makes building and testing applications more straightforward. [Spring Boot](https://spring.io/projects/spring-boot) extends this handling of the plumbing with an eye toward web application and microservices development. [Spring Data](https://spring.io/projects/spring-data) is a programming model for accessing datastores like Azure Cosmos DB from the context of a Spring or Spring Boot application. You can use Spring Data Azure Cosmos DB in your [Azure Spring Cloud](https://azure.microsoft.com/services/spring-cloud/) applications.
cosmos-db Sql Query Getting Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/sql-query-getting-started.md
In Azure Cosmos DB SQL API accounts, there are two ways to read data:
Here are some examples of how to do **Point reads** with each SDK: - [.NET SDK](/dotnet/api/microsoft.azure.cosmos.container.readitemasync)-- [Java SDK](/java/api/com.azure.cosmos.cosmoscontainer.readitem#com_azure_cosmos_CosmosContainer__T_readItem_java_lang_String_com_azure_cosmos_models_PartitionKey_com_azure_cosmos_models_CosmosItemRequestOptions_java_lang_Class_T__)-- [Node.js SDK](/javascript/api/@azure/cosmos/item#read-requestoptions-)-- [Python SDK](/python/api/azure-cosmos/azure.cosmos.containerproxy#read-item-item--partition-key--populate-query-metrics-none--post-trigger-include-none-kwargs-)
+- [Java SDK](/java/api/com.azure.cosmos.cosmoscontainer.readitem#com-azure-cosmos-cosmoscontainer-(t)readitem(java-lang-string-com-azure-cosmos-models-partitionkey-com-azure-cosmos-models-cosmositemrequestoptions-java-lang-class(t)))
+- [Node.js SDK](/javascript/api/@azure/cosmos/item#@azure-cosmos-item-read)
+- [Python SDK](/python/api/azure-cosmos/azure.cosmos.containerproxy#azure-cosmos-containerproxy-read-item)
**SQL queries** - You can query data by writing queries using the Structured Query Language (SQL) as a JSON query language. Queries always cost at least 2.3 request units and, in general, will have a higher and more variable latency than point reads. Queries can return many items.
cosmos-db Sql Query Keywords https://github.com/Microsof