Updates from: 02/06/2021 04:09:05
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c https://docs.microsoft.com/en-us/azure/active-directory-b2c/custom-policy-trust-frameworks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/custom-policy-trust-frameworks.md
@@ -111,7 +111,7 @@ Each starter pack includes the following files:
The inheritance model is as follows: - The child policy at any level can inherit from the parent policy and extend it by adding new elements.-- For more complex scenarios, you can add more inheritance levels (up to 5 in total).
+- For more complex scenarios, you can add more inheritance levels (up to 10 in total).
- You can add more relying party policies. For example, delete my account, change a phone number, SAML relying party policy and more. The following diagram shows the relationship between the policy files and the relying party applications.
active-directory-b2c https://docs.microsoft.com/en-us/azure/active-directory-b2c/javascript-and-page-layout https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/javascript-and-page-layout.md
@@ -59,7 +59,7 @@ To specify a page layout version for your user flow pages:
1. In your Azure AD B2C tenant, select **User flows**. 1. Select your policy (for example, "B2C_1_SignupSignin") to open it.
-1. Select **Page layouts**. Under **Layout name**, select a user flow page and choose the **Page Layout Version (Preview)**.
+1. Select **Page layouts**. Choose a **Layout name**, and then choose the **Page Layout Version (Preview)**.
For information about the different page layout versions, see the [Page layout version change log](page-layout.md).
active-directory-b2c https://docs.microsoft.com/en-us/azure/active-directory-b2c/whats-new-docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/whats-new-docs.md
@@ -6,8 +6,8 @@
--++
active-directory https://docs.microsoft.com/en-us/azure/active-directory/authentication/howto-authentication-passwordless-phone https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/howto-authentication-passwordless-phone.md
@@ -54,10 +54,13 @@ Azure AD lets you choose which authentication methods can be used during the sig
To enable the authentication method for passwordless phone sign-in, complete the following steps: 1. Sign in to the [Azure portal](https://portal.azure.com) with a *global administrator* account.
-1. Search for and select *Azure Active Directory*, then browse to **Security** > **Authentication methods** > **Authentication method policy (Preview)**
-1. Under **Passwordless phone sign-in**, choose the following options:
+1. Search for and select *Azure Active Directory*, then browse to **Security** > **Authentication methods** > **Policies**.
+1. Under **Microsoft Authenticator (preview)**, choose the following options:
1. **Enable** - Yes or No 1. **Target** - All users or Select users
+1. Each added group or user is enabled by default to use Microsoft Authenticator in both passwordless and push notification modes ("Any" mode). To change this, for each row:
+ 1. Browse to **...** > **Configure**.
+ 1. For **Authentication mode** - Any, Passwordless, or Push
1. To apply the new policy, select **Save**. ## User registration and management of Microsoft Authenticator
active-directory https://docs.microsoft.com/en-us/azure/active-directory/authentication/howto-password-ban-bad-on-premises-faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/howto-password-ban-bad-on-premises-faq.md
@@ -148,6 +148,146 @@ Audit mode is only supported in the on-premises Active Directory environment. Az
No. The error message seen by users when a password is rejected by a domain controller is controlled by the client machine, not by the domain controller. This behavior happens whether a password is rejected by the default Active Directory password policies or by a password-filter-based solution such as Azure AD Password Protection.
+## Password testing procedures
+
+You may want to do some basic testing of various passwords in order to validate proper operation of the software and to gain a better understanding of the [password evaluation algorithm](concept-password-ban-bad.md#how-are-passwords-evaluated). This section outlines a method for such testing that is designed to produce repeatable results.
+
+Why is it necessary to follow such steps? There are several factors that make it difficult to do controlled, repeatable testing of passwords in the on-premises Active Directory environment:
+
+* The password policy is configured and persisted in Azure, and copies of the policy are synced periodically by the on-premises DC agent(s) using a polling mechanism. The latency inherent in this polling cycle may cause confusion. For example, if you configure the policy in Azure but forget to sync it to the DC agent, then your tests may not yield the expected results. The polling interval is currently hardcoded to be once per hour, but waiting an hour between policy changes is non-ideal for an interactive testing scenario.
+* Once a new password policy is synced down to a domain controller, more latency will occur while it replicates to other domain controllers. These delays can cause unexpected results if you test a password change against a domain controller that has not yet received the latest version of the policy.
+* Testing password changes via a user interface makes it difficult to have confidence in your results. For example, it is easy to mis-type an invalid password into a user interface, especially since most password user interfaces hide user input (for example, such as the Windows Ctrl-Alt-Delete -> Change password UI).
+* It is not possible to strictly control which domain controller is used when testing password changes from domain-joined clients. The Windows client OS selects a domain controller based on factors such as Active Directory site and subnet assignments, environment-specific network configuration, etc.
+
+In order to avoid these problems, the steps below are based on command-line testing of password resets while logged into a domain controller.
+
+> [!WARNING]
+> These procedures should be used only in a test environment since all incoming password changes and resets will be accepted without validation while the DC agent service is stopped, and also to avoid the increased risks inherent in logging into a domain controller.
+
+The following steps assume that you have installed the DC agent on at least one domain controller, have installed at least one proxy, and have registered both the proxy and the forest.
+
+1. Log on to a domain controller using Domain Admin credentials (or other credentials that have sufficient privileges to create test user accounts and reset passwords), that has the DC agent software installed and has been rebooted.
+1. Open up Event Viewer and navigate to the [DC Agent Admin event log](howto-password-ban-bad-on-premises-monitor.md#dc-agent-admin-event-log).
+1. Open an elevated command prompt window.
+1. Create a test account for doing password testing
+
+ There are many ways to create a user account, but a command-line option is offered here as a way to make it easy during repetitive testing cycles:
+
+ ```text
+ net.exe user <testuseraccountname> /add <password>
+ ```
+
+ For discussion purposes below, assume that we have created a test account named "ContosoUser", for example:
+
+ ```text
+ net.exe user ContosoUser /add <password>
+ ```
+
+1. Open a web browser (you may need to use a separate device instead of your domain controller), sign in to the [Azure portal](https://portal.azure.com), and browse to Azure Active Directory > Security > Authentication methods > Password protection.
+1. Modify the Azure AD Password Protection policy as needed for the testing you want to perform. For example, you may decide to configure either Enforced or Audit Mode, or you may decide to modify the list of banned terms in your custom banned passwords list.
+1. Synchronize the new policy by stopping and restarting the DC agent service.
+
+ This step can be accomplished in various ways. One way would be to use the Service Management administrative console, by right-clicking on the Azure AD Password Protection DC Agent service and choosing "Restart". Another way may be performed from the command prompt window like so:
+
+ ```text
+ net stop AzureADPasswordProtectionDCAgent && net start AzureADPasswordProtectionDCAgent
+ ```
+
+1. Check the Event Viewer to verify that a new policy has been downloaded.
+
+ Each time the DC agent service is stopped and started, you should see two 30006 events issued in close succession. The first 30006 event will reflect the policy that was cached on disk in the sysvol share. The second 30006 event (if present) should have an updated Tenant policy date, and if so will reflect the policy that was downloaded from Azure. The Tenant policy date value is currently coded to display the approximate timestamp that the policy was downloaded from Azure.
+
+ If the second 30006 event does not appear, you should troubleshoot the problem before continuing.
+
+ The 30006 events will look similar to this example:
+
+ ```text
+ The service is now enforcing the following Azure password policy.
+
+ Enabled: 1
+ AuditOnly: 0
+ Global policy date: ΓÇÄ2018ΓÇÄ-ΓÇÄ05ΓÇÄ-ΓÇÄ15T00:00:00.000000000Z
+ Tenant policy date: ΓÇÄ2018ΓÇÄ-ΓÇÄ06ΓÇÄ-ΓÇÄ10T20:15:24.432457600Z
+ Enforce tenant policy: 1
+ ```
+
+ For example, changing between Enforced and Audit mode will result in the AuditOnly flag being modified (the above policy with AuditOnly=0 is in Enforced mode); changes to the custom banned password list are not directly reflected in the 30006 event above (and are not logged anywhere else for security reasons). Successfully downloading the policy from Azure after such a change will also include the modified custom banned password list.
+
+1. Run a test by trying to reset a new password on the test user account.
+
+ This step can be done from the command prompt window like so:
+
+ ```text
+ net.exe user ContosoUser <password>
+ ```
+
+ After running the command, you can get more information about the outcome of the command by looking in the event viewer. Password validation outcome events are documented in the [DC Agent Admin event log](howto-password-ban-bad-on-premises-monitor.md#dc-agent-admin-event-log) topic; you will use such events to validate the outcome of your test in addition to the interactive output from the net.exe commands.
+
+ Let's try an example: attempting to set a password that is banned by the Microsoft global list (note that list is [not documented](concept-password-ban-bad.md#global-banned-password-list) but we can test here against a known banned term). This example assumes that you have configured the policy to be in Enforced mode, and have added zero terms to the custom banned password list.
+
+ ```text
+ net.exe user ContosoUser PassWord
+ The password does not meet the password policy requirements. Check the minimum password length, password complexity and password history requirements.
+
+ More help is available by typing NET HELPMSG 2245.
+ ```
+
+ Per the documentation, because our test was a password reset operation you should see a 10017 and a 30005 event for the ContosoUser user.
+
+ The 10017 event should look like this example:
+
+ ```text
+ The reset password for the specified user was rejected because it did not comply with the current Azure password policy. Please see the correlated event log message for more details.
+
+ UserName: ContosoUser
+ FullName:
+ ```
+
+ The 30005 event should look like this example:
+
+ ```text
+ The reset password for the specified user was rejected because it matched at least one of the tokens present in the Microsoft global banned password list of the current Azure password policy.
+
+ UserName: ContosoUser
+ FullName:
+ ```
+
+ That was fun - let's try another example! This time we will attempt to set a password that is banned by the custom banned list while the policy is in Audit mode. This example assumes that you have done the following steps: configured the policy to be in Audit mode, added the term "lachrymose" to the custom banned password list, and synchronized the resultant new policy to the domain controller by cycling the DC agent service as described above.
+
+ Ok, set a variation of the banned password:
+
+ ```text
+ net.exe user ContosoUser LaChRymoSE!1
+ The command completed successfully.
+ ```
+
+ Remember, this time it succeeded because the policy is in Audit mode. You should see a 10025 and a 30007 event for the ContosoUser user.
+
+ The 10025 event should look like this example:
+
+ ```text
+ The reset password for the specified user would normally have been rejected because it did not comply with the current Azure password policy. The current Azure password policy is configured for audit-only mode so the password was accepted. Please see the correlated event log message for more details.
+
+ UserName: ContosoUser
+ FullName:
+ ```
+
+ The 30007 event should look like this example:
+
+ ```text
+ The reset password for the specified user would normally have been rejected because it matches at least one of the tokens present in the per-tenant banned password list of the current Azure password policy. The current Azure password policy is configured for audit-only mode so the password was accepted.
+
+ UserName: ContosoUser
+ FullName:
+ ```
+
+1. Continue testing various passwords of your choice and checking the results in the event viewer using the procedures outlined in the previous steps. If you need to change the policy in the Azure portal, don't forget to synchronize the new policy down to the DC agent as described earlier.
+
+We've covered procedures that enable you to do controlled testing of Azure AD Password Protection's password validation behavior. Resetting user passwords from the command line directly on a domain controller may seem an odd means of doing such testing, but as described previously it is designed to produce repeatable results. As you are testing various passwords, keep the [password evaluation algorithm](concept-password-ban-bad.md#how-are-passwords-evaluated) in mind as it may help to explain results that you did not expect.
+
+> [!WARNING]
+> When all testing is completed do not forget to delete any user accounts created for testing purposes!
+ ## Additional content The following links are not part of the core Azure AD Password Protection documentation but may be a useful source of additional information on the feature.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/cloud-sync/how-to-inbound-synch-ms-graph https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/cloud-sync/how-to-inbound-synch-ms-graph.md
@@ -1,5 +1,5 @@
Title: 'Inbound synchronization for cloud sync using MS Graph API'
+ Title: 'How to programmatically configure cloud sync using MS Graph API'
description: This topic describes how to enable inbound synchronization using just the Graph API
@@ -12,7 +12,7 @@
-# Inbound synchronization for cloud sync using MS Graph API
+# How to programmatically configure cloud sync using MS Graph API
The following document describes how to replicate a synchronization profile from scratch using only MSGraph APIs. The structure of how to do this consists of the following steps. They are:
@@ -22,6 +22,7 @@ The structure of how to do this consists of the following steps. They are:
- [Create Sync Job](#create-sync-job) - [Update targeted domain](#update-targeted-domain) - [Enable sync password hashes](#enable-sync-password-hashes-on-configuration-blade)
+- [Accidental deletes](#accidental-deletes)
- [Start sync job](#start-sync-job) - [Review status](#review-status)
@@ -210,6 +211,71 @@ Here, the highlighted "Domain" value is the name of the on-premises Active Direc
Add the Schema in the request body.
+## Accidental deletes
+This section will cover how to programmatically enable/disable and use [accidental deletes](how-to-accidental-deletes.md) programmatically.
++
+### Enabling and setting the threshold
+There are two per job settings that you can use, they are:
+
+ - DeleteThresholdEnabled - Enables accidental delete prevention for the job when set to 'true'. Set to 'true' by default.
+ - DeleteThresholdValue - Defines the maximum number of deletes that will be allowed in each execution of the job when accidental deletes prevention is enabled. The value is set to 500 by default. So, if the value is set to 500, the maximum number of deletes allowed will be 499 in each execution.
+
+The delete threshold settings are a part of the `SyncNotificationSettings` and can be modified via graph.
+
+We're going to need to update the SyncNotificationSettings this configuration is targeting, so update the secrets.
+
+ ```
+ PUT ΓÇô https://graph.microsoft.com/beta/servicePrincipals/[SERVICE_PRINCIPAL_ID]/synchronization/secrets
+ ```
+
+ Add the following Key/value pair in the below value array based on what youΓÇÖre trying to do:
+
+```
+ Request body -
+ {
+ "value":[
+ {
+ "key":"SyncNotificationSettings",
+ "value": "{\"Enabled\":true,\"Recipients\":\"foobar@xyz.com\",\"DeleteThresholdEnabled\":true,\"DeleteThresholdValue\":50}"
+ }
+ ]
+ }
++
+```
+
+The "Enabled" setting in the example above is for enabling/disabling notification emails when the job is quarantined.
++
+Currently, we do not support PATCH requests for secrets, so you will need to add all the values in the body of the PUT request(like in the example above) in order to preserve the other values.
+
+The existing values for all the secrets can be retrieved by using
+
+```
+GET https://graph.microsoft.com/beta/servicePrincipals/{id}/synchronization/secrets
+```
+
+### Allowing deletes
+To allow the deletes to flow through after the job goes into quarantine, you need to issue a restart with just "ForceDeletes" as the scope.
+
+```
+Request:
+POST https://graph.microsoft.com/beta/servicePrincipals/{id}/synchronization/jobs/{jobId}/restart
+```
+
+```
+Request Body:
+{
+ "criteria": {"resetScope": "ForceDeletes"}
+}
+```
++++++ ## Start sync job The job can be retrieved again via the following command:
active-directory https://docs.microsoft.com/en-us/azure/active-directory/cloud-sync/reference-powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/cloud-sync/reference-powershell.md
@@ -15,44 +15,40 @@
# AADCloudSyncTools PowerShell Module for Azure AD Connect cloud sync
-With the release of public preview refresh 2, Microsoft has introduced the AADCloudSyncTools PowerShell Module. This module provides a set of useful tools that you can use to help manage your Azure AD Connect Cloud Sync deployments.
+The AADCloudSyncTools module provides a set of useful tools that you can use to help manage your Azure AD Connect Cloud Sync deployments.
## Pre-requisites The following pre-requisites are required:-- This module uses MSAL authentication, so it requires MSAL.PS module installed. It no longer depends on Azure AD or Azure AD Preview. To verify, in an Admin PowerShell window, execute `Get-module MSAL.PS`. If the module is installed correctly you will get a response. You can use `Install-AADCloudSyncToolsPrerequisites` to install the latest version of MSAL.PS-- The AzureAD PowerShell module. Some of the cmdlets rely on pieces of the AzureAD PowerShell module to accomplish their tasks. To verify, in an Admin PowerShell window execute `Get-module AzureAD`. You should get a response. You can use `Install-AADCloudSyncToolsPrerequisites` to install the latest version of the AzureAD PowerShell module.-- Installing modules from PowerShell may enforce using TLS 1.2. To ensure you can install modules, set the following: \
-`[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12 `
+
+- All the prerequisites for this module can be automatically installed using `Install-AADCloudSyncToolsPrerequisites`
+- This module uses MSAL authentication, so it requires MSAL.PS module installed. To verify, in a PowerShell window, execute `Get-module MSAL.PS -ListAvailable`. If the module is installed correctly you will get a response. You can use `Install-AADCloudSyncToolsPrerequisites` to install the latest version of MSAL.PS
+- Although the AzureAD PowerShell module is not a pre-requisite for any functionality of this module, it is useful to have so it is also automatically installed with using `Install-AADCloudSyncToolsPrerequisites`.
+- Manually installing modules from PowerShell require TLS 1.2 enforcement. To ensure you can install modules, set the following in the PowerShell session before using
+ ```
+ Install-Module:
+ [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
+ ```
+ ## Install the AADCloudSyncTools PowerShell module To install and use the AADCloudSyncTools module use the following steps:
-1. Open Windows PowerShell with administrative privileges
-2. Type `[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12` and hit enter.
-3. Type or copy and paste the following:
- ``` powershell
- Import-module -Name "C:\Program Files\Microsoft Azure AD Connect Provisioning Agent\Utility\AADCloudSyncTools"
- ```
-3. Hit enter.
-4. To verify the module was installed, enter or copy and paste the following"
- ```powershell
- Get-module AADCloudSyncTools
- ```
-5. You should now see information about the module.
-6. Next run
- ``` powershell
- Install-AADCloudSyncToolsPrerequisites
- ```
-7. This will install the PowerShell Get modules. Close the PowerShell Window.
-8. Open Windows PowerShell with administrative privileges
-9. Import the module again using step 3.
-10. Run `Install-AADCloudSyncToolsPrerequisites` to install the MSAL and AzureAD modules
+1. Open Windows PowerShell with administrative privileges
+2. Type or copy and paste the following: `Import-module -Name "C:\Program Files\Microsoft Azure AD Connect Provisioning Agent\Utility\AADCloudSyncTools"`
+3. Hit enter.
+4. To verify the module was imported, enter or copy and paste the following: `Get-module AADCloudSyncTools`
+5. You should now see information about the module.
+6. Next, to install the AADCloudSyncTools module pre-requisites run: `Install-AADCloudSyncToolsPrerequisites`
+7. On the first run, the PoweShellGet module will be installed if not present. To load the new PowershellGet module close the PowerShell Window and open a new PowerShell session with administrative privileges.
+8. Import the module again using step 3.
+9. Run `Install-AADCloudSyncToolsPrerequisites` to install the MSAL and AzureAD modules
11. All pre-reqs should be successfully installed ![Install module](media/reference-powershell/install-1.png) + ## AADCloudSyncTools Cmdlets ### Connect-AADCloudSyncTools
-Uses AzureAD module to connect to Azure AD and the MSAL.PS module to request a token for Microsoft Graph
+Uses the MSAL.PS module to request a token for the Azure AD administrator to access Microsoft Graph
### Export-AADCloudSyncToolsLogs
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-certificate-credentials https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/active-directory-certificate-credentials.md
@@ -97,12 +97,12 @@ In the Azure app registration for the client application:
### Updating the application manifest
-Having hold of a certificate, you need to compute:
+After acquiring a certificate, compute these values:
- `$base64Thumbprint` - Base64-encoded value of the certificate hash - `$base64Value` - Base64-encoded value of the certificate raw data
-You also need to provide a GUID to identify the key in the application manifest (`$keyId`).
+Provide a GUID to identify the key in the application manifest (`$keyId`).
In the Azure app registration for the client application: 1. Select **Manifest** to open the application manifest.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-enterprise-app-role-management https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/active-directory-enterprise-app-role-management.md
@@ -26,7 +26,7 @@ By using Azure Active Directory (Azure AD), you can customize the claim type for
## When to use this feature
-If your application expects custom roles to be passed in a SAML response, you need to use this feature. You can create as many roles as you need to be passed back from Azure AD to your application.
+Use this feature if your application expects custom roles in the SAML response returned by Azure AD. You can create as many roles as you need.
## Create roles for an application
@@ -135,7 +135,7 @@ If your application expects custom roles to be passed in a SAML response, you ne
!["Edit Assignment" pane and "Select Role" pane](./media/active-directory-enterprise-app-role-management/graph-explorer-new6.png)
- You need to refresh your session in the Azure portal to see new roles.
+ Refresh your session in the Azure portal to see new roles.
1. Update the **Attributes** table to define a customized mapping of the role claim.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/api-find-an-api-how-to https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/api-find-an-api-how-to.md
@@ -16,7 +16,7 @@
# How to find a specific API needed for a custom-developed application
-Access to APIs require configuration of access scopes and roles. If you want to expose your resource application web APIs to client applications, you need to configure access scopes and roles for the API. If you want a client application to access a web API, you need to configure permissions to access the API in the app registration.
+Access to APIs require configuration of access scopes and roles. If you want to expose your resource application web APIs to client applications, configure access scopes and roles for the API. If you want a client application to access a web API, configure permissions to access the API in the app registration.
## Configuring a resource application to expose web APIs
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/authentication-flows-app-scenarios https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/authentication-flows-app-scenarios.md
@@ -159,7 +159,7 @@ For more information, see [Mobile app that calls web APIs](scenario-mobile-overv
You can use the Microsoft identity platform endpoint to secure web services like your app's RESTful web API. A protected web API is called through an access token. The token helps secure the API's data and authenticate incoming requests. The caller of a web API appends an access token in the authorization header of an HTTP request.
-If you want to protect your ASP.NET or ASP.NET Core web API, you need to validate the access token. For this validation, you use the ASP.NET JWT middleware. The validation is done by the [IdentityModel extensions for .NET](https://github.com/AzureAD/azure-activedirectory-identitymodel-extensions-for-dotnet/wiki) library and not by MSAL.NET.
+If you want to protect your ASP.NET or ASP.NET Core web API, validate the access token. For this validation, you use the ASP.NET JWT middleware. The validation is done by the [IdentityModel extensions for .NET](https://github.com/AzureAD/azure-activedirectory-identitymodel-extensions-for-dotnet/wiki) library and not by MSAL.NET.
For more information, see [Protected web API](scenario-protected-web-api-overview.md).
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/authentication-vs-authorization https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/authentication-vs-authorization.md
@@ -31,7 +31,7 @@ This article defines authentication and authorization. It also briefly covers ho
## Authentication and authorization using the Microsoft identity platform
-Creating apps that each maintain their own username and password information incurs a high administrative burden when you need to add or remove users across multiple apps. Instead, your apps can delegate that responsibility to a centralized identity provider.
+Creating apps that each maintain their own username and password information incurs a high administrative burden when adding or removing users across multiple apps. Instead, your apps can delegate that responsibility to a centralized identity provider.
Azure Active Directory (Azure AD) is a centralized identity provider in the cloud. Delegating authentication and authorization to it enables scenarios such as:
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-add-branding-in-azure-ad-apps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-add-branding-in-azure-ad-apps.md
@@ -23,7 +23,7 @@ When developing applications with the Microsoft identity platform, you'll need t
In this article, you will: - Learn about the two kinds of user accounts managed by Microsoft and how to refer to Azure AD accounts in your application-- Find out what you need to do to add the Microsoft logo for use in your app
+- Learn the requirements for using the Microsoft logo in your app
- Download the official **Sign in** or **Sign in with Microsoft** images to use in your app - Learn about the branding and navigation do's and don'ts
@@ -106,4 +106,4 @@ To download the official images for use in your app, right-click the one you wan
## Navigation DoΓÇÖs and DonΓÇÖts
-**DO** provide a way for users to sign out and switch to another user account. While most people have a single personal account from Microsoft/Facebook/Google/Twitter, people are often associated with more than one organization. Support for multiple signed-in users is coming soon.
+**DO** provide a way for users to sign out and switch to another user account. While most people have a single personal account from Microsoft/Facebook/Google/Twitter, people are often associated with more than one organization. Support for multiple signed-in users is coming soon.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-authenticate-service-principal-powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-authenticate-service-principal-powershell.md
@@ -68,7 +68,7 @@ The example sleeps for 20 seconds to allow some time for the new service princip
You can scope the role assignment to a specific resource group by using the **ResourceGroupName** parameter. You can scope to a specific resource by also using the **ResourceType** and **ResourceName** parameters.
-If you **do not have Windows 10 or Windows Server 2016**, you need to download the [Self-signed certificate generator](https://gallery.technet.microsoft.com/scriptcenter/Self-signed-certificate-5920a7c6/) from Microsoft Script Center. Extract its contents and import the cmdlet you need.
+If you **do not have Windows 10 or Windows Server 2016**, download the [Self-signed certificate generator](https://gallery.technet.microsoft.com/scriptcenter/Self-signed-certificate-5920a7c6/) from Microsoft Script Center. Extract its contents and import the cmdlet you need.
```powershell # Only run if you could not use New-SelfSignedCertificate
@@ -87,7 +87,7 @@ $cert = Get-ChildItem -path Cert:\CurrentUser\my | where {$PSitem.Subject -eq 'C
### Provide certificate through automated PowerShell script
-Whenever you sign in as a service principal, you need to provide the tenant ID of the directory for your AD app. A tenant is an instance of Azure AD.
+Whenever you sign in as a service principal, provide the tenant ID of the directory for your AD app. A tenant is an instance of Azure AD.
```powershell $TenantId = (Get-AzSubscription -SubscriptionName "Contoso Default").TenantId
@@ -147,7 +147,7 @@ Param (
``` ### Provide certificate through automated PowerShell script
-Whenever you sign in as a service principal, you need to provide the tenant ID of the directory for your AD app. A tenant is an instance of Azure AD.
+Whenever you sign in as a service principal, provide the tenant ID of the directory for your AD app. A tenant is an instance of Azure AD.
```powershell Param (
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-convert-app-to-be-multi-tenant https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-convert-app-to-be-multi-tenant.md
@@ -121,7 +121,7 @@ App-only permissions always require a tenant administratorΓÇÖs consent. If your
Certain delegated permissions also require a tenant administratorΓÇÖs consent. For example, the ability to write back to Azure AD as the signed in user requires a tenant administratorΓÇÖs consent. Like app-only permissions, if an ordinary user tries to sign in to an application that requests a delegated permission that requires administrator consent, your application receives an error. Whether a permission requires admin consent is determined by the developer that published the resource, and can be found in the documentation for the resource. The permissions documentation for the [Microsoft Graph API][MSFT-Graph-permission-scopes] indicate which permissions require admin consent.
-If your application uses permissions that require admin consent, you need to have a gesture such as a button or link where the admin can initiate the action. The request your application sends for this action is the usual OAuth2/OpenID Connect authorization request that also includes the `prompt=admin_consent` query string parameter. Once the admin has consented and the service principal is created in the customerΓÇÖs tenant, subsequent sign-in requests do not need the `prompt=admin_consent` parameter. Since the administrator has decided the requested permissions are acceptable, no other users in the tenant are prompted for consent from that point forward.
+If your application uses permissions that require admin consent, have a gesture such as a button or link where the admin can initiate the action. The request your application sends for this action is the usual OAuth2/OpenID Connect authorization request that also includes the `prompt=admin_consent` query string parameter. Once the admin has consented and the service principal is created in the customerΓÇÖs tenant, subsequent sign-in requests do not need the `prompt=admin_consent` parameter. Since the administrator has decided the requested permissions are acceptable, no other users in the tenant are prompted for consent from that point forward.
A tenant administrator can disable the ability for regular users to consent to applications. If this capability is disabled, admin consent is always required for the application to be used in the tenant. If you want to test your application with end-user consent disabled, you can find the configuration switch in the [Azure portal][AZURE-portal] in the **[User settings](https://portal.azure.com/#blade/Microsoft_AAD_IAM/StartboardApplicationsMenuBlade/UserSettings/menuId/)** section under **Enterprise applications**.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-create-service-principal-portal.md
@@ -111,7 +111,7 @@ The next section shows how to get values that are needed when signing in program
## Get tenant and app ID values for signing in
-When programmatically signing in, you need to pass the tenant ID with your authentication request and the application ID. You also need a certificate or an authentication key (described in the following section). To get those values, use the following steps:
+When programmatically signing in, pass the tenant ID with your authentication request and the application ID. You also need a certificate or an authentication key (described in the following section). To get those values, use the following steps:
1. Select **Azure Active Directory**. 1. From **App registrations** in Azure AD, select your application.
@@ -158,7 +158,7 @@ To upload the certificate:
1. Select **Add**.
-After registering the certificate with your application in the application registration portal, you need to enable the client application code to use the certificate.
+After registering the certificate with your application in the application registration portal, enable the client application code to use the certificate.
### Option 2: Create a new application secret
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/migrate-adal-msal-java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/migrate-adal-msal-java.md
@@ -78,7 +78,7 @@ MSAL for Java adds a [token cache](msal-acquire-cache-tokens.md) to simplify man
In v1.0, if you use the `https://login.microsoftonline.com/common` authority, users can sign in with any Azure Active Directory (AAD) account (for any organization).
-If you use the `https://login.microsoftonline.com/common` authority in v2.0, users can sign in with any AAD organization, or even a Microsoft personal account (MSA). In MSAL for Java, if you want to restrict login to any AAD account, you need to use the `https://login.microsoftonline.com/organizations` authority (which is the same behavior as with ADAL4J). To specify an authority, set the `authority` parameter in the [PublicClientApplication.Builder](https://javadoc.io/doc/com.microsoft.azure/msal4j/1.0.0/com/microsoft/aad/msal4j/PublicClientApplication.Builder.html) method when you create your `PublicClientApplication` class.
+If you use the `https://login.microsoftonline.com/common` authority in v2.0, users can sign in with any AAD organization, or even a Microsoft personal account (MSA). In MSAL for Java, if you want to restrict login to any AAD account, use the `https://login.microsoftonline.com/organizations` authority (which is the same behavior as with ADAL4J). To specify an authority, set the `authority` parameter in the [PublicClientApplication.Builder](https://javadoc.io/doc/com.microsoft.azure/msal4j/1.0.0/com/microsoft/aad/msal4j/PublicClientApplication.Builder.html) method when you create your `PublicClientApplication` class.
## v1.0 and v2.0 tokens
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/migrate-android-adal-msal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/migrate-android-adal-msal.md
@@ -68,7 +68,7 @@ In your app registration in the portal, you will see an **API permissions** tab.
With ADAL and the Azure AD v1 endpoint, user consent to resources they own was granted on first use. With MSAL and the Microsoft identity platform, consent can be requested incrementally. Incremental consent is useful for permissions that a user may consider high privilege, or may otherwise question if not provided with a clear explanation of why the permission is required. In ADAL, those permissions may have resulted in the user abandoning signing in to your app. > [!TIP]
-> We recommend the use of incremental consent in scenarios where you need to provide additional context to your user about why your app needs a permission.
+> Use incremental consent to provide additional context to your users about why your app needs a permission.
### Admin consent
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-client-application-configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-client-application-configuration.md
@@ -70,7 +70,7 @@ If you don't specify an instance, your app will target the Azure public cloud in
The sign-in audience depends on the business needs for your app: -- If you're a line of business (LOB) developer, you'll probably produce a single-tenant application that will be used only in your organization. In that case, you need to specify the organization, either by its tenant ID (the ID of your Azure AD instance) or by a domain name associated with the Azure AD instance.
+- If you're a line of business (LOB) developer, you'll probably produce a single-tenant application that will be used only in your organization. In that case, specify the organization by its tenant ID (the ID of your Azure AD instance) or by a domain name associated with the Azure AD instance.
- If you're an ISV, you might want to sign in users with their work and school accounts in any organization or in some organizations (multitenant app). But you might also want to have users sign in with their personal Microsoft accounts. ### How to specify the audience in your code/configuration
@@ -120,10 +120,9 @@ If you're a public client app developer who's using MSAL:
| UWP | value of `WebAuthenticationBroker.GetCurrentApplicationCallbackUri()`. This enables SSO with the browser by setting the value to the result of WebAuthenticationBroker.GetCurrentApplicationCallbackUri() which you need to register | | .NET Core | `https://localhost`. This enables the user to use the system browser for interactive authentication since .NET Core doesn't have a UI for the embedded web view at the moment. | -- You don't need to add a redirect URI if you're building a Xamarin Android and iOS application that doesn't support broker (the
- redirect URI is automatically set to `msal{ClientId}://auth` for Xamarin Android and iOS
+- You don't need to add a redirect URI if you're building a Xamarin Android and iOS application that doesn't support the broker redirect URI. It is automatically set to `msal{ClientId}://auth` for Xamarin Android and iOS.
-- You need to configure the redirect URI in [App registrations](https://aka.ms/appregistrations):
+- Configure the redirect URI in [App registrations](https://aka.ms/appregistrations):
![Redirect URI in App registrations](media/msal-client-application-configuration/redirect-uri.png)
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-compare-msal-js-and-adal-js https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-compare-msal-js-and-adal-js.md
@@ -45,7 +45,7 @@ However, you still need to use ADAL.js if your application needs to sign in user
In v1.0, using the `https://login.microsoftonline.com/common` authority will allow users to sign in with any Azure AD account (for any organization).
-In v2.0, using the `https://login.microsoftonline.com/common` authority, will allow users to sign in with any Azure AD organization account or a Microsoft personal account (MSA). To restrict the sign in to only Azure AD accounts (same behavior as with ADAL.js), you need to use `https://login.microsoftonline.com/organizations`. For details, see the `authority` config option in [Initialize using MSAL.js](msal-js-initializing-client-applications.md).
+In v2.0, using the `https://login.microsoftonline.com/common` authority, will allow users to sign in with any Azure AD organization account or a Microsoft personal account (MSA). To restrict the sign in to only Azure AD accounts (same behavior as with ADAL.js), use `https://login.microsoftonline.com/organizations`. For details, see the `authority` config option in [Initialize using MSAL.js](msal-js-initializing-client-applications.md).
### Scopes for acquiring tokens * Scope instead of resource parameter in authentication requests to acquire tokens
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-error-handling-dotnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-error-handling-dotnet.md
@@ -33,8 +33,8 @@ Here are the common exceptions that might be thrown and some possible mitigation
| Exception | Error code | Mitigation| | | | |
-| [MsalUiRequiredException](/dotnet/api/microsoft.identity.client.msaluirequiredexception) | AADSTS65001: The user or administrator has not consented to use the application with ID '{appId}' named '{appName}'. Send an interactive authorization request for this user and resource.| You need to get user consent first. If you aren't using .NET Core (which doesn't have any Web UI), call (once only) `AcquireTokeninteractive`. If you are using .NET core or don't want to do an `AcquireTokenInteractive`, the user can navigate to a URL to give consent: `https://login.microsoftonline.com/common/oauth2/v2.0/authorize?client_id={clientId}&response_type=code&scope=user.read`. to call `AcquireTokenInteractive`: `app.AcquireTokenInteractive(scopes).WithAccount(account).WithClaims(ex.Claims).ExecuteAsync();`|
-| [MsalUiRequiredException](/dotnet/api/microsoft.identity.client.msaluirequiredexception) | AADSTS50079: The user is required to use [multi-factor authentication (MFA)](../authentication/concept-mfa-howitworks.md).| There is no mitigation. If MFA is configured for your tenant and Azure Active Directory (AAD) decides to enforce it, you need to fall back to an interactive flow such as `AcquireTokenInteractive` or `AcquireTokenByDeviceCode`.|
+| [MsalUiRequiredException](/dotnet/api/microsoft.identity.client.msaluirequiredexception) | AADSTS65001: The user or administrator has not consented to use the application with ID '{appId}' named '{appName}'. Send an interactive authorization request for this user and resource.| Get user consent first. If you aren't using .NET Core (which doesn't have any Web UI), call (once only) `AcquireTokeninteractive`. If you are using .NET core or don't want to do an `AcquireTokenInteractive`, the user can navigate to a URL to give consent: `https://login.microsoftonline.com/common/oauth2/v2.0/authorize?client_id={clientId}&response_type=code&scope=user.read`. to call `AcquireTokenInteractive`: `app.AcquireTokenInteractive(scopes).WithAccount(account).WithClaims(ex.Claims).ExecuteAsync();`|
+| [MsalUiRequiredException](/dotnet/api/microsoft.identity.client.msaluirequiredexception) | AADSTS50079: The user is required to use [multi-factor authentication (MFA)](../authentication/concept-mfa-howitworks.md).| There is no mitigation. If MFA is configured for your tenant and Azure Active Directory (AAD) decides to enforce it, fall back to an interactive flow such as `AcquireTokenInteractive` or `AcquireTokenByDeviceCode`.|
| [MsalServiceException](/dotnet/api/microsoft.identity.client.msalserviceexception) |AADSTS90010: The grant type isn't supported over the */common* or */consumers* endpoints. Use the */organizations* or tenant-specific endpoint. You used */common*.| As explained in the message from Azure AD, the authority needs to have a tenant or otherwise */organizations*.| | [MsalServiceException](/dotnet/api/microsoft.identity.client.msalserviceexception) | AADSTS70002: The request body must contain the following parameter: `client_secret or client_assertion`.| This exception can be thrown if your application was not registered as a public client application in Azure AD. In the Azure portal, edit the manifest for your application and set `allowPublicClient` to `true`. | | [MsalClientException](/dotnet/api/microsoft.identity.client.msalclientexception)| `unknown_user Message`: Could not identify logged in user| The library was unable to query the current Windows logged-in user or this user isn't AD or Azure AD joined (work-place joined users aren't supported). Mitigation 1: on UWP, check that the application has the following capabilities: Enterprise Authentication, Private Networks (Client and Server), User Account Information. Mitigation 2: Implement your own logic to fetch the username (for example, john@contoso.com) and use the `AcquireTokenByIntegratedWindowsAuth` form that takes in the username.|
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-js-use-ie-browser https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-js-use-ie-browser.md
@@ -36,7 +36,7 @@ This is because Internet Explorer does not support JavaScript promises natively.
Deploying your application to production (for instance in Azure Web apps) normally works fine, provided the end user has accepted popups. We tested it with Internet Explorer 11. ### Running locally
-If you want to run and debug locally your application running in Internet Explorer, you need to be aware of the following considerations (assume that you want to run your application as *http://localhost:1234*):
+If you want to run and debug locally your application running in Internet Explorer, be aware of the following considerations (assume that you want to run your application as *http://localhost:1234*):
- Internet Explorer has a security mechanism named "protected mode", which prevents MSAL.js from working correctly. Among the symptoms, after you sign in, the page can be redirected to http://localhost:1234/null.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-national-cloud https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-national-cloud.md
@@ -37,7 +37,7 @@ Before you start, make sure that you meet these prerequisites.
### Choose the appropriate identities
-[Azure Government](../../azure-government/index.yml) applications can use Azure AD Government identities and Azure AD Public identities to authenticate users. Because you can use any of these identities, you need to decide which authority endpoint you should choose for your scenario:
+[Azure Government](../../azure-government/index.yml) applications can use Azure AD Government identities and Azure AD Public identities to authenticate users. Because you can use any of these identities, decide which authority endpoint you should choose for your scenario:
- Azure AD Public: Commonly used if your organization already has an Azure AD Public tenant to support Microsoft 365 (Public or GCC) or another application. - Azure AD Government: Commonly used if your organization already has an Azure AD Government tenant to support Office 365 (GCC High or DoD) or is creating a new tenant in Azure AD Government.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-net-aad-b2c-considerations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-net-aad-b2c-considerations.md
@@ -23,7 +23,7 @@
You can use MSAL.NET to sign in users with social identities by using [Azure Active Directory B2C (Azure AD B2C)](../../active-directory-b2c/overview.md). Azure AD B2C is built around the notion of policies. In MSAL.NET, specifying a policy translates to providing an authority. -- When you instantiate the public client application, you need to specify the policy as part of the authority.
+- When you instantiate the public client application, specify the policy as part of the authority.
- When you want to apply a policy, call an override of `AcquireTokenInteractive` that accepts the `authority` parameter. This article applies to MSAL.NET 3.x. For MSAL.NET 2.x, see [Azure AD B2C specifics in MSAL 2.x](https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/wiki/AAD-B2C-Specifics-MSAL-2.x) in the MSAL.NET Wiki on GitHub.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-net-instantiate-confidential-client-config-options https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-net-instantiate-confidential-client-config-options.md
@@ -25,7 +25,7 @@ Before initializing an application, you first need to [register](quickstart-regi
- The client ID (a string representing a GUID) - The identity provider URL (named the instance) and the sign-in audience for your application. These two parameters are collectively known as the authority.-- The tenant ID if you are writing a line of business application solely for your organization (also named single-tenant application).
+- The tenant ID if you are writing a line-of-business application solely for your organization (also named single-tenant application).
- The application secret (client secret string) or certificate (of type X509Certificate2) if it's a confidential client app. - For web apps, and sometimes for public client apps (in particular when your app needs to use a broker), you'll have also set the redirectUri where the identity provider will contact back your application with the security tokens.
@@ -57,7 +57,7 @@ An ASP.NET Core application configuration is described in an *appsettings.json*
Starting in MSAL.NET v3.x, you can configure your confidential client application from the config file.
-In the class where you want configure and instantiate your application, you need to declare a `ConfidentialClientApplicationOptions` object. Bind the configuration read from the source (including the appconfig.json file) to the instance of the application options, using the `IConfigurationRoot.Bind()` method from the [Microsoft.Extensions.Configuration.Binder nuget package](https://www.nuget.org/packages/Microsoft.Extensions.Configuration.Binder):
+In the class where you want to configure and instantiate your application, declare a `ConfidentialClientApplicationOptions` object. Bind the configuration read from the source (including the appconfig.json file) to the instance of the application options, using the `IConfigurationRoot.Bind()` method from the [Microsoft.Extensions.Configuration.Binder NuGet package](https://www.nuget.org/packages/Microsoft.Extensions.Configuration.Binder):
```csharp using Microsoft.Identity.Client;
@@ -76,7 +76,7 @@ app = ConfidentialClientApplicationBuilder.CreateWithApplicationOptions(_applica
``` ## Add runtime configuration
-In a confidential client application, you usually have a cache per user. Therefore you will need to get the cache associated with the user and inform the application builder that you want to use it. In the same way, you might have a dynamically computed redirect URI. In this case the code is the following:
+In a confidential client application, you usually have a cache per user. Therefore you will need to get the cache associated with the user and inform the application builder that you want to use it. In the same way, you might have a dynamically computed redirect URI. In this case, the code is as follows:
```csharp IConfidentialClientApplication app;
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-net-migration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-net-migration.md
@@ -143,7 +143,7 @@ MSAL.NET makes the token cache a sealed class, removing the ability to extend it
In v1.0, if you use the `https://login.microsoftonline.com/common` authority, you will allow users to sign in with any AAD account (for any organization). See [Authority Validation in ADAL.NET](https://github.com/AzureAD/azure-activedirectory-library-for-dotnet/wiki/AuthenticationContext:-the-connection-to-Azure-AD#authority-validation)
-If you use the `https://login.microsoftonline.com/common` authority in v2.0, you will allow users to sign in with any AAD organization or a Microsoft personal account (MSA). In MSAL.NET, if you want to restrict login to any AAD account (same behavior as with ADAL.NET), you need to use `https://login.microsoftonline.com/organizations`. For details, see the `authority` parameter in [public client application](https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/wiki/Client-Applications#publicclientapplication).
+If you use the `https://login.microsoftonline.com/common` authority in v2.0, you will allow users to sign in with any AAD organization or a Microsoft personal account (MSA). In MSAL.NET, if you want to restrict login to any AAD account (same behavior as with ADAL.NET), use `https://login.microsoftonline.com/organizations`. For details, see the `authority` parameter in [public client application](https://github.com/AzureAD/microsoft-authentication-library-for-dotnet/wiki/Client-Applications#publicclientapplication).
## v1.0 and v2.0 tokens
@@ -180,7 +180,7 @@ string[] scopes = { ResourceId + "Directory.Read", ResourceId + "Directory.Write
#### Warning: Should you have one or two slashes in the scope corresponding to a v1.0 web API
-If you want to write the scope corresponding to the Azure Resource Manager API (https://management.core.windows.net/), you need to request the following scope (note the two slashes)
+If you want to write the scope corresponding to the Azure Resource Manager API (https://management.core.windows.net/), request the following scope (note the two slashes).
```csharp var scopes = new[] {"https://management.core.windows.net//user_impersonation"};
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-net-token-cache-serialization https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-net-token-cache-serialization.md
@@ -31,7 +31,7 @@ In MSAL.NET, an in-memory token cache is provided by default. Serialization is p
## Custom serialization for Windows desktop apps and web apps/web APIs
-Remember, custom serialization isn't available on mobile platforms (UWP, Xamarin.iOS, and Xamarin.Android). MSAL already defines a secure and performant serialization mechanism for these platforms. .NET desktop and .NET Core applications, however, have varied architectures and MSAL can't implement a general-purpose serialization mechanism. For example, web sites may choose to store tokens in a Redis cache, or desktop apps store tokens in an encrypted file. So serialization isn't provided out-of-the-box. To have a persistent token cache application in .NET desktop or .NET Core, you need to customize the serialization.
+Remember, custom serialization isn't available on mobile platforms (UWP, Xamarin.iOS, and Xamarin.Android). MSAL already defines a secure and performant serialization mechanism for these platforms. .NET desktop and .NET Core applications, however, have varied architectures and MSAL can't implement a general-purpose serialization mechanism. For example, web sites may choose to store tokens in a Redis cache, or desktop apps store tokens in an encrypted file. So serialization isn't provided out-of-the-box. To have a persistent token cache application in .NET desktop or .NET Core, customize the serialization.
The following classes and interfaces are used in token cache serialization:
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-net-use-brokers-with-xamarin-apps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-net-use-brokers-with-xamarin-apps.md
@@ -88,7 +88,7 @@ This method is invoked every time the application is started. It's used as an op
### Step 4: Set UIViewController()
-Still in the *AppDelegate.cs* file, you need to set an object window. You don't typically need to set the object window for Xamarin iOS, but you do need an object window to send and receive responses from the broker.
+Still in the *AppDelegate.cs* file, set an object window. You don't typically need to set the object window for Xamarin iOS, but you do need an object window to send and receive responses from the broker.
To set up the object window:
@@ -234,7 +234,7 @@ result = await app.AcquireTokenInteractive(scopes)
### Step 4: Add a redirect URI to your app registration
-MSAL uses URLs to invoke the broker and then return to your app. To complete that round trip, you need to register a **Redirect URI** for your app by using the [Azure portal](https://portal.azure.com).
+MSAL uses URLs to invoke the broker and then return to your app. To complete that round trip, register a **Redirect URI** for your app by using the [Azure portal](https://portal.azure.com).
The format of the redirect URI for your application depends on the certificate used to sign the APK. For example:
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-net-user-gets-consent-for-multiple-resources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-net-user-gets-consent-for-multiple-resources.md
@@ -49,7 +49,7 @@ var result = await app.AcquireTokenInteractive(scopesForCustomerApi)
.ExecuteAsync(); ```
-This will get you an access token for the first web API. Then, when you need to access the second web API you can silently acquire the token from the token cache:
+This will get you an access token for the first web API. Then, to access the second web API you can silently acquire the token from the token cache:
```csharp AcquireTokenSilent(scopesForVendorApi, accounts.FirstOrDefault()).ExecuteAsync();
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-net-web-browsers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-net-web-browsers.md
@@ -38,7 +38,7 @@ It's important to understand that when acquiring a token interactively, the cont
MSAL.NET is a multi-framework library and has framework-specific code to host a browser in a UI control (for example, on .NET Classic it uses WinForms, on Xamarin it uses native mobile controls etc.). This control is called `embedded` web UI. Alternatively, MSAL.NET is also able to kick off the system OS browser.
-Generally, it's recommended that you use the platform default, and this is typically the system browser. The system browser is better at remembering the users that have logged in before. If you need to change this behavior, use `WithUseEmbeddedWebView(bool)`
+Generally, it's recommended that you use the platform default, and this is typically the system browser. The system browser is better at remembering the users that have logged in before. To change this behavior, use `WithUseEmbeddedWebView(bool)`
### At a glance
@@ -138,7 +138,7 @@ You can also enable embedded webviews in Xamarin.iOS and Xamarin.Android apps. S
As a developer using MSAL.NET targeting Xamarin, you may choose to use either embedded webviews or system browsers. This is your choice depending on the user experience and security concerns you want to target.
-Currently, MSAL.NET doesn't yet support the Android and iOS brokers. Therefore if you need to provide single sign-on (SSO), the system browser might still be a better option. Supporting brokers with the embedded web browser is on the MSAL.NET backlog.
+Currently, MSAL.NET doesn't yet support the Android and iOS brokers. Therefore to provide single sign-on (SSO), the system browser might still be a better option. Supporting brokers with the embedded web browser is on the MSAL.NET backlog.
### Differences between embedded webview and system browser There are some visual differences between embedded webview and system browser in MSAL.NET.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-python-adfs-support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-python-adfs-support.md
@@ -54,7 +54,7 @@ When you connect directory to AD FS, the authority you'll want to use to build y
MSAL Python supports ADFS 2019.
-It does not support a direct connection to ADFS 2016 or ADFS v2. If you need to support scenarios requiring a direct connection to ADFS 2016, use the latest version of ADAL Python. Once you have upgraded your on-premises system to ADFS 2019, you can use MSAL Python.
+It does not support a direct connection to ADFS 2016 or ADFS v2. To support scenarios requiring a direct connection to ADFS 2016, use the latest version of ADAL Python. Once you have upgraded your on-premises system to ADFS 2019, you can use MSAL Python.
## Next steps
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/msal-v1-app-scopes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-v1-app-scopes.md
@@ -34,7 +34,7 @@ var scopes = new [] { ResourceId+"/user_impersonation"};
var scopes = [ ResourceId + "/user_impersonation"]; ```
-To read and write with MSAL.NET Azure AD using the Microsoft Graph API (https:\//graph.microsoft.com/), you need to create a list of scopes as shown in the following examples:
+To read and write with MSAL.NET Azure AD using the Microsoft Graph API (https:\//graph.microsoft.com/), create a list of scopes as shown in the following examples:
```csharp string ResourceId = "https://graph.microsoft.com/";
@@ -46,7 +46,7 @@ var ResourceId = "https://graph.microsoft.com/";
var scopes = [ ResourceId + "Directory.Read", ResourceID + "Directory.Write"]; ```
-To write the scope corresponding to the Azure Resource Manager API (https:\//management.core.windows.net/), you need to request the following scope (note the two slashes):
+To write the scope corresponding to the Azure Resource Manager API (https:\//management.core.windows.net/), request the following scope (note the two slashes):
```csharp var scopes = new[] {"https://management.core.windows.net//user_impersonation"};
@@ -56,7 +56,7 @@ var result = await app.AcquireTokenInteractive(scopes).ExecuteAsync();
``` > [!NOTE]
-> You need to use two slashes because the Azure Resource Manager API expects a slash in its audience claim (aud), and then there is a slash to separate the API name from the scope.
+> Use two slashes because the Azure Resource Manager API expects a slash in its audience claim (aud), and then there is a slash to separate the API name from the scope.
The logic used by Azure AD is the following:
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-configure-app-access-web-apis https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-configure-app-access-web-apis.md
@@ -107,7 +107,7 @@ Some permissions, like Microsoft Graph's *Files.Read.All* permission, require ad
### Configure client credentials
-Apps that use application permissions authenticate as themselves by using their own credentials, without requiring any user interaction. Before your application (or API) can access Microsoft Graph, your own web API, or any another API by using application permissions, you need to configure that client app's credentials.
+Apps that use application permissions authenticate as themselves by using their own credentials, without requiring any user interaction. Before your application (or API) can access Microsoft Graph, your own web API, or any another API by using application permissions, you must configure that client app's credentials.
For more information about configuring an app's credentials, see the [Add credentials](quickstart-register-app.md#add-credentials) section of [Quickstart: Register an application with the Microsoft identity platform](quickstart-register-app.md).
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-create-new-tenant https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-create-new-tenant.md
@@ -19,57 +19,62 @@
# Quickstart: Set up a tenant
-The Microsoft identity platform allows developers to build apps targeting a wide variety of custom Microsoft 365 environments and identities. To get started using Microsoft identity platform, you will need access to an environment, also called an Azure AD tenant, that can register and manage apps, have access to Microsoft 365 data, and deploy custom Conditional Access and tenant restrictions.
+To build apps that use the Microsoft identity platform for identity and access management, you need access to an Azure Active Directory (Azure AD) *tenant*. It's in the Azure AD tenant that you register and manage your apps, configure their access to data in Microsoft 365 and other web APIs, and enable features like Conditional Access.
-A tenant is a representation of an organization. It's a dedicated instance of Azure AD that an organization or app developer receives when the organization or app developer creates a relationship with Microsoft-- like signing up for Azure, Microsoft Intune, or Microsoft 365.
+A tenant represents an organization. It's a dedicated instance of Azure AD that an organization or app developer receives at the beginning of a relationship with Microsoft. That relationship could start with signing up for Azure, Microsoft Intune, or Microsoft 365, for example.
-Each Azure AD tenant is distinct and separate from other Azure AD tenants and has its own representation of work and school identities, consumer identities (if it's an Azure AD B2C tenant), and app registrations. An app registration inside of your tenant can allow authentications from accounts only within your tenant or all tenants.
+Each Azure AD tenant is distinct and separate from other Azure AD tenants. It has its own representation of work and school identities, consumer identities (if it's an Azure AD B2C tenant), and app registrations. An app registration inside your tenant can allow authentications only from accounts within your tenant or all tenants.
## Prerequisites -- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+An Azure account that has an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-## Determining environment type
+## Determining the environment type
-There are two types of environments you can create. Deciding which you need is based solely on the types of users your app will authenticate.
+You can create two types of environments. The environment depends solely on the types of users your app will authenticate.
-* Work and school (Azure AD accounts) or Microsoft accounts (such as outlook.com and live.com)
-* Social and local accounts (Azure AD B2C)
+This quickstart addresses two scenarios for the type of app you want to build:
-The quickstart is broken into two scenarios depending on the type of app you want to build.
+* Work and school (Azure AD) accounts or Microsoft accounts (such as Outlook.com and Live.com)
+* Social and local (Azure AD B2C) accounts
## Work and school accounts, or personal Microsoft accounts
-### Use an existing tenant
+To build an environment for either work and school accounts or personal Microsoft accounts, you can use an existing Azure AD tenant or create a new one.
+### Use an existing Azure AD tenant
-Many developers already have tenants through services or subscriptions that are tied to Azure AD tenants such as Microsoft 365 or Azure subscriptions.
+Many developers already have tenants through services or subscriptions that are tied to Azure AD tenants, such as Microsoft 365 or Azure subscriptions.
-1. To check the tenant, sign in to the <a href="https://portal.azure.com/" target="_blank">Azure portal<span class="docon docon-navigate-external x-hidden-focus"></span></a> with the account you want to use to manage your application.
-1. Check the upper right corner. If you have a tenant, you'll automatically be logged in and can see the tenant name directly under your account name.
- * Hover over your account name on the upper right-hand side of the Azure portal to see your name, email, directory / tenant ID (a GUID), and your domain.
+To check the tenant:
+
+1. Sign in to the <a href="https://portal.azure.com/" target="_blank">Azure portal<span class="docon docon-navigate-external x-hidden-focus"></span></a>. Use the account you'll use to manage your application.
+1. Check the upper-right corner. If you have a tenant, you'll automatically be signed in. You see the tenant name directly under your account name.
+ * Hover over your account name to see your name, email address, directory or tenant ID (a GUID), and domain.
* If your account is associated with multiple tenants, you can select your account name to open a menu where you can switch between tenants. Each tenant has its own tenant ID. > [!TIP]
-> If you need to find the tenant ID, you can:
-> * Hover over your account name to get the directory / tenant ID, or
-> * Search and select **Azure Active Directory > Properties > Tenant ID** in the Azure portal
+> To find the tenant ID, you can:
+> * Hover over your account name to get the directory or tenant ID.
+> * Search and select **Azure Active Directory** > **Properties** > **Tenant ID** in the Azure portal.
-If you don't have an existing tenant associated with your account, you'll see a GUID under your account name and you won't be able to perform actions like registering apps until you follow the steps of the next section.
+If you don't have a tenant associated with your account, you'll see a GUID under your account name. You won't be able to do actions like registering apps until you create an Azure AD tenant.
### Create a new Azure AD tenant
-If you don't already have an Azure AD tenant or want to create a new one for development, see the [quickstart](../fundamentals/active-directory-access-create-new-tenant.md) or simply follow the [directory creation experience](https://portal.azure.com/#create/Microsoft.AzureActiveDirectory). You will have to provide the following info to create your new tenant:
+If you don't already have an Azure AD tenant or if you want to create a new one for development, see [Create a new tenant in Azure AD](../fundamentals/active-directory-access-create-new-tenant.md). Or use the [directory creation experience](https://portal.azure.com/#create/Microsoft.AzureActiveDirectory) in the Azure portal.
+
+You'll provide the following information to create your new tenant:
- **Organization name**-- **Initial domain** - this will be part of *.onmicrosoft.com. You can customize the domain more later.
+- **Initial domain** - This domain is part of *.onmicrosoft.com. You can customize the domain later.
- **Country or region** > [!NOTE]
-> When naming your tenant, use alphanumeric characters. Special characters are not allowed. The name must not exceed 256 characters.
+> When naming your tenant, use alphanumeric characters. Special characters aren't allowed. The name must not exceed 256 characters.
## Social and local accounts
-To begin building apps that sign in social and local accounts, you'll need to create an Azure AD B2C tenant. To begin, follow [creating an Azure AD B2C tenant](../../active-directory-b2c/tutorial-create-tenant.md).
+To begin building apps that sign in social and local accounts, create an Azure AD B2C tenant. To begin, see [Create an Azure AD B2C tenant](../../active-directory-b2c/tutorial-create-tenant.md).
## Next steps
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-register-app https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-register-app.md
@@ -22,12 +22,12 @@
In this quickstart, you register an app in the Azure portal so the Microsoft identity platform can provide authentication and authorization services for your application and its users.
-Each application you want the Microsoft identity platform to perform identity and access management (IAM) for needs to be registered. Whether it's a client application like a web or mobile app, or it's a web API that backs a client app, registering it establishes a trust relationship between your application and the identity provider, the Microsoft identity platform.
+The Microsoft identity platform performs identity and access management (IAM) only for registered applications. Whether it's a client application like a web or mobile app, or it's a web API that backs a client app, registering it establishes a trust relationship between your application and the identity provider, the Microsoft identity platform.
## Prerequisites
-* An Azure account with an active subscription - [create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F)
-* Completion of [Quickstart: Set up a tenant](quickstart-create-new-tenant.md)
+* An Azure account that has an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+* Completion of the [Set up a tenant](quickstart-create-new-tenant.md) quickstart.
## Register an application
@@ -36,33 +36,33 @@ Registering your application establishes a trust relationship between your app a
Follow these steps to create the app registration: 1. Sign in to the <a href="https://portal.azure.com/" target="_blank">Azure portal<span class="docon docon-navigate-external x-hidden-focus"></span></a>.
-1. If you have access to multiple tenants, use the **Directory + subscription** filter :::image type="icon" source="./media/common/portal-directory-subscription-filter.png" border="false"::: in the top menu to select the tenant in which you want to register an application.
+1. If you have access to multiple tenants, in the top menu, use the **Directory + subscription** filter :::image type="icon" source="./media/common/portal-directory-subscription-filter.png" border="false"::: to select the tenant in which you want to register an application.
1. Search for and select **Azure Active Directory**. 1. Under **Manage**, select **App registrations** > **New registration**.
-1. Enter a **Name** for your application. Users of your app might see this name, and you can change it later.
-1. Specify who can use the application, sometimes referred to as the *sign-in audience*.
+1. Enter a **Name** for your application. Users of your app might see this name. You can change it later.
+1. Specify who can use the application, sometimes called its *sign-in audience*.
| Supported account types | Description | |-|-|
- | **Accounts in this organizational directory only** | Select this option if you're building an application for use only by users (or guests) in *your* tenant.<br><br>Often called a *line-of-business* (LOB) application, this is a **single-tenant** application in the Microsoft identity platform. |
- | **Accounts in any organizational directory** | Select this option if you'd like users in *any* Azure AD tenant to be able to use your application. This option is appropriate if, for example, you're building a software-as-a-service (SaaS) application that you intend to provide to multiple organizations.<br><br>This is known as a **multi-tenant** application in the Microsoft identity platform. |
- | **Accounts in any organizational directory and personal Microsoft accounts** | Select this option to target the widest set of customers.<br><br>By selecting this option, you're registering a **multi-tenant** application that can also support users with personal **Microsoft accounts** (MSA). |
- | **Personal Microsoft accounts** | Select this option if you're building an application for use only by users with personal Microsoft accounts. Personal Microsoft accounts include Skype, Xbox, Live, and Hotmail accounts. |
+ | **Accounts in this organizational directory only** | Select this option if you're building an application for use only by users (or guests) in *your* tenant.<br><br>Often called a *line-of-business* (LOB) application, this app is a *single-tenant* application in the Microsoft identity platform. |
+ | **Accounts in any organizational directory** | Select this option if you want users in *any* Azure Active Directory (Azure AD) tenant to be able to use your application. This option is appropriate if, for example, you're building a software-as-a-service (SaaS) application that you intend to provide to multiple organizations.<br><br>This type of app is known as a *multitenant* application in the Microsoft identity platform. |
+ | **Accounts in any organizational directory and personal Microsoft accounts** | Select this option to target the widest set of customers.<br><br>By selecting this option, you're registering a *multitenant* application that can also support users who have personal *Microsoft accounts*. |
+ | **Personal Microsoft accounts** | Select this option if you're building an application only for users who have personal Microsoft accounts. Personal Microsoft accounts include Skype, Xbox, Live, and Hotmail accounts. |
-1. Don't enter anything for **Redirect URI (optional)**, you'll configure one in the next section.
+1. Don't enter anything for **Redirect URI (optional)**. You'll configure a redirect URI in the next section.
1. Select **Register** to complete the initial app registration.
- :::image type="content" source="media/quickstart-register-app/portal-02-app-reg-01.png" alt-text="Screenshot of the Azure portal in a web browser showing the Register an application pane.":::
+ :::image type="content" source="media/quickstart-register-app/portal-02-app-reg-01.png" alt-text="Screenshot of the Azure portal in a web browser, showing the Register an application pane.":::
-When registration completes, the Azure portal displays the app registration's **Overview** pane, which includes its **Application (client) ID**. Also referred to as just *client ID*, this value uniquely identifies your application in the Microsoft identity platform.
+When registration finishes, the Azure portal displays the app registration's **Overview** pane. You see the **Application (client) ID**. Also called the *client ID*, this value uniquely identifies your application in the Microsoft identity platform.
-Your application's code, or more typically an authentication library used in your application, also uses the client ID as one aspect in validating the security tokens it receives from the identity platform.
+Your application's code, or more typically an authentication library used in your application, also uses the client ID. The ID is used as part of validating the security tokens it receives from the identity platform.
## Add a redirect URI
-A redirect URI is the location where the Microsoft identity platform redirects a user's client and sends security tokens after authentication.
+A *redirect URI* is the location where the Microsoft identity platform redirects a user's client and sends security tokens after authentication.
In a production web application, for example, the redirect URI is often a public endpoint where your app is running, like `https://contoso.com/auth-response`. During development, it's common to also add the endpoint where you run your app locally, like `https://127.0.0.1/auth-response` or `http://localhost/auth-response`.
@@ -70,65 +70,66 @@ You add and modify redirect URIs for your registered applications by configuring
### Configure platform settings
-Settings for each application type, including redirect URIs, are configured in **Platform configurations** in the Azure portal. Some platforms, like **Web** and **Single-page applications**, require you to manually specify a redirect URI. For other platforms like mobile and desktop, you can select from redirect URIs generated for you when you configure their other settings.
+Settings for each application type, including redirect URIs, are configured in **Platform configurations** in the Azure portal. Some platforms, like **Web** and **Single-page applications**, require you to manually specify a redirect URI. For other platforms, like mobile and desktop, you can select from redirect URIs generated for you when you configure their other settings.
To configure application settings based on the platform or device you're targeting:
-1. Select your application in **App registrations** in the Azure portal.
+1. In the Azure portal, in **App registrations**, select your application.
1. Under **Manage**, select **Authentication**. 1. Under **Platform configurations**, select **Add a platform**.
-1. In **Configure platforms**, select the tile for your application type (platform) to configure its settings.
+1. Under **Configure platforms**, select the tile for your application type (platform) to configure its settings.
- :::image type="content" source="media/quickstart-register-app/portal-04-app-reg-03-platform-config.png" alt-text="Screenshot of the Platform configuration pane in the Azure portal" border="false":::
+ :::image type="content" source="media/quickstart-register-app/portal-04-app-reg-03-platform-config.png" alt-text="Screenshot of the platform configuration pane in the Azure portal." border="false":::
| Platform | Configuration settings | | -- | - |
- | **Web** | Enter a **Redirect URI** for your app, the location where the Microsoft identity platform redirects a user's client and sends security tokens after authentication.<br/><br/>Select this platform for standard web applications that run on a server. |
- | **Single-page application** | Enter a **Redirect URI** for your app, the location where the Microsoft identity platform redirects a user's client and sends security tokens after authentication.<br/><br/>Select this platform if you're building a client-side web app in JavaScript or with a framework like Angular, Vue.js, React.js, or Blazor WebAssembly. |
- | **iOS / macOS** | Enter the app **Bundle ID**, found in XCode in *Info.plist* or Build Settings.<br/><br/>A redirect URI is generated for you when you specify a Bundle ID. |
- | **Android** | Enter the app **Package name**, which you can find in the *AndroidManifest.xml* file, and generate and enter the **Signature hash**.<br/><br/>A redirect URI is generated for you when you specify these settings. |
- | **Mobile and desktop applications** | Select one of the **Suggested redirect URIs** or specify a **Custom redirect URI**.<br/>For desktop applications, we recommend:<br/>`https://login.microsoftonline.com/common/oauth2/nativeclient`<br/><br/>Select this platform for mobile applications that aren't using the latest Microsoft Authentication Library (MSAL) or are not using a broker. Also select this platform for desktop applications. |
+ | **Web** | Enter a **Redirect URI** for your app. This URI is the location where the Microsoft identity platform redirects a user's client and sends security tokens after authentication.<br/><br/>Select this platform for standard web applications that run on a server. |
+ | **Single-page application** | Enter a **Redirect URI** for your app. This URI is the location where the Microsoft identity platform redirects a user's client and sends security tokens after authentication.<br/><br/>Select this platform if you're building a client-side web app by using JavaScript or a framework like Angular, Vue.js, React.js, or Blazor WebAssembly. |
+ | **iOS / macOS** | Enter the app **Bundle ID**. Find it in **Build Settings** or in Xcode in *Info.plist*.<br/><br/>A redirect URI is generated for you when you specify a **Bundle ID**. |
+ | **Android** | Enter the app **Package name**. Find it in the *AndroidManifest.xml* file. Also generate and enter the **Signature hash**.<br/><br/>A redirect URI is generated for you when you specify these settings. |
+ | **Mobile and desktop applications** | Select one of the **Suggested redirect URIs**. Or specify a **Custom redirect URI**.<br/><br/>For desktop applications, we recommend<br/>`https://login.microsoftonline.com/common/oauth2/nativeclient`<br/><br/>Select this platform for mobile applications that aren't using the latest Microsoft Authentication Library (MSAL) or aren't using a broker. Also select this platform for desktop applications. |
1. Select **Configure** to complete the platform configuration. ### Redirect URI restrictions
-There are certain restrictions on the format of the redirect URIs you add to an app registration. For details on these restrictions, see [Redirect URI (reply URL) restrictions and limitations](reply-url.md).
+There are some restrictions on the format of the redirect URIs you add to an app registration. For details about these restrictions, see [Redirect URI (reply URL) restrictions and limitations](reply-url.md).
## Add credentials
-Credentials are used by [confidential client applications](msal-client-applications.md) that access a web API. Examples of confidential clients are [web apps](scenario-web-app-call-api-overview.md), other [web APIs](scenario-protected-web-api-overview.md), or [service- and daemon-type applications](scenario-daemon-overview.md). Credentials allow your application to authenticate as itself, requiring no interaction from a user at runtime.
+Credentials are used by [confidential client applications](msal-client-applications.md) that access a web API. Examples of confidential clients are [web apps](scenario-web-app-call-api-overview.md), other [web APIs](scenario-protected-web-api-overview.md), or [service-type and daemon-type applications](scenario-daemon-overview.md). Credentials allow your application to authenticate as itself, requiring no interaction from a user at runtime.
You can add both certificates and client secrets (a string) as credentials to your confidential client app registration. ### Add a certificate
-Sometimes called a *public key*, certificates are the recommended credential type as they provide a higher level of assurance than a client secret. For more information on details about using certificate as an authentication method in your application , see [Microsoft identity platform application authentication certificate credentials](active-directory-certificate-credentials.md)
+Sometimes called a *public key*, a certificate is the recommended credential type. It provides more assurance than a client secret. For more information about using a certificate as an authentication method in your application, see [Microsoft identity platform application authentication certificate credentials](active-directory-certificate-credentials.md).
-1. Select your application in **App registrations** in the Azure portal.
+1. In the Azure portal, in **App registrations**, select your application.
1. Select **Certificates & secrets** > **Upload certificate**.
-1. Select the file you'd like to upload. It must be one of the following file types: .cer, .pem, .crt.
+1. Select the file you want to upload. It must be one of the following file types: *.cer*, *.pem*, *.crt*.
1. Select **Add**. ### Add a client secret
-The client secret, known also as an *application password*, is a string value your app can use in place of a certificate to identity itself. It's the easier of the two credential types to use and is often used during development, but is considered less secure than a certificate. You should use certificates in your applications running in production. For more information on application security recommendations, please see [Microsoft identity platform best practices and recommendations](identity-platform-integration-checklist.md#security)
+The client secret is also known as an *application password*. It's a string value your app can use in place of a certificate to identity itself. The client secret is the easier of the two credential types to use. It's often used during development, but it's considered less secure than a certificate. Use certificates in your applications that are running in production.
-1. Select your application in **App registrations** in the Azure portal.
+For more information about application security recommendations, see [Microsoft identity platform best practices and recommendations](identity-platform-integration-checklist.md#security).
+
+1. In the Azure portal, in **App registrations**, select your application.
1. Select **Certificates & secrets** > **New client secret**. 1. Add a description for your client secret. 1. Select a duration. 1. Select **Add**.
-1. **Record the secret's value** for use in your client application code - it's *never displayed again* after you leave this page.
+1. *Record the secret's value* for use in your client application code. This secret value is *never displayed again* after you leave this page.
-**Note:** The ID generated along with the secret's value is the ID of the secret, which is different than Application ID.
## Next steps
-Client applications typically need to access resources in a web API. In addition to protecting your client application with the Microsoft identity platform, you can use the platform for authorizing scoped, permissions-based access to your web API.
+Client applications typically need to access resources in a web API. You can protect your client application by using the Microsoft identity platform. You can also use the platform for authorizing scoped, permissions-based access to your web API.
-Move on to the next quickstart in the series to create another app registration for your web API and expose its scopes.
+Go to the next quickstart in the series to create another app registration for your web API and expose its scopes.
> [!div class="nextstepaction"] > [Configure an application to expose a web API](quickstart-configure-app-expose-web-apis.md)
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-android https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-android.md
@@ -32,7 +32,7 @@ Applications must be represented by an app object in Azure Active Directory so t
> [!div class="sxs-lookup" renderon="portal"] > ### Step 1: Configure your application in the Azure portal
-> This quickstart's sample code requires a redirect URI compatible with the Auth broker.
+> For the code sample in this quickstart to work, add a **Redirect URI** compatible with the Auth broker.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make these changes for me]() >
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-angular https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-angular.md
@@ -51,7 +51,7 @@ In this quickstart, you download and run a code sample that demonstrates how an
> [!div class="sxs-lookup" renderon="portal"] > #### Step 1: Configure the application in the Azure portal
-> For the code sample for this quickstart to work, you need to add a redirect URI as **http://localhost:4200/** and enable ****Implicit grant**.
+> For the code sample in this quickstart to work, you need to add a redirect URI as **http://localhost:4200/** and enable **Implicit grant**.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make these changes for me]() >
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-aspnet-core-webapp-calls-graph https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-aspnet-core-webapp-calls-graph.md
@@ -61,7 +61,7 @@ See [How the sample works](#how-the-sample-works) for an illustration.
> [!div class="sxs-lookup" renderon="portal"] > #### Step 1: Configure your application in the Azure portal
-> This quickstart's sample code requires a **Redirect URI** of `https://localhost:44321/signin-oidc` and **Front-channel logout URL** of `https://localhost:44321/signout-oidc` in the app registration.
+> For the code sample in this quickstart to work, add a **Redirect URI** of `https://localhost:44321/signin-oidc` and **Front-channel logout URL** of `https://localhost:44321/signout-oidc` in the app registration.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make this change for me]() >
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-aspnet-core-webapp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-aspnet-core-webapp.md
@@ -59,7 +59,7 @@ See [How the sample works](#how-the-sample-works) for an illustration.
> [!div class="sxs-lookup" renderon="portal"] > #### Step 1: Configure your application in the Azure portal
-> This quickstart's sample code requires a **Redirect URI** of `https://localhost:44321/` and `https://localhost:44321/signin-oidc` and a **Front-channel logout URL** of `https://localhost:44321/signout-oidc`. Request ID tokens will be issued by the authorization endpoint.
+> For the code sample in this quickstart to work, add a **Redirect URI** of `https://localhost:44321/` and `https://localhost:44321/signin-oidc` and a **Front-channel logout URL** of `https://localhost:44321/signout-oidc`. Request ID tokens will be issued by the authorization endpoint.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make this change for me]() >
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-aspnet-webapp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-aspnet-webapp.md
@@ -56,7 +56,7 @@ See [How the sample works](#how-the-sample-works) for an illustration.
> [!div class="sxs-lookup" renderon="portal"] > #### Step 1: Configure your application in Azure portal
-> This quickstart's sample code requires a **Redirect URI** of `https://localhost:44368/`.
+> For the code sample in this quickstart to work, add a **Redirect URI** of `https://localhost:44368/`.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make this change for me]()
@@ -176,7 +176,7 @@ public void Configuration(IAppBuilder app)
> [!NOTE]
-> Setting `ValidateIssuer = false` is a simplification for this quickstart. In real applications you need to validate the issuer.
+> Setting `ValidateIssuer = false` is a simplification for this quickstart. In real applications, validate the issuer.
> See the samples to understand how to do that. ### Initiate an authentication challenge
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-ios https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-ios.md
@@ -66,7 +66,7 @@ The quickstart applies to both iOS and macOS apps. Some steps are needed only fo
> [!div renderon="portal" class="sxs-lookup"] > > #### Step 1: Configure your application
-> For the code sample for this quickstart to work, you need to add a redirect URI compatible with the Auth broker.
+> For the code sample for this quickstart to work, add a **Redirect URI** compatible with the Auth broker.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make this change for me]() >
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-java-webapp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-java-webapp.md
@@ -64,7 +64,7 @@ To run this sample, you need:
> [!div class="sxs-lookup" renderon="portal"] > #### Step 1: Configure your application in the Azure portal >
-> To use the code sample in this quickstart, you need to:
+> To use the code sample in this quickstart:
> > 1. Add reply URLs `https://localhost:8443/msal4jsample/secure/aad` and `https://localhost:8443/msal4jsample/graph/me`. > 1. Create a client secret.
@@ -157,7 +157,7 @@ To run the web application from an IDE, select run, and then go to the home page
##### Running the project from Tomcat
-If you want to deploy the web sample to Tomcat, you need to make a couple changes to the source code.
+If you want to deploy the web sample to Tomcat, make a couple changes to the source code.
1. Open *ms-identity-java-webapp/pom.xml*. - Under `<name>msal-web-sample</name>`, add `<packaging>war</packaging>`.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-javascript-auth-code-angular https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-javascript-auth-code-angular.md
@@ -62,7 +62,7 @@ This quickstart uses MSAL Angular v2 with the authorization code flow. For a sim
> [!div class="sxs-lookup" renderon="portal"] > #### Step 1: Configure your application in the Azure portal
-> To make the code sample in this quickstart work, you need to add a `redirectUri` as `http://localhost:4200/`.
+> For the code sample in this quickstart to work, add a **Redirect URI** of `http://localhost:4200/`.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make these changes for me]() >
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-javascript-auth-code-react https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-javascript-auth-code-react.md
@@ -63,7 +63,7 @@ This quickstart uses MSAL React with the authorization code flow. For a similar
> [!div class="sxs-lookup" renderon="portal"] > #### Step 1: Configure your application in the Azure portal
-> To make the code sample in this quickstart work, you need to add a `redirectUri` as `http://localhost:3000/`.
+> For the code sample in this quickstart to work, add a **Redirect URI** of `http://localhost:3000/`.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make these changes for me]() >
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-javascript-auth-code https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-javascript-auth-code.md
@@ -60,7 +60,7 @@ This quickstart uses MSAL.js 2.0 with the authorization code flow. For a similar
> [!div class="sxs-lookup" renderon="portal"] > #### Step 1: Configure your application in the Azure portal
-> This quickstart's sample code requires a **Redirect URI** of `http://localhost:3000/`.
+> For the code sample in this quickstart to work, add a **Redirect URI** of `http://localhost:3000/`.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make these changes for me]() >
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-javascript https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-javascript.md
@@ -60,7 +60,7 @@ See [How the sample works](#how-the-sample-works) for an illustration.
> [!div class="sxs-lookup" renderon="portal"] > #### Step 1: Configure your application in the Azure portal
-> This quickstart's sample code requires a **Redirect URI** of `http://localhost:3000/` and enable **Implicit grant**.
+> For the code sample in this quickstart to work, add a **Redirect URI** of `http://localhost:3000/` and enable **Implicit grant**.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make these changes for me]() >
@@ -265,7 +265,7 @@ myMSALObj.acquireTokenSilent(tokenRequest)
#### Get a user token interactively
-There are situations where you need to force users to interact with the Microsoft identity platform. For example:
+There are situations where you force users to interact with the Microsoft identity platform. For example:
* Users might need to reenter their credentials because their password has expired. * Your application is requesting access to additional resource scopes that the user needs to consent to. * Two-factor authentication is required.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-netcore-daemon https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-netcore-daemon.md
@@ -62,7 +62,7 @@ This quickstart requires [.NET Core 3.1](https://www.microsoft.com/net/download/
> ### Download and configure your quickstart app > > #### Step 1: Configure your application in Azure portal
-> For the code sample for this quickstart to work, you need to create a client secret, and add Graph API's **User.Read.All** application permission.
+> For the code sample in this quickstart to work, create a client secret and add Graph API's **User.Read.All** application permission.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make these changes for me]() >
@@ -127,7 +127,7 @@ If you try to run the application at this point, you'll receive *HTTP 403 - Forb
##### Standard user
-If you're a standard user of your tenant, then you need to ask a global administrator to grant admin consent for your application. To do this, give the following URL to your administrator:
+If you're a standard user of your tenant, ask a global administrator to grant admin consent for your application. To do this, give the following URL to your administrator:
```url https://login.microsoftonline.com/Enter_the_Tenant_Id_Here/adminconsent?client_id=Enter_the_Application_Id_Here
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-python-daemon https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-python-daemon.md
@@ -65,7 +65,7 @@ To run this sample, you need:
> ### Download and configure your quickstart app > > #### Step 1: Configure your application in Azure portal
-> For the code sample for this quickstart to work, you need to create a client secret, and add Graph API's **User.Read.All** application permission.
+> For the code sample in this quickstart to work, create a client secret and add Graph API's **User.Read.All** application permission.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make these changes for me]() >
@@ -125,7 +125,7 @@ If you try to run the application at this point, you'll receive *HTTP 403 - Forb
##### Standard user
-If you're a standard user of your tenant, then you need to ask a global administrator to grant admin consent for your application. To do this, give the following URL to your administrator:
+If you're a standard user of your tenant, ask a global administrator to grant admin consent for your application. To do this, give the following URL to your administrator:
```url https://login.microsoftonline.com/Enter_the_Tenant_Id_Here/adminconsent?client_id=Enter_the_Application_Id_Here
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-python-webapp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-python-webapp.md
@@ -61,7 +61,7 @@ See [How the sample works](#how-the-sample-works) for an illustration.
> 1. Type a key description (for instance app secret), leave the default expiration, and select **Add**. > 1. Note the **Value** of the **Client Secret** for later use. > 1. Under **Manage**, select **API permissions** > **Add a permission**.
->1. Ensure that the **Microsoft APIs** tab is selected.
+> 1. Ensure that the **Microsoft APIs** tab is selected.
> 1. From the *Commonly used Microsoft APIs* section, select **Microsoft Graph**. > 1. From the **Delegated permissions** section, ensure that the right permissions are checked: **User.ReadBasic.All**. Use the search box if necessary. > 1. Select the **Add permissions** button.
@@ -70,7 +70,7 @@ See [How the sample works](#how-the-sample-works) for an illustration.
> > #### Step 1: Configure your application in Azure portal >
-> For the code sample for this quickstart to work, you need to:
+> For the code sample in this quickstart to work:
> > 1. Add a reply URL as `http://localhost:5000/getAToken`. > 1. Create a Client Secret.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-uwp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-uwp.md
@@ -58,7 +58,7 @@ See [How the sample works](#how-the-sample-works) for an illustration.
> [!div renderon="portal" class="sxs-lookup"] > #### Step 1: Configure the application
-> For the code sample for this quickstart to work, you need to add a redirect URI as **https://login.microsoftonline.com/common/oauth2/nativeclient**.
+> For the code sample in this quickstart to work, add a **Redirect URI** of `https://login.microsoftonline.com/common/oauth2/nativeclient`.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make this change for me]() >
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-v2-windows-desktop https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-v2-windows-desktop.md
@@ -56,7 +56,7 @@ See [How the sample works](#how-the-sample-works) for an illustration.
> [!div class="sxs-lookup" renderon="portal"] > #### Step 1: Configure your application in Azure portal
-> For the code sample for this quickstart to work, you need to add a reply URL as **https://login.microsoftonline.com/common/oauth2/nativeclient**.
+> For the code sample in this quickstart to work, add a **Redirect URI** of `https://login.microsoftonline.com/common/oauth2/nativeclient`.
> > [!div renderon="portal" id="makechanges" class="nextstepaction"] > > [Make this change for me]() >
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/reference-app-manifest https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/reference-app-manifest.md
@@ -22,7 +22,7 @@ The application manifest contains a definition of all the attributes of an appli
You can configure an app's attributes through the Azure portal or programmatically using [REST API](/graph/api/resources/application) or [PowerShell](/powershell/module/azuread#applications). However, there are some scenarios where you'll need to edit the app manifest to configure an app's attribute. These scenarios include: * If you registered the app as Azure AD multi-tenant and personal Microsoft accounts, you can't change the supported Microsoft accounts in the UI. Instead, you must use the application manifest editor to change the supported account type.
-* If you need to define permissions and roles that your app supports, you must modify the application manifest.
+* To define permissions and roles that your app supports, you must modify the application manifest.
## Configure the app manifest
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/registration-config-change-token-lifetime-how-to https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/registration-config-change-token-lifetime-how-to.md
@@ -23,7 +23,7 @@ This article shows how to use Azure AD PowerShell to set an access token lifetim
> After May 2020, tenants will no longer be able to configure refresh and session token lifetimes. Azure Active Directory will stop honoring existing refresh and session token configuration in policies after January 30, 2021. You can still configure access token lifetimes after the deprecation. For more information, read [Configurable token lifetimes in Azure AD](./active-directory-configurable-token-lifetimes.md). > We’ve implemented [authentication session management capabilities](../conditional-access/howto-conditional-access-session-lifetime.md) in Azure AD Conditional Access. You can use this new feature to configure refresh token lifetimes by setting sign in frequency.
-To set an access token lifetime policy, you need to download the [Azure AD PowerShell Module](https://www.powershellgallery.com/packages/AzureADPreview).
+To set an access token lifetime policy, download the [Azure AD PowerShell Module](https://www.powershellgallery.com/packages/AzureADPreview).
Run the **Connect-AzureAD -Confirm** command. HereΓÇÖs an example policy that requires users to authenticate more frequently in your web app. This policy sets the lifetime of the access to the service principal of your web app. Create the policy and assign it to your service principal. You also need to get the ObjectId of your service principal.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/reply-url https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/reply-url.md
@@ -39,7 +39,7 @@ You can use a maximum of 256 characters for each redirect URI you add to an app
The Azure Active Directory (Azure AD) application model currently supports both HTTP and HTTPS schemes for apps that sign in work or school accounts in any organization's Azure AD tenant. These account types are specified by the `AzureADMyOrg` and `AzureADMultipleOrgs` values in the `signInAudience` field of the application manifest. For apps that sign in personal Microsoft accounts (MSA) *and* work and school accounts (that is, the `signInAudience` is set to `AzureADandPersonalMicrosoftAccount`), only the HTTPS scheme is allowed.
-To add redirect URIs with an HTTP scheme to app registrations that sign in work or school accounts, you need to use the application manifest editor in [App registrations](https://go.microsoft.com/fwlink/?linkid=2083908) in the Azure portal. However, though it's possible to set an HTTP-based redirect URI by using the manifest editor, we *strongly* recommend that you use the HTTPS scheme for your redirect URIs.
+To add redirect URIs with an HTTP scheme to app registrations that sign in work or school accounts, use the application manifest editor in [App registrations](https://go.microsoft.com/fwlink/?linkid=2083908) in the Azure portal. However, though it's possible to set an HTTP-based redirect URI by using the manifest editor, we *strongly* recommend that you use the HTTPS scheme for your redirect URIs.
## Localhost exceptions
@@ -59,7 +59,7 @@ From a development standpoint, this means a few things:
* Do not register multiple redirect URIs where only the port differs. The login server will pick one arbitrarily and use the behavior associated with that redirect URI (for example, whether it's a `web`-, `native`-, or `spa`-type redirect). This is especially important when you want to use different authentication flows in the same application registration, for example both the authorization code grant and implicit flow. To associate the correct response behavior with each redirect URI, the login server must be able to distinguish between the redirect URIs and cannot do so when only the port differs.
-* If you need to register multiple redirect URIs on localhost to test different flows during development, differentiate them using the *path* component of the URI. For example, `http://localhost/MyWebApp` doesn't match `http://localhost/MyNativeApp`.
+* To register multiple redirect URIs on localhost to test different flows during development, differentiate them using the *path* component of the URI. For example, `http://localhost/MyWebApp` doesn't match `http://localhost/MyNativeApp`.
* The IPv6 loopback address (`[::1]`) is not currently supported. #### Prefer 127.0.0.1 over localhost
@@ -78,7 +78,7 @@ Wildcard URIs like `https://*.contoso.com` may seem convenient, but should be av
Wildcard URIs are currently unsupported in app registrations configured to sign in personal Microsoft accounts and work or school accounts. Wildcard URIs are allowed, however, for apps that are configured to sign in only work or school accounts in an organization's Azure AD tenant.
-To add redirect URIs with wildcards to app registrations that sign in work or school accounts, you need to use the application manifest editor in [App registrations](https://go.microsoft.com/fwlink/?linkid=2083908) in the Azure portal. Though it's possible to set a redirect URI with a wildcard by using the manifest editor, we *strongly* recommend you adhere to [section 3.1.2 of RFC 6749](https://tools.ietf.org/html/rfc6749#section-3.1.2) and use only absolute URIs.
+To add redirect URIs with wildcards to app registrations that sign in work or school accounts, use the application manifest editor in [App registrations](https://go.microsoft.com/fwlink/?linkid=2083908) in the Azure portal. Though it's possible to set a redirect URI with a wildcard by using the manifest editor, we *strongly* recommend you adhere to [section 3.1.2 of RFC 6749](https://tools.ietf.org/html/rfc6749#section-3.1.2) and use only absolute URIs.
If your scenario requires more redirect URIs than the maximum limit allowed, consider the following [state parameter approach](#use-a-state-parameter) instead of adding a wildcard redirect URI.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-daemon-app-configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-daemon-app-configuration.md
@@ -108,7 +108,7 @@ When you build a confidential client with certificates, the [parameters.json](ht
### Instantiate the MSAL application
-To instantiate the MSAL application, you need to add, reference, or import the MSAL package (depending on the language).
+To instantiate the MSAL application, add, reference, or import the MSAL package (depending on the language).
The construction is different, depending on whether you're using client secrets or certificates (or, as an advanced scenario, signed assertions).
@@ -286,7 +286,7 @@ MSAL.NET has two methods to provide signed assertions to the confidential client
- `.WithClientAssertion()` - `.WithClientClaims()`
-When you use `WithClientAssertion`, you need to provide a signed JWT. This advanced scenario is detailed in [Client assertions](msal-net-client-assertions.md).
+When you use `WithClientAssertion`, provide a signed JWT. This advanced scenario is detailed in [Client assertions](msal-net-client-assertions.md).
```csharp string signedClientAssertion = ComputeAssertion();
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-daemon-app-registration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-daemon-app-registration.md
@@ -23,7 +23,7 @@ For a daemon application, here's what you need to know when you register the app
## Supported account types
-Daemon applications make sense only in Azure AD tenants. So when you create the application, you need to choose one of the following options:
+Daemon applications make sense only in Azure AD tenants. So when you create the application, choose one of the following options:
- **Accounts in this organizational directory only**. This choice is the most common one because daemon applications are usually written by line-of-business (LOB) developers. - **Accounts in any organizational directory**. You'll make this choice if you're an ISV providing a utility tool to your customers. You'll need your customers' tenant admins to approve it.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-daemon-call-api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-daemon-call-api.md
@@ -63,7 +63,7 @@ JSONObject responseObject = HttpClientHelper.processResponse(responseCode, respo
## Calling several APIs
-For daemon apps, the web APIs that you call need to be pre-approved. There's no incremental consent with daemon apps. (There's no user interaction.) The tenant admin needs to provide consent in advance for the application and all the API permissions. If you want to call several APIs, you need to acquire a token for each resource, each time calling `AcquireTokenForClient`. MSAL will use the application token cache to avoid unnecessary service calls.
+For daemon apps, the web APIs that you call need to be pre-approved. There's no incremental consent with daemon apps. (There's no user interaction.) The tenant admin needs to provide consent in advance for the application and all the API permissions. If you want to call several APIs, acquire a token for each resource, each time calling `AcquireTokenForClient`. MSAL will use the application token cache to avoid unnecessary service calls.
## Next steps
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-daemon-overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-daemon-overview.md
@@ -51,7 +51,7 @@ Applications that acquire a token for their own identities:
For developers, the end-to-end experience for this scenario has the following aspects: - Daemon applications can work only in Azure AD tenants. It wouldn't make sense to build a daemon application that attempts to manipulate Microsoft personal accounts. If you're a line-of-business (LOB) app developer, you'll create your daemon app in your tenant. If you're an ISV, you might want to create a multitenant daemon application. Each tenant admin will need to provide consent.-- During [application registration](./scenario-daemon-app-registration.md), the reply URI isn't needed. You need to share secrets or certificates or signed assertions with Azure AD. You also need to request application permissions and grant admin consent to use those app permissions.
+- During [application registration](./scenario-daemon-app-registration.md), the reply URI isn't needed. Share secrets or certificates or signed assertions with Azure AD. You also need to request application permissions and grant admin consent to use those app permissions.
- The [application configuration](./scenario-daemon-app-configuration.md) needs to provide client credentials as shared with Azure AD during the application registration. - The [scope](scenario-daemon-acquire-token.md#scopes-to-request) used to acquire a token with the client credentials flow needs to be a static scope.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-daemon-production https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-daemon-production.md
@@ -24,7 +24,7 @@ Now that you know how to acquire and use a token for a service-to-service call,
## Deployment - multitenant daemon apps
-If you're an ISV creating a daemon application that can run in several tenants, you need to make sure that the tenant admin:
+If you're an ISV creating a daemon application that can run in several tenants, make sure that the tenant admin:
- Provisions a service principal for the application. - Grants consent to the application.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-desktop-acquire-token https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-desktop-acquire-token.md
@@ -442,7 +442,7 @@ For more information on consent, see the [Microsoft identity platform permission
# [.NET](#tab/dotnet)
-In MSAL.NET, you need to use:
+In MSAL.NET, use:
```csharp AcquireTokenByIntegratedWindowsAuth(IEnumerable<string> scopes)
@@ -919,7 +919,7 @@ This flow isn't supported on MSAL for macOS.
### Device code flow
-If you're writing a command-line tool that doesn't have web controls, and you can't or don't want to use the previous flows, you need to use the device code flow.
+If you're writing a command-line tool that doesn't have web controls, and you can't or don't want to use the previous flows, use the device code flow.
Interactive authentication with Azure AD requires a web browser. For more information, see [Usage of web browsers](https://aka.ms/msal-net-uses-web-browser). To authenticate users on devices or operating systems that don't provide a web browser, device code flow lets the user use another device such as a computer or a mobile phone to sign in interactively. By using the device code flow, the application obtains tokens through a two-step process that's designed for these devices or operating systems. Examples of such applications are applications that run on iOT or command-line tools (CLI). The idea is that:
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-desktop-app-registration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-desktop-app-registration.md
@@ -44,7 +44,7 @@ The redirect URIs to use in a desktop application depend on the flow you want to
- If you build a native Objective-C or Swift app for macOS, register the redirect URI based on your application's bundle identifier in the following format: `msauth.<your.app.bundle.id>://auth`. Replace `<your.app.bundle.id>` with your application's bundle identifier. - If your app uses only Integrated Windows Authentication or a username and a password, you don't need to register a redirect URI for your application. These flows do a round trip to the Microsoft identity platform v2.0 endpoint. Your application won't be called back on any specific URI.-- To distinguish [device code flow](scenario-desktop-acquire-token.md#device-code-flow), [Integrated Windows Authentication](scenario-desktop-acquire-token.md#integrated-windows-authentication), and a [username and a password](scenario-desktop-acquire-token.md#username-and-password) from a confidential client application using a client credential flow used in [daemon applications](scenario-daemon-overview.md), none of which requires a redirect URI, you need to configure it as a public client application. To achieve this configuration:
+- To distinguish [device code flow](scenario-desktop-acquire-token.md#device-code-flow), [Integrated Windows Authentication](scenario-desktop-acquire-token.md#integrated-windows-authentication), and a [username and a password](scenario-desktop-acquire-token.md#username-and-password) from a confidential client application using a client credential flow used in [daemon applications](scenario-daemon-overview.md), none of which requires a redirect URI, configure it as a public client application. To achieve this configuration:
1. In the <a href="https://portal.azure.com/" target="_blank">Azure portal<span class="docon docon-navigate-external x-hidden-focus"></span></a>, select your app in **App registrations**, and then select **Authentication**. 1. In **Advanced settings** > **Allow public client flows** > **Enable the following mobile and desktop flows:**, select **Yes**.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-desktop-production https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-desktop-production.md
@@ -93,7 +93,7 @@ application.acquireToken(with: interactiveParameters, completionBlock: { (result
This call gets you an access token for the first web API.
-When you need to call the second web API, call the `AcquireTokenSilent` API.
+When calling the second web API, call the `AcquireTokenSilent` API.
```csharp AcquireTokenSilent(scopesForVendorApi, accounts.FirstOrDefault()).ExecuteAsync();
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-mobile-acquire-token https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-mobile-acquire-token.md
@@ -23,7 +23,7 @@ Before your app can call protected web APIs, it needs an access token. This arti
## Define a scope
-When you request a token, you need to define a scope. The scope determines what data your app can access.
+When you request a token, define a scope. The scope determines what data your app can access.
The easiest way to define a scope is to combine the desired web API's `App ID URI` with the scope `.default`. This definition tells the Microsoft identity platform that your app requires all scopes that are set in the portal.
@@ -233,7 +233,7 @@ The class defines the following constants:
This option can be useful if the token acquisition fails and you want to let the user sign in again. In this case, MSAL sends `prompt=login` to the identity provider. You might want to use this option in security-focused applications where the organization governance requires the user to sign in each time they access specific parts of the application. - `Never` is for only .NET 4.5 and Windows Runtime (WinRT). This constant won't prompt the user, but it will try to use the cookie that's stored in the hidden embedded web view. For more information, see [Using web browsers with MSAL.NET](./msal-net-web-browsers.md).
- If this option fails, then `AcquireTokenInteractive` throws an exception to notify you that a UI interaction is needed. Then you need to use another `Prompt` parameter.
+ If this option fails, then `AcquireTokenInteractive` throws an exception to notify you that a UI interaction is needed. Then use another `Prompt` parameter.
- `NoPrompt` doesn't send a prompt to the identity provider. This option is useful only for edit-profile policies in Azure Active Directory B2C. For more information, see [B2C specifics](https://aka.ms/msal-net-b2c-specificities).
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-mobile-app-configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-mobile-app-configuration.md
@@ -79,7 +79,7 @@ The following sections provide more information about instantiating the applicat
##### Specify the parent UI, window, or activity
-On Android, you need to pass the parent activity before you do the interactive authentication. On iOS, when you use a broker, you need to pass-in `ViewController`. In the same way on UWP, you might want to pass-in the parent window. You pass it in when you acquire the token. But when you're creating the app, you can also specify a callback as a delegate that returns `UIParent`.
+On Android, pass the parent activity before you do the interactive authentication. On iOS, when you use a broker, pass-in `ViewController`. In the same way on UWP, you might want to pass-in the parent window. You pass it in when you acquire the token. But when you're creating the app, you can also specify a callback as a delegate that returns `UIParent`.
```csharp IPublicClientApplication application = PublicClientApplicationBuilder.Create(clientId)
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-mobile-app-registration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-mobile-app-registration.md
@@ -76,9 +76,9 @@ If you prefer to manually configure the redirect URI, you can do so through the
### Username-password authentication
-If your app uses only username-password authentication, you don't need to register a redirect URI for your application. This flow does a round trip to the Microsoft identity platform version 2.0 endpoint. Your application won't be called back on any specific URI.
+If your app uses only username-password authentication, you don't need to register a redirect URI for your application. This flow does a round trip to the Microsoft identity platform. Your application won't be called back on any specific URI.
-However, you need to identify your application as a public client application. To do so:
+However, identify your application as a public client application. To do so:
1. Still in the <a href="https://portal.azure.com/" target="_blank">Azure portal<span class="docon docon-navigate-external x-hidden-focus"></span></a>, select your app in **App registrations**, and then select **Authentication**. 1. In **Advanced settings** > **Allow public client flows** > **Enable the following mobile and desktop flows:**, select **Yes**.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-mobile-call-api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-mobile-call-api.md
@@ -114,7 +114,7 @@ task.resume()
## Make several API requests
-If you need to call the same API several times, or if you need to call multiple APIs, then consider the following subjects when you build your app:
+To call the same API several times, or call multiple APIs, then consider the following subjects when you build your app:
- **Incremental consent**: The Microsoft identity platform allows apps to get user consent when permissions are required rather than all at the start. Each time your app is ready to call an API, it should request only the scopes that it needs.
@@ -122,7 +122,7 @@ If you need to call the same API several times, or if you need to call multiple
## Call several APIs by using incremental consent and conditional access
-If you need to call several APIs for the same user, after you acquire a token for the user, you can avoid repeatedly asking the user for credentials by subsequently calling `AcquireTokenSilent` to get a token:
+To call several APIs for the same user, after you acquire a token for the user, you can avoid repeatedly asking the user for credentials by subsequently calling `AcquireTokenSilent` to get a token:
```csharp var result = await app.AcquireTokenXX("scopeApi1")
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-protected-web-api-app-configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-protected-web-api-app-configuration.md
@@ -18,7 +18,7 @@
# Protected web API: Code configuration
-To configure the code for your protected web API, you need to understand:
+To configure the code for your protected web API, understand:
- What defines APIs as protected. - How to configure a bearer token.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-protected-web-api-app-registration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-protected-web-api-app-registration.md
@@ -55,7 +55,7 @@ Other settings specific to web APIs are the exposed API and the exposed scopes o
Scopes usually have the form `resourceURI/scopeName`. For Microsoft Graph, the scopes have shortcuts. For example, `User.Read` is a shortcut for `https://graph.microsoft.com/user.read`.
-During app registration, you need to define these parameters:
+During app registration, define these parameters:
- The resource URI - One or more scopes
@@ -65,7 +65,7 @@ By default, the application registration portal recommends that you use the reso
To client applications, scopes show up as *delegated permissions* and app roles show up as *application permissions* for your web API.
-Scopes also appear on the consent window that's presented to users of your app. So you need to provide the corresponding strings that describe the scope:
+Scopes also appear on the consent window that's presented to users of your app. Therefore, provide the corresponding strings that describe the scope:
- As seen by a user. - As seen by a tenant admin, who can grant admin consent.
@@ -96,7 +96,7 @@ In this section, you learn how to register your protected web API so that daemon
#### Exposing application permissions (app roles)
-To expose application permissions, you need to edit the manifest.
+To expose application permissions, edit the manifest.
1. In the application registration for your application, select **Manifest**. 1. To edit the manifest, find the `appRoles` setting and add application roles. The role definitions are provided in the following sample JSON block.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-protected-web-api-overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-protected-web-api-overview.md
@@ -20,7 +20,7 @@
In this scenario, you learn how to expose a web API. You also learn how to protect the web API so that only authenticated users can access it.
-To use your web API, you need to either enable authenticated users with both work and school accounts or enable Microsoft personal accounts.
+To use your web API, you either enable authenticated users with both work and school accounts or enable Microsoft personal accounts.
## Specifics
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-spa-sign-in https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-spa-sign-in.md
@@ -205,7 +205,7 @@ myMsal.loginRedirect(loginRequest);
# [JavaScript (MSAL.js 1.x)](#tab/javascript1)
-The redirect methods don't return a promise because of the move away from the main app. To process and access the returned tokens, you need to register success and error callbacks before you call the redirect methods.
+The redirect methods don't return a promise because of the move away from the main app. To process and access the returned tokens, register success and error callbacks before you call the redirect methods.
```javascript
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-token-exchange-saml-oauth https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-token-exchange-saml-oauth.md
@@ -17,7 +17,7 @@
SAML and OpenID Connect (OIDC) / OAuth are popular protocols used to implement Single Sign-On (SSO). Some apps might only implement SAML and others might only implement OIDC/OAuth. Both protocols use tokens to communicate secrets. To learn more about SAML, see [Single Sign-On SAML protocol](single-sign-on-saml-protocol.md). To learn more about OIDC/OAuth, see [OAuth 2.0 and OpenID Connect protocols on Microsoft identity platform](active-directory-v2-protocols.md).
-This article outlines a common scenario where an app implements SAML but you need to call into the Graph API, which uses OIDC/OAuth. Basic guidance is provided for people working with this scenario.
+This article outlines a common scenario where an app implements SAML but calls the Graph API, which uses OIDC/OAuth. Basic guidance is provided for people working with this scenario.
## Scenario: You have a SAML token and want to call the Graph API Many apps are implemented with SAML. However, the Graph API uses the OIDC/OAuth protocols. It's possible, though not trivial, to add OIDC/OAuth functionality to a SAML app. Once OAuth functionality is available in an app, the Graph API can be used.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-web-api-call-api-app-configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-web-api-call-api-app-configuration.md
@@ -30,7 +30,7 @@ Microsoft recommends that you use the [Microsoft.Identity.Web](https://www.nuget
## Client secrets or client certificates
-Given that your web API now calls a downstream web API, you need to provide a client secret or client certificate in the *appsettings.json* file. You can also add a section that specifies:
+Given that your web API now calls a downstream web API, provide a client secret or client certificate in the *appsettings.json* file. You can also add a section that specifies:
- The URL of the downstream web API - The scopes required for calling the API
@@ -166,7 +166,7 @@ The following image shows the various possibilities of *Microsoft.Identity.Web*
:::image type="content" source="media/scenarios/microsoft-identity-web-startup-cs.svg" alt-text="Block diagram showing service configuration options in startup dot C S for calling a web API and specifying a token cache implementation"::: > [!NOTE]
-> To fully understand the code examples here, you need to be familiar with [ASP.NET Core fundamentals](/aspnet/core/fundamentals), and in particular with [dependency injection](/aspnet/core/fundamentals/dependency-injection) and [options](/aspnet/core/fundamentals/configuration/options).
+> To fully understand the code examples here, be familiar with [ASP.NET Core fundamentals](/aspnet/core/fundamentals), and in particular with [dependency injection](/aspnet/core/fundamentals/dependency-injection) and [options](/aspnet/core/fundamentals/configuration/options).
# [Java](#tab/java)
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-web-api-call-api-app-registration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-web-api-call-api-app-registration.md
@@ -18,7 +18,7 @@
# A web API that calls web APIs: App registration
-A web API that calls downstream web APIs has the same registration as a protected web API. Therefore, you need to follow the instructions in [Protected web API: App registration](scenario-protected-web-api-app-registration.md).
+A web API that calls downstream web APIs has the same registration as a protected web API. Follow the instructions in [Protected web API: App registration](scenario-protected-web-api-app-registration.md).
Because the web app now calls web APIs, it becomes a confidential client application. That's why extra registration information is required: the app needs to share secrets (client credentials) with the Microsoft identity platform.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-web-app-call-api-app-configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-web-app-call-api-app-configuration.md
@@ -41,7 +41,7 @@ Select the tab for the platform you're interested in:
## Client secrets or client certificates
-Given that your web app now calls a downstream web API, you need to provide a client secret or client certificate in the *appsettings.json* file. You can also add a section that specifies:
+Given that your web app now calls a downstream web API, provide a client secret or client certificate in the *appsettings.json* file. You can also add a section that specifies:
- The URL of the downstream web API - The scopes required for calling the API
@@ -181,7 +181,7 @@ The following image shows the various possibilities of *Microsoft.Identity.Web*
:::image type="content" source="media/scenarios/microsoft-identity-web-startup-cs.svg" alt-text="Block diagram showing service configuration options in startup dot C S for calling a web API and specifying a token cache implementation"::: > [!NOTE]
-> To fully understand the code examples here, you need to be familiar with [ASP.NET Core fundamentals](/aspnet/core/fundamentals), and in particular with [dependency injection](/aspnet/core/fundamentals/dependency-injection) and [options](/aspnet/core/fundamentals/configuration/options).
+> To fully understand the code examples here, be familiar with [ASP.NET Core fundamentals](/aspnet/core/fundamentals), and in particular with [dependency injection](/aspnet/core/fundamentals/dependency-injection) and [options](/aspnet/core/fundamentals/configuration/options).
# [ASP.NET](#tab/aspnet)
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-web-app-sign-user-app-configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-web-app-sign-user-app-configuration.md
@@ -61,13 +61,13 @@ You might want to refer to this sample for full implementation details.
## Configuration files
-Web applications that sign in users by using the Microsoft identity platform are configured through configuration files. The settings that you need to fill in are:
+Web applications that sign in users by using the Microsoft identity platform are configured through configuration files. These are the values you're required to specify in the configuration:
- The cloud instance (`Instance`) if you want your app to run in national clouds, for example - The audience in the tenant ID (`TenantId`) - The client ID (`ClientId`) for your application, as copied from the Azure portal
-Sometimes, applications can be parametrized by `Authority`, which is a concatenation of `Instance` and `TenantId`.
+You might also see references to the `Authority`. The `Authority` value is the concatenation of the `Instance` and `TenantId` values.
# [ASP.NET Core](#tab/aspnetcore)
@@ -130,7 +130,7 @@ In ASP.NET Core, another file ([properties\launchSettings.json](https://github.c
} ```
-In the Azure portal, the reply URIs that you need to register on the **Authentication** page for your application need to match these URLs. For the two preceding configuration files, they would be `https://localhost:44321/signin-oidc`. The reason is that `applicationUrl` is `http://localhost:3110`, but `sslPort` is specified (44321). `CallbackPath` is `/signin-oidc`, as defined in `appsettings.json`.
+In the Azure portal, the redirect URIs that you register on the **Authentication** page for your application need to match these URLs. For the two preceding configuration files, they would be `https://localhost:44321/signin-oidc`. The reason is that `applicationUrl` is `http://localhost:3110`, but `sslPort` is specified (44321). `CallbackPath` is `/signin-oidc`, as defined in `appsettings.json`.
In the same way, the sign-out URI would be set to `https://localhost:44321/signout-oidc`.
@@ -158,7 +158,7 @@ In ASP.NET, the application is configured through the [Web.config](https://githu
</appSettings> ```
-In the Azure portal, the reply URIs that you need to register on the **Authentication** page for your application need to match these URLs. That is, they should be `https://localhost:44326/`.
+In the Azure portal, the reply URIs that you register on the **Authentication** page for your application need to match these URLs. That is, they should be `https://localhost:44326/`.
# [Java](#tab/java)
@@ -172,7 +172,7 @@ aad.redirectUriSignin=http://localhost:8080/msal4jsample/secure/aad
aad.redirectUriGraph=http://localhost:8080/msal4jsample/graph/me ```
-In the Azure portal, the reply URIs that you need to register on the **Authentication** page for your application need to match the `redirectUri` instances that the application defines. That is, they should be `http://localhost:8080/msal4jsample/secure/aad` and `http://localhost:8080/msal4jsample/graph/me`.
+In the Azure portal, the reply URIs that you register on the **Authentication** page for your application need to match the `redirectUri` instances that the application defines. That is, they should be `http://localhost:8080/msal4jsample/secure/aad` and `http://localhost:8080/msal4jsample/graph/me`.
# [Python](#tab/python)
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-web-app-sign-user-production https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/scenario-web-app-sign-user-production.md
@@ -25,7 +25,7 @@ Now that you know how to get a token to call web APIs, here are some things to c
## Troubleshooting When users sign-in to the web application for the first time, they will need to consent. However, in some organizations, users can see a message like the following: *AppName needs permissions to access resources in your organization that only an admin can grant. Please ask an admin to grant permission to this app before you can use it.*
-This is because your tenant administrator has **disabled** the ability for users to consent. In that case, you need to contact your tenant administrators so that they do an admin-consent for the scopes required by the application.
+This is because your tenant administrator has **disabled** the ability for users to consent. In that case, contact your tenant administrators so that they do an admin-consent for the scopes required by the application.
## Same site
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/sso-between-adal-msal-apps-macos-ios https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/sso-between-adal-msal-apps-macos-ios.md
@@ -30,7 +30,7 @@ This section covers SSO differences between MSAL and ADAL 2.7.x
### Cache format
-ADAL 2.7.x can read the MSAL cache format. You don't need to do anything special for cross-app SSO with version ADAL 2.7.x. However, you need to be aware of differences in account identifiers that those two libraries support.
+ADAL 2.7.x can read the MSAL cache format. You don't need to do anything special for cross-app SSO with version ADAL 2.7.x. However, be aware of differences in account identifiers that those two libraries support.
### Account identifier differences
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/troubleshoot-publisher-verification https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/troubleshoot-publisher-verification.md
@@ -48,8 +48,8 @@ Below are some common issues that may occur during the process.
1. Go to your [partner profile](https://partner.microsoft.com/pcv/accountsettings/connectedpartnerprofile) and verify that: - The MPN ID is correct. - There are no errors or ΓÇ£pending actionsΓÇ¥ shown, and the verification status under Legal business profile and Partner info both say ΓÇ£authorizedΓÇ¥ or ΓÇ£successΓÇ¥.
- 1. Go to the [MPN tenant management page](https://partner.microsoft.com/dashboard/account/v3/tenantmanagement) and confirm that the tenant the app is registered in and that you are signing with a user account from is on the list of associated tenants. If you need to add an additional tenant, follow the instructions [here](/partner-center/multi-tenant-account). Please be aware that all Global Admins of any tenant you add will be granted Global Admin privileges on your Partner Center account.
- 1. Go to the [MPN User Management page](https://partner.microsoft.com/pcv/users) and confirm the user you are signing in as is either a Global Admin, MPN Admin, or Accounts Admin. If you need to add a user to a role in Partner Center, follow the instructions [here](/partner-center/create-user-accounts-and-set-permissions).
+ 1. Go to the [MPN tenant management page](https://partner.microsoft.com/dashboard/account/v3/tenantmanagement) and confirm that the tenant the app is registered in and that you are signing with a user account from is on the list of associated tenants. To add an additional tenant, follow the instructions [here](/partner-center/multi-tenant-account). Please be aware that all Global Admins of any tenant you add will be granted Global Admin privileges on your Partner Center account.
+ 1. Go to the [MPN User Management page](https://partner.microsoft.com/pcv/users) and confirm the user you are signing in as is either a Global Admin, MPN Admin, or Accounts Admin. To add a user to a role in Partner Center, follow the instructions [here](/partner-center/create-user-accounts-and-set-permissions).
- **When I sign into the Azure AD portal, I do not see any apps registered. Why?** Your app registrations may have been created using a different user account in this tenant, a personal/consumer account, or in a different tenant. Ensure you are signed in with the correct account in the tenant where your app registrations were created.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/tutorial-v2-ios https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/tutorial-v2-ios.md
@@ -151,7 +151,7 @@ var webViewParameters : MSALWebviewParameters?
var currentAccount: MSALAccount? ```
-The only value you need to modify above is the value assigned to `kClientID`to be your [Application ID](./developer-glossary.md#application-id-client-id). This value is part of the MSAL Configuration data that you saved during the step at the beginning of this tutorial to register the application in the Azure portal.
+The only value you modify above is the value assigned to `kClientID`to be your [Application ID](./developer-glossary.md#application-id-client-id). This value is part of the MSAL Configuration data that you saved during the step at the beginning of this tutorial to register the application in the Azure portal.
## Configure Xcode project settings
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/tutorial-v2-javascript-spa https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/tutorial-v2-javascript-spa.md
@@ -403,13 +403,13 @@ Create a new .js file named `authPopup.js`, which will contain your authenticati
### More information
-After a user selects the **Sign In** button for the first time, the `signIn` method calls `loginPopup` to sign in the user. This method opens a pop-up window with the *Microsoft identity platform endpoint* to prompt and validate the user's credentials. After a successful sign-in, the user is redirected back to the original *https://docsupdatetracker.net/index.html* page. A token is received, processed by `msal.js`, and the information contained in the token is cached. This token is known as the *ID token* and contains basic information about the user, such as the user display name. If you plan to use any data provided by this token for any purposes, you need to make sure this token is validated by your backend server to guarantee that the token was issued to a valid user for your application.
+After a user selects the **Sign In** button for the first time, the `signIn` method calls `loginPopup` to sign in the user. This method opens a pop-up window with the *Microsoft identity platform endpoint* to prompt and validate the user's credentials. After a successful sign-in, the user is redirected back to the original *https://docsupdatetracker.net/index.html* page. A token is received, processed by `msal.js`, and the information contained in the token is cached. This token is known as the *ID token* and contains basic information about the user, such as the user display name. If you plan to use any data provided by this token for any purposes, make sure this token is validated by your backend server to guarantee that the token was issued to a valid user for your application.
The SPA generated by this guide calls `acquireTokenSilent` and/or `acquireTokenPopup` to acquire an *access token* used to query the Microsoft Graph API for user profile info. If you need a sample that validates the ID token, take a look at [this](https://github.com/Azure-Samples/active-directory-javascript-singlepageapp-dotnet-webapi-v2 "GitHub active-directory-javascript-singlepageapp-dotnet-webapi-v2 sample") sample application in GitHub. The sample uses an ASP.NET web API for token validation. #### Get a user token interactively
-After the initial sign-in, you do not want to ask users to reauthenticate every time they need to request a token to access a resource. So *acquireTokenSilent* should be used most of the time to acquire tokens. There are situations, however, where you need to force users to interact with Microsoft identity platform. Examples include:
+After the initial sign-in, you do not want to ask users to reauthenticate every time they need to request a token to access a resource. So *acquireTokenSilent* should be used most of the time to acquire tokens. There are situations, however, where you force users to interact with Microsoft identity platform. Examples include:
- Users need to reenter their credentials because the password has expired. - Your application is requesting access to a resource, and you need the user's consent.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/tutorial-v2-windows-uwp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/tutorial-v2-windows-uwp.md
@@ -288,7 +288,7 @@ private async void SignOutButton_Click(object sender, RoutedEventArgs e)
} ```
-MSAL.NET uses asynchronous methods to acquire tokens or manipulate accounts. You need to support UI actions in the UI thread. This is the reason for the `Dispatcher.RunAsync` call and the precautions to call `ConfigureAwait(false)`.
+MSAL.NET uses asynchronous methods to acquire tokens or manipulate accounts. As such, support UI actions in the UI thread. This is the reason for the `Dispatcher.RunAsync` call and the precautions to call `ConfigureAwait(false)`.
#### More information about signing out<a name="more-information-on-sign-out"></a>
@@ -339,7 +339,7 @@ private async Task DisplayMessageAsync(string message)
## Register your application
-Now you need to register your application:
+Now, register your application:
1. Sign in to the <a href="https://portal.azure.com/" target="_blank">Azure portal<span class="docon docon-navigate-external x-hidden-focus"></span></a>. 1. If you have access to multiple tenants, use the **Directory + subscription** filter :::image type="icon" source="./media/common/portal-directory-subscription-filter.png" border="false"::: in the top menu to select the tenant in which you want to register an application.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/v2-conditional-access-dev-guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/v2-conditional-access-dev-guide.md
@@ -39,9 +39,9 @@ Knowledge of [single](quickstart-register-app.md) and [multi-tenant](howto-conve
### App types impacted
-In most common cases, Conditional Access does not change an app's behavior or requires any changes from the developer. Only in certain cases when an app indirectly or silently requests a token for a service, an app requires code changes to handle Conditional Access "challenges". It may be as simple as performing an interactive sign-in request.
+In most common cases, Conditional Access does not change an app's behavior or requires any changes from the developer. Only in certain cases when an app indirectly or silently requests a token for a service, an app requires code changes to handle Conditional Access challenges. It may be as simple as performing an interactive sign-in request.
-Specifically, the following scenarios require code to handle Conditional Access "challenges":
+Specifically, the following scenarios require code to handle Conditional Access challenges:
* Apps performing the on-behalf-of flow * Apps accessing multiple services/resources
@@ -50,7 +50,7 @@ Specifically, the following scenarios require code to handle Conditional Access
Conditional Access policies can be applied to the app, but also can be applied to a web API your app accesses. To learn more about how to configure a Conditional Access policy, see [Quickstart: Require MFA for specific apps with Azure Active Directory Conditional Access](../authentication/tutorial-enable-azure-mfa.md).
-Depending on the scenario, an enterprise customer can apply and remove Conditional Access policies at any time. In order for your app to continue functioning when a new policy is applied, you need to implement the "challenge" handling. The following examples illustrate challenge handling.
+Depending on the scenario, an enterprise customer can apply and remove Conditional Access policies at any time. For your app to continue functioning when a new policy is applied, implement challenge handling. The following examples illustrate challenge handling.
### Conditional Access examples
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/v2-howto-app-gallery-listing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/v2-howto-app-gallery-listing.md
@@ -80,7 +80,7 @@ You can get a free test account with all the premium Azure AD features - 90 days
## Step 1 - Choose the right single sign-on standard for your app
-To list an application in the Azure AD app gallery, you need to implement at least one of the supported single sign-on options. To understand the single sign-on options, and how customers will configure them in Azure AD, see [SSO options](../manage-apps/sso-options.md).
+To list an application in the Azure AD app gallery, implement at least one of the supported single sign-on options. To understand the single sign-on options, and how customers will configure them in Azure AD, see [SSO options](../manage-apps/sso-options.md).
The following table compares the main standards: Open Authentication 2.0 (OAuth 2.0) with OpenID Connect (OIDC), Security Assertion Markup Language (SAML), and Web Services Federation (WS-Fed).
@@ -181,7 +181,7 @@ You will need an Azure AD tenant in order to test your app. To set up your devel
Alternatively, an Azure AD tenant comes with every Microsoft 365 subscription. To set up a free Microsoft 365 development environment, see [Join the Microsoft 365 Developer Program](/office/developer-program/microsoft-365-developer-program).
-Once you have a tenant, you need to test single-sign on and [provisioning](../app-provisioning/use-scim-to-provision-users-and-groups.md#step-4-integrate-your-scim-endpoint-with-the-azure-ad-scim-client).
+Once you have a tenant, test single-sign on and [provisioning](../app-provisioning/use-scim-to-provision-users-and-groups.md#step-4-integrate-your-scim-endpoint-with-the-azure-ad-scim-client).
**For OIDC or Oath applications**, [Register your application](quickstart-register-app.md) as a multi-tenant application. ΓÇÄSelect the Accounts in any organizational directory and personal Microsoft accounts option in Supported Account types.
active-directory https://docs.microsoft.com/en-us/azure/active-directory/develop/v2-oauth2-auth-code-flow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/v2-oauth2-auth-code-flow.md
@@ -40,7 +40,7 @@ If you attempt to use the authorization code flow and see this error:
`access to XMLHttpRequest at 'https://login.microsoftonline.com/common/v2.0/oauth2/token' from origin 'yourApp.com' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.`
-Then you need to visit your app registration and update the redirect URI for your app to type `spa`.
+Then, visit your app registration and update the redirect URI for your app to type `spa`.
## Request an authorization code
active-directory https://docs.microsoft.com/en-us/azure/active-directory/external-identities/whats-new-docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/external-identities/whats-new-docs.md
@@ -6,8 +6,8 @@
--++
active-directory https://docs.microsoft.com/en-us/azure/active-directory/roles/permissions-reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/permissions-reference.md
@@ -530,6 +530,7 @@ Can create and manage all aspects of app registrations and enterprise apps.
> | microsoft.directory/policies/applicationConfiguration/owners/read | Read policies.applicationConfiguration property in Azure Active Directory. | > | microsoft.directory/policies/applicationConfiguration/owners/update | Update policies.applicationConfiguration property in Azure Active Directory. | > | microsoft.directory/policies/applicationConfiguration/policyAppliedTo/read | Read policies.applicationConfiguration property in Azure Active Directory. |
+> | microsoft.directory/provisioningLogs/allProperties/read | Read all properties of provisioning logs. |
> | microsoft.directory/servicePrincipals/appRoleAssignedTo/update | Update servicePrincipals.appRoleAssignedTo property in Azure Active Directory. | > | microsoft.directory/servicePrincipals/appRoleAssignments/update | Update servicePrincipals.appRoleAssignments property in Azure Active Directory. | > | microsoft.directory/servicePrincipals/audience/update | Update servicePrincipals.audience property in Azure Active Directory. |
@@ -692,6 +693,7 @@ Can create and manage all aspects of app registrations and enterprise apps excep
> | microsoft.directory/policies/applicationConfiguration/owners/read | Read policies.applicationConfiguration property in Azure Active Directory. | > | microsoft.directory/policies/applicationConfiguration/owners/update | Update policies.applicationConfiguration property in Azure Active Directory. | > | microsoft.directory/policies/applicationConfiguration/policyAppliedTo/read | Read policies.applicationConfiguration property in Azure Active Directory. |
+> | microsoft.directory/provisioningLogs/allProperties/read | Read all properties of provisioning logs. |
> | microsoft.directory/servicePrincipals/appRoleAssignedTo/update | Update servicePrincipals.appRoleAssignedTo property in Azure Active Directory. | > | microsoft.directory/servicePrincipals/appRoleAssignments/update | Update servicePrincipals.appRoleAssignments property in Azure Active Directory. | > | microsoft.directory/servicePrincipals/audience/update | Update servicePrincipals.audience property in Azure Active Directory. |
@@ -761,6 +763,7 @@ Can manage all aspects of Azure AD and Microsoft services that use Azure AD iden
> | microsoft.directory/oAuth2PermissionGrants/allProperties/allTasks | Create and delete oAuth2PermissionGrants, and read and update all properties in Azure Active Directory. | > | microsoft.directory/organization/allProperties/allTasks | Create and delete organization, and read and update all properties in Azure Active Directory. | > | microsoft.directory/policies/allProperties/allTasks | Create and delete policies, and read and update all properties in Azure Active Directory. |
+> | microsoft.directory/provisioningLogs/allProperties/read | Read all properties of provisioning logs. |
> | microsoft.directory/roleAssignments/allProperties/allTasks | Create and delete roleAssignments, and read and update all properties in Azure Active Directory. | > | microsoft.directory/roleDefinitions/allProperties/allTasks | Create and delete roleDefinitions, and read and update all properties in Azure Active Directory. | > | microsoft.directory/scopedRoleMemberships/allProperties/allTasks | Create and delete scopedRoleMemberships, and read and update all properties in Azure Active Directory. |
@@ -1138,6 +1141,7 @@ Can read everything that a Global Administrator can, but not edit anything.
> | microsoft.directory/organization/basic/read | Read basic properties on organization in Azure Active Directory. | > | microsoft.directory/organization/trustedCAsForPasswordlessAuth/read | Read organization.trustedCAsForPasswordlessAuth property in Azure Active Directory. | > | microsoft.directory/policies/standard/read | Read standard policies in Azure Active Directory. |
+> | microsoft.directory/provisioningLogs/allProperties/read | Read all properties of provisioning logs. |
> | microsoft.directory/roleAssignments/basic/read | Read basic properties on roleAssignments in Azure Active Directory. | > | microsoft.directory/roleDefinitions/basic/read | Read basic properties on roleDefinitions in Azure Active Directory. | > | microsoft.directory/servicePrincipals/appRoleAssignedTo/read | Read servicePrincipals.appRoleAssignedTo property in Azure Active Directory. |
@@ -1678,6 +1682,7 @@ Can read sign-in and audit reports.
> | Actions | Description | > | | | > | microsoft.directory/auditLogs/allProperties/read | Read all properties (including privileged properties) on auditLogs in Azure Active Directory. |
+> | microsoft.directory/provisioningLogs/allProperties/read | Read all properties of provisioning logs. |
> | microsoft.directory/signInReports/allProperties/read | Read all properties (including privileged properties) on signInReports in Azure Active Directory. | > | microsoft.azure.serviceHealth/allEntities/allTasks | Read and configure Azure Service Health. | > | microsoft.office365.usageReports/allEntities/read | Read Office 365 usage reports. |
@@ -1738,6 +1743,7 @@ Can read security information and reports,and manage configuration in Azure AD a
> | microsoft.directory/policies/owners/update | Update policies.owners property in Azure Active Directory. | > | microsoft.directory/policies/tenantDefault/update | Update policies.tenantDefault property in Azure Active Directory. | > | microsoft.directory/privilegedIdentityManagement/allProperties/read | Read all resources in microsoft.aad.privilegedIdentityManagement. |
+> | microsoft.directory/provisioningLogs/allProperties/read | Read all properties of provisioning logs. |
> | microsoft.directory/servicePrincipals/policies/update | Update servicePrincipals.policies property in Azure Active Directory. | > | microsoft.directory/signInReports/allProperties/read | Read all properties (including privileged properties) on signInReports in Azure Active Directory. | > | microsoft.office365.protectionCenter/allEntities/read | Read all aspects of Office 365 Protection Center. |
@@ -1787,6 +1793,7 @@ Can read security information and reports in Azure AD and Microsoft 365.
> | microsoft.directory/signInReports/allProperties/read | Read all properties (including privileged properties) on signInReports in Azure Active Directory. | > | microsoft.aad.identityProtection/allEntities/read | Read all resources in microsoft.aad.identityProtection. | > | microsoft.aad.privilegedIdentityManagement/allEntities/read | Read all resources in microsoft.aad.privilegedIdentityManagement. |
+> | microsoft.directory/provisioningLogs/allProperties/read | Read all properties of provisioning logs. |
> | microsoft.azure.serviceHealth/allEntities/allTasks | Read and configure Azure Service Health. | > | microsoft.office365.webPortal/allEntities/basic/read | Read basic properties on all resources in microsoft.office365.webPortal. | > | microsoft.office365.protectionCenter/allEntities/read | Read all aspects of Office 365 Protection Center. |
active-directory https://docs.microsoft.com/en-us/azure/active-directory/saas-apps/globesmart-tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/globesmart-tutorial.md
@@ -0,0 +1,180 @@
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with GlobeSmart | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and GlobeSmart.
++++++++ Last updated : 02/05/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with GlobeSmart
+
+In this tutorial, you'll learn how to integrate GlobeSmart with Azure Active Directory (Azure AD). When you integrate GlobeSmart with Azure AD, you can:
+
+* Control in Azure AD who has access to GlobeSmart.
+* Enable your users to be automatically signed-in to GlobeSmart with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* GlobeSmart single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* GlobeSmart supports **SP and IDP** initiated SSO
+* GlobeSmart supports **Just In Time** user provisioning
+
+## Adding GlobeSmart from the gallery
+
+To configure the integration of GlobeSmart into Azure AD, you need to add GlobeSmart from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **GlobeSmart** in the search box.
+1. Select **GlobeSmart** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
++
+## Configure and test Azure AD SSO for GlobeSmart
+
+Configure and test Azure AD SSO with GlobeSmart using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in GlobeSmart.
+
+To configure and test Azure AD SSO with GlobeSmart, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure GlobeSmart SSO](#configure-globesmart-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create GlobeSmart test user](#create-globesmart-test-user)** - to have a counterpart of B.Simon in GlobeSmart that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **GlobeSmart** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, enter the values for the following fields:
+
+ a. In the **Identifier** box, type a value using the following pattern:
+
+ | Environment | URL |
+ |-|-|
+ | Sandbox | `urn:auth0:aperianglobal-staging:<INSTANCE_NAME>`|
+ | Production | `urn:auth0:aperianglobal-production:<INSTANCE_NAME>`|
+ | | |
++
+ b. In the **Reply URL** text box, type a URL using the following pattern:
+
+ | Environment | URL |
+ |-|-|
+ | Sandbox | ` https://aperianglobal-staging.auth0.com/login/callback?connection=<INSTANCE_NAME>`|
+ | Production | `https://auth.aperianglobal.com/login/callback?connection=<INSTANCE_NAME>`|
+ | | |
+
+1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
+
+ In the **Sign-on URL** text box, type a URL using the following pattern:
+
+ | Environment | URL |
+ |-|-|
+ | Sandbox | ` https://staging.aperianglobal.com?sp=<INSTANCE_NAME>`|
+ | Production | `https://globesmart.aperianglobal.com?sp=<INSTANCE_NAME>`|
+ | | |
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Identifier, Reply URL and Sign-on URL. Contact [GlobeSmart Client support team](mailto:support@aperianglobal.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. GlobeSmart application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
+
+ ![image](common/default-attributes.png)
+
+1. In addition to above, GlobeSmart application expects few more attributes to be passed back in SAML response, which are shown below. These attributes are also pre populated but you can review them as per your requirements.
+
+ | Name | Source attribute|
+ | - | |
+ | firstName | user.givenname |
+ | lastName | user.surname |
+ | user_id | user.userprincipalname |
+ | email | user.mail |
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/metadataxml.png)
+
+1. On the **Set up GlobeSmart** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to GlobeSmart.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **GlobeSmart**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure GlobeSmart SSO
+
+To configure single sign-on on **GlobeSmart** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [GlobeSmart support team](mailto:support@aperianglobal.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create GlobeSmart test user
+
+In this section, a user called Britta Simon is created in GlobeSmart. GlobeSmart supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in GlobeSmart, a new one is created after authentication.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to GlobeSmart Sign on URL where you can initiate the login flow.
+
+* Go to GlobeSmart Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
+
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the GlobeSmart for which you set up the SSO
+
+You can also use Microsoft My Apps to test the application in any mode. When you click the GlobeSmart tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the GlobeSmart for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](https://docs.microsoft.com/azure/active-directory/active-directory-saas-access-panel-introduction).
++
+## Next steps
+
+Once you configure GlobeSmart you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
++
active-directory https://docs.microsoft.com/en-us/azure/active-directory/saas-apps/poolparty-semantic-suite-tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/poolparty-semantic-suite-tutorial.md
@@ -0,0 +1,140 @@
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with PoolParty Semantic Suite | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and PoolParty Semantic Suite.
++++++++ Last updated : 02/05/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with PoolParty Semantic Suite
+
+In this tutorial, you'll learn how to integrate PoolParty Semantic Suite with Azure Active Directory (Azure AD). When you integrate PoolParty Semantic Suite with Azure AD, you can:
+
+* Control in Azure AD who has access to PoolParty Semantic Suite.
+* Enable your users to be automatically signed-in to PoolParty Semantic Suite with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* PoolParty Semantic Suite single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* PoolParty Semantic Suite supports **SP** initiated SSO
+
+## Adding PoolParty Semantic Suite from the gallery
+
+To configure the integration of PoolParty Semantic Suite into Azure AD, you need to add PoolParty Semantic Suite from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **PoolParty Semantic Suite** in the search box.
+1. Select **PoolParty Semantic Suite** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
++
+## Configure and test Azure AD SSO for PoolParty Semantic Suite
+
+Configure and test Azure AD SSO with PoolParty Semantic Suite using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in PoolParty Semantic Suite.
+
+To configure and test Azure AD SSO with PoolParty Semantic Suite, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure PoolParty Semantic Suite SSO](#configure-poolparty-semantic-suite-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create PoolParty Semantic Suite test user](#create-poolparty-semantic-suite-test-user)** - to have a counterpart of B.Simon in PoolParty Semantic Suite that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **PoolParty Semantic Suite** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, enter the values for the following fields:
+
+ a. In the **Sign-on URL** text box, type a URL using the following pattern:
+ `https://<CustomerName>.poolparty.biz/PoolParty/`
+
+ b. In the **Identifier** box, type a URL using the following pattern:
+ `https://<CustomerName>.poolparty.biz/<ID>`
+
+ c. In the **Reply URL** text box, type a URL using the following pattern:
+ `https://<CustomerName>.poolparty.biz/<ID>`
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Sign-On URL, Identifier and Reply URL. Contact [PoolParty Semantic Suite Client support team](mailto:support@poolparty.biz) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/metadataxml.png)
+
+1. On the **Set up PoolParty Semantic Suite** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to PoolParty Semantic Suite.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **PoolParty Semantic Suite**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure PoolParty Semantic Suite SSO
+
+To configure single sign-on on **PoolParty Semantic Suite** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [PoolParty Semantic Suite support team](mailto:support@poolparty.biz). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create PoolParty Semantic Suite test user
+
+In this section, you create a user called Britta Simon in PoolParty Semantic Suite. Work with [PoolParty Semantic Suite support team](mailto:support@poolparty.biz) to add the users in the PoolParty Semantic Suite platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on **Test this application** in Azure portal. This will redirect to PoolParty Semantic Suite Sign-on URL where you can initiate the login flow.
+
+* Go to PoolParty Semantic Suite Sign-on URL directly and initiate the login flow from there.
+
+* You can use Microsoft My Apps. When you click the PoolParty Semantic Suite tile in the My Apps, this will redirect to PoolParty Semantic Suite Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](https://docs.microsoft.com/azure/active-directory/active-directory-saas-access-panel-introduction).
++
+## Next steps
+
+Once you configure PoolParty Semantic Suite you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
api-management https://docs.microsoft.com/en-us/azure/api-management/plan-manage-costs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/api-management/plan-manage-costs.md
@@ -85,10 +85,10 @@ You can also [export your cost data](../cost-management-billing/costs/tutorial-e
### Choose tier
-Review the [Feature-based comparison of the Azure API Management tiers](api-management-features.md) to help decide which service tier may be appropriate for your scenarios. The different service tiers support combinations of features and capabilities designed for various use cases, with different costs. [Upgrade](upgrade-and-scale.md) to a different service tier at any time.
+Review the [Feature-based comparison of the Azure API Management tiers](api-management-features.md) to help decide which service tier may be appropriate for your scenarios. The different service tiers support combinations of features and capabilities designed for various use cases, with different costs.
* The **Consumption** service tier provides a lightweight, serverless option that incurs no fixed costs. You are billed based on the number of API calls to the service above a certain threshold. Capacity also scales automatically based on the load on the service.
-* Other API Management tiers incur monthly costs, and provide greater throughput and richer feature sets for evaluation and production workloads.
+* The **Developer**, **Basic**, **Standard**, and **Premium** API Management tiers incur monthly costs, and provide greater throughput and richer feature sets for evaluation and production workloads. [Upgrade](upgrade-and-scale.md) to a different service tier at any time.
### Scale using capacity units
@@ -107,4 +107,4 @@ As you add or remove units, capacity and cost scale proportionally. For example,
- Learn about how to [prevent unexpected costs](../cost-management-billing/cost-management-billing-overview.md?WT.mc_id=costmanagementcontent_docsacmhorizontal_-inproduct-learn). - Take the [Cost Management](/learn/paths/control-spending-manage-bills?WT.mc_id=costmanagementcontent_docsacmhorizontal_-inproduct-learn) guided learning course. - Learn about API Management [capacity](api-management-capacity.md).-- See steps to scale and upgrade API Management using the [Azure portal](upgrade-and-scale.md), and learn about [autoscaling](api-management-howto-autoscale.md).
+- See steps to scale and upgrade API Management using the [Azure portal](upgrade-and-scale.md), and learn about [autoscaling](api-management-howto-autoscale.md).
app-service https://docs.microsoft.com/en-us/azure/app-service/app-service-hybrid-connections https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/app-service-hybrid-connections.md
@@ -5,7 +5,7 @@
ms.assetid: 66774bde-13f5-45d0-9a70-4e9536a4f619 Previously updated : 02/04/2020 Last updated : 02/05/2020
@@ -197,13 +197,20 @@ Anyone with `Reader` access to the Relay will be able to _see_ the Hybrid Connec
## Troubleshooting ##
-The status of "Connected" means that at least one HCM is configured with that Hybrid Connection, and is able to reach Azure. If the status for your Hybrid Connection does not say **Connected**, your Hybrid Connection is not configured on any HCM that has access to Azure.
+The status of "Connected" means that at least one HCM is configured with that Hybrid Connection, and is able to reach Azure. If the status for your Hybrid Connection does not say **Connected**, your Hybrid Connection is not configured on any HCM that has access to Azure. When your HCM shows **Not Connected** there are a few things to check:
-The primary reason that clients cannot connect to their endpoint is because the endpoint was specified by using an IP address instead of a DNS name. If your app cannot reach the desired endpoint and you used an IP address, switch to using a DNS name that is valid on the host where the HCM is running. Also check that the DNS name resolves properly on the host where the HCM is running. Confirm that there is connectivity from the host where the HCM is running to the Hybrid Connection endpoint.
+* Does your host have outbound access to Azure on port 443? You can test from your HCM host using the PowerShell command *Test-NetConnection Destination -P Port*
+* Is your HCM potentially in a bad state? Try restarting the ΓÇÿAzure Hybrid Connection Manager Service" local service.
-In App Service, the **tcpping** command line tool can be invoked from the Advanced Tools (Kudu) console. This tool can tell you if you have access to a TCP endpoint, but it does not tell you if you have access to a Hybrid Connection endpoint. When you use the tool in the console against a Hybrid Connection endpoint, you are only confirming that it uses a host:port combination.
+If your status says **Connected** but your app cannot reach your endpoint then:
-If you have a command line client for your endpoint, you can test connectivity from the app console. For example, you can test access to web server endpoints by using curl.
+* make sure you are using a DNS name in your Hybrid Connection. If you use an IP address then the required client DNS lookup may not happen. If the client running in your web app does not do a DNS lookup, then the Hybrid Connection will not work
+* check that the DNS name used in your Hybrid Connection can resolve from the HCM host. Check the resolution using *nslookup EndpointDNSname* where EndpointDNSname is an exact match to what is used in your Hybrid Connection definition.
+* test access from your HCM host to your endpoint using the PowerShell command *Test-NetConnection EndpointDNSname -P Port* If you cannot reach the endpoint from your HCM host then check firewalls between the two hosts including any host-based firewalls on the destination host.
+
+In App Service, the **tcpping** command-line tool can be invoked from the Advanced Tools (Kudu) console. This tool can tell you if you have access to a TCP endpoint, but it does not tell you if you have access to a Hybrid Connection endpoint. When you use the tool in the console against a Hybrid Connection endpoint, you are only confirming that it uses a host:port combination.
+
+If you have a command-line client for your endpoint, you can test connectivity from the app console. For example, you can test access to web server endpoints by using curl.
<!--Image references-->
app-service https://docs.microsoft.com/en-us/azure/app-service/networking-features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/networking-features.md
@@ -110,6 +110,9 @@ The IP-based access restrictions feature helps when you want to restrict the IP
To learn how to enable this feature, see [Configuring access restrictions][iprestrictions].
+> [!NOTE]
+> IP-based access restriction rules only handle virtual network address ranges when your app is in an App Service Environment. If your app is in the multitenant service, you need to use [service endpoints](../virtual-network/virtual-network-service-endpoints-overview.md) to restrict traffic to select subnets in your virtual network.
+ #### Access restriction rules based on service endpoints Service endpoints allow you to lock down *inbound* access to your app so that the source address must come from a set of subnets that you select. This feature works together with IP access restrictions. Service endpoints aren't compatible with remote debugging. If you want to use remote debugging with your app, your client can't be in a subnet that has service endpoints enabled. The process for setting service endpoints is similar to the process for setting IP access restrictions. You can build an allow/deny list of access rules that includes public addresses and subnets in your virtual networks.
app-service https://docs.microsoft.com/en-us/azure/app-service/overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/overview.md
@@ -40,19 +40,19 @@ App Service can also host web apps natively on Linux for supported application s
### Built-in languages and frameworks
-App Service on Linux supports a number of language specific built-in images. Just deploy your code. Supported languages include: Node.js, Java (JRE 8 & JRE 11), PHP, Python, .NET Core and Ruby. Run [`az webapp list-runtimes --linux`](/cli/azure/webapp#az-webapp-list-runtimes) to view the latest languages and supported versions. If the runtime your application requires is not supported in the built-in images, you can deploy it with a custom container.
+App Service on Linux supports a number of language specific built-in images. Just deploy your code. Supported languages include: Node.js, Java (JRE 8 & JRE 11), PHP, Python, .NET Core, and Ruby. Run [`az webapp list-runtimes --linux`](/cli/azure/webapp#az-webapp-list-runtimes) to view the latest languages and supported versions. If the runtime your application requires is not supported in the built-in images, you can deploy it with a custom container.
Outdated runtimes are periodically removed from the Web Apps Create and Configuration blades in the Portal. These runtimes are hidden from the Portal when they are deprecated by the maintaining organization or found to have significant vulnerabilities. These options are hidden to guide customers to the latest runtimes where they will be the most successful. When an outdated runtime is hidden from the Portal, any of your existing sites using that version will continue to run. If a runtime is fully removed from the App Service platform, your Azure subscription owner(s) will receive an email notice before the removal.
-If you need to create another web app with an outdated runtime version that is no longer shown on the Portal see the language configuration guides for instructions on how to get the runtime version of your site. You can use the Azure CLI to create another site with the same runtime. Alternatively, you can use the **Export Template** button on the web app blade in the Portal to export an ARM template of the site. You can re-use this template to deploy a new site with the same runtime and configuration.
+If you need to create another web app with an outdated runtime version that is no longer shown on the Portal see the language configuration guides for instructions on how to get the runtime version of your site. You can use the Azure CLI to create another site with the same runtime. Alternatively, you can use the **Export Template** button on the web app blade in the Portal to export an ARM template of the site. You can reuse this template to deploy a new site with the same runtime and configuration.
### Limitations - App Service on Linux is not supported on [Shared](https://azure.microsoft.com/pricing/details/app-service/plans/) pricing tier. - You can't mix Windows and Linux apps in the same App Service plan. -- Within the same resource group, you can't mix Windows and Linux apps in the same region.
+- Historically, you can't mix Windows and Linux apps in the same resource group. However, all resource groups created on or after January 21, 2021 do support this scenario. For resource groups created before January 21, 2021, the ability to add mixed platform deployments will be rolled out across Azure regions (including National cloud regions) soon.
- The Azure portal shows only features that currently work for Linux apps. As features are enabled, they're activated on the portal. - When deployed to built-in images, your code and content are allocated a storage volume for web content, backed by Azure Storage. The disk latency of this volume is higher and more variable than the latency of the container filesystem. Apps that require heavy read-only access to content files may benefit from the custom container option, which places files in the container filesystem instead of on the content volume.
application-gateway https://docs.microsoft.com/en-us/azure/application-gateway/application-gateway-backend-health-troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/application-gateway-backend-health-troubleshooting.md
@@ -21,7 +21,7 @@ successfully, Application Gateway resumes forwarding the requests.
### How to check backend health To check the health of your backend pool, you can use the
-**Backend Health** page on the Azure portal. Or, you can use [Azure PowerShell](/powershell/module/az.network/get-azapplicationgatewaybackendhealth?view=azps-2.6.0), [CLI](/cli/azure/network/application-gateway?view=azure-cli-latest#az-network-application-gateway-show-backend-health), or [REST API](/rest/api/application-gateway/applicationgateways/backendhealth).
+**Backend Health** page on the Azure portal. Or, you can use [Azure PowerShell](/powershell/module/az.network/get-azapplicationgatewaybackendhealth), [CLI](/cli/azure/network/application-gateway#az-network-application-gateway-show-backend-health), or [REST API](/rest/api/application-gateway/applicationgateways/backendhealth).
The status retrieved by any of these methods can be any one of the following:
@@ -133,9 +133,9 @@ this message is displayed, it suggests that Application Gateway couldn't success
1. If the domain is private or internal, try to resolve it from a VM in the same virtual network. If you can resolve it, restart Application Gateway and check again. To restart Application Gateway, you need to
- [stop](/powershell/module/azurerm.network/stop-azurermapplicationgateway?view=azurermps-6.13.0)
+ [stop](/powershell/module/azurerm.network/stop-azurermapplicationgateway)
and
- [start](/powershell/module/azurerm.network/start-azurermapplicationgateway?view=azurermps-6.13.0)
+ [start](/powershell/module/azurerm.network/start-azurermapplicationgateway)
by using the PowerShell commands described in these linked resources. #### TCP connect error
@@ -449,4 +449,4 @@ This behavior can occur for one or more of the following reasons:
Next steps -
-Learn more about [Application Gateway diagnostics and logging](./application-gateway-diagnostics.md).
+Learn more about [Application Gateway diagnostics and logging](./application-gateway-diagnostics.md).
application-gateway https://docs.microsoft.com/en-us/azure/application-gateway/custom-error https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/custom-error.md
@@ -76,8 +76,8 @@ $listener01 = Get-AzApplicationGatewayHttpListener -Name <listener-name> -Applic
$updatedlistener = Add-AzApplicationGatewayHttpListenerCustomError -HttpListener $listener01 -StatusCode HttpStatus502 -CustomErrorPageUrl "http://<website-url>" ```
-For more information, see [Add-AzApplicationGatewayCustomError](/powershell/module/az.network/add-azapplicationgatewaycustomerror?view=azps-1.2.0) and [Add-AzApplicationGatewayHttpListenerCustomError](/powershell/module/az.network/add-azapplicationgatewayhttplistenercustomerror?view=azps-1.3.0).
+For more information, see [Add-AzApplicationGatewayCustomError](/powershell/module/az.network/add-azapplicationgatewaycustomerror) and [Add-AzApplicationGatewayHttpListenerCustomError](/powershell/module/az.network/add-azapplicationgatewayhttplistenercustomerror).
## Next steps
-For information about Application Gateway diagnostics, see [Back-end health, diagnostic logs, and metrics for Application Gateway](application-gateway-diagnostics.md).
+For information about Application Gateway diagnostics, see [Back-end health, diagnostic logs, and metrics for Application Gateway](application-gateway-diagnostics.md).
application-gateway https://docs.microsoft.com/en-us/azure/application-gateway/ingress-controller-install-new https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/ingress-controller-install-new.md
@@ -27,7 +27,7 @@ Alternatively, launch Cloud Shell from Azure portal using the following icon:
Your [Azure Cloud Shell](https://shell.azure.com/) already has all necessary tools. Should you choose to use another environment, please ensure the following command-line tools are installed:
-* `az` - Azure CLI: [installation instructions](/cli/azure/install-azure-cli?view=azure-cli-latest)
+* `az` - Azure CLI: [installation instructions](/cli/azure/install-azure-cli)
* `kubectl` - Kubernetes command-line tool: [installation instructions](https://kubernetes.io/docs/tasks/tools/install-kubectl) * `helm` - Kubernetes package * `jq` - command-line JSON processor: [installation instructions](https://stedolan.github.io/jq/download/)
@@ -350,4 +350,4 @@ kubectl apply -f aspnetapp.yaml
## Other Examples This [how-to guide](ingress-controller-expose-service-over-http-https.md) contains more examples on how to expose an AKS
-service via HTTP or HTTPS, to the Internet with Application Gateway.
+service via HTTP or HTTPS, to the Internet with Application Gateway.
application-gateway https://docs.microsoft.com/en-us/azure/application-gateway/migrate-v1-v2 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/migrate-v1-v2.md
@@ -120,7 +120,7 @@ To run the script:
$trustedCert = New-AzApplicationGatewayTrustedRootCertificate -Name "trustedCert1" -CertificateFile $certFilePath ```
- To create a list of PSApplicationGatewayTrustedRootCertificate objects, see [New-AzApplicationGatewayTrustedRootCertificate](/powershell/module/Az.Network/New-AzApplicationGatewayTrustedRootCertificate?view=azps-2.1.0&viewFallbackFrom=azps-2.0.0).
+ To create a list of PSApplicationGatewayTrustedRootCertificate objects, see [New-AzApplicationGatewayTrustedRootCertificate](/powershell/module/Az.Network/New-AzApplicationGatewayTrustedRootCertificate).
* **privateIpAddress: [String]: Optional**. A specific private IP address that you want to associate to your new v2 gateway. This must be from the same VNet that you allocate for your new v2 gateway. If this isn't specified, the script allocates a private IP address for your v2 gateway. * **publicIpResourceId: [String]: Optional**. The resourceId of existing public IP address (standard SKU) resource in your subscription that you want to allocate to the new v2 gateway. If this isn't specified, the script allocates a new public IP in the same resource group. The name is the v2 gateway's name with *-IP* appended. * **validateMigration: [switch]: Optional**. Use this parameter if you want the script to do some basic configuration comparison validations after the v2 gateway creation and the configuration copy. By default, no validation is done.
@@ -197,4 +197,4 @@ You can contact Azure Support under the topic "Configuration and Setup/Migrate t
## Next steps
-[Learn about Application Gateway v2](application-gateway-autoscaling-zone-redundant.md)
+[Learn about Application Gateway v2](application-gateway-autoscaling-zone-redundant.md)
application-gateway https://docs.microsoft.com/en-us/azure/application-gateway/scripts/create-vmss-cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/scripts/create-vmss-cli.md
@@ -46,8 +46,8 @@ This script uses the following commands to create the deployment. Each item in t
| [az group create](/cli/azure/group) | Creates a resource group in which all resources are stored. | | [az network vnet create](/cli/azure/network/vnet) | Creates a virtual network. | | [az network vnet subnet create](/cli/azure/network/vnet/subnet#az-network-vnet-subnet-create) | Creates a subnet in a virtual network. |
-| [az network public-ip create](/cli/azure/network/public-ip?view=azure-cli-latest) | Creates the public IP address for the application gateway. |
-| [az network application-gateway create](/cli/azure/network/application-gateway?view=azure-cli-latest) | Create an application gateway. |
+| [az network public-ip create](/cli/azure/network/public-ip) | Creates the public IP address for the application gateway. |
+| [az network application-gateway create](/cli/azure/network/application-gateway) | Create an application gateway. |
| [az vmss create](/cli/azure/vmss) | Creates a virtual machine scale set. | | [az network public-ip show](/cli/azure/network/public-ip) | Gets the public IP address of the application gateway. |
@@ -55,4 +55,4 @@ This script uses the following commands to create the deployment. Each item in t
For more information on the Azure CLI, see [Azure CLI documentation](/cli/azure/overview).
-Additional application gateway CLI script samples can be found in the [Azure Windows VM documentation](../cli-samples.md).
+Additional application gateway CLI script samples can be found in the [Azure Windows VM documentation](../cli-samples.md).
application-gateway https://docs.microsoft.com/en-us/azure/application-gateway/scripts/create-vmss-waf-cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/scripts/create-vmss-waf-cli.md
@@ -46,8 +46,8 @@ This script uses the following commands to create the deployment. Each item in t
| [az group create](/cli/azure/group#az-group-create) | Creates a resource group in which all resources are stored. | | [az network vnet create](/cli/azure/network/vnet#az-network-vnet-create) | Creates a virtual network. | | [az network vnet subnet create](/cli/azure/network/vnet/subnet#az-network-vnet-subnet-create) | Creates a subnet in a virtual network. |
-| [az network public-ip create](/cli/azure/network/public-ip?view=azure-cli-latest) | Creates the public IP address for the application gateway. |
-| [az network application-gateway create](/cli/azure/network/application-gateway?view=azure-cli-latest) | Create an application gateway. |
+| [az network public-ip create](/cli/azure/network/public-ip) | Creates the public IP address for the application gateway. |
+| [az network application-gateway create](/cli/azure/network/application-gateway) | Create an application gateway. |
| [az vmss create](/cli/azure/vmss#az-vmss-create) | Creates a virtual machine scale set. | | [az storage account create](/cli/azure/storage/account#az-storage-account-create) | Creates a storage account. | | [az monitor diagnostic-settings create](/cli/azure/monitor/diagnostic-settings#az-monitor-diagnostic-settings-create) | Creates a storage account. |
@@ -57,4 +57,4 @@ This script uses the following commands to create the deployment. Each item in t
For more information on the Azure CLI, see [Azure CLI documentation](/cli/azure/overview).
-Additional application gateway CLI script samples can be found in the [Azure application gateway documentation](../cli-samples.md).
+Additional application gateway CLI script samples can be found in the [Azure application gateway documentation](../cli-samples.md).
application-gateway https://docs.microsoft.com/en-us/azure/application-gateway/tutorial-ingress-controller-add-on-existing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/tutorial-ingress-controller-add-on-existing.md
@@ -62,7 +62,7 @@ In the following example, you'll be deploying a new AKS cluster named *myCluster
az aks create -n myCluster -g myResourceGroup --network-plugin azure --enable-managed-identity ```
-To configure additional parameters for the `az aks create` command, visit references [here](/cli/azure/aks?view=azure-cli-latest#az-aks-create).
+To configure additional parameters for the `az aks create` command, visit references [here](/cli/azure/aks#az-aks-create).
## Deploy a new Application Gateway
@@ -137,4 +137,4 @@ az group delete --name myResourceGroup
## Next steps > [!div class="nextstepaction"]
-> [Learn more about disabling the AGIC add-on](./ingress-controller-disable-addon.md)
+> [Learn more about disabling the AGIC add-on](./ingress-controller-disable-addon.md)
application-gateway https://docs.microsoft.com/en-us/azure/application-gateway/tutorial-ingress-controller-add-on-new https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/tutorial-ingress-controller-add-on-new.md
@@ -75,7 +75,7 @@ Deploying a new AKS cluster with the AGIC add-on enabled without specifying an e
az aks create -n myCluster -g myResourceGroup --network-plugin azure --enable-managed-identity -a ingress-appgw --appgw-name myApplicationGateway --appgw-subnet-prefix "10.2.0.0/16" --generate-ssh-keys ```
-To configure additional parameters for the `az aks create` command, see [these references](/cli/azure/aks?view=azure-cli-latest#az-aks-create).
+To configure additional parameters for the `az aks create` command, see [these references](/cli/azure/aks#az-aks-create).
> [!NOTE] > The AKS cluster that you created will appear in the resource group that you created, *myResourceGroup*. However, the automatically created Application Gateway instance will be in the node resource group, where the agent pools are. The node resource group by is named *MC_resource-group-name_cluster-name_location* by default, but can be modified.
attestation https://docs.microsoft.com/en-us/azure/attestation/quickstart-portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/attestation/quickstart-portal.md
@@ -1,6 +1,6 @@
Title: Set up Azure Attestation with Azure portal
-description: How to set up and configure an attestation provider using Azure portal.
+ Title: 'Quickstart: Set up Azure Attestation by using the Azure portal'
+description: In this quickstart, you'll learn how to set up and configure an attestation provider by using the Azure portal.
@@ -10,165 +10,151 @@
-# Quickstart: Set up Azure Attestation with Azure portal
+# Quickstart: Set up Azure Attestation by using the Azure portal
+
+Follow this quickstart to get started with Azure Attestation. Learn how to manage an attestation provider, a policy signer, and a policy by using the Azure portal.
## Prerequisites If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
-Follow the below steps to manage an attestation provider using Azure portal.
+## Attestation provider
-## 1. Attestation provider
+In this section, you'll create an attestation provider and configure it with either unsigned policies or signed policies. You'll also learn how to view and delete the attestation provider.
-### 1.1 Create an attestation provider
+### Create and configure the provider with unsigned policies
-#### 1.1.1 To configure the provider with unsigned policies
+1. Go to the Azure portal menu or the home page and select **Create a resource**.
+1. In the search box, enter **attestation**.
+1. In the results list, select **Microsoft Azure Attestation**.
+1. On the **Microsoft Azure Attestation** page, select **Create**.
+1. On the **Create attestation provider** page, provide the following inputs:
-1. From the Azure portal menu, or from the Home page, select **Create a resource**
-2. In the Search box, enter **attestation**
-3. From the results list, choose **Microsoft Azure Attestation**
-4. On the Microsoft Azure Attestation page, choose **Create**
-5. On the Create attestation provider page, provide the following inputs:
-
- **Subscription**: Choose a subscription
-
- **Resource Group**: select an existing resource group or choose **Create new** and enter a resource group name
-
- **Name**: A unique name is required
+ - **Subscription**: Choose a subscription.
+ - **Resource Group**: Select an existing resource group, or select **Create new** and enter a resource group name.
+ - **Name**: Enter a unique name.
+ - **Location**: Choose a location.
+ - **Policy signer certificates file**: Don't upload the policy signer certificates file to configure the provider with unsigned policies.
- **Location**: choose a location
-
- **Policy signer certificates file**: Do not upload policy signer certificates file to configure the provider with unsigned policies
-6. After providing the required inputs, click **Review+Create**
-7. Fix validation issues if any and click **Create**.
+1. After you provide the required inputs, select **Review+Create**.
+1. If there are validation issues, fix them and then select **Create**.
-#### 1.1.2 To configure the provider with signed policies
+### Create and configure the provider with signed policies
-1. From the Azure portal menu, or from the Home page, select **Create a resource**
-2. In the Search box, enter **attestation**
-3. From the results list, choose **Microsoft Azure Attestation**
-4. On the Microsoft Azure Attestation page, choose **Create**
-5. On the Create attestation provider page, provide the following information:
-
- a. **Subscription**: Choose a subscription
-
- b. **Resource Group**: select an existing resource group or choose **Create new** and enter a resource group name
-
- c. **Name**: A unique name is required
+1. Go to the Azure portal menu or the home page and select **Create a resource**.
+1. In the search box, enter **attestation**.
+1. In the results list, select **Microsoft Azure Attestation**.
+1. On the **Microsoft Azure Attestation** page, select **Create**.
+1. On the **Create attestation provider** page, provide the following information:
- d. **Location**: choose a location
-
- e. **Policy signer certificates file**: To configure the attestation provider with policy signing certs, upload certs file. See examples [here](./policy-signer-examples.md)
-6. After providing the required inputs, click **Review+Create**
-7. Fix validation issues if any and click **Create**.
+ - **Subscription**: Choose a subscription.
+ - **Resource Group**: Select an existing resource group, or select **Create new** and enter a resource group name.
+ - **Name**: Enter a unique name.
+ - **Location**: Choose a location.
+ - **Policy signer certificates file**: Upload the policy signer certificates file to configure the attestation provider with signed policies. [See examples of policy signer certificates](./policy-signer-examples.md).
-### 1.2 View attestation provider
+1. After you provide the required inputs, select **Review+Create**.
+1. If there are validation issues, fix them and then select **Create**.
-1. From the Azure portal menu, or from the Home page, select **All resources**
-2. In the filter box, enter attestation provider name and select it
+### View the attestation provider
-### 1.3 Delete attestation provider
+1. Go to the Azure portal menu or the home page and select **All resources**.
+1. In the filter box, enter the attestation provider name and select it.
-1. From the Azure portal menu, or from the Home page, select **All resources**
-2. In the filter box, enter attestation provider name
-3. Select the checkbox and click **Delete**
-4. Type yes and click **Delete**
-[OR]
-1. From the Azure portal menu, or from the Home page, select **All resources**
-2. In the filter box, enter attestation provider name
-3. Select the attestation provider and navigate to overview page
-4. Click **Delete** in the top menu and click **Yes**
+### Delete the attestation provider
+There are two ways to delete the attestation provider. You can:
-## 2. Attestation policy signers
+1. Go to the Azure portal menu or the home page and select **All resources**.
+1. In the filter box, enter the attestation provider name.
+1. Select the check box and select **Delete**.
+1. Enter **yes** and select **Delete**.
-### 2.1 View policy signer certificates
+Or you can:
-1. From the Azure portal menu, or from the Home page, select **All resources**
-2. In the filter box, enter attestation provider name
-3. Select the attestation provider and navigate to overview page
-4. Click **Policy signer certificates** in left-side resource menu or in the bottom pane
-5. Click **Download policy signer certificates** (The button will be disabled for the attestation providers created without policy signing requirement)
-6. The text file downloaded will have all certs in a JWS format.
-a. Verify the certificates count and certs downloaded.
+1. Go to the Azure portal menu or the home page and select **All resources**.
+1. In the filter box, enter the attestation provider name.
+1. Select the attestation provider and go to the overview page.
+1. Select **Delete** on the menu bar and select **Yes**.
-### 2.2 Add policy signer certificate
+## Attestation policy signers
-1. From the Azure portal menu, or from the Home page, select **All resources**
-2. In the filter box, enter attestation provider name
-3. Select the attestation provider and navigate to overview page
-4. Click **Policy signer certificates** in left-side resource menu or in the bottom pane
-5. Click **Add** in the top menu (The button will be disabled for the attestation providers created without policy signing requirement)
-6. Upload policy signer certificate file and click **Add**. See examples [here](./policy-signer-examples.md)
+Follow the steps in this section to view, add, and delete policy signer certificates.
-### 2.3 Delete policy signer certificate
+### View the policy signer certificates
-1. From the Azure portal menu, or from the Home page, select **All resources**
-2. In the filter box, enter attestation provider name
-3. Select the attestation provider and navigate to overview page
-4. Click **Policy signer certificates** in left-side resource menu or in the bottom pane
-5. Click **Delete** in the top menu (The button will be disabled for the attestation providers created without policy signing requirement)
-6. Upload policy signer certificate file and click **Delete**. See examples [here](./policy-signer-examples.md)
+1. Go to the Azure portal menu or the home page and select **All resources**.
+1. In the filter box, enter the attestation provider name.
+1. Select the attestation provider and go to the overview page.
+1. Select **Policy signer certificates** on the resource menu on the left side of the window or on the lower pane.
+1. Select **Download policy signer certificates**. The button will be disabled for attestation providers created without the policy signing requirement.
+1. The downloaded text file will have all certificates in a JWS format.
+1. Verify the certificate count and the downloaded certificates.
-## 3. Attestation policy
+### Add the policy signer certificate
-### 3.1 View attestation policy
+1. Go to the Azure portal menu or the home page and select **All resources**.
+1. In the filter box, enter the attestation provider name.
+1. Select the attestation provider and go to the overview page.
+1. Select **Policy signer certificates** on the resource menu on the left side of the window or on the lower pane.
+1. Select **Add** on the upper menu. The button will be disabled for attestation providers created without the policy signing requirement.
+1. Upload the policy signer certificate file and select **Add**. [See examples of policy signer certificates](./policy-signer-examples.md).
-1. From the Azure portal menu, or from the Home page, select **All resources**
-2. In the filter box, enter attestation provider name
-3. Select the attestation provider and navigate to overview page
-4. Click **Policy** in left-side resource menu or in the bottom pane
-5. Select the preferred **Attestation Type** and view the **Current policy**
+### Delete the policy signer certificates
-### 3.2 Configure attestation policy
+1. Go to the Azure portal menu or the home page and select **All resources**.
+1. In the filter box, enter the attestation provider name.
+1. Select the attestation provider and go to the overview page.
+1. Select **Policy signer certificates** on the resource menu on the left side of the window or on the lower pane.
+1. Select **Delete** on the upper menu. The button will be disabled for attestation providers created without the policy signing requirement.
+1. Upload the policy signer certificate file and select **Delete**. [See examples of policy signer certificates](./policy-signer-examples.md).
-#### 3.2.1 When attestation provider is created without policy signing requirement
+## Attestation policy
-##### Upload policy in JWT format
+This section describes how to view an attestation policy and how to configure policies that were created with and without a policy signing requirement.
-1. From the Azure portal menu, or from the Home page, select **All resources**
-2. In the filter box, enter attestation provider name
-3. Select the attestation provider and navigate to overview page
-4. Click **Policy** in left-side resource menu or in the bottom pane
-5. Click **Configure** in the top menu
-6. When the attestation provider is created without policy signing requirement, user can upload a policy in **JWT** or **Text** format
-7. Select **Policy Format** as **JWT**
-8. Upload policy file with policy content in an **unsigned/signed JWT** format and click **Save**. See examples [here](./policy-examples.md)
-
- For file upload option, policy preview will be shown in text format and policy preview is not editable.
+### View an attestation policy
+
+1. Go to the Azure portal menu or the home page and select **All resources**.
+1. In the filter box, enter the attestation provider name.
+1. Select the attestation provider and go to the overview page.
+1. Select **Policy** on the resource menu on the left side of the window or on the lower pane.
+1. Select the preferred **Attestation Type** and view the **Current policy**.
+
+### Configure an attestation policy
+
+Follow these steps to upload a policy in JWT or text format if the attestation provider was created without a policy signing requirement.
-7. Click **Refresh** in the top menu to view the configured policy
+1. Go to the Azure portal menu or the home page and select **All resources**.
+1. In the filter box, enter the attestation provider name.
+1. Select the attestation provider and go to the overview page.
+1. Select **Policy** on the resource menu on the left side of the window or on the lower pane.
+1. Select **Configure** on the upper menu.
+1. Select **Policy Format** as **JWT** or as **Text**.
-##### Upload policy in Text format
+ If the attestation provider was created without policy signing requirement, the user can upload a policy in either **JWT** or **Text** format.
-1. From the Azure portal menu, or from the Home page, select **All resources**
-2. In the filter box, enter attestation provider name
-3. Select the attestation provider and navigate to overview page
-4. Click **Policy** in left-side resource menu or in the bottom pane
-5. Click **Configure** in the top menu
-6. When the attestation provider is created without policy signing requirement, user can upload a policy in **JWT** or **Text** format
-7. Select **Policy Format** as **Text**
-8. Upload policy file with content in **Text** format or enter policy content in text area and click **Save**. See examples [here](./policy-examples.md)
+ - If you chose JWT format, upload the policy file with the policy content in **unsigned/signed JWT** format and select **Save**. [See policy examples](./policy-examples.md).
+ - If you chose text format, upload the policy file with the content in **Text** format or enter the policy content in the text area and select **Save**. [See policy examples](./policy-examples.md).
- For file upload option, policy preview will be shown in text format and policy preview is not editable.
+ For the file upload option, the policy preview is shown in text format and isn't editable.
-8. Click **Refresh** to view the configured policy
+1. Select **Refresh** on the upper menu to view the configured policy.
-#### 3.2.2 When attestation provider is created with policy signing requirement
+If the attestation provider was created with a policy signing requirement, follow these steps to upload a policy in JWT format.
-##### Upload policy in JWT format
+1. Go to the Azure portal menu or the home page and select **All resources**.
+1. In the filter box, enter the attestation provider name.
+1. Select the attestation provider and go to the overview page.
+1. Select **Policy** on the resource menu on the left side of the window or on the lower pane.
+1. Select **Configure** on the upper menu.
+1. Upload the policy file in **signed JWT format** and select **Save**. [See policy examples](./policy-examples.md).
-1. From the Azure portal menu, or from the Home page, select **All resources**
-2. In the filter box, enter attestation provider name
-3. Select the attestation provider and navigate to overview page
-4. Click **Policy** in left-side resource menu or in the bottom pane
-5. Click **Configure** in the top menu
-6. When the attestation provider is created with policy signing requirement, user can upload a policy only in **signed JWT format**
-7. Upload policy file is **signed JWT format** and click **Save**. See examples [here](./policy-examples.md)
+ If the attestation provider was created with a policy signing requirement, the user can upload a policy only in **signed JWT format**.
- For file upload option, policy preview will be shown in text format and policy preview is not editable.
+ For the file upload option, the policy preview is shown in text format and isn't editable.
-8. Click **Refresh** to view the configured policy
+1. Select **Refresh** to view the configured policy.
## Next steps
azure-arc https://docs.microsoft.com/en-us/azure/azure-arc/servers/agent-overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/servers/agent-overview.md
@@ -50,6 +50,7 @@ The following versions of the Windows and Linux operating system are officially
- SUSE Linux Enterprise Server (SLES) 15 (x64) - Red Hat Enterprise Linux (RHEL) 7 (x64) - Amazon Linux 2 (x64)
+- Oracle Linux 7
> [!WARNING] > The Linux hostname or Windows computer name cannot use one of the reserved words or trademarks in the name, otherwise attempting to register the connected machine with Azure will fail. See [Resolve reserved resource name errors](../../azure-resource-manager/templates/error-reserved-resource-name.md) for a list of the reserved words.
azure-functions https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-phone-verification https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/durable/durable-functions-phone-verification.md
@@ -33,6 +33,9 @@ This article walks through the following functions in the sample app:
* `E4_SmsPhoneVerification`: An [orchestrator function](durable-functions-bindings.md#orchestration-trigger) that performs the phone verification process, including managing timeouts and retries. * `E4_SendSmsChallenge`: An [activity function](durable-functions-bindings.md#activity-trigger) that sends a code via text message.
+> [!NOTE]
+> The `HttpStart` function in the [sample app and the quickstart](#prerequisites) acts as [Orchestration client](durable-functions-bindings.md#orchestration-client) which triggers the orchestrator function.
+ ### E4_SmsPhoneVerification orchestrator function # [C#](#tab/csharp)
azure-functions https://docs.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-triggers-bindings.md
@@ -35,16 +35,19 @@ These examples are not meant to be exhaustive, but are provided to illustrate ho
### Trigger and binding definitions
-Triggers and bindings are defined differently depending on the development approach.
+Triggers and bindings are defined differently depending on the development language.
-| Platform | Triggers and bindings are configured by... |
+| Language | Triggers and bindings are configured by... |
|-|--| | C# class library | &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;decorating methods and parameters with C# attributes |
-| All others (including Azure portal) | &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;updating [function.json](./functions-reference.md) ([schema](http://json.schemastore.org/function)) |
+| Java | &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;decorating methods and parameters with Java annotations |
+| JavaScript/PowerShell/Python/TypeScript | &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;updating [function.json](./functions-reference.md) ([schema](http://json.schemastore.org/function)) |
-The portal provides a UI for this configuration, but you can edit the file directly by opening the **Advanced editor** available via the **Integrate** tab of your function.
+For languages that rely on function.json, the portal provides a UI for adding bindings in the **Integration** tab. You can also edit the file directly in the portal in the **Code + test** tab of your function. Visual Studio Code lets you easily [add a binding to a function.json file](functions-develop-vs-code.md?tabs=nodejs#add-a-function-to-your-project) by following a convenient set of prompts.
-In .NET, the parameter type defines the data type for input data. For instance, use `string` to bind to the text of a queue trigger, a byte array to read as binary and a custom type to de-serialize to an object.
+In .NET and Java, the parameter type defines the data type for input data. For instance, use `string` to bind to the text of a queue trigger, a byte array to read as binary, and a custom type to de-serialize to an object. Since .NET class library functions and Java functions don't rely on *function.json* for binding definitions, they can't be created and edited in the portal. C# portal editing is based on C# script, which uses *function.json* instead of attributes.
+
+To learn more about how to adding bindings to existing functions, see [Connect functions to Azure services using bindings](add-bindings-existing-function.md).
For languages that are dynamically typed such as JavaScript, use the `dataType` property in the *function.json* file. For example, to read the content of an HTTP request in binary format, set `dataType` to `binary`:
azure-monitor https://docs.microsoft.com/en-us/azure/azure-monitor/platform/manage-cost-storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/platform/manage-cost-storage.md
@@ -417,10 +417,9 @@ find where TimeGenerated > ago(24h) project _ResourceId, _BilledSize, _IsBillabl
For data from nodes hosted in Azure you can get the **size** of ingested data __per Azure subscription__, get use the `_SubscriptionId` property as: ```kusto
-find where TimeGenerated > ago(24h) project _ResourceId, _BilledSize, _IsBillable
+find where TimeGenerated > ago(24h) project _ResourceId, _BilledSize, _IsBillable, _SubscriptionId
| where _IsBillable == true
-| summarize BillableDataBytes = sum(_BilledSize) by _ResourceId
-| summarize BillableDataBytes = sum(BillableDataBytes) by _SubscriptionId | sort by BillableDataBytes nulls last
+| summarize BillableDataBytes = sum(_BilledSize) by _SubscriptionId | sort by BillableDataBytes nulls last
``` To get data volume by resource group, you can parse `_ResourceId`:
@@ -664,4 +663,4 @@ There are some additional Log Analytics limits, some of which depend on the Log
- Change [performance counter configuration](data-sources-performance-counters.md). - To modify your event collection settings, review [event log configuration](data-sources-windows-events.md). - To modify your syslog collection settings, review [syslog configuration](data-sources-syslog.md).-- To modify your syslog collection settings, review [syslog configuration](data-sources-syslog.md).
+- To modify your syslog collection settings, review [syslog configuration](data-sources-syslog.md).
azure-netapp-files https://docs.microsoft.com/en-us/azure/azure-netapp-files/manage-manual-qos-capacity-pool https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/manage-manual-qos-capacity-pool.md
@@ -13,7 +13,7 @@
na ms.devlang: na Previously updated : 09/28/2020 Last updated : 02/04/2021 # Manage a manual QoS capacity pool
@@ -54,7 +54,8 @@ To create a new capacity pool using the manual QoS type:
You can change a capacity pool that currently uses the auto QoS type to use the manual QoS type. > [!IMPORTANT]
-> Setting the capacity type to manual QoS is a permanent change. You cannot convert a manual QoS type capacity tool to an auto QoS capacity pool.
+> Setting the capacity type to manual QoS is a permanent change. You cannot convert a manual QoS type capacity tool to an auto QoS capacity pool.
+> At conversion time, throughput levels might be capped to conform to the throughput limits for volumes of the manual QoS type. See [Resource limits for Azure NetApp Files](azure-netapp-files-resource-limits.md#resource-limits).
1. From the management blade for your NetApp account, click **Capacity pools** to display existing capacity pools.
@@ -91,4 +92,4 @@ If a volume is contained in a manual QoS capacity pool, you can modify the allot
* [Resource limits for Azure NetApp Files](azure-netapp-files-resource-limits.md) * [Create an NFS volume](azure-netapp-files-create-volumes.md) * [Create an SMB volume](azure-netapp-files-create-volumes-smb.md)
-* [Create a dual-protocol volume](create-volumes-dual-protocol.md)
+* [Create a dual-protocol volume](create-volumes-dual-protocol.md)
azure-netapp-files https://docs.microsoft.com/en-us/azure/azure-netapp-files/manual-qos-capacity-pool-introduction https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/manual-qos-capacity-pool-introduction.md
@@ -13,7 +13,7 @@
na ms.devlang: na Previously updated : 10/12/2020 Last updated : 02/04/2021 # Manual QoS capacity pool
@@ -24,7 +24,7 @@ This article provides an introduction to the manual Quality of Service (QoS) cap
The [QoS type](azure-netapp-files-understand-storage-hierarchy.md#qos_types) is an attribute of a capacity pool. Azure NetApp Files provides two QoS types of capacity pools ΓÇô auto (default) and manual.
-In a *manual* QoS capacity pool, you can assign the capacity and throughput for a volume independently. The total throughput of all volumes created with a manual QoS capacity pool is limited by the total throughput of the pool. It is determined by the combination of the pool size and the service-level throughput.
+In a *manual* QoS capacity pool, you can assign the capacity and throughput for a volume independently. For minimum and maximum throughput levels, see [Resource limits for Azure NetApp Files](azure-netapp-files-resource-limits.md#resource-limits). The total throughput of all volumes created with a manual QoS capacity pool is limited by the total throughput of the pool. It is determined by the combination of the pool size and the service-level throughput.
In an *auto* QoS capacity pool, throughput is assigned automatically to the volumes in the pool, proportional to the size quota assigned to the volumes.
azure-netapp-files https://docs.microsoft.com/en-us/azure/azure-netapp-files/snapshots-introduction https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/snapshots-introduction.md
@@ -13,7 +13,7 @@
na ms.devlang: na Previously updated : 01/12/2021 Last updated : 02/05/2021 # How Azure NetApp Files snapshots work
@@ -156,7 +156,7 @@ See [Delete snapshots](azure-netapp-files-manage-snapshots.md#delete-snapshots)
* [Troubleshoot snapshot policies](troubleshoot-snapshot-policies.md) * [Resource limits for Azure NetApp Files](azure-netapp-files-resource-limits.md) * [Azure NetApp Files Snapshots 101 video](https://www.youtube.com/watch?v=uxbTXhtXCkw)
-* [NetApp Snapshot - NetApp Video Library](https://tv.netapp.com/detail/video/2579133646001/snapshot)
+* [Azure NetApp Files Snapshot Overview](https://anfcommunity.com/2021/01/31/azure-netapp-files-snapshot-overview/)
azure-resource-manager https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/add-template-to-azure-pipelines https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/add-template-to-azure-pipelines.md
@@ -2,19 +2,23 @@
Title: CI/CD with Azure Pipelines and templates description: Describes how to configure continuous integration in Azure Pipelines by using Azure Resource Manager templates. It shows how to use a PowerShell script, or copy files to a staging location and deploy from there. Previously updated : 10/01/2020 Last updated : 02/05/2021 # Integrate ARM templates with Azure Pipelines
-You can integrate Azure Resource Manager templates (ARM templates) with Azure Pipelines for continuous integration and continuous deployment (CI/CD). The tutorial [Continuous integration of ARM templates with Azure Pipelines](deployment-tutorial-pipeline.md) shows how to use the [ARM template deployment task](https://github.com/microsoft/azure-pipelines-tasks/blob/master/Tasks/AzureResourceManagerTemplateDeploymentV3/README.md) to deploy a template from your GitHub repo. This approach works when you want to deploy a template directly from a repository.
+You can integrate Azure Resource Manager templates (ARM templates) with Azure Pipelines for continuous integration and continuous deployment (CI/CD). In this article, you learn two more advanced ways to deploy templates with Azure Pipelines.
-In this article, you learn two more ways to deploy templates with Azure Pipelines. This article shows how to:
+## Select your option
-* **Add task that runs an Azure PowerShell script**. This option has the advantage of providing consistency throughout the development life cycle because you can use the same script that you used when running local tests. Your script deploys the template but can also perform other operations such as getting values to use as parameters.
+Before proceeding with this article, let's consider the different options for deploying an ARM template from a pipeline.
+
+* **Use ARM template deployment task**. This option is the easiest option. This approach works when you want to deploy a template directly from a repository. This option isn't covered in this article but instead is covered in the tutorial [Continuous integration of ARM templates with Azure Pipelines](deployment-tutorial-pipeline.md). It shows how to use the [ARM template deployment task](https://github.com/microsoft/azure-pipelines-tasks/blob/master/Tasks/AzureResourceManagerTemplateDeploymentV3/README.md) to deploy a template from your GitHub repo.
+
+* **Add task that runs an Azure PowerShell script**. This option has the advantage of providing consistency throughout the development life cycle because you can use the same script that you used when running local tests. Your script deploys the template but can also perform other operations such as getting values to use as parameters. This option is shown in this article. See [Azure PowerShell task](#azure-powershell-task).
Visual Studio provides the [Azure Resource Group project](create-visual-studio-deployment-project.md) that includes a PowerShell script. The script stages artifacts from your project to a storage account that Resource Manager can access. Artifacts are items in your project such as linked templates, scripts, and application binaries. If you want to continue using the script from the project, use the PowerShell script task shown in this article.
-* **Add tasks to copy and deploy tasks**. This option offers a convenient alternative to the project script. You configure two tasks in the pipeline. One task stages the artifacts to an accessible location. The other task deploys the template from that location.
+* **Add tasks to copy and deploy tasks**. This option offers a convenient alternative to the project script. You configure two tasks in the pipeline. One task stages the artifacts to an accessible location. The other task deploys the template from that location. This option is shown in this article. See [Copy and deploy tasks](#copy-and-deploy-tasks).
## Prepare your project
azure-resource-manager https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/deployment-manager-overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/deployment-manager-overview.md
@@ -16,10 +16,10 @@ Azure Deployment Manager is in preview. Help us improve the feature by providing
To use Deployment Manager, you need to create four files:
-* Topology template
-* Rollout template
-* Parameter file for topology
-* Parameter file for rollout
+* Topology template.
+* Rollout template.
+* Parameter file for topology.
+* Parameter file for rollout.
You deploy the topology template before deploying the rollout template.
azure-resource-manager https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/deployment-manager-tutorial-health-check https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/deployment-manager-tutorial-health-check.md
@@ -15,7 +15,7 @@ Learn how to integrate health check in [Azure Deployment Manager](./deployment-m
In the rollout template used in [Use Azure Deployment Manager with Resource Manager templates](./deployment-manager-tutorial.md), you used a wait step. In this tutorial, you replace the wait step with a health check step. > [!IMPORTANT]
-> If your subscription is marked for Canary to test out new Azure features, you can only use Azure Deployment Manager to deploy to the Canary regions. 
+> If your subscription is marked for Canary to test out new Azure features, you can only use Azure Deployment Manager to deploy to the Canary regions.
This tutorial covers the following tasks:
@@ -31,26 +31,23 @@ This tutorial covers the following tasks:
Additional resources:
-* The [Azure Deployment Manager REST API reference](/rest/api/deploymentmanager/).
+* [Azure Deployment Manager REST API reference](/rest/api/deploymentmanager/).
* [An Azure Deployment Manager sample](https://github.com/Azure-Samples/adm-quickstart).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
- ## Prerequisites
-To complete this article, you need:
+To complete this tutorial you need:
+* Azure subscription. If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
* Complete [Use Azure Deployment Manager with Resource Manager templates](./deployment-manager-tutorial.md). ## Install the artifacts
-Download [the templates and the artifacts](https://github.com/Azure/azure-docs-json-samples/raw/master/tutorial-adm/ADMTutorial.zip) and unzip it locally if you haven't done so. And then run the PowerShell script found at [Prepare the artifacts](./deployment-manager-tutorial.md#prepare-the-artifacts). The script creates a resource group, creates a storage container, creates a blob container, upload the downloaded files, and then create a SAS token.
-
-Make a copy of the URL with SAS token. This URL is needed to populate a field in the two parameter files, topology parameters file and rollout parameters file.
-
-Open CreateADMServiceTopology.Parameters.json, and update the values of **projectName** and **artifactSourceSASLocation**.
+If you haven't already downloaded the samples used in the prerequisite tutorial, you can download [the templates and the artifacts](https://github.com/Azure/azure-docs-json-samples/raw/master/tutorial-adm/ADMTutorial.zip) and unzip it locally. Then, run the PowerShell script from the prerequisite tutorial's section [Prepare the artifacts](./deployment-manager-tutorial.md#prepare-the-artifacts). The script creates a resource group, creates a storage container, creates a blob container, uploads the downloaded files, and then creates a SAS token.
-Open CreateADMRollout.Parameters.json, and update the values of **projectName** and **artifactSourceSASLocation**.
+* Make a copy of the URL with SAS token. This URL is needed to populate a field in the two parameter files: topology parameters file and rollout parameters file.
+* Open _CreateADMServiceTopology.Parameters.json_ and update the values of `projectName` and `artifactSourceSASLocation`.
+* Open _CreateADMRollout.Parameters.json_ and update the values of `projectName` and `artifactSourceSASLocation`.
## Create a health check service simulator
@@ -61,40 +58,40 @@ The following two files are used for deploying the Azure Function. You don't nee
* A Resource Manager template located at [https://raw.githubusercontent.com/Azure/azure-docs-json-samples/master/tutorial-adm/deploy_hc_azure_function.json](https://raw.githubusercontent.com/Azure/azure-docs-json-samples/master/tutorial-adm/deploy_hc_azure_function.json). You deploy this template to create an Azure Function. * A zip file of the Azure Function source code, [https://github.com/Azure/azure-docs-json-samples/raw/master/tutorial-adm/ADMHCFunction0417.zip](https://github.com/Azure/azure-docs-json-samples/raw/master/tutorial-adm/ADMHCFunction0417.zip). This zip called is called by the Resource Manager template.
-To deploy the Azure function, select **Try it** to open the Azure Cloud shell, and then paste the following script into the shell window. To paste the code, right-click the shell window and then select **Paste**.
+To deploy the Azure function, select **Try it** to open the Azure Cloud Shell, and then paste the following script into the shell window. To paste the code, right-click the shell window and then select **Paste**.
-```azurepowershell
+```azurepowershell-interactive
New-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName -TemplateUri "https://raw.githubusercontent.com/Azure/azure-docs-json-samples/master/tutorial-adm/deploy_hc_azure_function.json" -projectName $projectName ``` To verify and test the Azure function: 1. Open the [Azure portal](https://portal.azure.com).
-1. Open the resource group. The default name is the project name with **rg** appended.
-1. Select the app service from the resource group. The default name of the app service is the project name with **webapp** appended.
+1. Open the resource group. The default name is the project name with **rg** appended.
+1. Select the app service from the resource group. The default name of the app service is the project name with **webapp** appended.
1. Expand **Functions**, and then select **HttpTrigger1**. ![Azure Deployment Manager health check Azure Function](./media/deployment-manager-tutorial-health-check/azure-deployment-manager-hc-function.png) 1. Select **&lt;/> Get function URL**.
-1. Select **Copy** to copy the URL to the clipboard. The URL is similar to:
+1. Select **Copy** to copy the URL to the clipboard. The URL is similar to:
```url https://myhc0417webapp.azurewebsites.net/api/healthStatus/{healthStatus}?code=hc4Y1wY4AqsskAkVw6WLAN1A4E6aB0h3MbQ3YJRF3XtXgHvooaG0aw== ```
- Replace `{healthStatus}` in the URL with a status code. In this tutorial, use **unhealthy** to test the unhealthy scenario, and use either **healthy** or **warning** to test the healthy scenario. Create two URLs, one with the unhealthy status, and the other with healthy status. For examples:
+ Replace `{healthStatus}` in the URL with a status code. In this tutorial, use *unhealthy* to test the unhealthy scenario, and use either *healthy* or *warning* to test the healthy scenario. Create two URLs, one with the *unhealthy* status, and the other with *healthy* status. For example:
```url https://myhc0417webapp.azurewebsites.net/api/healthStatus/unhealthy?code=hc4Y1wY4AqsskAkVw6WLAN1A4E6aB0h3MbQ3YJRF3XtXgHvooaG0aw== https://myhc0417webapp.azurewebsites.net/api/healthStatus/healthy?code=hc4Y1wY4AqsskAkVw6WLAN1A4E6aB0h3MbQ3YJRF3XtXgHvooaG0aw== ```
- You need both URLs to completed this tutorial.
+ You need both URLs to complete this tutorial.
-1. To test the health monitoring simulator, open the URLs that you created in the last step. The results for the unhealthy status shall be similar to:
+1. To test the health monitoring simulator, open the URLs that you created in the previous step. The results for the unhealthy status will be similar to:
- ```
+ ```Output
Status: unhealthy ```
@@ -102,7 +99,7 @@ To verify and test the Azure function:
The purpose of this section is to show you how to include a health check step in the rollout template.
-1. Open **CreateADMRollout.json** that you created in [Use Azure Deployment Manager with Resource Manager templates](./deployment-manager-tutorial.md). This JSON file is a part of the download. See [Prerequisites](#prerequisites).
+1. Open _CreateADMRollout.json_ that you created in [Use Azure Deployment Manager with Resource Manager templates](./deployment-manager-tutorial.md). This JSON file is a part of the download. See [Prerequisites](#prerequisites).
1. Add two more parameters: ```json
@@ -171,7 +168,7 @@ The purpose of this section is to show you how to include a health check step in
Based on the definition, the rollout proceeds if the health status is either *healthy* or *warning*.
-1. Update the **dependsON** of the rollout definition to include the newly defined health check step:
+1. Update the `dependsOn` of the rollout definition to include the newly defined health check step:
```json "dependsOn": [
@@ -180,7 +177,7 @@ The purpose of this section is to show you how to include a health check step in
], ```
-1. Update **stepGroups** to include the health check step. The **healthCheckStep** is called in **postDeploymentSteps** of **stepGroup2**. **stepGroup3** and **stepGroup4** are only deployed if the healthy status is either *healthy* or *warning*.
+1. Update `stepGroups` to include the health check step. The `healthCheckStep` is called in `postDeploymentSteps` of `stepGroup2`. `stepGroup3` and `stepGroup4` are only deployed if the healthy status is either *healthy* or *warning*.
```json "stepGroups": [
@@ -218,15 +215,15 @@ The purpose of this section is to show you how to include a health check step in
] ```
- If you compare the **stepGroup3** section before and after it is revised, this section now depends on **stepGroup2**. This is necessary when **stepGroup3** and the subsequent step groups depend on the results of health monitoring.
+ If you compare the `stepGroup3` section before and after it's revised, this section now depends on `stepGroup2`. This is necessary when `stepGroup3` and the subsequent step groups depend on the results of health monitoring.
- The following screenshot illustrates the areas modified, and how the health check step is used:
+ The following screenshot illustrates the modified areas and how the health check step is used:
![Azure Deployment Manager health check template](./media/deployment-manager-tutorial-health-check/azure-deployment-manager-hc-rollout-template.png) ## Deploy the topology
-Run the following PowerShell script to deploy the topology. You need the same **CreateADMServiceTopology.json** and **CreateADMServiceTopology.Parameters.json** that you used in [Use Azure Deployment Manager with Resource Manager templates](./deployment-manager-tutorial.md).
+Run the following PowerShell script to deploy the topology. You need the same _CreateADMServiceTopology.json_ and _CreateADMServiceTopology.Parameters.json_ that you used in [Use Azure Deployment Manager with Resource Manager templates](./deployment-manager-tutorial.md).
```azurepowershell # Create the service topology
@@ -244,7 +241,7 @@ Verify the service topology and the underlined resources have been created succe
## Deploy the rollout with the unhealthy status
-Use the unhealthy status URL you created in [Create a health check service simulator](#create-a-health-check-service-simulator). You need the revised **CreateADMServiceTopology.json** and the same **CreateADMServiceTopology.Parameters.json** that you used in [Use Azure Deployment Manager with Resource Manager templates](./deployment-manager-tutorial.md).
+Use the unhealthy status URL you created in [Create a health check service simulator](#create-a-health-check-service-simulator). You need the revised _CreateADMServiceTopology.json_ and the same _CreateADMServiceTopology.Parameters.json_ that you used in [Use Azure Deployment Manager with Resource Manager templates](./deployment-manager-tutorial.md).
```azurepowershell-interactive $healthCheckUrl = Read-Host -Prompt "Enter the health check Azure function URL"
@@ -263,7 +260,7 @@ New-AzResourceGroupDeployment `
> [!NOTE] > `New-AzResourceGroupDeployment` is an asynchronous call. The success message only means the deployment has successfully begun. To verify the deployment, use `Get-AZDeploymentManagerRollout`. See the next procedure.
-To check the rollout progress using the following PowerShell script:
+To check the rollout progress use the following PowerShell script:
```azurepowershell $projectName = Read-Host -Prompt "Enter the same project name used earlier in this tutorial"
@@ -279,7 +276,7 @@ Get-AzDeploymentManagerRollout `
The following sample output shows the deployment failed due to the unhealthy status:
-```output
+```Output
Service: myhc0417ServiceWUSrg TargetLocation: WestUS TargetSubscriptionId: <Subscription ID>
@@ -336,32 +333,32 @@ Id : /subscriptions/<Subscription ID>/resourcegroups/myhc04
Tags : ```
-After the rollout is completed, you shall see one additional resource group created for West US.
+After the rollout is completed, you'll see one additional resource group created for West US.
## Deploy the rollout with the healthy status
-Repeat this section to redeploy the rollout with the healthy status URL. After the rollout is completed, you shall see one more resource group created for East US.
+Repeat this section to redeploy the rollout with the healthy status URL. After the rollout is completed, you'll see one more resource group created for East US.
## Verify the deployment 1. Open the [Azure portal](https://portal.azure.com).
-2. Browse to the newly create web applications under the new resource groups created by the rollout deployment.
-3. Open the web application in a web browser. Verify the location and the version on the https://docsupdatetracker.net/index.html file.
+1. Browse to the new web applications under the new resource groups created by the rollout deployment.
+1. Open the web application in a web browser. Verify the location and the version on the _https://docsupdatetracker.net/index.html_ file.
## Clean up resources When the Azure resources are no longer needed, clean up the resources you deployed by deleting the resource group. 1. From the Azure portal, select **Resource group** from the left menu.
-2. Use the **Filter by name** field to narrow down the resource groups created in this tutorial. There shall be 3-4:
+1. Use the **Filter by name** field to narrow down the resource groups created in this tutorial.
* **&lt;projectName>rg**: contains the Deployment Manager resources. * **&lt;projectName>ServiceWUSrg**: contains the resources defined by ServiceWUS. * **&lt;projectName>ServiceEUSrg**: contains the resources defined by ServiceEUS. * The resource group for the user-defined managed identity.
-3. Select the resource group name.
-4. Select **Delete resource group** from the top menu.
-5. Repeat the last two steps to delete other resource groups created by this tutorial.
+1. Select the resource group name.
+1. Select **Delete resource group** from the top menu.
+1. Repeat the last two steps to delete other resource groups created by this tutorial.
## Next steps
azure-resource-manager https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/deployment-manager-tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/deployment-manager-tutorial.md
@@ -18,7 +18,7 @@ To use Deployment Manager, you need to create two templates:
* **A rollout template**: describes the steps to take when deploying your applications. > [!IMPORTANT]
-> If your subscription is marked for Canary to test out new Azure features, you can only use Azure Deployment Manager to deploy to the Canary regions. 
+> If your subscription is marked for Canary to test out new Azure features, you can only use Azure Deployment Manager to deploy to the Canary regions.
This tutorial covers the following tasks:
@@ -36,17 +36,16 @@ This tutorial covers the following tasks:
Additional resources:
-* The [Azure Deployment Manager REST API reference](/rest/api/deploymentmanager/).
+* [Azure Deployment Manager REST API reference](/rest/api/deploymentmanager/).
* [Tutorial: Use health check in Azure Deployment Manager](./deployment-manager-tutorial-health-check.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
- [!INCLUDE [updated-for-az](../../../includes/updated-for-az.md)] ## Prerequisites
-To complete this article, you need:
+To complete this tutorial, you need:
+* Azure subscription. If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
* Some experience with developing [Azure Resource Manager templates](overview.md). * Azure PowerShell. For more information, see [Get started with Azure PowerShell](/powershell/azure/get-started-azureps). * Deployment Manager cmdlets. To install these prerelease cmdlets, you need the latest version of PowerShellGet. To get the latest version, see [Installing PowerShellGet](/powershell/scripting/gallery/installing-psget). After installing PowerShellGet, close your PowerShell window. Open a new elevated PowerShell window, and use the following command:
@@ -67,23 +66,23 @@ The following diagram illustrates the service topology used in this tutorial:
![Azure Deployment Manager tutorial scenario diagram](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-scenario-diagram.png)
-There are two services allocated in the west U.S. and the east U.S. locations. Each service has two service units - a web application frontend and a storage account for the backend. The service unit definitions contain links to the template and parameter files for creating the web applications and the storage accounts.
+There are two services allocated in the West US and the East US locations. Each service has two service units: a front-end web application and a back-end storage account. The service unit definitions contain links to the template and parameter files that create the web applications and the storage accounts.
## Download the tutorial files 1. Download [the templates and the artifacts](https://github.com/Azure/azure-docs-json-samples/raw/master/tutorial-adm/ADMTutorial.zip) used by this tutorial.
-2. Unzip the files to your location computer.
+1. Unzip the files to your location computer.
Under the root folder, there are two folders:
-* **ADMTemplates**: contains the Deployment Manager templates, that include:
- * CreateADMServiceTopology.json
- * CreateADMServiceTopology.Parameters.json
- * CreateADMRollout.json
- * CreateADMRollout.Parameters.json
-* **ArtifactStore**: contains both the template artifacts and the binary artifacts. See [Prepare the artifacts](#prepare-the-artifacts).
+* _ADMTemplates_: contains the Deployment Manager templates, that include:
+ * _CreateADMServiceTopology.json_
+ * _CreateADMServiceTopology.Parameters.json_
+ * _CreateADMRollout.json_
+ * _CreateADMRollout.Parameters.json_
+* _ArtifactStore_: contains both the template artifacts and the binary artifacts. See [Prepare the artifacts](#prepare-the-artifacts).
-Note there are two sets of templates. One set is the Deployment Manager templates that are used to deploy the service topology and the rollout; the other set is called from the service units to create web services and storage accounts.
+There are two sets of templates. One set is the Deployment Manager templates that are used to deploy the service topology and the rollout. The other set is called from the service units to create web services and storage accounts.
## Prepare the artifacts
@@ -91,23 +90,23 @@ The ArtifactStore folder from the download contains two folders:
![Azure Deployment Manager tutorial artifact source diagram](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-artifact-source-diagram.png)
-* The **templates** folder: contains the template artifacts. **1.0.0.0** and **1.0.0.1** represent the two versions of the binary artifacts. Within each version, there is a folder for each service (Service East U.S. and Service West U.S.). Each service has a pair of template and parameter files for creating a storage account, and another pair for creating a web application. The web application template calls a compressed package, which contains the web application files. The compressed file is a binary artifact stored in the binaries folder.
-* The **binaries** folder: contains the binary artifacts. **1.0.0.0** and **1.0.0.1** represent the two versions of the binary artifacts. Within each version, there is one zip file for creating the web application in the west U.S. location, and the other zip file to create the web application in the east U.S. location.
+* The _templates_ folder: contains the template artifacts. The folders _1.0.0.0_ and _1.0.0.1_ represent the two versions of the binary artifacts. Within each version, there is a folder for each service: _ServiceEUS_ (Service East US) and _ServiceWUS_ (Service West US). Each service has a pair of template and parameter files for creating a storage account, and another pair for creating a web application. The web application template calls a compressed package, which contains the web application files. The compressed file is a binary artifact stored in the binaries folder.
+* The _binaries_ folder: contains the binary artifacts. The folders _1.0.0.0_ and _1.0.0.1_ represent the two versions of the binary artifacts. Within each version, there is one zip file for to create the web application in the West US location, and the other zip file to create the web application in the East US location.
The two versions (1.0.0.0 and 1.0.0.1) are for the [revision deployment](#deploy-the-revision). Even though both the template artifacts and the binary artifacts have two versions, only the binary artifacts are different between the two versions. In practice, binary artifacts are updated more frequently comparing to template artifacts.
-1. Open **\ArtifactStore\templates\1.0.0.0\ServiceWUS\CreateStorageAccount.json** in a text editor. It is a basic template for creating a storage account.
-2. Open **\ArtifactStore\templates\1.0.0.0\ServiceWUS\CreateWebApplication.json**.
+1. Open _\ArtifactStore\templates\1.0.0.0\ServiceWUS\CreateStorageAccount.json_ in a text editor. It's a basic template to create a storage account.
+1. Open _\ArtifactStore\templates\1.0.0.0\ServiceWUS\CreateWebApplication.json_.
![Azure Deployment Manager tutorial create web application template](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-create-web-application-packageuri.png)
- The template calls a deploy package, which contains the files of the web application. In this tutorial, the compressed package only contains an https://docsupdatetracker.net/index.html file.
-3. Open **\ArtifactStore\templates\1.0.0.0\ServiceWUS\CreateWebApplicationParameters.json**.
+ The template calls a deploy package, which contains the files of the web application. In this tutorial, the compressed package only contains an _https://docsupdatetracker.net/index.html_ file.
+1. Open _\ArtifactStore\templates\1.0.0.0\ServiceWUS\CreateWebApplicationParameters.json_.
![Azure Deployment Manager tutorial create web application template parameters containerRoot](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-create-web-application-parameters-deploypackageuri.png)
- The value of deployPackageUri is the path to the deployment package. The parameter contains a **$containerRoot** variable. The value of $containerRoot is provided in the [rollout template](#create-the-rollout-template) by concatenating the artifact source SAS location, artifact root, and deployPackageUri.
-4. Open **\ArtifactStore\binaries\1.0.0.0\helloWorldWebAppWUS.zip\https://docsupdatetracker.net/index.html**.
+ The value of `deployPackageUri` is the path to the deployment package. The parameter contains a `$containerRoot` variable. The value of `$containerRoot` is provided in the [rollout template](#create-the-rollout-template) by concatenating the artifact source SAS location, artifact root, and `deployPackageUri`.
+1. Open _\ArtifactStore\binaries\1.0.0.0\helloWorldWebAppWUS.zip\https://docsupdatetracker.net/index.html_.
```html <html>
@@ -121,16 +120,16 @@ The two versions (1.0.0.0 and 1.0.0.1) are for the [revision deployment](#deploy
</html> ```
- The html shows the location and the version information. The binary file in the 1.0.0.1 folder shows "Version 1.0.0.1". After you deploy the service, you can browse to these pages.
-5. Check out other artifact files. It helps you to understand the scenario better.
+ The HTML shows the location and the version information. The binary file in the _1.0.0.1_ folder shows _Version 1.0.0.1_. After you deploy the service, you can browse to these pages.
+1. Check out other artifact files. It helps you to understand the scenario better.
Template artifacts are used by the service topology template, and binary artifacts are used by the rollout template. Both the topology template and the rollout template define an artifact source Azure resource, which is a resource used to point Resource Manager to the template and binary artifacts that are used in the deployment. To simplify the tutorial, one storage account is used to store both the template artifacts and the binary artifacts. Both artifact sources point to the same storage account. Run the following PowerShell script to create a resource group, create a storage container, create a blob container, upload the downloaded files, and then create a SAS token. > [!IMPORTANT]
-> **projectName** in the PowerShell script is used to generate names for the Azure services that are deployed in this tutorial. Different Azure services have different requirements on the names. To ensure the deployment is successful, choose a name with less than 12 characters with only lower case letters and numbers.
-> Save a copy of the project name. You use the same projectName through the tutorial.
+> `projectName` in the PowerShell script is used to generate names for the Azure services that are deployed in this tutorial. Different Azure services have different requirements on the names. To ensure the deployment is successful, choose a name with less than 12 characters with only lower case letters and numbers.
+> Save a copy of the project name. You use the same `projectName` throughout the tutorial.
```azurepowershell $projectName = Read-Host -Prompt "Enter a project name that is used to generate Azure resource names"
@@ -172,9 +171,9 @@ $url = $storageAccount.PrimaryEndpoints.Blob + $containerName + $token
Write-Host $url ```
-Make a copy of the URL with the SAS token. This URL is needed to populate a field in the two parameter files, topology parameters file and rollout parameters file.
+Make a copy of the URL with the SAS token. This URL is needed to populate a field in the two parameter files: topology parameters file and rollout parameters file.
-Open the container from the Azure portal and verify that both the **binaries** and the **templates** folders and the files are uploaded.
+Open the container from the Azure portal and verify that both the _binaries_ and the _templates_ folders, and the files are uploaded.
## Create the user-assigned managed identity
@@ -183,43 +182,43 @@ Later in the tutorial, you deploy a rollout. A user-assigned managed identity is
You need to create a user-assigned managed identity and configure the access control for your subscription. 1. Sign in to the [Azure portal](https://portal.azure.com).
-2. Create a [user-assigned managed identity](../../active-directory/managed-identities-azure-resources/how-to-manage-ua-identity-portal.md).
-3. From the portal, select **Subscriptions** from the left menu, and then select your subscription.
-4. Select **Access control (IAM)**, and then select **Add role assignment**.
-5. Enter or select the following values:
+1. Create a [user-assigned managed identity](../../active-directory/managed-identities-azure-resources/how-to-manage-ua-identity-portal.md).
+1. From the portal, select **Subscriptions** from the left menu, and then select your subscription.
+1. Select **Access control (IAM)**, and then select **Add role assignment**.
+1. Enter or select the following values:
![Azure Deployment Manager tutorial user-assigned managed identity access control](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-access-control.png) * **Role**: give sufficient permission to complete the artifact deployment (the web applications and the storage accounts). Select **Contributor** in this tutorial. In practice, you want to restrict the permissions to the minimum.
- * **Assigned access to**: select **User Assigned Managed Identity**.
+ * **Assign access to**: select **User Assigned Managed Identity**.
* Select the user-assigned managed identity you created earlier in the tutorial.
-6. Select **Save**.
+1. Select **Save**.
## Create the service topology template
-Open **\ADMTemplates\CreateADMServiceTopology.json**.
+Open _\ADMTemplates\CreateADMServiceTopology.json_.
### The parameters The template contains the following parameters:
-* **projectName**: This name is used to create the names for the Deployment Manager resources. For example, using "jdoe", the service topology name is **jdoe**ServiceTopology. The resource names are defined in the variables section of this template.
-* **azureResourcelocation**: To simplify the tutorial, all resources share this location unless it is specified otherwise.
-* **artifactSourceSASLocation**: The SAS URI to the Blob container where service unit template and parameters files are stored for deployment. See [Prepare the artifacts](#prepare-the-artifacts).
-* **templateArtifactRoot**: The offset path from the Blob container where the templates and parameters are stored. The default value is **templates/1.0.0.0**. Don't change this value unless you want to change the folder structure explained in [Prepare the artifacts](#prepare-the-artifacts). Relative paths are used in this tutorial. The full path is constructed by concatenating **artifactSourceSASLocation**, **templateArtifactRoot**, and **templateArtifactSourceRelativePath** (or **parametersArtifactSourceRelativePath**).
-* **targetSubscriptionID**: The subscription ID to which the Deployment Manager resources are going to be deployed and billed. Use your subscription ID in this tutorial.
+* `projectName`: This name is used to create the names for the Deployment Manager resources. For example, using **demo**, the service topology name is **demo**ServiceTopology. The resource names are defined in the template's `variables` section.
+* `azureResourcelocation`: To simplify the tutorial, all resources share this location unless it's specified otherwise.
+* `artifactSourceSASLocation`: The SAS URI to the Blob container where service unit template and parameters files are stored for deployment. See [Prepare the artifacts](#prepare-the-artifacts).
+* `templateArtifactRoot`: The offset path from the Blob container where the templates and parameters are stored. The default value is _templates/1.0.0.0_. Don't change this value unless you want to change the folder structure explained in [Prepare the artifacts](#prepare-the-artifacts). Relative paths are used in this tutorial. The full path is constructed by concatenating `artifactSourceSASLocation`, `templateArtifactRoot`, and `templateArtifactSourceRelativePath` (or `parametersArtifactSourceRelativePath`).
+* `targetSubscriptionID`: The subscription ID to which the Deployment Manager resources are going to be deployed and billed. Use your subscription ID in this tutorial.
### The variables
-The variables section defines the names of the resources, the Azure locations for the two
+The variables section defines the names of the resources, the Azure locations for the two
![Azure Deployment Manager tutorial topology template variables](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-topology-template-variables.png)
-Compare the artifact paths with the folder structure that you uploaded to the storage account. Notice the artifact paths are relative paths. The full path is constructed by concatenating **artifactSourceSASLocation**, **templateArtifactRoot**, and **templateArtifactSourceRelativePath** (or **parametersArtifactSourceRelativePath**).
+Compare the artifact paths with the folder structure that you uploaded to the storage account. Notice the artifact paths are relative paths. The full path is constructed by concatenating `artifactSourceSASLocation`, `templateArtifactRoot`, and `templateArtifactSourceRelativePath` (or `parametersArtifactSourceRelativePath`).
### The resources
-On the root level, there are two resources defined: *an artifact source*, and *a service topology*.
+On the root level, two resources are defined: *an artifact source*, and *a service topology*.
The artifact source definition is:
@@ -229,28 +228,28 @@ The following screenshot only shows some parts of the service topology, services
![Azure Deployment Manager tutorial topology template resources service topology](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-topology-template-resources-service-topology.png)
-* **artifactSourceId** is used to associate the artifact source resource to the service topology resource.
-* **dependsOn**: All the service topology resources depend on the artifact source resource.
-* **artifacts** point to the template artifacts. Relative paths are used here. The full path is constructed by concatenating artifactSourceSASLocation (defined in the artifact source), artifactRoot (defined in the artifact source), and templateArtifactSourceRelativePath (or parametersArtifactSourceRelativePath).
+* `artifactSourceId`: Used to associate the artifact source resource to the service topology resource.
+* `dependsOn`: All the service topology resources depend on the artifact source resource.
+* `artifacts`: Point to the template artifacts. Relative paths are used here. The full path is constructed by concatenating `artifactSourceSASLocation` (defined in the artifact source), `artifactRoot` (defined in the artifact source), and `templateArtifactSourceRelativePath` (or `parametersArtifactSourceRelativePath`).
### Topology parameters file You create a parameters file used with the topology template.
-1. Open **\ADMTemplates\CreateADMServiceTopology.Parameters** in Visual Studio Code or any text editor.
-2. Fill the parameter values:
+1. Open _\ADMTemplates\CreateADMServiceTopology.Parameters.json_ in Visual Studio Code or any text editor.
+1. Enter the parameter values:
- * **projectName**: Enter a string with 4-5 characters. This name is used to create unique azure resource names.
- * **azureResourceLocation**: If you are not familiar with Azure locations, use **centralus** in this tutorial.
- * **artifactSourceSASLocation**: Enter the SAS URI to the root directory (the Blob container) where service unit template and parameters files are stored for deployment. See [Prepare the artifacts](#prepare-the-artifacts).
- * **templateArtifactRoot**: Unless you change the folder structure of the artifacts, use **templates/1.0.0.0** in this tutorial.
+ * `projectName`: Enter a string with 4-5 characters. This name is used to create unique Azure resource names.
+ * `azureResourceLocation`: If you're not familiar with Azure locations, use **centralus** in this tutorial.
+ * `artifactSourceSASLocation`: Enter the SAS URI to the root directory (the Blob container) where service unit template and parameters files are stored for deployment. See [Prepare the artifacts](#prepare-the-artifacts).
+ * `templateArtifactRoot`: Unless you change the folder structure of the artifacts, use _templates/1.0.0.0_ in this tutorial.
> [!IMPORTANT]
-> The topology template and the rollout template share some common parameters. These parameters must have the same values. These parameters are: **projectName**, **azureResourceLocation**, and **artifactSourceSASLocation** (both artifact sources share the same storage account in this tutorial).
+> The topology template and the rollout template share some common parameters. These parameters must have the same values. These parameters are: `projectName`, `azureResourceLocation`, and `artifactSourceSASLocation` (both artifact sources share the same storage account in this tutorial).
## Create the rollout template
-Open **\ADMTemplates\CreateADMRollout.json**.
+Open _\ADMTemplates\CreateADMRollout.json_.
### The parameters
@@ -258,15 +257,15 @@ The template contains the following parameters:
![Azure Deployment Manager tutorial rollout template parameters](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-rollout-template-parameters.png)
-* **projectName**: This name is used to create the names for the Deployment Manager resources. For example, using "jdoe", the rollout name is **jdoe**Rollout. The names are defined in the variables section of the template.
-* **azureResourcelocation**: To simplify the tutorial, all Deployment Manager resources share this location unless it is specified otherwise.
-* **artifactSourceSASLocation**: The SAS URI to the root directory (the Blob container) where service unit template and parameters files are stored for deployment. See [Prepare the artifacts](#prepare-the-artifacts).
-* **binaryArtifactRoot**: The default value is **binaries/1.0.0.0**. Don't change this value unless you want to change the folder structure explained in [Prepare the artifacts](#prepare-the-artifacts). Relative paths are used in this tutorial. The full path is constructed by concatenating **artifactSourceSASLocation**, **binaryArtifactRoot**, and the **deployPackageUri** specified in the CreateWebApplicationParameters.json. See [Prepare the artifacts](#prepare-the-artifacts).
-* **managedIdentityID**: The user-assigned managed identity that performs the deployment actions. See [Create the user-assigned managed identity](#create-the-user-assigned-managed-identity).
+* `projectName`: This name is used to create the names for the Deployment Manager resources. For example, using **demo**, the rollout name is **demo**Rollout. The names are defined in the template's `variables` section.
+* `azureResourcelocation`: To simplify the tutorial, all Deployment Manager resources share this location unless it's specified otherwise.
+* `artifactSourceSASLocation`: The SAS URI to the root directory (the Blob container) where service unit template and parameters files are stored for deployment. See [Prepare the artifacts](#prepare-the-artifacts).
+* `binaryArtifactRoot`: The default value is _binaries/1.0.0.0_. Don't change this value unless you want to change the folder structure explained in [Prepare the artifacts](#prepare-the-artifacts). Relative paths are used in this tutorial. The full path is constructed by concatenating `artifactSourceSASLocation`, `binaryArtifactRoot`, and the `deployPackageUri` specified in _CreateWebApplicationParameters.json_. See [Prepare the artifacts](#prepare-the-artifacts).
+* `managedIdentityID`: The user-assigned managed identity that performs the deployment actions. See [Create the user-assigned managed identity](#create-the-user-assigned-managed-identity).
### The variables
-The variables section defines the names of the resources. Make sure the service topology name, the service names, and the service unit names match the names defined in the [topology template](#create-the-service-topology-template).
+The `variables` section defines the names of the resources. Make sure the service topology name, the service names, and the service unit names match the names defined in the [topology template](#create-the-service-topology-template).
![Azure Deployment Manager tutorial rollout template variables](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-rollout-template-variables.png)
@@ -274,44 +273,44 @@ The variables section defines the names of the resources. Make sure the service
On the root level, there are three resources defined: an artifact source, a step, and a rollout.
-The artifact source definition is identical to the one defined in the topology template. See [Create the service topology template](#create-the-service-topology-template) for more information.
+The artifact source definition is identical to the one defined in the topology template. See [Create the service topology template](#create-the-service-topology-template) for more information.
-The following screenshot shows the wait step definition:
+The following screenshot shows the `wait` step definition:
![Azure Deployment Manager tutorial rollout template resources wait step](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-rollout-template-resources-wait-step.png)
-The duration is using the [ISO 8601 standard](https://en.wikipedia.org/wiki/ISO_8601#Durations). **PT1M** (capital letters are required) is an example of a 1-minute wait.
+The duration uses the [ISO 8601 standard](https://en.wikipedia.org/wiki/ISO_8601#Durations). **PT1M** (capital letters are required) is an example of a 1-minute wait.
The following screenshot only shows some parts of the rollout definition: ![Azure Deployment Manager tutorial rollout template resources rollout](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-rollout-template-resources-rollout.png)
-* **dependsOn**: The rollout resource depends on the artifact source resource, and any of the steps defined.
-* **artifactSourceId**: used to associate the artifact source resource to the rollout resource.
-* **targetServiceTopologyId**: used to associate the service topology resource to the rollout resource.
-* **deploymentTargetId**: It is the service unit resource ID of the service topology resource.
-* **preDeploymentSteps** and **postDeploymentSteps**: contains the rollout steps. In the template, a wait step is called.
-* **dependsOnStepGroups**: configure the dependencies between the step groups.
+* `dependsOn`: The rollout resource depends on the artifact source resource, and any of the steps defined.
+* `artifactSourceId`: Used to associate the artifact source resource to the rollout resource.
+* `targetServiceTopologyId`: Used to associate the service topology resource to the rollout resource.
+* `deploymentTargetId`: It's the service unit resource ID of the service topology resource.
+* `preDeploymentSteps` and `postDeploymentSteps`: Contains the rollout steps. In the template, a `wait` step is called.
+* `dependsOnStepGroups`: Configure the dependencies between the step groups.
### Rollout parameters file You create a parameters file used with the rollout template.
-1. Open **\ADMTemplates\CreateADMRollout.Parameters** in Visual Studio Code or any text editor.
-2. Fill the parameter values:
+1. Open _\ADMTemplates\CreateADMRollout.Parameters.json_ in Visual Studio Code or any text editor.
+1. Enter the parameter values:
- * **projectName**: Enter a string with 4-5 characters. This name is used to create unique azure resource names.
- * **azureResourceLocation**: Specify an Azure location.
- * **artifactSourceSASLocation**: Enter the SAS URI to the root directory (the Blob container) where service unit template and parameters files are stored for deployment. See [Prepare the artifacts](#prepare-the-artifacts).
- * **binaryArtifactRoot**: Unless you change the folder structure of the artifacts, use **binaries/1.0.0.0** in this tutorial.
- * **managedIdentityID**: Enter the user-assigned managed identity. See [Create the user-assigned managed identity](#create-the-user-assigned-managed-identity). The syntax is:
+ * `projectName`: Enter a string with 4-5 characters. This name is used to create unique Azure resource names.
+ * `azureResourceLocation`: Specify an Azure location.
+ * `artifactSourceSASLocation`: Enter the SAS URI to the root directory (the Blob container) where service unit template and parameters files are stored for deployment. See [Prepare the artifacts](#prepare-the-artifacts).
+ * `binaryArtifactRoot`: Unless you change the folder structure of the artifacts, use _binaries/1.0.0.0_ in this tutorial.
+ * `managedIdentityID`: Enter the user-assigned managed identity. See [Create the user-assigned managed identity](#create-the-user-assigned-managed-identity). The syntax is:
- ```
+ ```json
"/subscriptions/<SubscriptionID>/resourcegroups/<ResourceGroupName>/providers/Microsoft.ManagedIdentity/userassignedidentities/<ManagedIdentityName>" ``` > [!IMPORTANT]
-> The topology template and the rollout template share some common parameters. These parameters must have the same values. These parameters are: **projectName**, **azureResourceLocation**, and **artifactSourceSASLocation** (both artifact sources share the same storage account in this tutorial).
+> The topology template and the rollout template share some common parameters. These parameters must have the same values. These parameters are: `projectName`, `azureResourceLocation`, and `artifactSourceSASLocation` (both artifact sources share the same storage account in this tutorial).
## Deploy the templates
@@ -327,18 +326,18 @@ Azure PowerShell can be used to deploy the templates.
-TemplateParameterFile "$filePath\ADMTemplates\CreateADMServiceTopology.Parameters.json" ```
- If you run this script from a different PowerShell session from the one you ran the [Prepare the artifacts](#prepare-the-artifacts) script, you need to repopulate the variables first, which include **$resourceGroupName** and **$filePath**.
+ If you run this script from a different PowerShell session from the one you ran the [Prepare the artifacts](#prepare-the-artifacts) script, you need to repopulate the variables first, which include `$resourceGroupName` and `$filePath`.
> [!NOTE]
- > `New-AzResourceGroupDeployment` is an asynchronous call. The success message only means the deployment has successfully begun. To verify the deployment, see step 2 and step 4 of this procedure.
+ > `New-AzResourceGroupDeployment` is an asynchronous call. The success message only means the deployment has successfully begun. To verify the deployment, see Step 2 and Step 4 of this procedure.
-2. Verify the service topology and the underlined resources have been created successfully using the Azure portal:
+1. Verify the service topology and the underlined resources have been created successfully using the Azure portal:
![Azure Deployment Manager tutorial deployed service topology resources](./media/deployment-manager-tutorial/azure-deployment-manager-tutorial-deployed-topology-resources.png) **Show hidden types** must be selected to see the resources.
-3. <a id="deploy-the-rollout-template"></a>Deploy the rollout template:
+1. <a id="deploy-the-rollout-template"></a>Deploy the rollout template:
```azurepowershell # Create the rollout
@@ -348,7 +347,7 @@ Azure PowerShell can be used to deploy the templates.
-TemplateParameterFile "$filePath\ADMTemplates\CreateADMRollout.Parameters.json" ```
-4. Check the rollout progress using the following PowerShell script:
+1. Check the rollout progress using the following PowerShell script:
```azurepowershell # Get the rollout status
@@ -359,11 +358,11 @@ Azure PowerShell can be used to deploy the templates.
-Verbose ```
- The Deployment Manager PowerShell cmdlets must be installed before you can run this cmdlet. See Prerequisites. The -Verbose switch can be used to see the whole output.
+ The Deployment Manager PowerShell cmdlets must be installed before you can run this cmdlet. See [Prerequisites](#prerequisites). The `-Verbose` parameter can be used to see the whole output.
The following sample shows the running status:
- ```
+ ```Output
VERBOSE: Status: Succeeded
@@ -419,37 +418,37 @@ Azure PowerShell can be used to deploy the templates.
Tags : ```
- After the rollout is deployed successfully, you shall see two more resource groups created, one for each service.
+ After the rollout is deployed successfully, you'll see two more resource groups created, one for each service.
## Verify the deployment 1. Open the [Azure portal](https://portal.azure.com).
-2. Browse to the newly create web applications under the new resource groups created by the rollout deployment.
-3. Open the web application in a web browser. Verify the location and the version on the https://docsupdatetracker.net/index.html file.
+1. Browse to the newly created web applications under the new resource groups created by the rollout deployment.
+1. Open the web application in a web browser. Verify the location and the version on the _https://docsupdatetracker.net/index.html_ file.
## Deploy the revision When you have a new version (1.0.0.1) for the web application. You can use the following procedure to redeploy the web application.
-1. Open CreateADMRollout.Parameters.json.
-2. Update **binaryArtifactRoot** to **binaries/1.0.0.1**.
-3. Redeploy the rollout as instructed in [Deploy the templates](#deploy-the-rollout-template).
-4. Verify the deployment as instructed in [Verify the deployment](#verify-the-deployment). The web page shall show the 1.0.0.1 version.
+1. Open _CreateADMRollout.Parameters.json_.
+1. Update `binaryArtifactRoot` to _binaries/1.0.0.1_.
+1. Redeploy the rollout as instructed in [Deploy the templates](#deploy-the-rollout-template).
+1. Verify the deployment as instructed in [Verify the deployment](#verify-the-deployment). The web page shall show the 1.0.0.1 version.
## Clean up resources When the Azure resources are no longer needed, clean up the resources you deployed by deleting the resource group. 1. From the Azure portal, select **Resource group** from the left menu.
-2. Use the **Filter by name** field to narrow down the resource groups created in this tutorial. There shall be 3-4:
+1. Use the **Filter by name** field to narrow down the resource groups created in this tutorial.
* **&lt;projectName>rg**: contains the Deployment Manager resources. * **&lt;projectName>ServiceWUSrg**: contains the resources defined by ServiceWUS. * **&lt;projectName>ServiceEUSrg**: contains the resources defined by ServiceEUS. * The resource group for the user-defined managed identity.
-3. Select the resource group name.
-4. Select **Delete resource group** from the top menu.
-5. Repeat the last two steps to delete other resource groups created by this tutorial.
+1. Select the resource group name.
+1. Select **Delete resource group** from the top menu.
+1. Repeat the last two steps to delete other resource groups created by this tutorial.
## Next steps
azure-resource-manager https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/template-deploy-what-if https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/template-deploy-what-if.md
@@ -3,7 +3,7 @@ Title: Template deployment what-if
description: Determine what changes will happen to your resources before deploying an Azure Resource Manager template. Previously updated : 12/15/2020 Last updated : 02/05/2021 # ARM template deployment what-if operation
@@ -16,10 +16,6 @@ You can use the what-if operation with Azure PowerShell, Azure CLI, or REST API
To use what-if in PowerShell, you must have version **4.2 or later of the Az module**.
-But, before installing the required module, make sure you have PowerShell Core (6.x or 7.x). If you have PowerShell 5.x or earlier, [update your version of PowerShell](/powershell/scripting/install/installing-powershell). You can't install the required module on PowerShell 5.x or earlier.
-
-### Install latest version
- To install the module, use: ```powershell
@@ -320,7 +316,7 @@ results=$(az deployment group what-if --resource-group ExampleGroup --template-u
The what-if operation supports using [deployment mode](deployment-modes.md). When set to complete mode, resources not in the template are deleted. The following example deploys a [template that has no resources defined](https://github.com/Azure/azure-docs-json-samples/blob/master/empty-template/azuredeploy.json) in complete mode.
-To preview changes before deploying a template, use the confirm switch parameter with the deployment command. If the changes are as you expected, acknowledge that you want the deployment to complete.
+To preview changes before deploying a template, use the confirm switch parameter with the deployment command. If the changes are as you expected, respond that you want the deployment to complete.
# [PowerShell](#tab/azure-powershell)
azure-sql https://docs.microsoft.com/en-us/azure/azure-sql/database/high-availability-sla https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/high-availability-sla.md
@@ -8,9 +8,9 @@
ms.devlang: ---+++ Last updated 10/28/2020
@@ -55,7 +55,7 @@ The zone redundant version of the high availability architecture for the general
![Zone redundant configuration for general purpose](./media/high-availability-sla/zone-redundant-for-general-purpose.png) > [!IMPORTANT]
-> For up to date information about the regions that support zone redundant databases, see [Services support by region](../../availability-zones/az-region.md). Zone redundant configuration is only available when the Gen5 compute hardware is selected. This feature is not available in SQL Managed Instance.
+> Zone redundant configuration is only available when the Gen5 compute hardware is selected. This feature is not available in SQL Managed Instance. Zone redundant configuration for general purpose tier is only available in the following regions: East US, East US 2, West US 2, North Europe, West Europe, Southeast Asia, Australia East, Japan East, UK South, and France Central.
> [!NOTE] > General Purpose databases with a size of 80 vcore may experience performance degradation with zone redundant configuration. Additionally, operations such as backup, restore, database copy, and setting up Geo-DR relationships may experience slower performance for any single databases larger than 1 TB.
@@ -134,4 +134,4 @@ Azure SQL Database and Azure SQL Managed Instance feature a built-in high availa
- Learn about [Service Fabric](../../service-fabric/service-fabric-overview.md) - Learn about [Azure Traffic Manager](../../traffic-manager/traffic-manager-overview.md) - Learn [How to initiate a manual failover on SQL Managed Instance](../managed-instance/user-initiated-failover.md)-- For more options for high availability and disaster recovery, see [Business Continuity](business-continuity-high-availability-disaster-recover-hadr-overview.md)
+- For more options for high availability and disaster recovery, see [Business Continuity](business-continuity-high-availability-disaster-recover-hadr-overview.md)
azure-sql https://docs.microsoft.com/en-us/azure/azure-sql/managed-instance/transact-sql-tsql-differences-sql-server https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/managed-instance/transact-sql-tsql-differences-sql-server.md
@@ -486,7 +486,7 @@ Cross-instance service broker isn't supported:
- `scan for startup procs` - `sp_execute_external_scripts` isn't supported. See [sp_execute_external_scripts](/sql/relational-databases/system-stored-procedures/sp-execute-external-script-transact-sql#examples). - `xp_cmdshell` isn't supported. See [xp_cmdshell](/sql/relational-databases/system-stored-procedures/xp-cmdshell-transact-sql).-- `Extended stored procedures` aren't supported, which includes `sp_addextendedproc` and `sp_dropextendedproc`. See [Extended stored procedures](/sql/relational-databases/system-stored-procedures/general-extended-stored-procedures-transact-sql).
+- `Extended stored procedures` aren't supported, and this includes `sp_addextendedproc` and `sp_dropextendedproc`. This functionality won't be supported because it's on a deprecation path for SQL Server. For more details, see [Extended Stored Procedures](/sql/relational-databases/extended-stored-procedures-programming/database-engine-extended-stored-procedures-programming).
- `sp_attach_db`, `sp_attach_single_file_db`, and `sp_detach_db` aren't supported. See [sp_attach_db](/sql/relational-databases/system-stored-procedures/sp-attach-db-transact-sql), [sp_attach_single_file_db](/sql/relational-databases/system-stored-procedures/sp-attach-single-file-db-transact-sql), and [sp_detach_db](/sql/relational-databases/system-stored-procedures/sp-detach-db-transact-sql). ### System functions and variables
azure-vmware https://docs.microsoft.com/en-us/azure/azure-vmware/azure-security-integration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/azure-security-integration.md
@@ -2,7 +2,7 @@
Title: Protect your Azure VMware Solution VMs with Azure Security Center integration description: Learn how to protect your Azure VMware Solution VMs with Azure's native security tools from a single dashboard in Azure Security Center. Previously updated : 11/06/2020 Last updated : 02/04/2021 # Protect your Azure VMware Solution VMs with Azure Security Center integration
@@ -31,7 +31,12 @@ You can configure the Log Analytics workspace with Azure Sentinel for alert dete
- Azure native services can be used for hybrid environment security in Azure, Azure VMware Solution, and on-premises services. - Using a Log Analytics workspace, you can collect the data or the logs to a single point and present the same data to different Azure native services.-- Azure Security Center provides security features like file integrity monitoring, fileless attack detection, operating system patch assessment, security misconfigurations assessment, and endpoint protection assessment.
+- Azure Security Center offers a number of features, including:
+ - File integrity monitoring
+ - Fileless attack detection
+ - Operating system patch assessment
+ - Security misconfigurations assessment
+ - Endpoint protection assessment
- Azure Sentinel allows you to: - Collect data at cloud scale across all users, devices, applications, and infrastructure, both on premises and in multiple clouds. - Detect previously undetected threats.
@@ -178,7 +183,7 @@ After connecting data sources to Azure Sentinel, you can create rules to generat
6. On the **Incident settings** tab, enable **Create incidents from alerts triggered by this analytics rule** and select **Next: Automated response >**.
- :::image type="content" source="media/azure-security-integration/create-new-analytic-rule-wizard.png" alt-text="Screenshot of the Analytic rule wizard for creating a new rule in Azure Sentinel showing Create incidents from alerts triggered by this analytics rule as enabled.":::
+ :::image type="content" source="media/azure-security-integration/create-new-analytic-rule-wizard.png" alt-text="Screenshot of the Analytic rule wizard for creating a new rule in Azure Sentinel. Shows Create incidents from alerts triggered by this rule as enabled.":::
7. Select **Next: Review >**.
@@ -230,6 +235,8 @@ You can create queries or use the available pre-defined query in Azure Sentinel
## Next steps -- Learn to use the [Azure Defender dashboard](../security-center/azure-defender-dashboard.md).-- Explore the full range of protection offered by [Azure Defender](../security-center/azure-defender.md).-- Learn about [Advanced multistage attack detection in Azure Sentinel](../azure-monitor/learn/quick-create-workspace.md).
+Now that you've covered how to protect your Azure VMware Solution VMs, you may want to learn about:
+
+- Using the [Azure Defender dashboard](../security-center/azure-defender-dashboard.md).
+- [Advanced multistage attack detection in Azure Sentinel](../azure-monitor/learn/quick-create-workspace.md).
+- [Lifecycle management of Azure VMware Solution VMs](lifecycle-management-of-azure-vmware-solution-vms.md).
azure-vmware https://docs.microsoft.com/en-us/azure/azure-vmware/backup-azure-vmware-solution-virtual-machines https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/backup-azure-vmware-solution-virtual-machines.md
@@ -2,7 +2,7 @@
Title: Back up Azure VMware Solution VMs with Azure Backup Server description: Configure your Azure VMware Solution environment to back up virtual machines by using Azure Backup Server. Previously updated : 06/09/2020 Last updated : 02/04/2021 # Back up Azure VMware Solution VMs with Azure Backup Server
@@ -351,7 +351,7 @@ You can restore individual files from a protected VM recovery point. This featur
## Next steps
-For troubleshooting issues when setting up backups, review the troubleshooting guide for Azure Backup Server.
+Now that you've covered backing up your Azure VMware Solution VMs with Azure Backup Server, you may want to learn about:
-> [!div class="nextstepaction"]
-> [Troubleshooting guide for Azure Backup Server](../backup/backup-azure-mabs-troubleshoot.md)
+- [Troubleshooting when setting up backups in Azure Backup Server](../backup/backup-azure-mabs-troubleshoot.md).
+- [Lifecycle management of Azure VMware Solution VMs](lifecycle-management-of-azure-vmware-solution-vms.md).
azure-vmware https://docs.microsoft.com/en-us/azure/azure-vmware/public-ip-usage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/public-ip-usage.md
@@ -2,7 +2,7 @@
Title: How to use the public IP functionality in Azure VMware Solution description: This article explains how to use the public IP functionality in Azure Virtual WAN. Previously updated : 10/28/2020 Last updated : 02/04/2021 # How to use the public IP functionality in Azure VMware Solution
@@ -110,11 +110,11 @@ Once all components are deployed, you can see them in the added Resource group.
:::image type="content" source="media/public-ip-usage/create-firewall-policy.png" alt-text="Screenshot of how to create a firewall policy in Firewall Manager" border="true" lightbox="media/public-ip-usage/create-firewall-policy.png":::
-1. Under the **Basics** tab, provide the required details and select **Next : DNS Settings**.
+1. Under the **Basics** tab, provide the required details and select **Next: DNS Settings**.
-1. Under the **DNS** tab, select **Disable**, and then select **Next : Rules**.
+1. Under the **DNS** tab, select **Disable**, and then select **Next: Rules**.
-1. Select **Add a rule collection**, provide the below details, and select **Add** and then select **Next : Threat intelligence**.
+1. Select **Add a rule collection**, provide the below details, and select **Add** and then select **Next: Threat intelligence**.
- Name - Rules collection Type - DNAT
@@ -130,7 +130,7 @@ Once all components are deployed, you can see them in the added Resource group.
- Translated address ΓÇô **Azure VMware Solution Web Server private IP Address** - Translated port - **Azure VMware Solution Web Server port**
-1. Leave the default value, and then select **Next : Hubs**.
+1. Leave the default value, and then select **Next: Hubs**.
1. Select **Associate virtual hub**.
@@ -138,11 +138,11 @@ Once all components are deployed, you can see them in the added Resource group.
:::image type="content" source="media/public-ip-usage/secure-hubs-with-azure-firewall-polcy.png" alt-text="Screenshot that shows the selected hubs that will be converted to Secured Virtual Hubs." border="true" lightbox="media/public-ip-usage/secure-hubs-with-azure-firewall-polcy.png":::
-1. Select **Next : Tags**.
+1. Select **Next: Tags**.
1. (Optional) Create name and value pairs to categorize your resources.
-1. Select **Next : Review + create** and then select **Create**.
+1. Select **Next: Review + create** and then select **Create**.
## Limitations
@@ -150,5 +150,7 @@ You can have 100 public IPs per SDDCs.
## Next steps
-Learn more about using public IP addresses using [Azure Virtual WAN](../virtual-wan/virtual-wan-about.md).
+Now that you've covered how to use the public IP functionality in Azure VMware Solution, you may want to learn about:
+- Using public IP addresses with [Azure Virtual WAN](../virtual-wan/virtual-wan-about.md).
+- [Creating an IPSec tunnel into Azure VMware Solution](create-ipsec-tunnel.md).
azure-vmware https://docs.microsoft.com/en-us/azure/azure-vmware/set-up-backup-server-for-azure-vmware-solution https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/set-up-backup-server-for-azure-vmware-solution.md
@@ -2,7 +2,7 @@
Title: Set up Azure Backup Server for Azure VMware Solution description: Set up your Azure VMware Solution environment to back up virtual machines using Azure Backup Server. Previously updated : 10/23/2020 Last updated : 02/04/2021 # Set up Azure Backup Server for Azure VMware Solution
@@ -384,7 +384,7 @@ Azure Backup Server v3 only accepts storage volumes. When you add a volume, Azur
## Next steps
-Continue to the next tutorial to learn how to configure a backup of VMware VMs running on Azure VMware Solution using Azure Backup Server.
+Now that you've covered how to set up Azure Backup Server for Azure VMware Solution, you may want to learn about:
-> [!div class="nextstepaction"]
-> [Configure backup of Azure VMware Solution VMs](backup-azure-vmware-solution-virtual-machines.md)
+- [Configuring backups for your Azure VMware Solution VMs](backup-azure-vmware-solution-virtual-machines.md).
+- [Protecting your Azure VMware Solution VMs with Azure Security Center integration](azure-security-integration.md).
bastion https://docs.microsoft.com/en-us/azure/bastion/bastion-faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/bastion/bastion-faq.md
@@ -6,13 +6,11 @@
Previously updated : 11/05/2020 Last updated : 02/05/2021 # Azure Bastion FAQ
-This is the FAQ for Azure Bastion.
- [!INCLUDE [Bastion FAQ](../../includes/bastion-faq-include.md)] [!INCLUDE [FAQ for VNet peering](../../includes/bastion-faq-peering-include.md)]
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/QnAMaker/Concepts/azure-resources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/Concepts/azure-resources.md
@@ -129,6 +129,94 @@ Get the latest runtime updates by [updating your App Service in the Azure portal
+## Keys in QnA Maker
+
+# [QnA Maker GA (stable release)](#tab/v1)
+
+Your QnA Maker service deals with two kinds of keys: **authoring keys** and **query endpoint keys** used with the runtime hosted in the App service.
+
+Use these keys when making requests to the service through APIs.
+
+![Key management](../media/qnamaker-how-to-key-management/key-management.png)
+
+|Name|Location|Purpose|
+|--|--|--|
+|Authoring/Subscription key|[Azure portal](https://azure.microsoft.com/free/cognitive-services/)|These keys are used to access the [QnA Maker management service APIs](/rest/api/cognitiveservices/qnamaker4.0/knowledgebase). These APIs let you edit the questions and answers in your knowledge base, and publish your knowledge base. These keys are created when you create a new QnA Maker service.<br><br>Find these keys on the **Cognitive Services** resource on the **Keys** page.|
+|Query endpoint key|[QnA Maker portal](https://www.qnamaker.ai)|These keys are used to query the published knowledge base endpoint to get a response for a user question. You typically use this query endpoint in your chat bot or in the client application code that connects to the QnA Maker service. These keys are created when you publish your QnA Maker knowledge base.<br><br>Find these keys in the **Service settings** page. Find this page from the user's menu in the upper right of the page on the drop-down menu.|
+
+### Find authoring keys in the Azure portal
+
+You can view and reset your authoring keys from the Azure portal, where you created the QnA Maker resource. These keys may be referred to as subscription keys.
+
+1. Go to the QnA Maker resource in the Azure portal and select the resource that has the _Cognitive Services_ type:
+
+ ![QnA Maker resource list](../media/qnamaker-how-to-key-management/qnamaker-resource-list.png)
+
+2. Go to **Keys and Endpoint**:
+
+ ![QnA Maker managed (Preview) Subscription key](../media/qnamaker-how-to-key-management/subscription-key-v2.png)
+
+### Find query endpoint keys in the QnA Maker portal
+
+The endpoint is in the same region as the resource because the endpoint keys are used to make a call to the knowledge base.
+
+Endpoint keys can be managed from the [QnA Maker portal](https://qnamaker.ai).
+
+1. Sign in to the [QnA Maker portal](https://qnamaker.ai), go to your profile, and then select **Service settings**:
+
+ ![Endpoint key](../media/qnamaker-how-to-key-management/Endpoint-keys.png)
+
+2. View or reset your keys:
+
+ > [!div class="mx-imgBorder"]
+ > ![Endpoint key manager](../media/qnamaker-how-to-key-management/Endpoint-keys1.png)
+
+ >[!NOTE]
+ >Refresh your keys if you think they've been compromised. This may require corresponding changes to your client application or bot code.
+
+# [QnA Maker managed (preview release)](#tab/v2)
+
+Your QnA Maker managed (Preview) service deals with two kinds of keys: **authoring keys** and **Azure Cognitive Search keys** used to access the service in the customerΓÇÖs subscription.
+
+Use these keys when making requests to the service through APIs.
+
+![Key management managed preview](../media/qnamaker-how-to-key-management/qnamaker-v2-key-management.png)
+
+|Name|Location|Purpose|
+|--|--|--|
+|Authoring/Subscription key|[Azure portal](https://azure.microsoft.com/free/cognitive-services/)|These keys are used to access the [QnA Maker management service APIs](/rest/api/cognitiveservices/qnamaker4.0/knowledgebase). These APIs let you edit the questions and answers in your knowledge base, and publish your knowledge base. These keys are created when you create a new QnA Maker service.<br><br>Find these keys on the **Cognitive Services** resource on the **Keys** page.|
+|Azure Cognitive Search Admin Key|[Azure portal](../../../search/search-security-api-keys.md)|These keys are used to communicate with the Azure cognitive search service deployed in the userΓÇÖs Azure subscription. When you associate an Azure cognitive search with the QnA Maker managed (Preview) service, the admin key is automatically passed on to the QnA Maker service. <br><br>You can find these keys on the **Azure Cognitive Search** resource on the **Keys** page.|
+
+### Find authoring keys in the Azure portal
+
+You can view and reset your authoring keys from the Azure portal, where you created the QnA Maker managed (Preview) resource. These keys may be referred to as subscription keys.
+
+1. Go to the QnA Maker managed (Preview) resource in the Azure portal and select the resource that has the *Cognitive Services* type:
+
+ ![QnA Maker managed (Preview) resource list](../media/qnamaker-how-to-key-management/qnamaker-v2-resource-list.png)
+
+2. Go to **Keys and Endpoint**:
+
+ ![QnA Maker managed (Preview) Subscription key](../media/qnamaker-how-to-key-management/subscription-key-v2.png)
+
+### Update the resources
+
+Learn how to upgrade the resources used by your knowledge base. QnA Maker managed (Preview) is **free** while in preview.
+++
+## Management service region
+
+# [QnA Maker GA (stable release)](#tab/v1)
+
+The management service of QnA Maker is used only for the QnA Maker portal and for initial data processing. This service is available only in the **West US** region. No customer data is stored in this West US service.
+
+# [QnA Maker managed (preview release)](#tab/v2)
+
+In QnA Maker managed (Preview) both the management and the prediction services are co-located in the same region. Currently QnA Maker managed (Preview) is available in **South Central US, North Europe and Australia East**.
+++ ## Resource naming considerations # [QnA Maker GA (stable release)](#tab/v1)
@@ -219,31 +307,6 @@ If you create a QnA service and its dependencies (such as Search) through the po
Learn [how to configure](../How-To/set-up-qnamaker-service-azure.md#configure-qna-maker-to-use-different-cognitive-search-resource) QnA Maker to use a different Cognitive Service resource than the one created as part of the QnA Maker resource creation process.
-## Management service region
-
-The management service of QnA Maker is used only for the QnA Maker portal and for initial data processing. This service is available only in the **West US** region. No customer data is stored in this West US service.
-
-## Keys in QnA Maker
-
-Your QnA Maker service deals with two kinds of keys: **authoring keys** and **query endpoint keys** used with the runtime hosted in the App service.
-
-Use these keys when making requests to the service through APIs.
-
-![Key management](../media/qnamaker-how-to-key-management/key-management.png)
-
-|Name|Location|Purpose|
-|--|--|--|
-|Authoring/Subscription key|[Azure portal](https://azure.microsoft.com/free/cognitive-services/)|These keys are used to access the [QnA Maker management service APIs](/rest/api/cognitiveservices/qnamaker4.0/knowledgebase). These APIs let you edit the questions and answers in your knowledge base, and publish your knowledge base. These keys are created when you create a new QnA Maker service.<br><br>Find these keys on the **Cognitive Services** resource on the **Keys** page.|
-|Query endpoint key|[QnA Maker portal](https://www.qnamaker.ai)|These keys are used to query the published knowledge base endpoint to get a response for a user question. You typically use this query endpoint in your chat bot or in the client application code that connects to the QnA Maker service. These keys are created when you publish your QnA Maker knowledge base.<br><br>Find these keys in the **Service settings** page. Find this page from the user's menu in the upper right of the page on the drop-down menu.|
-
-### Recommended settings for network isolation
-
-* Protect Cognitive Service Resource from public access by [configuring the virtual network](../../cognitive-services-virtual-networks.md?tabs=portal).
-* Protect App Service (QnA Runtime) from public access:
- * Allow traffic only from Cognitive Service IPs. These are already included in Service Tag "CognitiveServicesManagement". This is required for Authoring APIs (Create/Update KB) to invoke the app service and update Azure Search service accordingly.
- * Make sure you also allow other entry points like Bot service, QnA Maker portal (may be your corpnet) etc. for prediction "GenerateAnswer" API access.
- * Check out [more information about service tags.](../../../virtual-network/service-tags-overview.md)
- # [QnA Maker managed (preview release)](#tab/v2) The resource name for the QnA Maker managed (Preview) resource, such as `qna-westus-f0-b`, is also used to name the other resources.
@@ -291,27 +354,6 @@ With QnA Maker managed (Preview) you have a choice to setup your QnA Maker servi
The QnA Maker managed (Preview) resource provides access to the authoring and publishing APIs, hosts the ranking runtime as well as provides telemetry.
-## Region support
-
-In QnA Maker managed (Preview) both the management and the prediction services are co-located in the same region. Currently QnA Maker managed (Preview) is available in **South Central US, North Europe and Australia East**.
-
-### Keys in QnA Maker managed (Preview)
-
-Your QnA Maker managed (Preview) service deals with two kinds of keys: **authoring keys** and **Azure Cognitive Search keys** used to access the service in the customerΓÇÖs subscription.
-
-Use these keys when making requests to the service through APIs.
-
-![Key management managed preview](../media/qnamaker-how-to-key-management/qnamaker-v2-key-management.png)
-
-|Name|Location|Purpose|
-|--|--|--|
-|Authoring/Subscription key|[Azure portal](https://azure.microsoft.com/free/cognitive-services/)|These keys are used to access the [QnA Maker management service APIs](/rest/api/cognitiveservices/qnamaker4.0/knowledgebase). These APIs let you edit the questions and answers in your knowledge base, and publish your knowledge base. These keys are created when you create a new QnA Maker service.<br><br>Find these keys on the **Cognitive Services** resource on the **Keys** page.|
-|Azure Cognitive Search Admin Key|[Azure portal](../../../search/search-security-api-keys.md)|These keys are used to communicate with the Azure cognitive search service deployed in the userΓÇÖs Azure subscription. When you associate an Azure cognitive search with the QnA Maker managed (Preview) service, the admin key is automatically passed on to the QnA Maker service. <br><br>You can find these keys on the **Azure Cognitive Search** resource on the **Keys** page.|
-
-### Recommended settings for network isolation
-
-Protect Cognitive Service Resource from public access by [configuring the virtual network](../../cognitive-services-virtual-networks.md?tabs=portal).
- ## Next steps
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/QnAMaker/How-To/add-sharepoint-datasources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/How-To/add-sharepoint-datasources.md
@@ -127,8 +127,8 @@ The Active Directory manager will get a pop-up window requesting permissions to
There is a workaround to add latest SharePoint content via API using Azure blob storage, below are the steps: 1. Download the SharePoint files locally. The user calling the API needs to have access to SharePoint.
-1. Upload them on the Azure blob stoarge. This will create a secure shared access by [using SAS token.](../../../storage/common/storage-sas-overview.md#how-a-shared-access-signature-works)
-1. Pass the blob URL generated with the SAS token to the QnA Maker API. To allow the Question Answers extraction from the files, you need to add the suffix file type as '&ext=pdf' or '&ext=doc' at the end of the URL before passing it to QnA Maker API>
+1. Upload them on the Azure blob storage. This will create a secure shared access by [using SAS token.](../../../storage/common/storage-sas-overview.md#how-a-shared-access-signature-works)
+1. Pass the blob URL generated with the SAS token to the QnA Maker API. To allow the Question Answers extraction from the files, you need to add the suffix file type as '&ext=pdf' or '&ext=doc' at the end of the URL before passing it to QnA Maker API.
<!--
@@ -187,4 +187,4 @@ Use the **@microsoft.graph.downloadUrl** from the previous section as the `fileu
## Next steps > [!div class="nextstepaction"]
-> [Collaborate on your knowledge base](../index.yml)
+> [Collaborate on your knowledge base](../index.yml)
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/QnAMaker/How-To/improve-knowledge-base https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/How-To/improve-knowledge-base.md
@@ -45,29 +45,56 @@ In order to see suggested questions, you must [turn on active learning](../conce
<a name="#score-proximity-between-knowledge-base-questions"></a>
+## Active learning suggestions are saved in the exported knowledge base
+
+When your app has active learning enabled, and you export the app, the `SuggestedQuestions` column in the tsv file retains the active learning data.
+
+The `SuggestedQuestions` column is a JSON object of information of implicit, `autosuggested`, and explicit, `usersuggested` feedback. An example of this JSON object for a single user-submitted question of `help` is:
+
+```JSON
+[
+ {
+ "clusterHead": "help",
+ "totalAutoSuggestedCount": 1,
+ "totalUserSuggestedCount": 0,
+ "alternateQuestionList": [
+ {
+ "question": "help",
+ "autoSuggestedCount": 1,
+ "userSuggestedCount": 0
+ }
+ ]
+ }
+]
+```
+
+When you reimport this app, the active learning continues to collect information and recommend suggestions for your knowledge base.
++ ### Architectural flow for using GenerateAnswer and Train APIs from a bot A bot or other client application should use the following architectural flow to use active learning: * Bot [gets the answer from the knowledge base](#use-the-top-property-in-the-generateanswer-request-to-get-several-matching-answers) with the GenerateAnswer API, using the `top` property to get a number of answers.+
+ #### Use the top property in the GenerateAnswer request to get several matching answers
+
+ When submitting a question to QnA Maker for an answer, the `top` property of the JSON body sets the number of answers to return.
+
+ ```json
+ {
+ "question": "wi-fi",
+ "isTest": false,
+ "top": 3
+ }
+ ```
+ * Bot determines explicit feedback: * Using your own [custom business logic](#use-the-score-property-along-with-business-logic-to-get-list-of-answers-to-show-user), filter out low scores. * In the bot or client-application, display list of possible answers to the user and get user's selected answer. * Bot [sends selected answer back to QnA Maker](#bot-framework-sample-code) with the [Train API](#train-api).
-### Use the top property in the GenerateAnswer request to get several matching answers
-
-When submitting a question to QnA Maker for an answer, the `top` property of the JSON body sets the number of answers to return.
-
-```json
-{
- "question": "wi-fi",
- "isTest": false,
- "top": 3
-}
-```
- ### Use the score property along with business logic to get list of answers to show user When the client application (such as a chat bot) receives the response, the top 3 questions are returned. Use the `score` property to analyze the proximity between scores. This proximity range is determined by your own business logic.
@@ -306,33 +333,6 @@ async callTrain(stepContext){
} ```
-## Active learning is saved in the exported knowledge base
-
-When your app has active learning enabled, and you export the app, the `SuggestedQuestions` column in the tsv file retains the active learning data.
-
-The `SuggestedQuestions` column is a JSON object of information of implicit, `autosuggested`, and explicit, `usersuggested` feedback. An example of this JSON object for a single user-submitted question of `help` is:
-
-```JSON
-[
- {
- "clusterHead": "help",
- "totalAutoSuggestedCount": 1,
- "totalUserSuggestedCount": 0,
- "alternateQuestionList": [
- {
- "question": "help",
- "autoSuggestedCount": 1,
- "userSuggestedCount": 0
- }
- ]
- }
-]
-```
-
-When you reimport this app, the active learning continues to collect information and recommend suggestions for your knowledge base.
--- ## Best practices For best practices when using active learning, see [Best practices](../Concepts/best-practices.md#active-learning).
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/QnAMaker/How-To/metadata-generateanswer-usage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/How-To/metadata-generateanswer-usage.md
@@ -286,3 +286,5 @@ The **Publish** page also provides information to [generate an answer](../Quicks
> [!div class="nextstepaction"] > [Get analytics on your knowledge base](../how-to/get-analytics-knowledge-base.md)
+> [!div class="nextstepaction"]
+> [Confidence score](../Concepts/confidence-score.md)
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/QnAMaker/How-To/set-up-qnamaker-service-azure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/How-To/set-up-qnamaker-service-azure.md
@@ -57,117 +57,6 @@ This procedure creates the Azure resources needed to manage the knowledge base c
The resource with the _Cognitive Services_ type has your _subscription_ keys.
-### Upgrade QnA Maker SKU
-
-When you want to have more questions and answers in your knowledge base, beyond your current tier, upgrade your QnA Maker service pricing tier.
-
-To upgrade the QnA Maker management SKU:
-
-1. Go to your QnA Maker resource in the Azure portal, and select **Pricing tier**.
-
- ![QnA Maker resource](../media/qnamaker-how-to-upgrade-qnamaker/qnamaker-resource.png)
-
-1. Choose the appropriate SKU and press **Select**.
-
- ![QnA Maker pricing](../media/qnamaker-how-to-upgrade-qnamaker/qnamaker-pricing-page.png)
-
-### Upgrade App Service
-
- When your knowledge base needs to serve more requests from your client app, upgrade your App Service pricing tier.
-
-You can [scale up](../../../app-service/manage-scale-up.md) or scale out App Service.
-
-Go to the App Service resource in the Azure portal, and select the **Scale up** or **Scale out** option as required.
-
-![QnA Maker App Service scale](../media/qnamaker-how-to-upgrade-qnamaker/qnamaker-appservice-scale.png)
-
-### Get the latest runtime updates
-
-The QnAMaker runtime is part of the Azure App Service instance that's deployed when you [create a QnAMaker service](./set-up-qnamaker-service-azure.md) in the Azure portal. Updates are made periodically to the runtime. The QnA Maker App Service instance is in auto-update mode after the April 2019 site extension release (version 5+). This update is designed to take care of ZERO downtime during upgrades.
-
-You can check your current version at https://www.qnamaker.ai/UserSettings. If your version is older than version 5.x, you must restart App Service to apply the latest updates:
-
-1. Go to your QnAMaker service (resource group) in the [Azure portal](https://portal.azure.com).
-
- > [!div class="mx-imgBorder"]
- > ![QnAMaker Azure resource group](../media/qnamaker-how-to-troubleshoot/qnamaker-azure-resourcegroup.png)
-
-1. Select the App Service instance and open the **Overview** section.
-
- > [!div class="mx-imgBorder"]
- > ![QnAMaker App Service instance](../media/qnamaker-how-to-troubleshoot/qnamaker-azure-appservice.png)
--
-1. Restart App Service. The update process should finish in a couple of seconds. Any dependent applications or bots that use this QnAMaker service will be unavailable to end users during this restart period.
-
- ![Restart of the QnAMaker App Service instance](../media/qnamaker-how-to-upgrade-qnamaker/qnamaker-appservice-restart.png)
-
-### Configure App service idle setting to avoid timeout
-
-The app service, which serves the QnA Maker prediction runtime for a published knowledge base, has an idle timeout configuration, which defaults to automatically time out if the service is idle. For QnA Maker, this means your prediction runtime generateAnswer API occasionally times out after periods of no traffic.
-
-In order to keep the prediction endpoint app loaded even when there is no traffic, set the idle to always on.
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Search for and select your QnA Maker resource's app service. It will have the same name as the QnA Maker resource but it will have a different **type** of App Service.
-1. Find **Settings** then select **Configuration**.
-1. On the Configuration pane, select **General settings**, then find **Always on**, and select **On** as the value.
-
- > [!div class="mx-imgBorder"]
- > ![On the Configuration pane, select **General settings**, then find **Always on**, and select **On** as the value.](../media/qnamaker-how-to-upgrade-qnamaker/configure-app-service-idle-timeout.png)
-
-1. Select **Save** to save the configuration.
-1. You are asked if you want to restart the app to use the new setting. Select **Continue**.
-
-Learn more about how to configure the App Service [General settings](../../../app-service/configure-common.md#configure-general-settings).
-
-### Configure App Service Environment to host QnA Maker App Service
-The App Service Environment(ASE) can be used to host QnA Maker App service. Please follow the steps below:
-
-1. Create an App Service Environment and mark it as ΓÇ£externalΓÇ¥. Please follow the [tutorial](../../../app-service/environment/create-external-ase.md) for instructions.
-2. Create an App service inside the App Service Environment.
- * Check the configuration for the App service and add 'PrimaryEndpointKey' as an application setting. The value for 'PrimaryEndpointKey' should be set to ΓÇ£\<app-name\>-PrimaryEndpointKeyΓÇ¥. The App Name is defined in the App service URL. For instance, if the App service URL is "mywebsite.myase.p.azurewebsite.net", then the app-name is "mywebsite". In this case, the value for 'PrimaryEndpointKey' should be set to ΓÇ£mywebsite-PrimaryEndpointKeyΓÇ¥.
- * Create an Azure search service.
- * Ensure Azure Search and App Settings are appropriately configured.
- Please follow this [tutorial](../reference-app-service.md?tabs=v1#app-service).
-3. Update the Network Security Group associated with the App Service Environment
- * Update pre-created Inbound Security Rules as per your requirements.
- * Add a new Inbound Security Rule with source as 'Service Tag' and source service tag as 'CognitiveServicesManagement'.
-4. Create a QnA Maker cognitive service instance (Microsoft.CognitiveServices/accounts) using Azure Resource Manager, where QnA Maker endpoint should be set to the App Service Endpoint created above (https:// mywebsite.myase.p.azurewebsite.net).
-
-### Network isolation for App Service
-
-QnA Maker Cognitive Service uses the service tag: `CognitiveServicesManagement`. Please follow these steps to add the IP Address ranges to an allowlist:
-
-* Download [IP Ranges for all service tags](https://www.microsoft.com/download/details.aspx?id=56519).
-* Select the IPs of "CognitiveServicesManagement".
-* Navigate to the networking section of your App Service resource, and click on "Configure Access Restriction" option to add the IPs to an allowlist.
-
-We also have an automated script to do the same for your App Service. You can find the [PowerShell script to configure an allowlist](https://github.com/pchoudhari/QnAMakerBackupRestore/blob/master/AddRestrictedIPAzureAppService.ps1) on GitHub. You need to input subscription id, resource group and actual App Service name as script parameters. Running the script will automatically add the IPs to App Service allowlist.
-
-### Business continuity with traffic manager
-
-The primary objective of the business continuity plan is to create a resilient knowledge base endpoint, which would ensure no down time for the Bot or the application consuming it.
-
-> [!div class="mx-imgBorder"]
-> ![QnA Maker bcp plan](../media/qnamaker-how-to-bcp-plan/qnamaker-bcp-plan.png)
-
-The high-level idea as represented above is as follows:
-
-1. Set up two parallel [QnA Maker services](set-up-qnamaker-service-azure.md) in [Azure paired regions](../../../best-practices-availability-paired-regions.md).
-
-1. [Backup](../../../app-service/manage-backup.md) your primary QnA Maker App service and [restore](../../../app-service/web-sites-restore.md) it in the secondary setup. This will ensure that both setups work with the same hostname and keys.
-
-1. Keep the primary and secondary Azure search indexes in sync. Use the GitHub sample [here](https://github.com/pchoudhari/QnAMakerBackupRestore) to see how to backup-restore Azure indexes.
-
-1. Back up the Application Insights using [continuous export](../../../azure-monitor/app/export-telemetry.md).
-
-1. Once the primary and secondary stacks have been set up, use [traffic manager](../../../traffic-manager/traffic-manager-overview.md) to configure the two endpoints and set up a routing method.
-
-1. You would need to create a Transport Layer Security (TLS), previously known as Secure Sockets Layer (SSL), certificate for your traffic manager endpoint. [Bind the TLS/SSL certificate](../../../app-service/configure-ssl-bindings.md) in your App services.
-
-1. Finally, use the traffic manager endpoint in your Bot or App.
- # [QnA Maker managed (preview release)](#tab/v2) This procedure creates the Azure resources needed to manage the knowledge base content. After you complete these steps, you'll find the *subscription* keys on the **Keys** page for the resource in the Azure portal.
@@ -197,62 +86,99 @@ This procedure creates the Azure resources needed to manage the knowledge base c
![Resource created a new QnA Maker managed (Preview) service](../media/qnamaker-how-to-setup-service/resources-created-v2.png) The resource with the _Cognitive Services_ type has your _subscription_ keys.-
+
-## Find authoring keys in the Azure portal
+## Recommended settings for network isolation
# [QnA Maker GA (stable release)](#tab/v1)
-You can view and reset your authoring keys from the Azure portal, where you created the QnA Maker resource. These keys may be referred to as subscription keys.
+1. Protect Cognitive Service Resource from public access by [configuring the virtual network](../../cognitive-services-virtual-networks.md?tabs=portal).
+2. Protect App Service (QnA Runtime) from public access.
-1. Go to the QnA Maker resource in the Azure portal and select the resource that has the _Cognitive Services_ type:
+ ##### Add IPs to App Service allowlist
- ![QnA Maker resource list](../media/qnamaker-how-to-key-management/qnamaker-resource-list.png)
+ * Allow traffic only from Cognitive Service IPs. These are already included in Service Tag `CognitiveServicesManagement`. This is required for Authoring APIs (Create/Update KB) to invoke the app service and update Azure Search service accordingly. Check out [more information about service tags.](../../../virtual-network/service-tags-overview.md)
+ * Make sure you also allow other entry points like Bot service, QnA Maker portal (may be your corpnet) etc. for prediction "GenerateAnswer" API access.
+ * Please follow these steps to add the IP Address ranges to an allowlist:
-2. Go to **Keys**:
+ * Download [IP Ranges for all service tags](https://www.microsoft.com/download/details.aspx?id=56519).
+ * Select the IPs of "CognitiveServicesManagement".
+ * Navigate to the networking section of your App Service resource, and click on "Configure Access Restriction" option to add the IPs to an allowlist.
- ![Subscription key](../media/qnamaker-how-to-key-management/subscription-key.PNG)
+ We also have an automated script to do the same for your App Service. You can find the [PowerShell script to configure an allowlist](https://github.com/pchoudhari/QnAMakerBackupRestore/blob/master/AddRestrictedIPAzureAppService.ps1) on GitHub. You need to input subscription id, resource group and actual App Service name as script parameters. Running the script will automatically add the IPs to App Service allowlist.
-### Find query endpoint keys in the QnA Maker portal
+ ##### Configure App Service Environment to host QnA Maker App Service
+ The App Service Environment(ASE) can be used to host QnA Maker App service. Please follow the steps below:
-The endpoint is in the same region as the resource because the endpoint keys are used to make a call to the knowledge base.
+ 1. Create an App Service Environment and mark it as ΓÇ£externalΓÇ¥. Please follow the [tutorial](../../../app-service/environment/create-external-ase.md) for instructions.
+ 2. Create an App service inside the App Service Environment.
+ * Check the configuration for the App service and add 'PrimaryEndpointKey' as an application setting. The value for 'PrimaryEndpointKey' should be set to ΓÇ£\<app-name\>-PrimaryEndpointKeyΓÇ¥. The App Name is defined in the App service URL. For instance, if the App service URL is "mywebsite.myase.p.azurewebsite.net", then the app-name is "mywebsite". In this case, the value for 'PrimaryEndpointKey' should be set to ΓÇ£mywebsite-PrimaryEndpointKeyΓÇ¥.
+ * Create an Azure search service.
+ * Ensure Azure Search and App Settings are appropriately configured.
+ Please follow this [tutorial](../reference-app-service.md?tabs=v1#app-service).
+ 3. Update the Network Security Group associated with the App Service Environment
+ * Update pre-created Inbound Security Rules as per your requirements.
+ * Add a new Inbound Security Rule with source as 'Service Tag' and source service tag as 'CognitiveServicesManagement'.
+ 4. Create a QnA Maker cognitive service instance (Microsoft.CognitiveServices/accounts) using Azure Resource Manager, where QnA Maker endpoint should be set to the App Service Endpoint created above (https:// mywebsite.myase.p.azurewebsite.net).
+
+3. Configuring Cognitive Search as a private endpoint inside a VNET
-Endpoint keys can be managed from the [QnA Maker portal](https://qnamaker.ai).
+ When a Search instance is created during the creation of a QnA Maker resource, you can force Cognitive Search to support a private endpoint configuration created entirely within a customerΓÇÖs VNet.
-1. Sign in to the [QnA Maker portal](https://qnamaker.ai), go to your profile, and then select **Service settings**:
+ All resources must be created in the same region to use a private endpoint.
- ![Endpoint key](../media/qnamaker-how-to-key-management/Endpoint-keys.png)
+ * QnA Maker resource
+ * new Cognitive Search resource
+ * new Virtual Network resource
-2. View or reset your keys:
+ Complete the following steps in the [Azure portal](https://portal.azure.com):
- > [!div class="mx-imgBorder"]
- > ![Endpoint key manager](../media/qnamaker-how-to-key-management/Endpoint-keys1.png)
+ 1. Create a [QnA Maker resource](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesQnAMaker).
+ 1. Create a new Cognitive Search resource with Endpoint connectivity (data) set to _Private_. Create the resource in the same region as the QnA Maker resource created in step 1. Learn more about [creating a Cognitive Search resource](../../../search/search-create-service-portal.md), then use this link to go directly to the [creation page of the resource](https://ms.portal.azure.com/#create/Microsoft.Search).
+ 1. Create a new [Virtual Network resource](https://ms.portal.azure.com/#create/Microsoft.VirtualNetwork-ARM).
+ 1. Configure the VNET on the App service resource created in step 1 of this procedure.
+ 1. Create a new DNS entry in the VNET for new Cognitive Search resource created in step 2. to the Cognitive Search IP address.
+ 1. [Associate the App service to the new Cognitive Search resource](#configure-qna-maker-to-use-different-cognitive-search-resource) created in step 2. Then, you can delete the original Cognitive Search resource created in step 1.
- >[!NOTE]
- >Refresh your keys if you think they've been compromised. This may require corresponding changes to your client application or bot code.
+ In the [QnA Maker portal](https://www.qnamaker.ai/), create your first knowledge base.
# [QnA Maker managed (preview release)](#tab/v2)
-You can view and reset your authoring keys from the Azure portal, where you created the QnA Maker managed (Preview) resource. These keys may be referred to as subscription keys.
+1. Protect Cognitive Service Resource from public access by [configuring the virtual network](../../cognitive-services-virtual-networks.md?tabs=portal).
+2. [Create Private endpoints](../reference-private-endpoint.md) to the Azure Search resource.
-1. Go to the QnA Maker managed (Preview) resource in the Azure portal and select the resource that has the *Cognitive Services* type:
+
- ![QnA Maker managed (Preview) resource list](../media/qnamaker-how-to-key-management/qnamaker-v2-resource-list.png)
+## Upgrade Azure resources
-2. Go to **Keys and Endpoint**:
+# [QnA Maker GA (stable release)](#tab/v1)
- ![QnA Maker managed (Preview) Subscription key](../media/qnamaker-how-to-key-management/subscription-key-v2.png)
+### Upgrade QnA Maker SKU
-### Update the resources
+When you want to have more questions and answers in your knowledge base, beyond your current tier, upgrade your QnA Maker service pricing tier.
-Learn how to upgrade the resources used by your knowledge base. QnA Maker managed (Preview) is **free** while in preview.
+To upgrade the QnA Maker management SKU:
-
+1. Go to your QnA Maker resource in the Azure portal, and select **Pricing tier**.
-## Upgrade the Azure Cognitive Search service
+ ![QnA Maker resource](../media/qnamaker-how-to-upgrade-qnamaker/qnamaker-resource.png)
-# [QnA Maker GA (stable release)](#tab/v1)
+1. Choose the appropriate SKU and press **Select**.
+
+ ![QnA Maker pricing](../media/qnamaker-how-to-upgrade-qnamaker/qnamaker-pricing-page.png)
+
+### Upgrade App Service
+
+When your knowledge base needs to serve more requests from your client app, upgrade your App Service pricing tier.
+
+You can [scale up](../../../app-service/manage-scale-up.md) or scale out App Service.
+
+Go to the App Service resource in the Azure portal, and select the **Scale up** or **Scale out** option as required.
+
+![QnA Maker App Service scale](../media/qnamaker-how-to-upgrade-qnamaker/qnamaker-appservice-scale.png)
+
+### Upgrade the Azure Cognitive Search service
If you plan to have many knowledge bases, upgrade your Azure Cognitive Search service pricing tier.
@@ -279,10 +205,40 @@ Currently, you can't perform an in-place upgrade of the Azure search SKU. Howeve
1. Restart the App Service instance. ![Restart of the QnA Maker App Service instance](../media/qnamaker-how-to-upgrade-qnamaker/qnamaker-appservice-restart.png)
+
+### Inactivity policy for free Search resources
-### Cognitive Search consideration
+If you are not using a QnA maker resource, you should remove all the resources. If you don't remove unused resources, your Knowledge base will stop working if you created a free Search resource.
+
+Free Search resources are deleted after 90 days without receiving an API call.
+
+# [QnA Maker managed (preview release)](#tab/v2)
-Cognitive Search, as a separate resource, has some different configurations you should be aware of.
+### Upgrade the Azure Cognitive Search service
+
+If you plan to have many knowledge bases, upgrade your Azure Cognitive Search service pricing tier.
+
+Currently, you can't perform an in-place upgrade of the Azure search SKU. However, you can create a new Azure search resource with the desired SKU, restore the data to the new resource, and then link it to the QnA Maker stack. To do this, follow these steps:
+
+1. Create a new Azure search resource in the Azure portal, and select the desired SKU.
+
+ ![QnA Maker Azure search resource](../media/qnamaker-how-to-upgrade-qnamaker/qnamaker-azuresearch-new.png)
+
+1. Restore the indexes from your original Azure search resource to the new one. See the [backup restore sample code](https://github.com/pchoudhari/QnAMakerBackupRestore).
+
+1. To link the new Azure search resource to the QnA Maker managed (Preview) service, see the below topic.
+
+### Inactivity policy for free Search resources
+
+If you are not using a QnA maker resource, you should remove all the resources. If you don't remove unused resources, your Knowledge base will stop working if you created a free Search resource.
+
+Free Search resources are deleted after 90 days without receiving an API call.
+++
+## Configure Azure resources
+
+# [QnA Maker GA (stable release)](#tab/v1)
### Configure QnA Maker to use different Cognitive Search resource
@@ -313,47 +269,70 @@ If you create a QnA service through Azure Resource Manager templates, you can cr
Learn more about how to configure the App Service [Application settings](../../../app-service/configure-common.md#configure-app-settings).
-### Configuring Cognitive Search as a private endpoint inside a VNET
+### Get the latest runtime updates
-When a Search instance is created during the creation of a QnA Maker resource, you can force Cognitive Search to support a private endpoint configuration created entirely within a customerΓÇÖs VNet.
+The QnAMaker runtime is part of the Azure App Service instance that's deployed when you [create a QnAMaker service](./set-up-qnamaker-service-azure.md) in the Azure portal. Updates are made periodically to the runtime. The QnA Maker App Service instance is in auto-update mode after the April 2019 site extension release (version 5+). This update is designed to take care of ZERO downtime during upgrades.
-All resources must be created in the same region to use a private endpoint.
+You can check your current version at https://www.qnamaker.ai/UserSettings. If your version is older than version 5.x, you must restart App Service to apply the latest updates:
-* QnA Maker resource
-* new Cognitive Search resource
-* new Virtual Network resource
+1. Go to your QnAMaker service (resource group) in the [Azure portal](https://portal.azure.com).
-Complete the following steps in the [Azure portal](https://portal.azure.com):
+ > [!div class="mx-imgBorder"]
+ > ![QnAMaker Azure resource group](../media/qnamaker-how-to-troubleshoot/qnamaker-azure-resourcegroup.png)
-1. Create a [QnA Maker resource](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesQnAMaker).
-1. Create a new Cognitive Search resource with Endpoint connectivity (data) set to _Private_. Create the resource in the same region as the QnA Maker resource created in step 1. Learn more about [creating a Cognitive Search resource](../../../search/search-create-service-portal.md), then use this link to go directly to the [creation page of the resource](https://ms.portal.azure.com/#create/Microsoft.Search).
-1. Create a new [Virtual Network resource](https://ms.portal.azure.com/#create/Microsoft.VirtualNetwork-ARM).
-1. Configure the VNET on the App service resource created in step 1 of this procedure.
- 1. Create a new DNS entry in the VNET for new Cognitive Search resource created in step 2. to the Cognitive Search IP address.
-1. [Associate the App service to the new Cognitive Search resource](#configure-qna-maker-to-use-different-cognitive-search-resource) created in step 2. Then, you can delete the original Cognitive Search resource created in step 1.
+1. Select the App Service instance and open the **Overview** section.
-In the [QnA Maker portal](https://www.qnamaker.ai/), create your first knowledge base.
+ > [!div class="mx-imgBorder"]
+ > ![QnAMaker App Service instance](../media/qnamaker-how-to-troubleshoot/qnamaker-azure-appservice.png)
-### Inactivity policy for free Search resources
+1. Restart App Service. The update process should finish in a couple of seconds. Any dependent applications or bots that use this QnAMaker service will be unavailable to end users during this restart period.
-If you are not using a QnA maker resource, you should remove all the resources. If you don't remove unused resources, your Knowledge base will stop working if you created a free Search resource.
+ ![Restart of the QnAMaker App Service instance](../media/qnamaker-how-to-upgrade-qnamaker/qnamaker-appservice-restart.png)
-Free Search resources are deleted after 90 days without receiving an API call.
+### Configure App service idle setting to avoid timeout
-# [QnA Maker managed (preview release)](#tab/v2)
+The app service, which serves the QnA Maker prediction runtime for a published knowledge base, has an idle timeout configuration, which defaults to automatically time out if the service is idle. For QnA Maker, this means your prediction runtime generateAnswer API occasionally times out after periods of no traffic.
-If you plan to have many knowledge bases, upgrade your Azure Cognitive Search service pricing tier.
+In order to keep the prediction endpoint app loaded even when there is no traffic, set the idle to always on.
-Currently, you can't perform an in-place upgrade of the Azure search SKU. However, you can create a new Azure search resource with the desired SKU, restore the data to the new resource, and then link it to the QnA Maker stack. To do this, follow these steps:
+1. Sign in to the [Azure portal](https://portal.azure.com).
+1. Search for and select your QnA Maker resource's app service. It will have the same name as the QnA Maker resource but it will have a different **type** of App Service.
+1. Find **Settings** then select **Configuration**.
+1. On the Configuration pane, select **General settings**, then find **Always on**, and select **On** as the value.
-1. Create a new Azure search resource in the Azure portal, and select the desired SKU.
+ > [!div class="mx-imgBorder"]
+ > ![On the Configuration pane, select **General settings**, then find **Always on**, and select **On** as the value.](../media/qnamaker-how-to-upgrade-qnamaker/configure-app-service-idle-timeout.png)
- ![QnA Maker Azure search resource](../media/qnamaker-how-to-upgrade-qnamaker/qnamaker-azuresearch-new.png)
+1. Select **Save** to save the configuration.
+1. You are asked if you want to restart the app to use the new setting. Select **Continue**.
-1. Restore the indexes from your original Azure search resource to the new one. See the [backup restore sample code](https://github.com/pchoudhari/QnAMakerBackupRestore).
+Learn more about how to configure the App Service [General settings](../../../app-service/configure-common.md#configure-general-settings).
-1. To link the new Azure search resource to the QnA Maker managed (Preview) service, see the below topic.
+### Business continuity with traffic manager
+
+The primary objective of the business continuity plan is to create a resilient knowledge base endpoint, which would ensure no down time for the Bot or the application consuming it.
+
+> [!div class="mx-imgBorder"]
+> ![QnA Maker bcp plan](../media/qnamaker-how-to-bcp-plan/qnamaker-bcp-plan.png)
+
+The high-level idea as represented above is as follows:
+
+1. Set up two parallel [QnA Maker services](set-up-qnamaker-service-azure.md) in [Azure paired regions](../../../best-practices-availability-paired-regions.md).
+
+1. [Backup](../../../app-service/manage-backup.md) your primary QnA Maker App service and [restore](../../../app-service/web-sites-restore.md) it in the secondary setup. This will ensure that both setups work with the same hostname and keys.
+
+1. Keep the primary and secondary Azure search indexes in sync. Use the GitHub sample [here](https://github.com/pchoudhari/QnAMakerBackupRestore) to see how to backup-restore Azure indexes.
+
+1. Back up the Application Insights using [continuous export](../../../azure-monitor/app/export-telemetry.md).
+
+1. Once the primary and secondary stacks have been set up, use [traffic manager](../../../traffic-manager/traffic-manager-overview.md) to configure the two endpoints and set up a routing method.
+
+1. You would need to create a Transport Layer Security (TLS), previously known as Secure Sockets Layer (SSL), certificate for your traffic manager endpoint. [Bind the TLS/SSL certificate](../../../app-service/configure-ssl-bindings.md) in your App services.
+
+1. Finally, use the traffic manager endpoint in your Bot or App.
+
+# [QnA Maker managed (preview release)](#tab/v2)
### Configure QnA Maker managed (Preview) service to use different Cognitive Search resource
@@ -369,11 +348,6 @@ If you create a QnA service managed (Preview) and its dependencies (such as Sear
> [!NOTE] > If you change the Azure Search service associated with QnA Maker, you will lose access to all the knowledge bases already present in it. Make sure you export the existing knowledge bases before you change the Azure Search service.
-### Inactivity policy for free Search resources
-
-If you are not using a QnA maker resource, you should remove all the resources. If you don't remove unused resources, your Knowledge base will stop working if you created a free Search resource.
-
-Free Search resources are deleted after 90 days without receiving an API call.
@@ -386,4 +360,4 @@ If you delete any of the Azure resources used for your QnA Maker knowledge bases
Learn more about the [App service](../../../app-service/index.yml) and [Search service](../../../search/index.yml). > [!div class="nextstepaction"]
-> [Learn how to author with others](../index.yml)
+> [Learn how to author with others](../index.yml)
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/QnAMaker/How-To/use-active-learning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/How-To/use-active-learning.md
@@ -7,12 +7,118 @@
Last updated 03/18/2020
-# Use active learning to improve your knowledge base
+# Active learning
-[Active learning](../Concepts/active-learning-suggestions.md) allows you to improve the quality of your knowledge base by suggesting alternative questions. User-submissions are taken into consideration and show up as suggestions in the alternate questions list. You have the flexibility to either add those suggestions as alternate questions or reject them.
+The _Active learning suggestions_ feature allows you to improve the quality of your knowledge base by suggesting alternative questions, based on user-submissions, to your question and answer pair. You review those suggestions, either adding them to existing questions or rejecting them.
Your knowledge base doesn't change automatically. In order for any change to take effect, you must accept the suggestions. These suggestions add questions but don't change or remove existing questions.
+## What is active learning?
+
+QnA Maker learns new question variations with implicit and explicit feedback.
+
+* [Implicit feedback](#how-qna-makers-implicit-feedback-works) ΓÇô The ranker understands when a user question has multiple answers with scores that are very close and considers this as feedback. You don't need to do anything for this to happen.
+* [Explicit feedback](#how-you-give-explicit-feedback-with-the-train-api) ΓÇô When multiple answers with little variation in scores are returned from the knowledge base, the client application asks the user which question is the correct question. The user's explicit feedback is sent to QnA Maker with the [Train API](../How-To/improve-knowledge-base.md#train-api).
+
+Both methods provide the ranker with similar queries that are clustered.
+
+## How active learning works
+
+Active learning is triggered based on the scores of the top few answers returned by QnA Maker. If the score differences between QnA pairs that match the query lie within a small range, then the query is considered a possible suggestion (as an alternate question) for each of the possible QnA pairs. Once you accept the suggested question for a specific QnA pair, it is rejected for the other pairs. You need to remember to save and train, after accepting suggestions.
+
+Active learning gives the best possible suggestions in cases where the endpoints are getting a reasonable quantity and variety of usage queries. When 5 or more similar queries are clustered, every 30 minutes, QnA Maker suggests the user-based questions to the knowledge base designer to accept or reject. All the suggestions are clustered together by similarity and top suggestions for alternate questions are displayed based on the frequency of the particular queries by end users.
+
+Once questions are suggested in the QnA Maker portal, you need to review and accept or reject those suggestions. There isn't an API to manage suggestions.
+
+## How QnA Maker's implicit feedback works
+
+QnA Maker's implicit feedback uses an algorithm to determine score proximity then makes active learning suggestions. The algorithm to determine proximity is not a simple calculation. The ranges in the following example are not meant to be fixed but should be used as a guide to understand the impact of the algorithm only.
+
+When a question's score is highly confident, such as 80%, the range of scores that are considered for active learning are wide, approximately within 10%. As the confidence score decreases, such as 40%, the range of scores decreases as well, approximately within 4%.
+
+In the following JSON response from a query to QnA Maker's generateAnswer, the scores for A, B, and C are near and would be considered as suggestions.
+
+```json
+{
+ "activeLearningEnabled": true,
+ "answers": [
+ {
+ "questions": [
+ "Q1"
+ ],
+ "answer": "A1",
+ "score": 80,
+ "id": 15,
+ "source": "Editorial",
+ "metadata": [
+ {
+ "name": "topic",
+ "value": "value"
+ }
+ ]
+ },
+ {
+ "questions": [
+ "Q2"
+ ],
+ "answer": "A2",
+ "score": 78,
+ "id": 16,
+ "source": "Editorial",
+ "metadata": [
+ {
+ "name": "topic",
+ "value": "value"
+ }
+ ]
+ },
+ {
+ "questions": [
+ "Q3"
+ ],
+ "answer": "A3",
+ "score": 75,
+ "id": 17,
+ "source": "Editorial",
+ "metadata": [
+ {
+ "name": "topic",
+ "value": "value"
+ }
+ ]
+ },
+ {
+ "questions": [
+ "Q4"
+ ],
+ "answer": "A4",
+ "score": 50,
+ "id": 18,
+ "source": "Editorial",
+ "metadata": [
+ {
+ "name": "topic",
+ "value": "value"
+ }
+ ]
+ }
+ ]
+}
+```
+
+QnA Maker won't know which answer is the best answer. Use the QnA Maker portal's list of suggestions to select the best answer and train again.
++
+## How you give explicit feedback with the Train API
+
+QnA Maker needs explicit feedback about which of the answers was the best answer. How the best answer is determined is up to you and can include:
+
+* User feedback, selecting one of the answers.
+* Business logic, such as determining an acceptable score range.
+* A combination of both user feedback and business logic.
+
+Use the [Train API](/rest/api/cognitiveservices/qnamaker4.0/runtime/train) to send the correct answer to QnA Maker, after the user selects it.
+ ## Upgrade runtime version to use active learning # [QnA Maker GA (stable release)](#tab/v1)
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/QnAMaker/includes/quickstart-sdk-csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/includes/quickstart-sdk-csharp.md
@@ -146,7 +146,9 @@ In the application's `Main` method, add variables and code, shown in the followi
# [QnA Maker GA (stable release)](#tab/version-1) > [!IMPORTANT]
-> Go to the Azure portal and find the key and endpoint for the QnA Maker resource you created in the prerequisites. They will be located on the resource's **key and endpoint** page, under **resource management**. We use subscription key and authoring key interchangably. For more details on authoring key, follow [Keys in QnA Maker](https://docs.microsoft.com/azure/cognitive-services/qnamaker/concepts/azure-resources?tabs=v1#keys-in-qna-maker).
+> Go to the Azure portal and find the key and endpoint for the QnA Maker resource you created in the prerequisites. They will be located on the resource's **key and endpoint** page, under **resource management**.
+
+We use subscription key and authoring key interchangably. For more details on authoring key, follow [Keys in QnA Maker](https://docs.microsoft.com/azure/cognitive-services/qnamaker/concepts/azure-resources?tabs=v1#keys-in-qna-maker).
- Create environment variables named QNA_MAKER_SUBSCRIPTION_KEY, QNA_MAKER_ENDPOINT, and QNA_MAKER_RUNTIME_ENDPOINT to store these values. - The value of QNA_MAKER_ENDPOINT has the format `https://YOUR-RESOURCE-NAME.cognitiveservices.azure.com`.
@@ -158,7 +160,9 @@ In the application's `Main` method, add variables and code, shown in the followi
# [QnA Maker managed (preview release)](#tab/version-2) > [!IMPORTANT]
-> Go to the Azure portal and find the key and endpoint for the QnA Maker resource you created in the prerequisites. They will be located on the resource's **key and endpoint** page, under **resource management**. We use subscription key and authoring key interchangably. For more details on authoring key, follow [Keys in QnA Maker](https://docs.microsoft.com/azure/cognitive-services/qnamaker/concepts/azure-resources?tabs=v2#keys-in-qna-maker-managed-preview).
+> Go to the Azure portal and find the key and endpoint for the QnA Maker resource you created in the prerequisites. They will be located on the resource's **key and endpoint** page, under **resource management**.
+
+We use subscription key and authoring key interchangably. For more details on authoring key, follow [Keys in QnA Maker](https://docs.microsoft.com/azure/cognitive-services/qnamaker/concepts/azure-resources?tabs=v2#keys-in-qna-maker).
- Create environment variables named QNA_MAKER_SUBSCRIPTION_KEY and QNA_MAKER_ENDPOINT to store these values. - The value of QNA_MAKER_ENDPOINT has the format `https://YOUR-RESOURCE-NAME.cognitiveservices.azure.com`.
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/QnAMaker/reference-precise-answering https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/reference-precise-answering.md
@@ -9,7 +9,7 @@ Last updated 11/09/2020
# Precise answering
-The precise answering feature allows you to get the precise short answer from the best candidate answer passage present in the knowledge-base for any user query. This feature uses a deep learning model which on runtime, which understands the intent of the user query and
+The precise answering feature introduced in QnA Maker managed (Preview), allows you to get the precise short answer from the best candidate answer passage present in the knowledge-base for any user query. This feature uses a deep learning model which on runtime, which understands the intent of the user query and
detects the precise short answer from the answer passage, if there is a short answer present as a fact in the answer passage. This feature is on by-default in the test pane, so that you can test the functionality specific to your scenario. This feature is extremely beneficial for both content developers as well as
@@ -30,7 +30,7 @@ The service also returns back the confidence score of the precise answer as an *
## Publishing a QnA Maker bot
-When you publish a bot, you get the precise answer enabled experience by default in your application, where you will see short answer along with the answer passage. User has the flexibility to choose other experiences by updating the template through th eBot app service.
+When you publish a bot, you get the precise answer enabled experience by default in your application, where you will see short answer along with the answer passage. Refer to the API reference for [Generate Answer](https://docs.microsoft.com/rest/api/cognitiveservices/qnamakerv5.0-preview.1/knowledgebase/generateanswer#answerspan) to see how to use the precise answer (called AnswerSpan) in the response. User has the flexibility to choose other experiences by updating the template through the Bot app service.
## Language support
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/QnAMaker/whats-new https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/whats-new.md
@@ -31,7 +31,7 @@ Learn what's new with QnA Maker.
### July 2020 * [Metadata: `OR` logical combination of multiple metadata pairs](how-to/metadata-generateanswer-usage.md#logical-or-using-strictfilterscompoundoperationtype-property)
-* [Steps](how-to/set-up-qnamaker-service-azure.md#configuring-cognitive-search-as-a-private-endpoint-inside-a-vnet) to configure Cognitive Search endpoints to be private, but still accessible to QnA Maker.
+* [Steps](how-to/set-up-qnamaker-service-azure.md#recommended-settings-for-network-isolation) to configure Cognitive Search endpoints to be private, but still accessible to QnA Maker.
* Free Cognitive Search resources are removed after [90 days of inactivity](how-to/set-up-qnamaker-service-azure.md#inactivity-policy-for-free-search-resources). ### June 2020
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/build-training-data-set https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/build-training-data-set.md
@@ -17,9 +17,9 @@
When you use the Form Recognizer custom model, you provide your own training data to the [Train Custom Model](https://westus2.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2/operations/TrainCustomModelAsync) operation, so that the model can train to your industry-specific forms. Follow this guide to learn how to collect and prepare data to train the model effectively.
-If you're training without manual labels, you can use five filled-in forms, or an empty form (you must include the word "empty" in the file name) plus two filled-in forms. Even if you have enough filled-in forms, adding an empty form to your training data set can improve the accuracy of the model.
+You need at least five filled-in forms of the same type.
-If you want to use manually labeled training data, you must start with at least five filled-in forms of the same type. You can still use unlabeled forms and an empty form in addition to the required data set.
+If you want to use manually labeled training data, you must start with at least five filled-in forms of the same type. You can still use unlabeled forms in addition to the required data set.
## Custom model input requirements
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/concept-business-cards https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/concept-business-cards.md
@@ -31,7 +31,7 @@ The prebuilt Business Card API extracts key fields from business cards and retur
|:--|:-|:-|:-| | ContactNames | array of objects | Contact name extracted from business card | [{ "FirstName": "John", "LastName": "Doe" }] | | FirstName | string | First (given) name of contact | "John" |
-| LastName | string | Last (family) name of contact | "Doe" |
+| LastName | string | Last (family) name of contact | "Doe" |
| CompanyNames | array of strings | Company name extracted from business card | ["Contoso"] | | Departments | array of strings | Department or organization of contact | ["R&D"] | | JobTitles | array of strings | Listed Job title of contact | ["Software Engineer"] |
@@ -41,7 +41,7 @@ The prebuilt Business Card API extracts key fields from business cards and retur
| MobilePhones | array of phone numbers | Mobile phone number extracted from business card | ["+19876543210"] | | Faxes | array of phone numbers | Fax phone number extracted from business card | ["+19876543211"] | | WorkPhones | array of phone numbers | Work phone number extracted from business card | ["+19876543231"] |
-| OtherPhones | array of phone numbers | Other phone number extracted from business card | ["+19876543233"] |
+| OtherPhones | array of phone numbers | Other phone number extracted from business card | ["+19876543233"] |
The Business Card API can also return all recognized text from the Business Card. This OCR output is included in the JSON response.
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/concept-invoices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/concept-invoices.md
@@ -104,7 +104,7 @@ The Invoice service will extract the text, tables and 26 invoice fields. Followi
## Next steps - Try your own invoices and samples in the [Form Recognizer Sample UI](https://fott-preview.azurewebsites.net/).-- Complete a [Form Recognizer quickstart](quickstarts/client-library.md) to get started writing an invoice processing app with Form Recognizer in the language of your choice.
+- Complete a [Form Recognizer quickstart](quickstarts/client-library.md) to get started writing an invoice processing app with Form Recognizer in the development language of your choice.
## See also
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/concept-layout https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/concept-layout.md
@@ -86,7 +86,7 @@ Layout also extracts selection marks from documents. Selection marks extracted i
## Next steps - Try your own layout extraction using the [Form Recognizer Sample UI](https://fott-preview.azurewebsites.net/)-- Complete a [Form Recognizer quickstart](quickstarts/client-library.md) to get started extracting layouts in the language of your choice.
+- Complete a [Form Recognizer quickstart](quickstarts/client-library.md) to get started extracting layouts in the development language of your choice.
## See also
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/concept-receipts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/concept-receipts.md
@@ -464,7 +464,7 @@ The Receipt API also powers the [AI Builder Receipt Processing feature](/ai-buil
## Next steps -- Complete a [Form Recognizer quickstart](quickstarts/client-library.md) to get started writing a receipt processing app with Form Recognizer in the language of your choice.
+- Complete a [Form Recognizer quickstart](quickstarts/client-library.md) to get started writing a receipt processing app with Form Recognizer in the development language of your choice.
## See also
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/form-recognizer-container-howto https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/form-recognizer-container-howto.md
@@ -18,7 +18,7 @@
Azure Form Recognizer applies machine learning technology to identify and extract key-value pairs and tables from forms. It associates values and table entries with the key-value pairs and then outputs structured data that includes the relationships in the original file.
-To reduce complexity and easily integrate a custom Form Recognizer model into your workflow automation process or other application, you can call the model by using a simple REST API. Only five form documents (or one empty form and two filled-in forms) are needed, so you can get results quickly, accurately, and tailored to your specific content. No heavy manual intervention or extensive data science expertise is necessary. And it doesn't require data labeling or data annotation.
+To reduce complexity and easily integrate a custom Form Recognizer model into your workflow automation process or other application, you can call the model by using a simple REST API. Only five form documents are needed, so you can get results quickly, accurately, and tailored to your specific content. No heavy manual intervention or extensive data science expertise is necessary. And it doesn't require data labeling or data annotation.
| Function | Features | |-|-|
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/includes/quickstarts/csharp-sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/includes/quickstarts/csharp-sdk.md
@@ -107,7 +107,7 @@ With Form Recognizer, you can create two different client types. The first, `For
`FormRecognizerClient` provides operations for:
+ - Recognizing form fields and content, using custom models trained to analyze your custom forms. These values are returned in a collection of `RecognizedForm` objects. See example [Analyze custom forms](#analyze-forms-with-a-custom-model).
- Recognizing form content, including tables, lines and words, without the need to train a model. Form content is returned in a collection of `FormPage` objects. See example [Analyze layout](#analyze-layout). - Recognizing common fields from US receipts, using a pre-trained receipt model on the Form Recognizer service. These fields and meta-data are returned in a collection of `RecognizedForm` objects. See example [Analyze receipts](#analyze-receipts).
@@ -115,8 +115,8 @@ With Form Recognizer, you can create two different client types. The first, `For
`FormTrainingClient` provides operations for: -- Training custom models to recognize all fields and values found in your custom forms. A `CustomFormModel` is returned indicating the form types the model will recognize, and the fields it will extract for each form type.-- Training custom models to recognize specific fields and values you specify by labeling your custom forms. A `CustomFormModel` is returned indicating the fields the model will extract, as well as the estimated accuracy for each field.
+- Training custom models to analyze all fields and values found in your custom forms. A `CustomFormModel` is returned indicating the form types the model will analyze, and the fields it will extract for each form type.
+- Training custom models to analyze specific fields and values you specify by labeling your custom forms. A `CustomFormModel` is returned indicating the fields the model will extract, as well as the estimated accuracy for each field.
- Managing models created in your account. - Copying a custom model from one Form Recognizer resource to another.
@@ -186,9 +186,9 @@ You'll also need to add references to the URLs for your training and testing dat
## Analyze layout
-You can use Form Recognizer to recognize tables, lines, and words in documents, without needing to train a model. The returned value is a collection of **FormPage** objects: one for each page in the submitted document.
+You can use Form Recognizer to analyze tables, lines, and words in documents, without needing to train a model. The returned value is a collection of **FormPage** objects: one for each page in the submitted document. For more information about layout extraction see the [Layout conceptual guide](../../concept-layout.md).
-To recognize the content of a file at a given URL, use the `StartRecognizeContentFromUri` method.
+To analyze the content of a file at a given URL, use the `StartRecognizeContentFromUri` method.
[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart.cs?name=snippet_getcontent_call)]
@@ -234,89 +234,6 @@ Table 0 has 2 rows and 6 columns.
Cell (1, 5) contains text: 'PT'. ```
-## Analyze receipts
-
-This section demonstrates how to recognize and extract common fields from US receipts, using a pre-trained receipt model.
-
-To recognize receipts from a URL, use the `StartRecognizeReceiptsFromUri` method.
-
-[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart.cs?name=snippet_receipt_call)]
-
-> [!TIP]
-> You can also recognize local receipt images. See the [FormRecognizerClient](/dotnet/api/azure.ai.formrecognizer.formrecognizerclient) methods, such as **StartRecognizeReceipts**. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/README.md) for scenarios involving local images.
-
-The returned value is a collection of `RecognizedReceipt` objects: one for each page in the submitted document. The following code processes the receipt at the given URI and prints the major fields and values to the console.
-
-[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart.cs?name=snippet_receipt_print)]
-
-### Output
-
-```console
-Form Page 1 has 18 lines.
- Line 0 has 1 word, and text: 'Contoso'.
- Line 1 has 1 word, and text: 'Address:'.
- Line 2 has 3 words, and text: 'Invoice For: Microsoft'.
- Line 3 has 4 words, and text: '1 Redmond way Suite'.
- Line 4 has 3 words, and text: '1020 Enterprise Way'.
- Line 5 has 3 words, and text: '6000 Redmond, WA'.
- Line 6 has 3 words, and text: 'Sunnayvale, CA 87659'.
- Line 7 has 1 word, and text: '99243'.
- Line 8 has 2 words, and text: 'Invoice Number'.
- Line 9 has 2 words, and text: 'Invoice Date'.
- Line 10 has 3 words, and text: 'Invoice Due Date'.
- Line 11 has 1 word, and text: 'Charges'.
- Line 12 has 2 words, and text: 'VAT ID'.
- Line 13 has 1 word, and text: '34278587'.
- Line 14 has 1 word, and text: '6/18/2017'.
- Line 15 has 1 word, and text: '6/24/2017'.
- Line 16 has 1 word, and text: '$56,651.49'.
- Line 17 has 1 word, and text: 'PT'.
-Table 0 has 2 rows and 6 columns.
- Cell (0, 0) contains text: 'Invoice Number'.
- Cell (0, 1) contains text: 'Invoice Date'.
- Cell (0, 2) contains text: 'Invoice Due Date'.
- Cell (0, 3) contains text: 'Charges'.
- Cell (0, 5) contains text: 'VAT ID'.
- Cell (1, 0) contains text: '34278587'.
- Cell (1, 1) contains text: '6/18/2017'.
- Cell (1, 2) contains text: '6/24/2017'.
- Cell (1, 3) contains text: '$56,651.49'.
- Cell (1, 5) contains text: 'PT'.
-Merchant Name: 'Contoso Contoso', with confidence 0.516
-Transaction Date: '6/10/2019 12:00:00 AM', with confidence 0.985
-Item:
- Name: '8GB RAM (Black)', with confidence 0.916
- Total Price: '999', with confidence 0.559
-Item:
- Name: 'SurfacePen', with confidence 0.858
- Total Price: '99.99', with confidence 0.386
-Total: '1203.39', with confidence '0.774'
-```
-
-## Analyze business cards
-
-#### [version 2.0](#tab/ga)
-
-> [!IMPORTANT]
-> This feature isn't available in the selected API version.
-
-#### [version 2.1 preview](#tab/preview)
--
-This section demonstrates how to recognize and extract common fields from English business cards, using a pre-trained model.
-
-To recognize business cards from a URL, use the `StartRecognizeBusinessCardsFromUriAsync` method.
-
-[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart-preview.cs?name=snippet_bc_call)]
-
-> [!TIP]
-> You can also recognize local receipt images. See the [FormRecognizerClient](/dotnet/api/azure.ai.formrecognizer.formrecognizerclient) methods, such as **StartRecognizeBusinessCards**. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/README.md) for scenarios involving local images.
-
-The returned value is a collection of `RecognizedForm` objects: one for each card in the document. The following code processes the business card at the given URI and prints the major fields and values to the console.
-
-[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart-preview.cs?name=snippet_bc_print)]
-- ## Analyze invoices
@@ -327,14 +244,14 @@ The returned value is a collection of `RecognizedForm` objects: one for each car
#### [version 2.1 preview](#tab/preview)
-This section demonstrates how to recognize and extract common fields from sales invoices, using a pre-trained model.
+This section demonstrates how to analyze and extract common fields from sales invoices, using a pre-trained model. For more information about invoice analysis, see the [Invoice conceptual guide](../../concept-invoices.md).
-To recognize invoices from a URL, use the `StartRecognizeInvoicesFromUriAsync` method.
+To analyze invoices from a URL, use the `StartRecognizeInvoicesFromUriAsync` method.
[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart-preview.cs?name=snippet_invoice_call)] > [!TIP]
-> You can also recognize local invoice images. See the [FormRecognizerClient](/dotnet/api/azure.ai.formrecognizer.formrecognizerclient) methods, such as **StartRecognizeInvoices**. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/README.md) for scenarios involving local images.
+> You can also analyze local invoice images. See the [FormRecognizerClient](/dotnet/api/azure.ai.formrecognizer.formrecognizerclient?view=azure-dotnet) methods, such as **StartRecognizeInvoices**. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/README.md) for scenarios involving local images.
The returned value is a collection of `RecognizedForm` objects: one for each invoice in the submitted document. The following code processes the invoice at the given URI and prints the major fields and values to the console.
@@ -352,13 +269,13 @@ This section demonstrates how to train a model with your own data. A trained mod
### Train a model without labels
-Train custom models to recognize all the fields and values found in your custom forms without manually labeling the training documents. The following method trains a model on a given set of documents and prints the model's status to the console.
+Train custom models to analyze all the fields and values found in your custom forms without manually labeling the training documents. The following method trains a model on a given set of documents and prints the model's status to the console.
[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart.cs?name=snippet_train)]
-The returned `CustomFormModel` object contains information on the form types the model can recognize and the fields it can extract from each form type. The following code block prints this information to the console.
+The returned `CustomFormModel` object contains information on the form types the model can analyze and the fields it can extract from each form type. The following code block prints this information to the console.
[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart.cs?name=snippet_train_response)]
@@ -556,6 +473,90 @@ Field 'Azure.AI.FormRecognizer.Models.FieldValue:
... ```
+## Analyze receipts
+
+This section demonstrates how to analyze and extract common fields from US receipts, using a pre-trained receipt model. For more information about receipt analysis, see the [Receipts conceptual guide](../../concept-receipts.md).
+
+To analyze receipts from a URL, use the `StartRecognizeReceiptsFromUri` method.
+
+[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart.cs?name=snippet_receipt_call)]
+
+> [!TIP]
+> You can also analyze local receipt images. See the [FormRecognizerClient](/dotnet/api/azure.ai.formrecognizer.formrecognizerclient?view=azure-dotnet) methods, such as **StartRecognizeReceipts**. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/README.md) for scenarios involving local images.
+
+The returned value is a collection of `RecognizedReceipt` objects: one for each page in the submitted document. The following code processes the receipt at the given URI and prints the major fields and values to the console.
+
+[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart.cs?name=snippet_receipt_print)]
+
+### Output
+
+```console
+Form Page 1 has 18 lines.
+ Line 0 has 1 word, and text: 'Contoso'.
+ Line 1 has 1 word, and text: 'Address:'.
+ Line 2 has 3 words, and text: 'Invoice For: Microsoft'.
+ Line 3 has 4 words, and text: '1 Redmond way Suite'.
+ Line 4 has 3 words, and text: '1020 Enterprise Way'.
+ Line 5 has 3 words, and text: '6000 Redmond, WA'.
+ Line 6 has 3 words, and text: 'Sunnayvale, CA 87659'.
+ Line 7 has 1 word, and text: '99243'.
+ Line 8 has 2 words, and text: 'Invoice Number'.
+ Line 9 has 2 words, and text: 'Invoice Date'.
+ Line 10 has 3 words, and text: 'Invoice Due Date'.
+ Line 11 has 1 word, and text: 'Charges'.
+ Line 12 has 2 words, and text: 'VAT ID'.
+ Line 13 has 1 word, and text: '34278587'.
+ Line 14 has 1 word, and text: '6/18/2017'.
+ Line 15 has 1 word, and text: '6/24/2017'.
+ Line 16 has 1 word, and text: '$56,651.49'.
+ Line 17 has 1 word, and text: 'PT'.
+Table 0 has 2 rows and 6 columns.
+ Cell (0, 0) contains text: 'Invoice Number'.
+ Cell (0, 1) contains text: 'Invoice Date'.
+ Cell (0, 2) contains text: 'Invoice Due Date'.
+ Cell (0, 3) contains text: 'Charges'.
+ Cell (0, 5) contains text: 'VAT ID'.
+ Cell (1, 0) contains text: '34278587'.
+ Cell (1, 1) contains text: '6/18/2017'.
+ Cell (1, 2) contains text: '6/24/2017'.
+ Cell (1, 3) contains text: '$56,651.49'.
+ Cell (1, 5) contains text: 'PT'.
+Merchant Name: 'Contoso Contoso', with confidence 0.516
+Transaction Date: '6/10/2019 12:00:00 AM', with confidence 0.985
+Item:
+ Name: '8GB RAM (Black)', with confidence 0.916
+ Total Price: '999', with confidence 0.559
+Item:
+ Name: 'SurfacePen', with confidence 0.858
+ Total Price: '99.99', with confidence 0.386
+Total: '1203.39', with confidence '0.774'
+```
+
+## Analyze business cards
+
+#### [version 2.0](#tab/ga)
+
+> [!IMPORTANT]
+> This feature isn't available in the selected API version.
+
+#### [version 2.1 preview](#tab/preview)
++
+This section demonstrates how to analyze and extract common fields from English business cards, using a pre-trained model. For more information about business card analysis, see the [Business cards conceptual guide](../../concept-business-cards.md).
+
+To analyze business cards from a URL, use the `StartRecognizeBusinessCardsFromUriAsync` method.
+
+[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart-preview.cs?name=snippet_bc_call)]
+
+> [!TIP]
+> You can also analyze local receipt images. See the [FormRecognizerClient](/dotnet/api/azure.ai.formrecognizer.formrecognizerclient?view=azure-dotnet) methods, such as **StartRecognizeBusinessCards**. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/README.md) for scenarios involving local images.
+
+The returned value is a collection of `RecognizedForm` objects: one for each card in the document. The following code processes the business card at the given URI and prints the major fields and values to the console.
+
+[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/FormRecognizer/FormRecognizerQuickstart-preview.cs?name=snippet_bc_print)]
+++ ## Manage custom models This section demonstrates how to manage the custom models stored in your account. You'll do multiple operations within the following method:
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/includes/quickstarts/java-sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/includes/quickstarts/java-sdk.md
@@ -150,7 +150,7 @@ With Form Recognizer, you can create two different client types. The first, `For
`FormRecognizerClient` provides operations for: -- Recognizing form fields and content, using custom models trained to recognize your custom forms. These values are returned in a collection of `RecognizedForm` objects. See example [Analyze custom forms](#analyze-forms-with-a-custom-model).
+- Recognizing form fields and content, using custom models trained to analyze your custom forms. These values are returned in a collection of `RecognizedForm` objects. See example [Analyze custom forms](#analyze-forms-with-a-custom-model).
- Recognizing form content, including tables, lines and words, without the need to train a model. Form content is returned in a collection of `FormPage` objects. See example [Analyze layout](#analyze-layout). - Recognizing common fields from US receipts, using a pre-trained receipt model on the Form Recognizer service. These fields and meta-data are returned in a collection of `RecognizedForm` objects. See example [Analyze receipts](#analyze-receipts).
@@ -158,8 +158,8 @@ With Form Recognizer, you can create two different client types. The first, `For
`FormTrainingClient` provides operations for: -- Training custom models to recognize all fields and values found in your custom forms. A `CustomFormModel` is returned indicating the form types the model will recognize, and the fields it will extract for each form type.-- Training custom models to recognize specific fields and values you specify by labeling your custom forms. A `CustomFormModel` is returned indicating the fields the model will extract, as well as the estimated accuracy for each field.
+- Training custom models to analyze all fields and values found in your custom forms. A `CustomFormModel` is returned indicating the form types the model will analyze, and the fields it will extract for each form type.
+- Training custom models to analyze specific fields and values you specify by labeling your custom forms. A `CustomFormModel` is returned indicating the fields the model will extract, as well as the estimated accuracy for each field.
- Managing models created in your account. - Copying a custom model from one Form Recognizer resource to another.
@@ -197,9 +197,9 @@ At the top of your **main** method, add the following code. Here, you'll authent
## Analyze layout
-You can use Form Recognizer to recognize tables, lines, and words in documents, without needing to train a model.
+You can use Form Recognizer to analyze tables, lines, and words in documents, without needing to train a model. For more information about layout extraction see the [Layout conceptual guide](../../concept-layout.md).
-To recognize the content of a file at a given URL, use the **beginRecognizeContentFromUrl** method.
+To analyze the content of a file at a given URL, use the **beginRecognizeContentFromUrl** method.
[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer.java?name=snippet_getcontent_call)]
@@ -228,65 +228,6 @@ Cell has text $89,024.34.
Cell has text ET. ```
-## Analyze receipts
-
-This section demonstrates how to recognize and extract common fields from US receipts, using a pre-trained receipt model.
-
-To recognize receipts from a URI, use the **beginRecognizeReceiptsFromUrl** method.
-
-[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer.java?name=snippet_receipts_call)]
-
-> [!TIP]
-> You can also recognize local receipt images. See the [FormRecognizerClient](/jav) for scenarios involving local images.
-
-The returned value is a collection of **RecognizedReceipt** objects: one for each page in the submitted document. The next block of code iterates through the receipts and prints their details to the console.
-
-[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer.java?name=snippet_receipts_print)]
-
-The next block of code iterates through the individual items detected on the receipt and prints their details to the console.
-
-[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer.java?name=snippet_receipts_print_items)]
-
-### Output
-
-```console
-Analyze receipt...
Recognized Receipt page 0 --
-Merchant Name: Contoso Contoso, confidence: 0.62
-Merchant Address: 123 Main Street Redmond, WA 98052, confidence: 0.99
-Transaction Date: 2020-06-10, confidence: 0.90
-Receipt Items:
-Name: Cappuccino, confidence: 0.96s
-Quantity: null, confidence: 0.957s]
-Total Price: 2.200000, confidence: 0.95
-Name: BACON & EGGS, confidence: 0.94s
-Quantity: null, confidence: 0.927s]
-Total Price: null, confidence: 0.93
-```
-
-## Analyze business cards
-
-#### [version 2.0](#tab/ga)
-
-> [!IMPORTANT]
-> This feature isn't available in the selected API version.
-
-#### [version 2.1 preview](#tab/preview)
-
-This section demonstrates how to recognize and extract common fields from English business cards, using a pre-trained model.
-
-To recognize business cards from a URL, use the `beginRecognizeBusinessCardsFromUrl` method.
-
-[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer-preview.java?name=snippet_bc_call)]
-
-> [!TIP]
-> You can also recognize local business card images. See the [FormRecognizerClient](/jav) for scenarios involving local images.
-
-The returned value is a collection of **RecognizedForm** objects: one for each card in the document. The following code processes the business card at the given URI and prints the major fields and values to the console.
-
-[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer-preview.java?name=snippet_bc_print)]
-- ## Analyze invoices
@@ -297,16 +238,16 @@ The returned value is a collection of **RecognizedForm** objects: one for each c
#### [version 2.1 preview](#tab/preview)
-This section demonstrates how to recognize and extract common fields from sales invoices, using a pre-trained model.
+This section demonstrates how to analyze and extract common fields from sales invoices, using a pre-trained model. For more information about invoice analysis, see the [Invoice conceptual guide](../../concept-invoices.md).
-To recognize business cards from a URL, use the `beginRecognizeInvoicesFromUrl` method.
+To analyze invoices from a URL, use the `beginRecognizeInvoicesFromUrl` method.
[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer-preview.java?name=snippet_invoice_call)] > [!TIP]
-> You can also recognize local invoices. See the [FormRecognizerClient](/jav) for scenarios involving local images.
+> You can also analyze local invoices. See the [FormRecognizerClient](/jav) for scenarios involving local images.
-The returned value is a collection of **RecognizedForm** objects: one for each invoice in the document. The following code processes the business card at the given URI and prints the major fields and values to the console.
+The returned value is a collection of **RecognizedForm** objects: one for each invoice in the document. The following code processes the invoice at the given URI and prints the major fields and values to the console.
[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer-preview.java?name=snippet_invoice_print)]
@@ -321,14 +262,14 @@ This section demonstrates how to train a model with your own data. A trained mod
### Train a model without labels
-Train custom models to recognize all fields and values found in your custom forms without manually labeling the training documents.
+Train custom models to analyze all fields and values found in your custom forms without manually labeling the training documents.
The following method trains a model on a given set of documents and prints the model's status to the console. [!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer.java?name=snippet_train_call)]
-The returned **CustomFormModel** object contains information on the form types the model can recognize and the fields it can extract from each form type. The following code block prints this information to the console.
+The returned **CustomFormModel** object contains information on the form types the model can analyze and the fields it can extract from each form type. The following code block prints this information to the console.
[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer.java?name=snippet_train_print)]
@@ -423,6 +364,65 @@ Field 'field-5' has label 'Charges' with a confidence score of 1.00.
Field 'field-6' has label 'VAT ID' with a confidence score of 1.00. ```
+## Analyze receipts
+
+This section demonstrates how to analyze and extract common fields from US receipts, using a pre-trained receipt model. For more information about receipt analysis, see the [Receipts conceptual guide](../../concept-receipts.md).
+
+To analyze receipts from a URI, use the **beginRecognizeReceiptsFromUrl** method.
+
+[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer.java?name=snippet_receipts_call)]
+
+> [!TIP]
+> You can also analyze local receipt images. See the [FormRecognizerClient](/jav) for scenarios involving local images.
+
+The returned value is a collection of **RecognizedReceipt** objects: one for each page in the submitted document. The next block of code iterates through the receipts and prints their details to the console.
+
+[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer.java?name=snippet_receipts_print)]
+
+The next block of code iterates through the individual items detected on the receipt and prints their details to the console.
+
+[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer.java?name=snippet_receipts_print_items)]
+
+### Output
+
+```console
+Analyze receipt...
+-- Recognized Receipt page 0 --
+Merchant Name: Contoso Contoso, confidence: 0.62
+Merchant Address: 123 Main Street Redmond, WA 98052, confidence: 0.99
+Transaction Date: 2020-06-10, confidence: 0.90
+Receipt Items:
+Name: Cappuccino, confidence: 0.96s
+Quantity: null, confidence: 0.957s]
+Total Price: 2.200000, confidence: 0.95
+Name: BACON & EGGS, confidence: 0.94s
+Quantity: null, confidence: 0.927s]
+Total Price: null, confidence: 0.93
+```
+
+## Analyze business cards
+
+#### [version 2.0](#tab/ga)
+
+> [!IMPORTANT]
+> This feature isn't available in the selected API version.
+
+#### [version 2.1 preview](#tab/preview)
+
+This section demonstrates how to analyze and extract common fields from English business cards, using a pre-trained model. For more information about business card analysis, see the [Business cards conceptual guide](../../concept-business-cards.md).
+
+To analyze business cards from a URL, use the `beginRecognizeBusinessCardsFromUrl` method.
+
+[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer-preview.java?name=snippet_bc_call)]
+
+> [!TIP]
+> You can also analyze local business card images. See the [FormRecognizerClient](/jav) for scenarios involving local images.
+
+The returned value is a collection of **RecognizedForm** objects: one for each card in the document. The following code processes the business card at the given URI and prints the major fields and values to the console.
+
+[!code-java[](~/cognitive-services-quickstart-code/java/FormRecognizer/FormRecognizer-preview.java?name=snippet_bc_print)]
++ ## Manage custom models
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/includes/quickstarts/javascript-sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/includes/quickstarts/javascript-sdk.md
@@ -76,15 +76,15 @@ With Form Recognizer, you can create two different client types. The first, `For
### FormRecognizerClient `FormRecognizerClient` provides operations for:
- * Recognizing form fields and content using custom models trained to recognize your custom forms. These values are returned in a collection of `RecognizedForm` objects.
+ * Recognizing form fields and content using custom models trained to analyze your custom forms. These values are returned in a collection of `RecognizedForm` objects.
* Recognizing form content, including tables, lines and words, without the need to train a model. Form content is returned in a collection of `FormPage` objects. * Recognizing common fields from receipts, using a pre-trained receipt model on the Form Recognizer service. These fields and meta-data are returned in a collection of `RecognizedReceipt`. ### FormTrainingClient `FormTrainingClient` provides operations for:
-* Training custom models to recognize all fields and values found in your custom forms. A `CustomFormModel` is returned indicating the form types the model will recognize, and the fields it will extract for each form type. See the [service's documentation on unlabeled model training](#train-a-model-without-labels) for a more detailed explanation of creating a training data set.
-* Training custom models to recognize specific fields and values you specify by labeling your custom forms. A `CustomFormModel` is returned indicating the fields the model will extract, as well as the estimated accuracy for each field. See the [service's documentation on labeled model training](#train-a-model-with-labels) for a more detailed explanation of applying labels to a training data set.
+* Training custom models to analyze all fields and values found in your custom forms. A `CustomFormModel` is returned indicating the form types the model will analyze, and the fields it will extract for each form type. See the [service's documentation on unlabeled model training](#train-a-model-without-labels) for a more detailed explanation of creating a training data set.
+* Training custom models to analyze specific fields and values you specify by labeling your custom forms. A `CustomFormModel` is returned indicating the fields the model will extract, as well as the estimated accuracy for each field. See the [service's documentation on labeled model training](#train-a-model-with-labels) for a more detailed explanation of applying labels to a training data set.
* Managing models created in your account. * Copying a custom model from one Form Recognizer resource to another.
@@ -123,7 +123,7 @@ You'll also need to add references to the URLs for your training and testing dat
## Analyze layout
-You can use Form Recognizer to recognize tables, lines, and words in documents, without needing to train a model. To recognize the content of a file at a given URI, use the `beginRecognizeContentFromUrl` method.
+You can use Form Recognizer to analyze tables, lines, and words in documents, without needing to train a model. For more information about layout extraction see the [Layout conceptual guide](../../concept-layout.md). To analyze the content of a file at a given URI, use the `beginRecognizeContentFromUrl` method.
[!code-javascript[](~/cognitive-services-quickstart-code/javascript/FormRecognizer/FormRecognizerQuickstart.js?name=snippet_getcontent)]
@@ -147,31 +147,7 @@ cell [1,3] has text $56,651.49
cell [1,5] has text PT ```
-## Analyze receipts
-
-This section demonstrates how to recognize and extract common fields from US receipts, using a pre-trained receipt model.
-
-To recognize receipts from a URI, use the `beginRecognizeReceiptsFromUrl` method. The following code processes a receipt at the given URI and prints the major fields and values to the console.
-[!code-javascript[](~/cognitive-services-quickstart-code/javascript/FormRecognizer/FormRecognizerQuickstart.js?name=snippet_receipts)]
-
-> [!TIP]
-> You can also recognize local receipt images. See the [FormRecognizerClient](/javascript/api/@azure/ai-form-recognizer/formrecognizerclient) methods, such as **beginRecognizeReceipts**. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/formrecognizer/ai-form-recognizer/samples) for scenarios involving local images.
-
-### Output
-
-```console
-status: notStarted
-status: running
-status: succeeded
-First receipt:
- Receipt Type: 'Itemized', with confidence of 0.659
- Merchant Name: 'Contoso Contoso', with confidence of 0.516
- Transaction Date: 'Sun Jun 09 2019 17:00:00 GMT-0700 (Pacific Daylight Time)', with confidence of 0.985
- Item Name: '8GB RAM (Black)', with confidence of 0.916
- Item Name: 'SurfacePen', with confidence of 0.858
- Total: '1203.39', with confidence of 0.774
-```
## Train a custom model
@@ -182,7 +158,7 @@ This section demonstrates how to train a model with your own data. A trained mod
### Train a model without labels
-Train custom models to recognize all the fields and values found in your custom forms without manually labeling the training documents.
+Train custom models to analyze all the fields and values found in your custom forms without manually labeling the training documents.
The following function trains a model on a given set of documents and prints the model's status to the console.
@@ -317,6 +293,32 @@ Field Tax has value 'undefined' with a confidence score of undefined
Field Total has value 'undefined' with a confidence score of undefined ```
+## Analyze receipts
+
+This section demonstrates how to analyze and extract common fields from US receipts, using a pre-trained receipt model. For more information about receipt analysis, see the [Receipts conceptual guide](../../concept-receipts.md).
+
+To analyze receipts from a URI, use the `beginRecognizeReceiptsFromUrl` method. The following code processes a receipt at the given URI and prints the major fields and values to the console.
+
+[!code-javascript[](~/cognitive-services-quickstart-code/javascript/FormRecognizer/FormRecognizerQuickstart.js?name=snippet_receipts)]
+
+> [!TIP]
+> You can also analyze local receipt images. See the [FormRecognizerClient](/javascript/api/@azure/ai-form-recognizer/formrecognizerclient?view=azure-node-latest) methods, such as **beginRecognizeReceipts**. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/formrecognizer/ai-form-recognizer/samples) for scenarios involving local images.
+
+### Output
+
+```console
+status: notStarted
+status: running
+status: succeeded
+First receipt:
+ Receipt Type: 'Itemized', with confidence of 0.659
+ Merchant Name: 'Contoso Contoso', with confidence of 0.516
+ Transaction Date: 'Sun Jun 09 2019 17:00:00 GMT-0700 (Pacific Daylight Time)', with confidence of 0.985
+ Item Name: '8GB RAM (Black)', with confidence of 0.916
+ Item Name: 'SurfacePen', with confidence of 0.858
+ Total: '1203.39', with confidence of 0.774
+```
+ ## Manage your custom models This section demonstrates how to manage the custom models stored in your account. The following code does all of the model management tasks in a single function, as an example.
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/includes/quickstarts/python-sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/includes/quickstarts/python-sdk.md
@@ -73,15 +73,15 @@ With Form Recognizer, you can create two different client types. The first, `for
### FormRecognizerClient `form_recognizer_client` provides operations for:
- * Recognizing form fields and content using custom models trained to recognize your custom forms.
+ * Recognizing form fields and content using custom models trained to analyze your custom forms.
* Recognizing form content, including tables, lines and words, without the need to train a model. * Recognizing common fields from receipts, using a pre-trained receipt model on the Form Recognizer service. ### FormTrainingClient `form_training_client` provides operations for:
-* Training custom models to recognize all fields and values found in your custom forms. See the [service's documentation on unlabeled model training](#train-a-model-without-labels) for a more detailed explanation of creating a training data set.
-* Training custom models to recognize specific fields and values you specify by labeling your custom forms. See the [service's documentation on labeled model training](#train-a-model-with-labels) for a more detailed explanation of applying labels to a training data set.
+* Training custom models to analyze all fields and values found in your custom forms. See the [service's documentation on unlabeled model training](#train-a-model-without-labels) for a more detailed explanation of creating a training data set.
+* Training custom models to analyze specific fields and values you specify by labeling your custom forms. See the [service's documentation on labeled model training](#train-a-model-with-labels) for a more detailed explanation of applying labels to a training data set.
* Managing models created in your account. * Copying a custom model from one Form Recognizer resource to another.
@@ -134,9 +134,9 @@ You'll need to add references to the URLs for your training and testing data.
## Analyze layout
-You can use Form Recognizer to recognize tables, lines, and words in documents, without needing to train a model.
+You can use Form Recognizer to analyze tables, lines, and words in documents, without needing to train a model. For more information about layout extraction see the [Layout conceptual guide](../../concept-layout.md).
-To recognize the content of a file at a given URL, use the `begin_recognize_content_from_url` method. The returned value is a collection of `FormPage` objects: one for each page in the submitted document. The following code iterates through these objects and prints the extracted key/value pairs and table data.
+To analyze the content of a file at a given URL, use the `begin_recognize_content_from_url` method. The returned value is a collection of `FormPage` objects: one for each page in the submitted document. The following code iterates through these objects and prints the extracted key/value pairs and table data.
[!code-python[](~/cognitive-services-quickstart-code/python/FormRecognizer/FormRecognizerQuickstart.py?name=snippet_getcontent)]
@@ -166,55 +166,6 @@ Confidence score: 1.0
```
-## Analyze receipts
-
-This section demonstrates how to recognize and extract common fields from US receipts, using a pre-trained receipt model. To recognize receipts from a URL, use the `begin_recognize_receipts_from_url` method.
-
-[!code-python[](~/cognitive-services-quickstart-code/python/FormRecognizer/FormRecognizerQuickstart.py?name=snippet_receipts)]
-
-> [!TIP]
-> You can also recognize local receipt images. See the [FormRecognizerClient](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.formrecognizerclient) methods, such as `begin_recognize_receipts`. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/formrecognizer/azure-ai-formrecognizer/samples) for scenarios involving local images.
-
-### Output
-
-```console
-ReceiptType: Itemized has confidence 0.659
-MerchantName: Contoso Contoso has confidence 0.516
-MerchantAddress: 123 Main Street Redmond, WA 98052 has confidence 0.986
-MerchantPhoneNumber: None has confidence 0.99
-TransactionDate: 2019-06-10 has confidence 0.985
-TransactionTime: 13:59:00 has confidence 0.968
-Receipt Items:
-...Item #1
-......Name: 8GB RAM (Black) has confidence 0.916
-......TotalPrice: 999.0 has confidence 0.559
-...Item #2
-......Quantity: None has confidence 0.858
-......Name: SurfacePen has confidence 0.858
-......TotalPrice: 99.99 has confidence 0.386
-Subtotal: 1098.99 has confidence 0.964
-Tax: 104.4 has confidence 0.713
-Total: 1203.39 has confidence 0.774
-```
--
-## Analyze business cards
-
-#### [version 2.0](#tab/ga)
-
-> [!IMPORTANT]
-> This feature isn't available in the selected API version.
-
-#### [version 2.1 preview](#tab/preview)
-
-This section demonstrates how to recognize and extract common fields from English business cards, using a pre-trained model. To recognize business cards from a URL, use the `begin_recognize_business_cards_from_url` method.
-
-[!code-python[](~/cognitive-services-quickstart-code/python/FormRecognizer/FormRecognizerQuickstart-preview.py?name=snippet_bc)]
-
-> [!TIP]
-> You can also recognize local business card images. See the [FormRecognizerClient](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.formrecognizerclient) methods, such as `begin_recognize_business_cards`. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/formrecognizer/azure-ai-formrecognizer/samples) for scenarios involving local images.
-- ## Analyze invoices
@@ -225,12 +176,12 @@ This section demonstrates how to recognize and extract common fields from Englis
#### [version 2.1 preview](#tab/preview)
-This section demonstrates how to recognize and extract common fields from sales invoices, using a pre-trained model. To recognize invoices from a URL, use the `begin_recognize_invoices_from_url` method.
+This section demonstrates how to analyze and extract common fields from sales invoices, using a pre-trained model. For more information about invoice analysis, see the [Invoice conceptual guide](../../concept-invoices.md). To analyze invoices from a URL, use the `begin_recognize_invoices_from_url` method.
[!code-python[](~/cognitive-services-quickstart-code/python/FormRecognizer/FormRecognizerQuickstart-preview.py?name=snippet_invoice)] > [!TIP]
-> You can also recognize local invoice images. See the [FormRecognizerClient](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.formrecognizerclient) methods, such as `begin_recognize_invoices`. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/formrecognizer/azure-ai-formrecognizer/samples) for scenarios involving local images.
+> You can also analyze local invoice images. See the [FormRecognizerClient](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.formrecognizerclient?view=azure-python) methods, such as `begin_recognize_invoices`. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/formrecognizer/azure-ai-formrecognizer/samples) for scenarios involving local images.
@@ -243,9 +194,9 @@ This section demonstrates how to train a model with your own data. A trained mod
### Train a model without labels
-Train custom models to recognize all fields and values found in your custom forms without manually labeling the training documents.
+Train custom models to analyze all fields and values found in your custom forms without manually labeling the training documents.
-The following code uses the training client with the `begin_training` function to train a model on a given set of documents. The returned `CustomFormModel` object contains information on the form types the model can recognize and the fields it can extract from each form type. The following code block prints this information to the console.
+The following code uses the training client with the `begin_training` function to train a model on a given set of documents. The returned `CustomFormModel` object contains information on the form types the model can analyze and the fields it can extract from each form type. The following code block prints this information to the console.
[!code-python[](~/cognitive-services-quickstart-code/python/FormRecognizer/FormRecognizerQuickstart.py?name=snippet_train)]
@@ -366,6 +317,56 @@ Field 'Tax' has label 'Tax' with value 'None' and a confidence score of None
Field 'Total' has label 'Total' with value 'None' and a confidence score of None ```
+## Analyze receipts
+
+This section demonstrates how to analyze and extract common fields from US receipts, using a pre-trained receipt model. For more information about receipt analysis, see the [Receipts conceptual guide](../../concept-receipts.md). To analyze receipts from a URL, use the `begin_recognize_receipts_from_url` method.
+
+[!code-python[](~/cognitive-services-quickstart-code/python/FormRecognizer/FormRecognizerQuickstart.py?name=snippet_receipts)]
+
+> [!TIP]
+> You can also analyze local receipt images. See the [FormRecognizerClient](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.formrecognizerclient?view=azure-python) methods, such as `begin_recognize_receipts`. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/formrecognizer/azure-ai-formrecognizer/samples) for scenarios involving local images.
+
+### Output
+
+```console
+ReceiptType: Itemized has confidence 0.659
+MerchantName: Contoso Contoso has confidence 0.516
+MerchantAddress: 123 Main Street Redmond, WA 98052 has confidence 0.986
+MerchantPhoneNumber: None has confidence 0.99
+TransactionDate: 2019-06-10 has confidence 0.985
+TransactionTime: 13:59:00 has confidence 0.968
+Receipt Items:
+...Item #1
+......Name: 8GB RAM (Black) has confidence 0.916
+......TotalPrice: 999.0 has confidence 0.559
+...Item #2
+......Quantity: None has confidence 0.858
+......Name: SurfacePen has confidence 0.858
+......TotalPrice: 99.99 has confidence 0.386
+Subtotal: 1098.99 has confidence 0.964
+Tax: 104.4 has confidence 0.713
+Total: 1203.39 has confidence 0.774
+```
++
+## Analyze business cards
+
+#### [version 2.0](#tab/ga)
+
+> [!IMPORTANT]
+> This feature isn't available in the selected API version.
+
+#### [version 2.1 preview](#tab/preview)
+
+This section demonstrates how to analyze and extract common fields from English business cards, using a pre-trained model. For more information about business card analysis, see the [Business cards conceptual guide](../../concept-business-cards.md). To analyze business cards from a URL, use the `begin_recognize_business_cards_from_url` method.
+
+[!code-python[](~/cognitive-services-quickstart-code/python/FormRecognizer/FormRecognizerQuickstart-preview.py?name=snippet_bc)]
+
+> [!TIP]
+> You can also analyze local business card images. See the [FormRecognizerClient](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer.formrecognizerclient?view=azure-python) methods, such as `begin_recognize_business_cards`. Or, see the sample code on [GitHub](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/formrecognizer/azure-ai-formrecognizer/samples) for scenarios involving local images.
+++ ## Manage your custom models This section demonstrates how to manage the custom models stored in your account.
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/includes/quickstarts/rest-api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/includes/quickstarts/rest-api.md
@@ -1,6 +1,6 @@
Title: "Quickstart: Form Recognizer client library for .NET"
-description: Use the Form Recognizer client library for .NET to create a forms processing app that extracts key/value pairs and table data from your custom documents.
+ Title: "Quickstart: Form Recognizer REST API"
+description: Use the Form Recognizer REST API to create a forms processing app that extracts key/value pairs and table data from your custom documents.
@@ -29,7 +29,7 @@
## Analyze layout
-You can use Form Recognizer to recognize and extract tables, lines, and words in documents, without needing to train a model. Before you run the command, make these changes:
+You can use Form Recognizer to analyze and extract tables, selection marks, text, and structure in documents, without needing to train a model. For more information about layout extraction see the [Layout conceptual guide](../../concept-layout.md). Before you run the command, make these changes:
1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription. 1. Replace `{subscription key}` with the subscription key you copied from the previous step.
@@ -86,7 +86,7 @@ curl -v -X GET "https://{Endpoint}/formrecognizer/v2.1-preview.2/layout/analyzeR
You'll receive a `200 (success)` response with JSON content.
-See the following invoice image and its corresponding JSON output. The output has been shortened for simplicity. The `"readResults"` node contains every line of text with its respective bounding box placement on the page. The `"selectionMarks"` node (in v2.1 preview) shows every selection mark (checkbox, radio mark) and whether its status is "selected" or "unselected". The `"pageResults"` field shows every piece of text within tables, each with its row-column coordinate.
+See the following invoice image and its corresponding JSON output. The output has been shortened for simplicity. The `"readResults"` node contains every line of text with its respective bounding box placement on the page. The `"selectionMarks"` node (in v2.1 preview) shows every selection mark (checkbox, radio mark) and whether its status is "selected" or "unselected". The `"pageResults"` section includes the tables extracted. For each table, the text, row, and column index, row and column spanning, bounding box, and more are extracted.
:::image type="content" source="../../media/contoso-invoice.png" alt-text="Contoso project statement document with a table.":::
@@ -314,1203 +314,1205 @@ See the following invoice image and its corresponding JSON output. The output ha
-## Analyze receipts
-To start analyzing a receipt, call the **[Analyze Receipt](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/AnalyzeReceiptAsync)** API using the cURL command below. Before you run the command, make these changes:
-1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription.
-1. Replace `{your receipt URL}` with the URL address of a receipt image.
-1. Replace `{subscription key>` with the subscription key you copied from the previous step.
+## Analyze invoices
-# [v2.0](#tab/v2-0)
+# [version 2.0](#tab/v2-0)
-```bash
-curl -i -X POST "https://{Endpoint}/formrecognizer/v2.0/prebuilt/receipt/analyze" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{ \"source\": \"{your receipt URL}\"}"
-```
+> [!IMPORTANT]
+> This feature isn't available in the selected API version.
-# [v2.1 preview](#tab/v2-1)
+# [version 2.1 preview](#tab/v2-1)
+
+To start analyzing an invoice, use the cURL command below. For more information about invoice analysis, see the [Invoice conceptual guide](../../concept-invoices.md). Before you run the command, make these changes:
+
+1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription.
+1. Replace `{your invoice URL}` with the URL address of an invoice document.
+1. Replace `{subscription key}` with your subscription key.
```bash
-curl -i -X POST "https://{Endpoint}/formrecognizer/v2.1-preview.2/prebuilt/receipt/analyze" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{ \"source\": \"{your receipt URL}\"}"
+curl -v -X POST "https://{Endpoint}/formrecognizer/v2.1-preview.2/prebuilt/invoice/analyze"
+-H "Content-Type: application/json"
+-H "Ocp-Apim-Subscription-Key: {subscription key}"
+--data-ascii "{ \"source\": \"{your invoice URL}\"}"
```-
-You'll receive a `202 (Success)` response that includes am **Operation-Location** header. The value of this header contains an operation ID that you can use to query the status of the asynchronous operation and get the results. In the following example, the string after `operations/` is the operation ID.
+You'll receive a `202 (Success)` response that includes am **Operation-Location** header. The value of this header contains an operation ID that you can use to query the status of the asynchronous operation and get the results.
```console
-https://cognitiveservice/formrecognizer/v2.1-preview.2/prebuilt/receipt/operations/54f0b076-4e38-43e5-81bd-b85b8835fdfb
+https://cognitiveservice/formrecognizer/v2.1-preview.2/prebuilt/invoice/analyzeResults/54f0b076-4e38-43e5-81bd-b85b8835fdfb
```
-### Get the receipt results
+### Get invoice results
-After you've called the **Analyze Receipt** API, you call the **[Get Analyze Receipt Result](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/GetAnalyzeReceiptResult)** API to get the status of the operation and the extracted data. Before you run the command, make these changes:
+After you've called the **[Analyze Invoice](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/5ed8c9843c2794cbb1a96291)** API, you call the **[Get Analyze Invoice Result](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/5ed8c9acb78c40a2533aee83)** API to get the status of the operation and the extracted data. Before you run the command, make these changes:
1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription key. You can find it on your Form Recognizer resource **Overview** tab.
-1. Replace `{operationId}` with the operation ID from the previous step.
+1. Replace `{resultId}` with the operation ID from the previous step.
1. Replace `{subscription key}` with your subscription key.
-# [v2.0](#tab/v2-0)
```bash
-curl -X GET "https://{Endpoint}/formrecognizer/v2.0/prebuilt/receipt/analyzeResults/{operationId}" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+curl -v -X GET "https://{Endpoint}/formrecognizer/v2.1-preview.2/prebuilt/invoice/analyzeResults/{resultId}"
+-H "Ocp-Apim-Subscription-Key: {subscription key}"
```
-# [v2.1 preview](#tab/v2-1)
+
+### Examine the response
+
+You'll receive a `200 (Success)` response with JSON output. The `"readResults"` field contains every line of text that was extracted from the invoice, the `"pageResults"` includes the tables and selections marks extracted from the invoice and the `"documentResults"` field contains key/value information for the most relevant parts of the invoice.
+
+See the following invoice document and its corresponding JSON output. The JSON content has been shortened for readability.
+
+* [Sample invoice](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/tree/master/curl/form-recognizer/sample-invoice.pdf)
+
+```json
+{
+ "status": "succeeded",
+ "createdDateTime": "2020-11-06T23:32:11Z",
+ "lastUpdatedDateTime": "2020-11-06T23:32:20Z",
+ "analyzeResult": {
+ "version": "2.1.0",
+ "readResults": [{
+ "page": 1,
+ "angle": 0,
+ "width": 8.5,
+ "height": 11,
+ "unit": "inch"
+ }],
+ "pageResults": [{
+ "page": 1,
+ "tables": [{
+ "rows": 3,
+ "columns": 4,
+ "cells": [{
+ "rowIndex": 0,
+ "columnIndex": 0,
+ "text": "QUANTITY",
+ "boundingBox": [0.4953,
+ 5.7306,
+ 1.8097,
+ 5.7306,
+ 1.7942,
+ 6.0122,
+ 0.4953,
+ 6.0122]
+ },
+ {
+ "rowIndex": 0,
+ "columnIndex": 1,
+ "text": "DESCRIPTION",
+ "boundingBox": [1.8097,
+ 5.7306,
+ 5.7529,
+ 5.7306,
+ 5.7452,
+ 6.0122,
+ 1.7942,
+ 6.0122]
+ },
+ ...
+ ],
+ "boundingBox": [0.4794,
+ 5.7132,
+ 8.0158,
+ 5.714,
+ 8.0118,
+ 6.5627,
+ 0.4757,
+ 6.5619]
+ },
+ {
+ "rows": 2,
+ "columns": 6,
+ "cells": [{
+ "rowIndex": 0,
+ "columnIndex": 0,
+ "text": "SALESPERSON",
+ "boundingBox": [0.4979,
+ 4.963,
+ 1.8051,
+ 4.963,
+ 1.7975,
+ 5.2398,
+ 0.5056,
+ 5.2398]
+ },
+ {
+ "rowIndex": 0,
+ "columnIndex": 1,
+ "text": "P.O. NUMBER",
+ "boundingBox": [1.8051,
+ 4.963,
+ 3.3047,
+ 4.963,
+ 3.3124,
+ 5.2398,
+ 1.7975,
+ 5.2398]
+ },
+ ...
+ ],
+ "boundingBox": [0.4976,
+ 4.961,
+ 7.9959,
+ 4.9606,
+ 7.9959,
+ 5.5204,
+ 0.4972,
+ 5.5209]
+ }]
+ }],
+ "documentResults": [{
+ "docType": "prebuilt:invoice",
+ "pageRange": [1,
+ 1],
+ "fields": {
+ "AmountDue": {
+ "type": "number",
+ "valueNumber": 610,
+ "text": "$610.00",
+ "boundingBox": [7.3809,
+ 7.8153,
+ 7.9167,
+ 7.8153,
+ 7.9167,
+ 7.9591,
+ 7.3809,
+ 7.9591],
+ "page": 1,
+ "confidence": 0.875
+ },
+ "BillingAddress": {
+ "type": "string",
+ "valueString": "123 Bill St, Redmond WA, 98052",
+ "text": "123 Bill St, Redmond WA, 98052",
+ "boundingBox": [0.594,
+ 4.3724,
+ 2.0125,
+ 4.3724,
+ 2.0125,
+ 4.7125,
+ 0.594,
+ 4.7125],
+ "page": 1,
+ "confidence": 0.997
+ },
+ "BillingAddressRecipient": {
+ "type": "string",
+ "valueString": "Microsoft Finance",
+ "text": "Microsoft Finance",
+ "boundingBox": [0.594,
+ 4.1684,
+ 1.7907,
+ 4.1684,
+ 1.7907,
+ 4.2837,
+ 0.594,
+ 4.2837],
+ "page": 1,
+ "confidence": 0.998
+ },
+ ...
+ }
+ }]
+ }
+}
+```
+++
+## Train a custom model
+
+To train a custom model, you'll need a set of training data in an Azure Storage blob. You need a minimum of five filled-in forms (PDF documents and/or images) of the same type/structure. See [Build a training data set for a custom model](../../build-training-data-set.md) for tips and options for putting together your training data.
+
+> [!NOTE]
+> For high-accuracy models, you can train with manually labeled data. See the [Train with labels](../../quickstarts/label-tool.md) getting started guide.
+
+To train a Form Recognizer model with the documents in your Azure blob container, call the **[Train Custom Model](https://westus2.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2/operations/TrainCustomModelAsync)** API by running the following cURL command. Before you run the command, make these changes:
+
+1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription.
+1. Replace `{subscription key}` with the subscription key you copied from the previous step.
+1. Replace `{SAS URL}` with the Azure Blob storage container's shared access signature (SAS) URL. [!INCLUDE [get SAS URL](../sas-instructions.md)]
+
+ :::image type="content" source="../../media/quickstarts/get-sas-url.png" alt-text="SAS URL retrieval":::
+
+# [v2.0](#tab/v2-0)
```bash
-curl -X GET "https://{Endpoint}/formrecognizer/v2.1-preview.2/prebuilt/receipt/analyzeResults/{operationId}" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+curl -i -X POST "https://{Endpoint}/formrecognizer/v2.0/custom/models" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{ \"source\": \""{SAS URL}"\"}"
+```
+# [v2.1 preview](#tab/v2-1)
+```bash
+curl -i -X POST "https://{Endpoint}/formrecognizer/v2.1-preview.2/custom/models" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{ \"source\": \""{SAS URL}"\"}"
```+
-### Examine the response
-You'll receive a `200 (Success)` response with JSON output. The first field, `"status"`, indicates the status of the operation. If the operation is complete, the `"readResults"` field contains every line of text that was extracted from the receipt, and the `"documentResults"` field contains key/value information for the most relevant parts of the receipt. If the operation is not complete, the value of `"status"` will be `"running"` or `"notStarted"`, and you should call the API again, either manually or through a script. We recommend an interval of one second or more between calls.
+You'll receive a `201 (Success)` response with a **Location** header. The value of this header is the ID of the new model being trained.
-See the following receipt image and its corresponding JSON output. The output has been shortened for readability.
+### Get training results
-![A receipt from Contoso store](../../media/contoso-allinone.jpg)
+After you've started the train operation, you use a new operation, **[Get Custom Model](https://westus2.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2/operations/GetCustomModel)** to check the training status. Pass the model ID into this API call to check the training status:
-The `"readResults"` node contains all of the recognized text (if you set the optional *includeTextDetails* parameter to `true`). Text is organized by page, then by line, then by individual words. The `"documentResults"` node contains the receipt-specific values that the model discovered. This is where you'll find useful key/value pairs like the tax, total, merchant address, and so on.
+1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription key.
+1. Replace `{subscription key}` with your subscription key
+1. Replace `{model ID}` with the model ID you received in the previous step
++
+# [v2.0](#tab/v2-0)
+```bash
+curl -X GET "https://{Endpoint}/formrecognizer/v2.0/custom/models/{model ID}" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+```
+# [v2.1 preview](#tab/v2-1)
+```bash
+curl -X GET "https://{Endpoint}/formrecognizer/v2.1-preview.2/custom/models/{model ID}" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+```
+
++
+You'll receive a `200 (Success)` response with a JSON body in the following format. Notice the `"status"` field. This will have the value `"ready"` once training is complete. If the model is not finished training, you'll need to query the service again by rerunning the command. We recommend an interval of one second or more between calls.
+
+The `"modelId"` field contains the ID of the model you're training. You'll need this for the next step.
```json {
- "status":"succeeded",
- "createdDateTime":"2019-12-17T04:11:24Z",
- "lastUpdatedDateTime":"2019-12-17T04:11:32Z",
- "analyzeResult":{
- "version":"2.1.0",
- "readResults":[
+ "modelInfo":{
+ "status":"ready",
+ "createdDateTime":"2019-10-08T10:20:31.957784",
+ "lastUpdatedDateTime":"2019-10-08T14:20:41+00:00",
+ "modelId":"1cfb372bab404ba3aa59481ab2c63da5"
+ },
+ "trainResult":{
+ "trainingDocuments":[
{
- "page":1,
- "angle":0.6893,
- "width":1688,
- "height":3000,
- "unit":"pixel",
- "language":"en",
- "lines":[
+ "documentName":"invoices\\Invoice_1.pdf",
+ "pages":1,
+ "errors":[
+
+ ],
+ "status":"succeeded"
+ },
+ {
+ "documentName":"invoices\\Invoice_2.pdf",
+ "pages":1,
+ "errors":[
+
+ ],
+ "status":"succeeded"
+ },
+ {
+ "documentName":"invoices\\Invoice_3.pdf",
+ "pages":1,
+ "errors":[
+
+ ],
+ "status":"succeeded"
+ },
+ {
+ "documentName":"invoices\\Invoice_4.pdf",
+ "pages":1,
+ "errors":[
+
+ ],
+ "status":"succeeded"
+ },
+ {
+ "documentName":"invoices\\Invoice_5.pdf",
+ "pages":1,
+ "errors":[
+
+ ],
+ "status":"succeeded"
+ }
+ ],
+ "errors":[
+
+ ]
+ },
+ "keys":{
+ "0":[
+ "Address:",
+ "Invoice For:",
+ "Microsoft",
+ "Page"
+ ]
+ }
+}
+```
+
+## Analyze forms with a custom model
+
+Next, you'll use your newly trained model to analyze a document and extract key-value pairs and tables from it. Call the **[Analyze Form](https://westus2.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2/operations/AnalyzeWithCustomForm)** API by running the following cURL command. Before you run the command, make these changes:
+
+1. Replace `{Endpoint}` with the endpoint that you obtained from your Form Recognizer subscription key. You can find it on your Form Recognizer resource **Overview** tab.
+1. Replace `{model ID}` with the model ID that you received in the previous section.
+1. Replace `{SAS URL}` with an SAS URL to your file in Azure storage. Follow the steps in the Training section, but instead of getting a SAS URL for the whole blob container, get one for the specific file you want to analyze.
+1. Replace `{subscription key}` with your subscription key.
+
+# [v2.0](#tab/v2-0)
+
+```bash
+curl -v "https://{Endpoint}/formrecognizer/v2.0/custom/models/{model ID}/analyze?includeTextDetails=true" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" -d "{ \"source\": \""{SAS URL}"\" } "
+```
+
+# [v2.1 preview](#tab/v2-1)
+```bash
+curl -v "https://{Endpoint}/formrecognizer/v2.1-preview.2/custom/models/{model ID}/analyze?includeTextDetails=true" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" -d "{ \"source\": \""{SAS URL}"\" } "
+```
+
++++
+You'll receive a `202 (Success)` response with an **Operation-Location** header. The value of this header includes a results ID you use to track the results of the Analyze operation. Save this results ID for the next step.
+
+### Get the Analyze results
+
+Call the Get **[Analyze Form Result](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/GetAnalyzeFormResult)** API to query the results of the Analyze operation.
+
+1. Replace `{Endpoint}` with the endpoint that you obtained from your Form Recognizer subscription key. You can find it on your Form Recognizer resource **Overview** tab.
+1. Replace `{result ID}` with the ID that you received in the previous section.
+1. Replace `{subscription key}` with your subscription key.
+
+# [v2.0](#tab/v2-0)
+```bash
+curl -X GET "https://{Endpoint}/formrecognizer/v2.0/custom/models/{model ID}/analyzeResults/{result ID}" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+```
+# [v2.1 preview](#tab/v2-1)
+```bash
+curl -X GET "https://{Endpoint}/formrecognizer/v2.1-preview/custom/models/{model ID}/analyzeResults/{result ID}" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+```
++
+You'll receive a `200 (Success)` response with a JSON body in the following format. The output has been shortened for simplicity. Notice the `"status"` field near the bottom. This will have the value `"succeeded"` when the Analyze operation is complete. If the Analyze operation hasn't completed, you'll need to query the service again by rerunning the command. We recommend an interval of one second or more between calls.
+
+In custom models trained without labels, the key/value pair associations and tables are in the `"pageResults"` node of the JSON output. In custom models trained with labels, the key/value pair associations are in the `"documentResults"` node. If you also specified plain text extraction through the *includeTextDetails* URL parameter, then the `"readResults"` node will show the content and positions of all the text in the document.
+
+This sample JSON output has been shortened for simplicity.
+
+# [v2.0](#tab/v2-0)
+```JSON
+{
+ "status": "succeeded",
+ "createdDateTime": "2020-08-21T00:46:25Z",
+ "lastUpdatedDateTime": "2020-08-21T00:46:32Z",
+ "analyzeResult": {
+ "version": "2.0.0",
+ "readResults": [
+ {
+ "page": 1,
+ "angle": 0,
+ "width": 8.5,
+ "height": 11,
+ "unit": "inch",
+ "lines": [
{
- "text":"Contoso",
- "boundingBox":[
- 635,
- 510,
- 1086,
- 461,
- 1098,
- 558,
- 643,
- 604
+ "text": "Project Statement",
+ "boundingBox": [
+ 5.0153,
+ 0.275,
+ 8.0944,
+ 0.275,
+ 8.0944,
+ 0.7125,
+ 5.0153,
+ 0.7125
],
- "words":[
+ "words": [
{
- "text":"Contoso",
- "boundingBox":[
- 639,
- 510,
- 1087,
- 461,
- 1098,
- 551,
- 646,
- 604
- ],
- "confidence":0.955
+ "text": "Project",
+ "boundingBox": [
+ 5.0153,
+ 0.275,
+ 6.2278,
+ 0.275,
+ 6.2278,
+ 0.7125,
+ 5.0153,
+ 0.7125
+ ]
+ },
+ {
+ "text": "Statement",
+ "boundingBox": [
+ 6.3292,
+ 0.275,
+ 8.0944,
+ 0.275,
+ 8.0944,
+ 0.7125,
+ 6.3292,
+ 0.7125
+ ]
} ]
- },
- ...
+ },
+ ...
] } ],
- "documentResults":[
+ "pageResults": [
{
- "docType":"prebuilt:receipt",
- "pageRange":[
- 1,
- 1
- ],
- "fields":{
- "ReceiptType":{
- "type":"string",
- "valueString":"Itemized",
- "confidence":0.692
- },
- "MerchantName":{
- "type":"string",
- "valueString":"Contoso Contoso",
- "text":"Contoso Contoso",
- "boundingBox":[
- 378.2,
- 292.4,
- 1117.7,
- 468.3,
- 1035.7,
- 812.7,
- 296.3,
- 636.8
- ],
- "page":1,
- "confidence":0.613,
- "elements":[
- "#/readResults/0/lines/0/words/0",
- "#/readResults/0/lines/1/words/0"
- ]
+ "page": 1,
+ "keyValuePairs": [
+ {
+ "key": {
+ "text": "Date:",
+ "boundingBox": [
+ 6.9722,
+ 1.0264,
+ 7.3417,
+ 1.0264,
+ 7.3417,
+ 1.1931,
+ 6.9722,
+ 1.1931
+ ],
+ "elements": [
+ "#/readResults/0/lines/2/words/0"
+ ]
+ },
+ "confidence": 1
},
- "MerchantAddress":{
- "type":"string",
- "valueString":"123 Main Street Redmond, WA 98052",
- "text":"123 Main Street Redmond, WA 98052",
- "boundingBox":[
- 302,
- 675.8,
- 848.1,
- 793.7,
- 809.9,
- 970.4,
- 263.9,
- 852.5
- ],
- "page":1,
- "confidence":0.99,
- "elements":[
- "#/readResults/0/lines/2/words/0",
- "#/readResults/0/lines/2/words/1",
- "#/readResults/0/lines/2/words/2",
- "#/readResults/0/lines/3/words/0",
- "#/readResults/0/lines/3/words/1",
- "#/readResults/0/lines/3/words/2"
- ]
- },
- "MerchantPhoneNumber":{
- "type":"phoneNumber",
- "valuePhoneNumber":"+19876543210",
- "text":"987-654-3210",
- "boundingBox":[
- 278,
- 1004,
- 656.3,
- 1054.7,
- 646.8,
- 1125.3,
- 268.5,
- 1074.7
- ],
- "page":1,
- "confidence":0.99,
- "elements":[
- "#/readResults/0/lines/4/words/0"
- ]
- },
- "TransactionDate":{
- "type":"date",
- "valueDate":"2019-06-10",
- "text":"6/10/2019",
- "boundingBox":[
- 265.1,
- 1228.4,
- 525,
- 1247,
- 518.9,
- 1332.1,
- 259,
- 1313.5
- ],
- "page":1,
- "confidence":0.99,
- "elements":[
- "#/readResults/0/lines/5/words/0"
- ]
- },
- "TransactionTime":{
- "type":"time",
- "valueTime":"13:59:00",
- "text":"13:59",
- "boundingBox":[
- 541,
- 1248,
- 677.3,
- 1261.5,
- 668.9,
- 1346.5,
- 532.6,
- 1333
- ],
- "page":1,
- "confidence":0.977,
- "elements":[
- "#/readResults/0/lines/5/words/1"
- ]
- },
- "Items":{
- "type":"array",
- "valueArray":[
- {
- "type":"object",
- "valueObject":{
- "Quantity":{
- "type":"number",
- "text":"1",
- "boundingBox":[
- 245.1,
- 1581.5,
- 300.9,
- 1585.1,
- 295,
- 1676,
- 239.2,
- 1672.4
- ],
- "page":1,
- "confidence":0.92,
- "elements":[
- "#/readResults/0/lines/7/words/0"
- ]
- },
- "Name":{
- "type":"string",
- "valueString":"Cappuccino",
- "text":"Cappuccino",
- "boundingBox":[
- 322,
- 1586,
- 654.2,
- 1601.1,
- 650,
- 1693,
- 317.8,
- 1678
- ],
- "page":1,
- "confidence":0.923,
- "elements":[
- "#/readResults/0/lines/7/words/1"
- ]
- },
- "TotalPrice":{
- "type":"number",
- "valueNumber":2.2,
- "text":"$2.20",
- "boundingBox":[
- 1107.7,
- 1584,
- 1263,
- 1574,
- 1268.3,
- 1656,
- 1113,
- 1666
- ],
- "page":1,
- "confidence":0.918,
- "elements":[
- "#/readResults/0/lines/8/words/0"
- ]
- }
- }
- },
- ...
- ]
- },
- "Subtotal":{
- "type":"number",
- "valueNumber":11.7,
- "text":"11.70",
- "boundingBox":[
- 1146,
- 2221,
- 1297.3,
- 2223,
- 1296,
- 2319,
- 1144.7,
- 2317
- ],
- "page":1,
- "confidence":0.955,
- "elements":[
- "#/readResults/0/lines/13/words/1"
- ]
- },
- "Tax":{
- "type":"number",
- "valueNumber":1.17,
- "text":"1.17",
- "boundingBox":[
- 1190,
- 2359,
- 1304,
- 2359,
- 1304,
- 2456,
- 1190,
- 2456
- ],
- "page":1,
- "confidence":0.979,
- "elements":[
- "#/readResults/0/lines/15/words/1"
- ]
- },
- "Tip":{
- "type":"number",
- "valueNumber":1.63,
- "text":"1.63",
- "boundingBox":[
- 1094,
- 2479,
- 1267.7,
- 2485,
- 1264,
- 2591,
- 1090.3,
- 2585
- ],
- "page":1,
- "confidence":0.941,
- "elements":[
- "#/readResults/0/lines/17/words/1"
- ]
- },
- "Total":{
- "type":"number",
- "valueNumber":14.5,
- "text":"$14.50",
- "boundingBox":[
- 1034.2,
- 2617,
- 1387.5,
- 2638.2,
- 1380,
- 2763,
- 1026.7,
- 2741.8
- ],
- "page":1,
- "confidence":0.985,
- "elements":[
- "#/readResults/0/lines/19/words/0"
- ]
- }
- }
- }
- ]
- }
-}
-```
-
-## Analyze business cards
-
-# [v2.0](#tab/v2-0)
-
-> [!IMPORTANT]
-> This feature isn't available in the selected API version.
-
-# [v2.1 preview](#tab/v2-1)
-
-To start analyzing a business card, you call the **[Analyze Business Card](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/AnalyzeBusinessCardAsync)** API using the cURL command below. Before you run the command, make these changes:
-
-1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription.
-1. Replace `{your receipt URL}` with the URL address of a receipt image.
-1. Replace `{subscription key}` with the subscription key you copied from the previous step.
-
-```bash
-curl -i -X POST "https://{Endpoint}/formrecognizer/v2.1-preview.2/prebuilt/businessCard/analyze" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{ \"source\": \"{your receipt URL}\"}"
-```
-
-You'll receive a `202 (Success)` response that includes am **Operation-Location** header. The value of this header contains an operation ID that you can use to query the status of the asynchronous operation and get the results.
-
-```console
-https://cognitiveservice/formrecognizer/v2.1-preview.2/prebuilt/businessCard/analyzeResults/54f0b076-4e38-43e5-81bd-b85b8835fdfb
-```
-
-### Get business card results
-
-After you've called the **Analyze Business Card** API, you call the **[Get Analyze Business Card Result](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/GetAnalyzeBusinessCardResult)** API to get the status of the operation and the extracted data. Before you run the command, make these changes:
-
-1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription key. You can find it on your Form Recognizer resource **Overview** tab.
-1. Replace `{resultId}` with the operation ID from the previous step.
-1. Replace `{subscription key}` with your subscription key.
-
-```bash
-curl -v -X GET "https://westcentralus.api.cognitive.microsoft.com/formrecognizer/v2.1-preview.2/prebuilt/businessCard/analyzeResults/{resultId}"
--H "Ocp-Apim-Subscription-Key: {subscription key}"
-```
-
-### Examine the response
-
-You'll receive a `200 (Success)` response with JSON output. The `"readResults"` node contains all of the recognized text. Text is organized by page, then by line, then by individual words. The `"documentResults"` node contains the business-card-specific values that the model discovered. This is where you'll find useful contact information like the company name, first name, last name, phone number, and so on.
-
-![A business card from Contoso company](../../media/business-card-english.jpg)
-
-This sample illustrates the JSON output returned by Form Recognizer. It has been truncated for readability.
-
-```json
-{
- "status": "succeeded",
- "createdDateTime": "2020-06-04T08:19:29Z",
- "lastUpdatedDateTime": "2020-06-04T08:19:35Z",
- "analyzeResult": {
- "version": "2.1.1",
- "readResults": [
- {
- "page": 1,
- "angle": -17.0956,
- "width": 4032,
- "height": 3024,
- "unit": "pixel"
- }
- ],
- "documentResults": [
- {
- "docType": "prebuilt:businesscard",
- "pageRange": [
- 1,
- 1
- ],
- "fields": {
- "ContactNames": {
- "type": "array",
- "valueArray": [
- {
- "type": "object",
- "valueObject": {
- "FirstName": {
- "type": "string",
- "valueString": "Avery",
- "text": "Avery",
- "boundingBox": [
- 703,
- 1096,
- 1134,
- 989,
- 1165,
- 1109,
- 733,
- 1206
- ],
- "page": 1
- },
- "text": "Dr. Avery Smith",
- "boundingBox": [
- 419.3,
- 1154.6,
- 1589.6,
- 877.9,
- 1618.9,
- 1001.7,
- 448.6,
- 1278.4
- ],
- "confidence": 0.993
- }
- ]
- },
- "Emails": {
- "type": "array",
- "valueArray": [
- {
- "type": "string",
- "valueString": "avery.smith@contoso.com",
- "text": "avery.smith@contoso.com",
- "boundingBox": [
- 2107,
- 934,
- 2917,
- 696,
- 2935,
- 764,
- 2126,
- 995
- ],
- "page": 1,
- "confidence": 0.99
- }
- ]
- },
- "Websites": {
- "type": "array",
- "valueArray": [
- {
- "type": "string",
- "valueString": "https://www.contoso.com/",
- "text": "https://www.contoso.com/",
- "boundingBox": [
- 2121,
- 1002,
- 2992,
- 755,
- 3014,
- 826,
- 2143,
- 1077
- ],
- "page": 1,
- "confidence": 0.995
- }
- ]
- }
- }
- }
- ]
- }
-}
-```
-
-The script will print responses to the console until the **Analyze Business Card** operation completes.
---
-## Analyze invoices
-
-# [version 2.0](#tab/v2-0)
-
-> [!IMPORTANT]
-> This feature isn't available in the selected API version.
-
-# [version 2.1 preview](#tab/v2-1)
-
-To start analyzing an invoice, call the **[Analyze Invoice](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/5ed8c9843c2794cbb1a96291)** API using the cURL command below. Before you run the command, make these changes:
-
-1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription.
-1. Replace `{your invoice URL}` with the URL address of an invoice document.
-1. Replace `{subscription key}` with your subscription key.
-
-```bash
-curl -v -X POST "https://{Endpoint}/formrecognizer/v2.1-preview.2/prebuilt/invoice/analyze"
--H "Content-Type: application/json"--H "Ocp-Apim-Subscription-Key: {subscription key}"data-ascii "{ \"source\": \"{your invoice URL}\"}"
-```
-
-You'll receive a `202 (Success)` response that includes am **Operation-Location** header. The value of this header contains an operation ID that you can use to query the status of the asynchronous operation and get the results.
-
-```console
-https://cognitiveservice/formrecognizer/v2.1-preview.2/prebuilt/invoice/analyzeResults/54f0b076-4e38-43e5-81bd-b85b8835fdfb
-```
-
-### Get invoice results
-
-After you've called the **Analyze Invoice** API, you call the **[Get Analyze Invoice Result](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/5ed8c9acb78c40a2533aee83)** API to get the status of the operation and the extracted data. Before you run the command, make these changes:
-
-1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription key. You can find it on your Form Recognizer resource **Overview** tab.
-1. Replace `{resultId}` with the operation ID from the previous step.
-1. Replace `{subscription key}` with your subscription key.
-
-```bash
-curl -v -X GET "https://{Endpoint}/formrecognizer/v2.1-preview.2/prebuilt/invoice/analyzeResults/{resultId}"
--H "Ocp-Apim-Subscription-Key: {subscription key}"
-```
-
-### Examine the response
-
-You'll receive a `200 (Success)` response with JSON output. The `"readResults"` field contains every line of text that was extracted from the invoice, the `"pageResults"` includes the tables and selections marks extracted from the invoice and the `"documentResults"` field contains key/value information for the most relevant parts of the invoice.
-
-See the following invoice document and its corresponding JSON output. The JSON content has been shortened for readability.
-
-* [Sample invoice](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/tree/master/curl/form-recognizer/sample-invoice.pdf)
-
-```json
-{
- "status": "succeeded",
- "createdDateTime": "2020-11-06T23:32:11Z",
- "lastUpdatedDateTime": "2020-11-06T23:32:20Z",
- "analyzeResult": {
- "version": "2.1.0",
- "readResults": [{
- "page": 1,
- "angle": 0,
- "width": 8.5,
- "height": 11,
- "unit": "inch"
- }],
- "pageResults": [{
- "page": 1,
- "tables": [{
- "rows": 3,
- "columns": 4,
- "cells": [{
- "rowIndex": 0,
- "columnIndex": 0,
- "text": "QUANTITY",
- "boundingBox": [0.4953,
- 5.7306,
- 1.8097,
- 5.7306,
- 1.7942,
- 6.0122,
- 0.4953,
- 6.0122]
- },
- {
- "rowIndex": 0,
- "columnIndex": 1,
- "text": "DESCRIPTION",
- "boundingBox": [1.8097,
- 5.7306,
- 5.7529,
- 5.7306,
- 5.7452,
- 6.0122,
- 1.7942,
- 6.0122]
- },
- ...
- ],
- "boundingBox": [0.4794,
- 5.7132,
- 8.0158,
- 5.714,
- 8.0118,
- 6.5627,
- 0.4757,
- 6.5619]
- },
- {
- "rows": 2,
- "columns": 6,
- "cells": [{
- "rowIndex": 0,
- "columnIndex": 0,
- "text": "SALESPERSON",
- "boundingBox": [0.4979,
- 4.963,
- 1.8051,
- 4.963,
- 1.7975,
- 5.2398,
- 0.5056,
- 5.2398]
- },
- {
- "rowIndex": 0,
- "columnIndex": 1,
- "text": "P.O. NUMBER",
- "boundingBox": [1.8051,
- 4.963,
- 3.3047,
- 4.963,
- 3.3124,
- 5.2398,
- 1.7975,
- 5.2398]
- },
- ...
- ],
- "boundingBox": [0.4976,
- 4.961,
- 7.9959,
- 4.9606,
- 7.9959,
- 5.5204,
- 0.4972,
- 5.5209]
- }]
- }],
- "documentResults": [{
- "docType": "prebuilt:invoice",
- "pageRange": [1,
- 1],
- "fields": {
- "AmountDue": {
- "type": "number",
- "valueNumber": 610,
- "text": "$610.00",
- "boundingBox": [7.3809,
- 7.8153,
- 7.9167,
- 7.8153,
- 7.9167,
- 7.9591,
- 7.3809,
- 7.9591],
- "page": 1,
- "confidence": 0.875
- },
- "BillingAddress": {
- "type": "string",
- "valueString": "123 Bill St, Redmond WA, 98052",
- "text": "123 Bill St, Redmond WA, 98052",
- "boundingBox": [0.594,
- 4.3724,
- 2.0125,
- 4.3724,
- 2.0125,
- 4.7125,
- 0.594,
- 4.7125],
- "page": 1,
- "confidence": 0.997
- },
- "BillingAddressRecipient": {
- "type": "string",
- "valueString": "Microsoft Finance",
- "text": "Microsoft Finance",
- "boundingBox": [0.594,
- 4.1684,
- 1.7907,
- 4.1684,
- 1.7907,
- 4.2837,
- 0.594,
- 4.2837],
- "page": 1,
- "confidence": 0.998
- },
- ...
- }
- }]
- }
-}
-```
---
-## Train a custom model
-
-To train a custom model, you'll need a set of training data in an Azure Storage blob. You should have a minimum of five filled-in forms (PDF documents and/or images) of the same type/structure as your main input data. Or, you can use a single empty form with two filled-in forms. The empty form's file name needs to include the word "empty." See [Build a training data set for a custom model](../../build-training-data-set.md) for tips and options for putting together your training data.
-
-> [!NOTE]
-> You can use the labeled data feature to manually label some or all of your training data beforehand. This is a more complex process but results in a better trained model. See the [Train with labels](../../overview.md#train-with-labels) section of the overview to learn more about this feature.
-
-To train a Form Recognizer model with the documents in your Azure blob container, call the **[Train Custom Model](https://westus2.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2/operations/TrainCustomModelAsync)** API by running the following cURL command. Before you run the command, make these changes:
-
-1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription.
-1. Replace `{subscription key}` with the subscription key you copied from the previous step.
-1. Replace `{SAS URL}` with the Azure Blob storage container's shared access signature (SAS) URL. [!INCLUDE [get SAS URL](../sas-instructions.md)]
-
- :::image type="content" source="../../media/quickstarts/get-sas-url.png" alt-text="SAS URL retrieval":::
-
-# [v2.0](#tab/v2-0)
-```bash
-curl -i -X POST "https://{Endpoint}/formrecognizer/v2.0/custom/models" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{ \"source\": \""{SAS URL}"\"}"
-```
-# [v2.1 preview](#tab/v2-1)
-```bash
-curl -i -X POST "https://{Endpoint}/formrecognizer/v2.1-preview.2/custom/models" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{ \"source\": \""{SAS URL}"\"}"
-```
----
-You'll receive a `201 (Success)` response with a **Location** header. The value of this header is the ID of the new model being trained.
-
-### Get training results
-
-After you've started the train operation, you use a new operation, **[Get Custom Model](https://westus2.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2/operations/GetCustomModel)** to check the training status. Pass the model ID into this API call to check the training status:
-
-1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription key.
-1. Replace `{subscription key}` with your subscription key
-1. Replace `{model ID}` with the model ID you received in the previous step
--
-# [v2.0](#tab/v2-0)
-```bash
-curl -X GET "https://{Endpoint}/formrecognizer/v2.0/custom/models/{model ID}" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}"
-```
+ ...
+ ],
+ "tables": [
+ {
+ "rows": 4,
+ "columns": 5,
+ "cells": [
+ {
+ "text": "Training Date",
+ "rowIndex": 0,
+ "columnIndex": 0,
+ "boundingBox": [
+ 0.6931,
+ 4.2444,
+ 1.5681,
+ 4.2444,
+ 1.5681,
+ 4.4125,
+ 0.6931,
+ 4.4125
+ ],
+ "confidence": 1,
+ "rowSpan": 1,
+ "columnSpan": 1,
+ "elements": [
+ "#/readResults/0/lines/15/words/0",
+ "#/readResults/0/lines/15/words/1"
+ ],
+ "isHeader": true,
+ "isFooter": false
+ },
+ ...
+ ]
+ }
+ ],
+ "clusterId": 0
+ }
+ ],
+ "documentResults": [],
+ "errors": []
+ }
+}
+```
# [v2.1 preview](#tab/v2-1)
-```bash
-curl -X GET "https://{Endpoint}/formrecognizer/v2.1-preview.2/custom/models/{model ID}" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}"
-```
-
--
-You'll receive a `200 (Success)` response with a JSON body in the following format. Notice the `"status"` field. This will have the value `"ready"` once training is complete. If the model is not finished training, you'll need to query the service again by rerunning the command. We recommend an interval of one second or more between calls.
-
-The `"modelId"` field contains the ID of the model you're training. You'll need this for the next step.
-
-```json
+```JSON
{
- "modelInfo":{
- "status":"ready",
- "createdDateTime":"2019-10-08T10:20:31.957784",
- "lastUpdatedDateTime":"2019-10-08T14:20:41+00:00",
- "modelId":"1cfb372bab404ba3aa59481ab2c63da5"
- },
- "trainResult":{
- "trainingDocuments":[
- {
- "documentName":"invoices\\Invoice_1.pdf",
- "pages":1,
- "errors":[
-
- ],
- "status":"succeeded"
- },
- {
- "documentName":"invoices\\Invoice_2.pdf",
- "pages":1,
- "errors":[
-
- ],
- "status":"succeeded"
- },
- {
- "documentName":"invoices\\Invoice_3.pdf",
- "pages":1,
- "errors":[
-
- ],
- "status":"succeeded"
- },
- {
- "documentName":"invoices\\Invoice_4.pdf",
- "pages":1,
- "errors":[
-
- ],
- "status":"succeeded"
- },
+ "status": "succeeded",
+ "createdDateTime": "2020-08-21T01:13:28Z",
+ "lastUpdatedDateTime": "2020-08-21T01:13:42Z",
+ "analyzeResult": {
+ "version": "2.1.0",
+ "readResults": [
{
- "documentName":"invoices\\Invoice_5.pdf",
- "pages":1,
- "errors":[
-
- ],
- "status":"succeeded"
+ "page": 1,
+ "angle": 0,
+ "width": 8.5,
+ "height": 11,
+ "unit": "inch",
+ "lines": [
+ {
+ "text": "Project Statement",
+ "boundingBox": [
+ 5.0444,
+ 0.3613,
+ 8.0917,
+ 0.3613,
+ 8.0917,
+ 0.6718,
+ 5.0444,
+ 0.6718
+ ],
+ "words": [
+ {
+ "text": "Project",
+ "boundingBox": [
+ 5.0444,
+ 0.3587,
+ 6.2264,
+ 0.3587,
+ 6.2264,
+ 0.708,
+ 5.0444,
+ 0.708
+ ]
+ },
+ {
+ "text": "Statement",
+ "boundingBox": [
+ 6.3361,
+ 0.3635,
+ 8.0917,
+ 0.3635,
+ 8.0917,
+ 0.6396,
+ 6.3361,
+ 0.6396
+ ]
+ }
+ ]
+ },
+ ...
+ ]
} ],
- "errors":[
-
- ]
- },
- "keys":{
- "0":[
- "Address:",
- "Invoice For:",
- "Microsoft",
- "Page"
- ]
+ "pageResults": [
+ {
+ "page": 1,
+ "keyValuePairs": [
+ {
+ "key": {
+ "text": "Date:",
+ "boundingBox": [
+ 6.9833,
+ 1.0615,
+ 7.3333,
+ 1.0615,
+ 7.3333,
+ 1.1649,
+ 6.9833,
+ 1.1649
+ ],
+ "elements": [
+ "#/readResults/0/lines/2/words/0"
+ ]
+ },
+ "value": {
+ "text": "9/10/2020",
+ "boundingBox": [
+ 7.3833,
+ 1.0802,
+ 7.925,
+ 1.0802,
+ 7.925,
+ 1.174,
+ 7.3833,
+ 1.174
+ ],
+ "elements": [
+ "#/readResults/0/lines/3/words/0"
+ ]
+ },
+ "confidence": 1
+ },
+ ...
+ ],
+ "tables": [
+ {
+ "rows": 5,
+ "columns": 5,
+ "cells": [
+ {
+ "text": "Training Date",
+ "rowIndex": 0,
+ "columnIndex": 0,
+ "boundingBox": [
+ 0.6944,
+ 4.2779,
+ 1.5625,
+ 4.2779,
+ 1.5625,
+ 4.4005,
+ 0.6944,
+ 4.4005
+ ],
+ "confidence": 1,
+ "rowSpan": 1,
+ "columnSpan": 1,
+ "elements": [
+ "#/readResults/0/lines/15/words/0",
+ "#/readResults/0/lines/15/words/1"
+ ],
+ "isHeader": true,
+ "isFooter": false
+ },
+ ...
+ ]
+ }
+ ],
+ "clusterId": 0
+ }
+ ],
+ "documentResults": [],
+ "errors": []
} }
-```
+```
+
-## Analyze forms with a custom model
+### Improve results
-Next, you'll use your newly trained model to analyze a document and extract key-value pairs and tables from it. Call the **[Analyze Form](https://westus2.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2/operations/AnalyzeWithCustomForm)** API by running the following cURL command. Before you run the command, make these changes:
-1. Replace `{Endpoint}` with the endpoint that you obtained from your Form Recognizer subscription key. You can find it on your Form Recognizer resource **Overview** tab.
-1. Replace `{model ID}` with the model ID that you received in the previous section.
-1. Replace `{SAS URL}` with an SAS URL to your file in Azure storage. Follow the steps in the Training section, but instead of getting a SAS URL for the whole blob container, get one for the specific file you want to analyze.
-1. Replace `{subscription key}` with your subscription key.
+## Analyze receipts
+
+This section demonstrates how to analyze and extract common fields from US receipts, using a pre-trained receipt model. For more information about receipt analysis, see the [Receipts conceptual guide](../../concept-receipts.md). To start analyzing a receipt, call the **[Analyze Receipt](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/AnalyzeReceiptAsync)** API using the cURL command below. Before you run the command, make these changes:
-# [v2.0](#tab/v2-0)
+1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription.
+1. Replace `{your receipt URL}` with the URL address of a receipt image.
+1. Replace `{subscription key>` with the subscription key you copied from the previous step.
+
+# [v2.0](#tab/v2-0)
```bash
-curl -v "https://{Endpoint}/formrecognizer/v2.0/custom/models/{model ID}/analyze" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" -d "{ \"source\": \""{SAS URL}"\" } "
+curl -i -X POST "https://{Endpoint}/formrecognizer/v2.0/prebuilt/receipt/analyze" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{ \"source\": \"{your receipt URL}\"}"
``` # [v2.1 preview](#tab/v2-1) + ```bash
-curl -v "https://{Endpoint}/formrecognizer/v2.1-preview.2/custom/models/{model ID}/analyze" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" -d "{ \"source\": \""{SAS URL}"\" } "
+curl -i -X POST "https://{Endpoint}/formrecognizer/v2.1-preview.2/prebuilt/receipt/analyze" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{ \"source\": \"{your receipt URL}\"}"
```
-
+You'll receive a `202 (Success)` response that includes am **Operation-Location** header. The value of this header contains an operation ID that you can use to query the status of the asynchronous operation and get the results. In the following example, the string after `operations/` is the operation ID.
+```console
+https://cognitiveservice/formrecognizer/v2.1-preview.2/prebuilt/receipt/operations/54f0b076-4e38-43e5-81bd-b85b8835fdfb
+```
-You'll receive a `202 (Success)` response with an **Operation-Location** header. The value of this header includes a results ID you use to track the results of the Analyze operation. Save this results ID for the next step.
-
-### Get the Analyze results
+### Get the receipt results
-Use the following API to query the results of the Analyze operation.
+After you've called the **Analyze Receipt** API, you call the **[Get Analyze Receipt Result](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/GetAnalyzeReceiptResult)** API to get the status of the operation and the extracted data. Before you run the command, make these changes:
-1. Replace `{Endpoint}` with the endpoint that you obtained from your Form Recognizer subscription key. You can find it on your Form Recognizer resource **Overview** tab.
-1. Replace `{result ID}` with the ID that you received in the previous section.
+1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription key. You can find it on your Form Recognizer resource **Overview** tab.
+1. Replace `{operationId}` with the operation ID from the previous step.
1. Replace `{subscription key}` with your subscription key.
-# [v2.0](#tab/v2-0)
+# [v2.0](#tab/v2-0)
```bash
-curl -X GET "https://{Endpoint}/formrecognizer/v2.0/custom/models/{model ID}/analyzeResults/{result ID}" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+curl -X GET "https://{Endpoint}/formrecognizer/v2.0/prebuilt/receipt/analyzeResults/{operationId}" -H "Ocp-Apim-Subscription-Key: {subscription key}"
```
-# [v2.1 preview](#tab/v2-1)
+# [v2.1 preview](#tab/v2-1)
```bash
-curl -X GET "https://{Endpoint}/formrecognizer/v2.1-preview/custom/models/{model ID}/analyzeResults/{result ID}" -H "Ocp-Apim-Subscription-Key: {subscription key}"
+curl -X GET "https://{Endpoint}/formrecognizer/v2.1-preview.2/prebuilt/receipt/analyzeResults/{operationId}" -H "Ocp-Apim-Subscription-Key: {subscription key}"
```
-You'll receive a `200 (Success)` response with a JSON body in the following format. The output has been shortened for simplicity. Notice the `"status"` field near the bottom. This will have the value `"succeeded"` when the Analyze operation is complete. If the Analyze operation hasn't completed, you'll need to query the service again by rerunning the command. We recommend an interval of one second or more between calls.
+### Examine the response
-The main key/value pair associations and tables are in the `"pageResults"` node. If you also specified plain text extraction through the *includeTextDetails* URL parameter, then the `"readResults"` node will show the content and positions of all the text in the document.
+You'll receive a `200 (Success)` response with JSON output. The first field, `"status"`, indicates the status of the operation. If the operation is complete, the `"readResults"` field contains every line of text that was extracted from the receipt, and the `"documentResults"` field contains key/value information for the most relevant parts of the receipt. If the operation is not complete, the value of `"status"` will be `"running"` or `"notStarted"`, and you should call the API again, either manually or through a script. We recommend an interval of one second or more between calls.
-This sample JSON output has been shortened for simplicity.
+See the following receipt image and its corresponding JSON output. The output has been shortened for readability.
-# [v2.0](#tab/v2-0)
-```JSON
+![A receipt from Contoso store](../../media/contoso-allinone.jpg)
+
+The `"readResults"` node contains all of the recognized text (if you set the optional *includeTextDetails* parameter to `true`). Text is organized by page, then by line, then by individual words. The `"documentResults"` node contains the receipt-specific values that the model discovered. This is where you'll find useful key/value pairs like the tax, total, merchant address, and so on.
+
+```json
{
- "status": "succeeded",
- "createdDateTime": "2020-08-21T00:46:25Z",
- "lastUpdatedDateTime": "2020-08-21T00:46:32Z",
- "analyzeResult": {
- "version": "2.0.0",
- "readResults": [
+ "status":"succeeded",
+ "createdDateTime":"2019-12-17T04:11:24Z",
+ "lastUpdatedDateTime":"2019-12-17T04:11:32Z",
+ "analyzeResult":{
+ "version":"2.1.0",
+ "readResults":[
{
- "page": 1,
- "angle": 0,
- "width": 8.5,
- "height": 11,
- "unit": "inch",
- "lines": [
+ "page":1,
+ "angle":0.6893,
+ "width":1688,
+ "height":3000,
+ "unit":"pixel",
+ "language":"en",
+ "lines":[
{
- "text": "Project Statement",
- "boundingBox": [
- 5.0153,
- 0.275,
- 8.0944,
- 0.275,
- 8.0944,
- 0.7125,
- 5.0153,
- 0.7125
+ "text":"Contoso",
+ "boundingBox":[
+ 635,
+ 510,
+ 1086,
+ 461,
+ 1098,
+ 558,
+ 643,
+ 604
],
- "words": [
- {
- "text": "Project",
- "boundingBox": [
- 5.0153,
- 0.275,
- 6.2278,
- 0.275,
- 6.2278,
- 0.7125,
- 5.0153,
- 0.7125
- ]
- },
+ "words":[
{
- "text": "Statement",
- "boundingBox": [
- 6.3292,
- 0.275,
- 8.0944,
- 0.275,
- 8.0944,
- 0.7125,
- 6.3292,
- 0.7125
- ]
+ "text":"Contoso",
+ "boundingBox":[
+ 639,
+ 510,
+ 1087,
+ 461,
+ 1098,
+ 551,
+ 646,
+ 604
+ ],
+ "confidence":0.955
} ]
- },
- ...
+ },
+ ...
] } ],
- "pageResults": [
+ "documentResults":[
{
- "page": 1,
- "keyValuePairs": [
- {
- "key": {
- "text": "Date:",
- "boundingBox": [
- 6.9722,
- 1.0264,
- 7.3417,
- 1.0264,
- 7.3417,
- 1.1931,
- 6.9722,
- 1.1931
- ],
- "elements": [
- "#/readResults/0/lines/2/words/0"
- ]
- },
- "confidence": 1
+ "docType":"prebuilt:receipt",
+ "pageRange":[
+ 1,
+ 1
+ ],
+ "fields":{
+ "ReceiptType":{
+ "type":"string",
+ "valueString":"Itemized",
+ "confidence":0.692
+ },
+ "MerchantName":{
+ "type":"string",
+ "valueString":"Contoso Contoso",
+ "text":"Contoso Contoso",
+ "boundingBox":[
+ 378.2,
+ 292.4,
+ 1117.7,
+ 468.3,
+ 1035.7,
+ 812.7,
+ 296.3,
+ 636.8
+ ],
+ "page":1,
+ "confidence":0.613,
+ "elements":[
+ "#/readResults/0/lines/0/words/0",
+ "#/readResults/0/lines/1/words/0"
+ ]
+ },
+ "MerchantAddress":{
+ "type":"string",
+ "valueString":"123 Main Street Redmond, WA 98052",
+ "text":"123 Main Street Redmond, WA 98052",
+ "boundingBox":[
+ 302,
+ 675.8,
+ 848.1,
+ 793.7,
+ 809.9,
+ 970.4,
+ 263.9,
+ 852.5
+ ],
+ "page":1,
+ "confidence":0.99,
+ "elements":[
+ "#/readResults/0/lines/2/words/0",
+ "#/readResults/0/lines/2/words/1",
+ "#/readResults/0/lines/2/words/2",
+ "#/readResults/0/lines/3/words/0",
+ "#/readResults/0/lines/3/words/1",
+ "#/readResults/0/lines/3/words/2"
+ ]
+ },
+ "MerchantPhoneNumber":{
+ "type":"phoneNumber",
+ "valuePhoneNumber":"+19876543210",
+ "text":"987-654-3210",
+ "boundingBox":[
+ 278,
+ 1004,
+ 656.3,
+ 1054.7,
+ 646.8,
+ 1125.3,
+ 268.5,
+ 1074.7
+ ],
+ "page":1,
+ "confidence":0.99,
+ "elements":[
+ "#/readResults/0/lines/4/words/0"
+ ]
+ },
+ "TransactionDate":{
+ "type":"date",
+ "valueDate":"2019-06-10",
+ "text":"6/10/2019",
+ "boundingBox":[
+ 265.1,
+ 1228.4,
+ 525,
+ 1247,
+ 518.9,
+ 1332.1,
+ 259,
+ 1313.5
+ ],
+ "page":1,
+ "confidence":0.99,
+ "elements":[
+ "#/readResults/0/lines/5/words/0"
+ ]
+ },
+ "TransactionTime":{
+ "type":"time",
+ "valueTime":"13:59:00",
+ "text":"13:59",
+ "boundingBox":[
+ 541,
+ 1248,
+ 677.3,
+ 1261.5,
+ 668.9,
+ 1346.5,
+ 532.6,
+ 1333
+ ],
+ "page":1,
+ "confidence":0.977,
+ "elements":[
+ "#/readResults/0/lines/5/words/1"
+ ]
},
- ...
- ],
- "tables": [
- {
- "rows": 4,
- "columns": 5,
- "cells": [
+ "Items":{
+ "type":"array",
+ "valueArray":[
{
- "text": "Training Date",
- "rowIndex": 0,
- "columnIndex": 0,
- "boundingBox": [
- 0.6931,
- 4.2444,
- 1.5681,
- 4.2444,
- 1.5681,
- 4.4125,
- 0.6931,
- 4.4125
- ],
- "confidence": 1,
- "rowSpan": 1,
- "columnSpan": 1,
- "elements": [
- "#/readResults/0/lines/15/words/0",
- "#/readResults/0/lines/15/words/1"
- ],
- "isHeader": true,
- "isFooter": false
+ "type":"object",
+ "valueObject":{
+ "Quantity":{
+ "type":"number",
+ "text":"1",
+ "boundingBox":[
+ 245.1,
+ 1581.5,
+ 300.9,
+ 1585.1,
+ 295,
+ 1676,
+ 239.2,
+ 1672.4
+ ],
+ "page":1,
+ "confidence":0.92,
+ "elements":[
+ "#/readResults/0/lines/7/words/0"
+ ]
+ },
+ "Name":{
+ "type":"string",
+ "valueString":"Cappuccino",
+ "text":"Cappuccino",
+ "boundingBox":[
+ 322,
+ 1586,
+ 654.2,
+ 1601.1,
+ 650,
+ 1693,
+ 317.8,
+ 1678
+ ],
+ "page":1,
+ "confidence":0.923,
+ "elements":[
+ "#/readResults/0/lines/7/words/1"
+ ]
+ },
+ "TotalPrice":{
+ "type":"number",
+ "valueNumber":2.2,
+ "text":"$2.20",
+ "boundingBox":[
+ 1107.7,
+ 1584,
+ 1263,
+ 1574,
+ 1268.3,
+ 1656,
+ 1113,
+ 1666
+ ],
+ "page":1,
+ "confidence":0.918,
+ "elements":[
+ "#/readResults/0/lines/8/words/0"
+ ]
+ }
+ }
}, ... ]
- }
- ],
- "clusterId": 0
- }
- ],
- "documentResults": [],
- "errors": []
- }
-}
-```
-# [v2.1 preview](#tab/v2-1)
-```JSON
-{
- "status": "succeeded",
- "createdDateTime": "2020-08-21T01:13:28Z",
- "lastUpdatedDateTime": "2020-08-21T01:13:42Z",
- "analyzeResult": {
- "version": "2.1.0",
- "readResults": [
- {
- "page": 1,
- "angle": 0,
- "width": 8.5,
- "height": 11,
- "unit": "inch",
- "lines": [
- {
- "text": "Project Statement",
- "boundingBox": [
- 5.0444,
- 0.3613,
- 8.0917,
- 0.3613,
- 8.0917,
- 0.6718,
- 5.0444,
- 0.6718
+ },
+ "Subtotal":{
+ "type":"number",
+ "valueNumber":11.7,
+ "text":"11.70",
+ "boundingBox":[
+ 1146,
+ 2221,
+ 1297.3,
+ 2223,
+ 1296,
+ 2319,
+ 1144.7,
+ 2317
],
- "words": [
- {
- "text": "Project",
- "boundingBox": [
- 5.0444,
- 0.3587,
- 6.2264,
- 0.3587,
- 6.2264,
- 0.708,
- 5.0444,
- 0.708
- ]
- },
- {
- "text": "Statement",
- "boundingBox": [
- 6.3361,
- 0.3635,
- 8.0917,
- 0.3635,
- 8.0917,
- 0.6396,
- 6.3361,
- 0.6396
- ]
- }
+ "page":1,
+ "confidence":0.955,
+ "elements":[
+ "#/readResults/0/lines/13/words/1"
+ ]
+ },
+ "Tax":{
+ "type":"number",
+ "valueNumber":1.17,
+ "text":"1.17",
+ "boundingBox":[
+ 1190,
+ 2359,
+ 1304,
+ 2359,
+ 1304,
+ 2456,
+ 1190,
+ 2456
+ ],
+ "page":1,
+ "confidence":0.979,
+ "elements":[
+ "#/readResults/0/lines/15/words/1"
+ ]
+ },
+ "Tip":{
+ "type":"number",
+ "valueNumber":1.63,
+ "text":"1.63",
+ "boundingBox":[
+ 1094,
+ 2479,
+ 1267.7,
+ 2485,
+ 1264,
+ 2591,
+ 1090.3,
+ 2585
+ ],
+ "page":1,
+ "confidence":0.941,
+ "elements":[
+ "#/readResults/0/lines/17/words/1"
]
- },
- ...
- ]
- }
- ],
- "pageResults": [
- {
- "page": 1,
- "keyValuePairs": [
- {
- "key": {
- "text": "Date:",
- "boundingBox": [
- 6.9833,
- 1.0615,
- 7.3333,
- 1.0615,
- 7.3333,
- 1.1649,
- 6.9833,
- 1.1649
- ],
- "elements": [
- "#/readResults/0/lines/2/words/0"
- ]
- },
- "value": {
- "text": "9/10/2020",
- "boundingBox": [
- 7.3833,
- 1.0802,
- 7.925,
- 1.0802,
- 7.925,
- 1.174,
- 7.3833,
- 1.174
- ],
- "elements": [
- "#/readResults/0/lines/3/words/0"
- ]
- },
- "confidence": 1
},
- ...
- ],
- "tables": [
- {
- "rows": 5,
- "columns": 5,
- "cells": [
- {
- "text": "Training Date",
- "rowIndex": 0,
- "columnIndex": 0,
- "boundingBox": [
- 0.6944,
- 4.2779,
- 1.5625,
- 4.2779,
- 1.5625,
- 4.4005,
- 0.6944,
- 4.4005
- ],
- "confidence": 1,
- "rowSpan": 1,
- "columnSpan": 1,
- "elements": [
- "#/readResults/0/lines/15/words/0",
- "#/readResults/0/lines/15/words/1"
- ],
- "isHeader": true,
- "isFooter": false
- },
- ...
+ "Total":{
+ "type":"number",
+ "valueNumber":14.5,
+ "text":"$14.50",
+ "boundingBox":[
+ 1034.2,
+ 2617,
+ 1387.5,
+ 2638.2,
+ 1380,
+ 2763,
+ 1026.7,
+ 2741.8
+ ],
+ "page":1,
+ "confidence":0.985,
+ "elements":[
+ "#/readResults/0/lines/19/words/0"
] }
- ],
- "clusterId": 0
+ }
}
- ],
- "documentResults": [],
- "errors": []
+ ]
} }
-```
-
+```
-### Improve results
+## Analyze business cards
+# [v2.0](#tab/v2-0)
+
+> [!IMPORTANT]
+> This feature isn't available in the selected API version.
+
+# [v2.1 preview](#tab/v2-1)
+
+This section demonstrates how to analyze and extract common fields from English business cards, using a pre-trained model. For more information about business card analysis, see the [Business cards conceptual guide](../../concept-business-cards.md). To start analyzing a business card, you call the **[Analyze Business Card](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/AnalyzeBusinessCardAsync)** API using the cURL command below. Before you run the command, make these changes:
+
+1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription.
+1. Replace `{your receipt URL}` with the URL address of a receipt image.
+1. Replace `{subscription key}` with the subscription key you copied from the previous step.
+
+```bash
+curl -i -X POST "https://{Endpoint}/formrecognizer/v2.1-preview.2/prebuilt/businessCard/analyze" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: {subscription key}" --data-ascii "{ \"source\": \"{your receipt URL}\"}"
+```
+
+You'll receive a `202 (Success)` response that includes am **Operation-Location** header. The value of this header contains an operation ID that you can use to query the status of the asynchronous operation and get the results.
+
+```console
+https://cognitiveservice/formrecognizer/v2.1-preview.2/prebuilt/businessCard/analyzeResults/54f0b076-4e38-43e5-81bd-b85b8835fdfb
+```
+
+### Get business card results
+
+After you've called the **Analyze Business Card** API, you call the **[Get Analyze Business Card Result](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/GetAnalyzeBusinessCardResult)** API to get the status of the operation and the extracted data. Before you run the command, make these changes:
+
+1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription key. You can find it on your Form Recognizer resource **Overview** tab.
+1. Replace `{resultId}` with the operation ID from the previous step.
+1. Replace `{subscription key}` with your subscription key.
+
+```bash
+curl -v -X GET "https://westcentralus.api.cognitive.microsoft.com/formrecognizer/v2.1-preview.2/prebuilt/businessCard/analyzeResults/{resultId}"
+-H "Ocp-Apim-Subscription-Key: {subscription key}"
+```
+
+### Examine the response
+
+You'll receive a `200 (Success)` response with JSON output. The `"readResults"` node contains all of the recognized text. Text is organized by page, then by line, then by individual words. The `"documentResults"` node contains the business-card-specific values that the model discovered. This is where you'll find useful contact information like the company name, first name, last name, phone number, and so on.
+
+![A business card from Contoso company](../../media/business-card-english.jpg)
+
+This sample illustrates the JSON output returned by Form Recognizer. It has been truncated for readability.
+
+```json
+{
+ "status": "succeeded",
+ "createdDateTime": "2020-06-04T08:19:29Z",
+ "lastUpdatedDateTime": "2020-06-04T08:19:35Z",
+ "analyzeResult": {
+ "version": "2.1.1",
+ "readResults": [
+ {
+ "page": 1,
+ "angle": -17.0956,
+ "width": 4032,
+ "height": 3024,
+ "unit": "pixel"
+ }
+ ],
+ "documentResults": [
+ {
+ "docType": "prebuilt:businesscard",
+ "pageRange": [
+ 1,
+ 1
+ ],
+ "fields": {
+ "ContactNames": {
+ "type": "array",
+ "valueArray": [
+ {
+ "type": "object",
+ "valueObject": {
+ "FirstName": {
+ "type": "string",
+ "valueString": "Avery",
+ "text": "Avery",
+ "boundingBox": [
+ 703,
+ 1096,
+ 1134,
+ 989,
+ 1165,
+ 1109,
+ 733,
+ 1206
+ ],
+ "page": 1
+ },
+ "text": "Dr. Avery Smith",
+ "boundingBox": [
+ 419.3,
+ 1154.6,
+ 1589.6,
+ 877.9,
+ 1618.9,
+ 1001.7,
+ 448.6,
+ 1278.4
+ ],
+ "confidence": 0.993
+ }
+ ]
+ },
+ "Emails": {
+ "type": "array",
+ "valueArray": [
+ {
+ "type": "string",
+ "valueString": "avery.smith@contoso.com",
+ "text": "avery.smith@contoso.com",
+ "boundingBox": [
+ 2107,
+ 934,
+ 2917,
+ 696,
+ 2935,
+ 764,
+ 2126,
+ 995
+ ],
+ "page": 1,
+ "confidence": 0.99
+ }
+ ]
+ },
+ "Websites": {
+ "type": "array",
+ "valueArray": [
+ {
+ "type": "string",
+ "valueString": "https://www.contoso.com/",
+ "text": "https://www.contoso.com/",
+ "boundingBox": [
+ 2121,
+ 1002,
+ 2992,
+ 755,
+ 3014,
+ 826,
+ 2143,
+ 1077
+ ],
+ "page": 1,
+ "confidence": 0.995
+ }
+ ]
+ }
+ }
+ }
+ ]
+ }
+}
+```
+
+The script will print responses to the console until the **Analyze Business Card** operation completes.
++ ## Manage custom models ### Get a list of custom models
-Use the following command to return a list of all the custom models that belong to your subscription.
+Use the **[List Custom Models](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/GetCustomModels)** API in the following command to return a list of all the custom models that belong to your subscription.
1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription. 1. Replace `{subscription key}` with the subscription key you copied from the previous step.
@@ -1552,7 +1554,7 @@ You'll receive a `200` success response, with JSON data like the following. The
### Get a specific model
-To retrieve detailed information about a specific custom model, use the following command.
+To retrieve detailed information about a specific custom model, use the **[Get Custom Model](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/GetCustomModel)** API in the following command.
1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription. 1. Replace `{subscription key}` with the subscription key you copied from the previous step.
@@ -1563,7 +1565,6 @@ To retrieve detailed information about a specific custom model, use the followin
```bash curl -v -X GET "https://{Endpoint}/formrecognizer/v2.0/custom/models/{modelId}" -H "Ocp-Apim-Subscription-Key: {subscription key}"data-ascii "{body}" ``` # [v2.1 preview](#tab/v2-1)
@@ -1571,7 +1572,6 @@ curl -v -X GET "https://{Endpoint}/formrecognizer/v2.0/custom/models/{modelId}"
```bash curl -v -X GET "https://{Endpoint}/formrecognizer/v2.1-preview.2/custom/models/{modelId}" -H "Ocp-Apim-Subscription-Key: {subscription key}"data-ascii "{body}" ```
@@ -1615,6 +1615,32 @@ You'll receive a `200` success response, with JSON data like the following.
} ```
+### Delete a model from the resource account
+
+You can also delete a model from your account by referencing its ID. This command calls the **[Delete Custom Model](https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/DeleteCustomModel)** API to delete the model used in the previous section.
+
+1. Replace `{Endpoint}` with the endpoint that you obtained with your Form Recognizer subscription.
+1. Replace `{subscription key}` with the subscription key you copied from the previous step.
+1. Replace `{modelId}` with the ID of the custom model you want to look up.
+
+# [v2.0](#tab/v2-0)
+
+```bash
+curl -v -X DELETE "https://{Endpoint}/formrecognizer/v2.0/custom/models/{modelId}"
+-H "Ocp-Apim-Subscription-Key: {subscription key}"
+```
+
+# [v2.1 preview](#tab/v2-1)
+
+```bash
+curl -v -X DELETE "https://{Endpoint}/formrecognizer/v2.1-preview.2/custom/models/{modelId}"
+-H "Ocp-Apim-Subscription-Key: {subscription key}"
+```
++
+You'll receive a `204` success response, indicating that your model is marked for deletion. Model artifacts will be removed within 48 hours.
++ ## Next steps In this quickstart, you used the Form Recognizer REST API to train models and analyze forms in different ways. Next, see the reference documentation to explore the Form Recognizer API in more depth.
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/overview.md
@@ -84,13 +84,13 @@ Form Recognizer uses the [Layout API](#layout-api) to learn the expected sizes a
Form Recognizer also includes Prebuilt models for automated data processing of unique form types. ### Prebuilt Invoice model
-The Prebuilt Invoice model extracts data from invoices in a variety of formats and returns structured data. This model extracts key information such as the invoice ID, customer details, vendor details, ship to, bill to, total, tax, subtotal and more. In addition, the prebuilt invoice model is trained to recognize and return all of the text and tables on the invoice. See the [Invoices](./concept-invoices.md) conceptual guide for more info.
+The Prebuilt Invoice model extracts data from invoices in a variety of formats and returns structured data. This model extracts key information such as the invoice ID, customer details, vendor details, ship to, bill to, total, tax, subtotal and more. In addition, the prebuilt invoice model is trained to analyze and return all of the text and tables on the invoice. See the [Invoices](./concept-invoices.md) conceptual guide for more info.
:::image type="content" source="./media/overview-invoices.jpg" alt-text="sample invoice" lightbox="./media/overview-invoices.jpg"::: ### Prebuilt Receipt model
-The Prebuilt Receipt model is used for reading English sales receipts from Australia, Canada, Great Britain, India, and the United States&mdash;the type used by restaurants, gas stations, retail, and so on. This model extracts key information such as the time and date of the transaction, merchant information, amounts of taxes, line items, totals and more. In addition, the prebuilt receipt model is trained to recognize and return all of the text on a receipt. See the [Receipts](./concept-receipts.md) conceptual guide for more info.
+The Prebuilt Receipt model is used for reading English sales receipts from Australia, Canada, Great Britain, India, and the United States&mdash;the type used by restaurants, gas stations, retail, and so on. This model extracts key information such as the time and date of the transaction, merchant information, amounts of taxes, line items, totals and more. In addition, the prebuilt receipt model is trained to analyze and return all of the text on a receipt. See the [Receipts](./concept-receipts.md) conceptual guide for more info.
:::image type="content" source="./media/overview-receipt.jpg" alt-text="sample receipt" lightbox="./media/overview-receipt.jpg":::
@@ -168,4 +168,4 @@ As with all the cognitive services, developers using the Form Recognizer service
## Next steps
-Complete a [quickstart](quickstarts/client-library.md) to get started writing a forms processing app with Form Recognizer in the language of your choice.
+Complete a [quickstart](quickstarts/client-library.md) to get started writing a forms processing app with Form Recognizer in the development language of your choice.
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/quickstarts/client-library https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/quickstarts/client-library.md
@@ -18,16 +18,16 @@ keywords: forms processing, automated data processing
# Quickstart: Use the Form Recognizer client library or REST API
-Get started with the Form Recognizer using the language of your choice. Azure Form Recognizer is a cognitive service that lets you build automated data processing software using machine learning technology. Identify and extract text, key/value pairs, selection marks, table data and more from your form documents&mdash;the service outputs structured data that includes the relationships in the original file. You can use Form Recognizer via the REST API or SDK. Follow these steps to install the SDK package and try out the example code for basic tasks.
+Get started with the Form Recognizer using the development language of your choice. Azure Form Recognizer is a cognitive service that lets you build automated data processing software using machine learning technology. Identify and extract text, key/value pairs, selection marks, table data and more from your form documents&mdash;the service outputs structured data that includes the relationships in the original file. You can use Form Recognizer via the REST API or SDK. Follow these steps to install the SDK package and try out the example code for basic tasks.
Use Form Recognizer to: * [Analyze Layout](#analyze-layout)
-* [Analyze receipts](#analyze-receipts)
-* [Analyze business cards](#analyze-business-cards)
* [Analyze invoices](#analyze-invoices) * [Train a custom model](#train-a-custom-model) * [Analyze forms with a custom model](#analyze-forms-with-a-custom-model)
+* [Analyze receipts](#analyze-receipts)
+* [Analyze business cards](#analyze-business-cards)
* [Manage your custom models](#manage-your-custom-models) ::: zone pivot="programming-language-csharp"
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/quickstarts/label-tool https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/quickstarts/label-tool.md
@@ -172,7 +172,7 @@ It will also show which tables have been automatically extracted. Click on the t
### Apply labels to text
-Next, you'll create tags (labels) and apply them to the text elements that you want the model to recognize.
+Next, you'll create tags (labels) and apply them to the text elements that you want the model to analyze.
# [v2.0](#tab/v2-0) 1. First, use the tags editor pane to create the tags you'd like to identify.
cognitive-services https://docs.microsoft.com/en-us/azure/cognitive-services/form-recognizer/whats-new https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/form-recognizer/whats-new.md
@@ -206,7 +206,7 @@ The JSON responses for all API calls have new formats. Some keys and values have
## Next steps
-Complete a [quickstart](quickstarts/client-library.md) to get started writing a forms processing app with Form Recognizer in the language of your choice.
+Complete a [quickstart](quickstarts/client-library.md) to get started writing a forms processing app with Form Recognizer in the development language of your choice.
## See also
communication-services https://docs.microsoft.com/en-us/azure/communication-services/concepts/call-flows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/call-flows.md
@@ -12,7 +12,7 @@
-# Call Flows
+# Call flows
[!INCLUDE [Public Preview Notice](../includes/public-preview-include.md)]
communication-services https://docs.microsoft.com/en-us/azure/communication-services/concepts/chat/sdk-features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/chat/sdk-features.md
@@ -49,6 +49,17 @@ The following list presents the set of features which are currently available in
| | Monitor the quality and status of API requests made by your app and configure alerts via the portal | ✔️ | ✔️ | ✔️ | ✔️ | |Additional features | Use [Cognitive Services APIs](../../../cognitive-services/index.yml) along with chat client library to enable intelligent features - *language translation & sentiment analysis of the incoming message on a client, speech to text conversion to compose a message while the member speaks, etc.* | ✔️ | ✔️ | ✔️ | ✔️ |
+## JavaScript chat client library support by OS and browser
+
+The following table represents the set of supported browsers and versions which are currently available.
+
+| | Windows | macOS | Ubuntu | Linux | Android | iOS | iPad OS|
+| -- | - | -- | - | | | | -|
+| **Chat client library** | Firefox*, Chrome*, new Edge | Firefox*, Chrome*, Safari* | Chrome* | Chrome* | Chrome* | Safari* | Safari* |
++
+*Note that the latest version is supported in addition to the previous two releases.<br/>
+ ## Next steps > [!div class="nextstepaction"]
@@ -56,4 +67,4 @@ The following list presents the set of features which are currently available in
The following documents may be interesting to you: -- Familiarize yourself with [chat concepts](../chat/concepts.md)
+- Familiarize yourself with [chat concepts](../chat/concepts.md)
communication-services https://docs.microsoft.com/en-us/azure/communication-services/concepts/detailed-call-flows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/detailed-call-flows.md
@@ -0,0 +1,199 @@
+
+ Title: Call flow topologies in Azure Communication Services
+
+description: Learn about call flow topologies in Azure Communication Services.
++++ Last updated : 12/11/2020+++++
+# Call flow topologies
+This article describes Azure Communication Services call flow topologies. This is a great article to review if you're an enterprise customer integrating Communication Services within a network that you manage. For an introduction to Communication Services call flows, visit the [call flows conceptual documentation](./call-flows.md).
+
+## Background
+
+### Network concepts
+
+Before reviewing call flow topologies, we'll define some terms that are referred to throughout the document.
+
+A **customer network** contains any network segments that you manage. This might include wired and wireless networks within your office or between offices, data centers, and internet service providers.
+
+A customer network usually has several network perimeters with firewalls and/or proxy servers that enforce your organization's security policies. We recommend performing a [comprehensive network assessment](https://docs.microsoft.com/microsoftteams/3-envision-evaluate-my-environment) to ensure optimal performance and quality of your communication solution.
+
+The **Communication Services network** is the network segment that supports Azure Communication Services. This network is managed by Microsoft and is distributed worldwide with edges close to most customer networks. This network is responsible for transport relay, media processing for group calls, and other components that support rich real-time media communications.
+
+### Types of traffic
+
+Communication Services is built primarily on two types of traffic: **real-time media** and **signaling**.
+
+**Real-time media** is transmitted using the Real-time Transport Protocol (RTP). This protocol supports audio, video, and screen sharing data transmission. This data is sensitive to network latency issues. While it's possible to transmit real-time media using TCP or HTTP, we recommend using UDP as the transport-layer protocol to support high-performance end-user experiences. Media payloads transmitted over RTP are secured using SRTP.
+
+Users of your Communication Services solution will be connecting to your services from their client devices. Communication between these devices and your servers is handled with **signaling**. For example: call initiation and real-time chat are supported by signaling between devices and your service. Most signaling traffic uses HTTPS REST, though in some scenarios, SIP can be used as a signaling traffic protocol. While this type of traffic is less sensitive to latency, low-latency signaling will the users of your solution a pleasant end-user experience.
+
+### Interoperability restrictions
+
+Media flowing through Communication Services is restricted as follows:
+
+**Third-party media relays.** Interoperability with a third-party media relays isn't supported. If one of your media endpoints is Communication Services, your media can only traverse Microsoft-native media relays, including those that support Microsoft Teams and Skype for Business.
+
+A third-party Session Border Controller (SBC) on the boundary with PSTN should terminate the RTP/RTCP stream, secured using SRTP, and not relay it to the next hop. If you relay the flow to the next hop, it might not be understood.
+
+**Third-party SIP proxy servers.** A Communication Services signaling SIP dialog with a third-party SBC and/or gateway may traverse Microsoft native SIP proxies (just like Teams). Interoperability with third-party SIP proxies is not supported.
+
+**Third-party B2BUA (or SBC).** Communication Services media flow to and from the PSTN is terminated by a third-party SBC. Interoperability with a third-party SBC within the Communication Services network (where a third-party SBC mediates two Communication Services endpoints) is not supported.
+
+### Unsupported technologies
+
+**VPN network.** Communication Services does not support media transmission over VPNs. If your users are using VPN clients, the client should split and route media traffic over a non-VPN connection as specified in [Enabling Lync media to bypass a VPN tunnel.](https://techcommunity.microsoft.com/t5/skype-for-business-blog/enabling-lync-media-to-bypass-a-vpn-tunnel/ba-p/620210)
+
+*Note. Although the title indicates Lync, it is applicable to Azure Communication Services and Teams.*
+
+**Packet shapers.** Packet snipping, packet inspection, or packet shaping devices are not supported and may degrade quality significantly.
+
+### Call flow principles
+
+There are four general principles that underpin Communication Services call flows:
+
+* **The first participant of a Communication Services group call determines the region in which the call is hosted**. There are exceptions to this rule in some topologies, which are described below.
+* **The media endpoint used to support a Communication Services call is selected based on media processing needs**, and is not affected by the number of participants on a call. For example, a point-to-point call may use a media endpoint in the cloud to process media for transcription or recording, while a call with two participants may not use any media endpoints. Group calls will use a media endpoint for mixing and routing purposes. This endpoint is selected based on the region in which the call is hosted. Media traffic sent from a client to the media endpoint may be routed directly or it may use a transport relay in Azure if customer network firewall restrictions require it.
+* **Media traffic for peer-to-peer calls takes the most direct route that's available**, assuming the call doesn't need a media endpoint in the cloud. The preferred route is direct to the remote peer (client). If a direct route isn't available, one or more transport relays will relay traffic. Media traffic shouldn't transverse servers that act like packet shapers, VPN servers, or other functions that might delay processing and degrade the end-user experience.
+* **Signaling traffic always goes to whatever server is closest to the user.**
+
+To learn more about the details on the media path that is chosen, refer to the [call flows conceptual documentation](./call-flows.md).
++
+## Call flows in various topologies
+
+### Communication Services (internet)
+
+This topology is used by customers that use Communication Services from the cloud without any on-premises deployment, such as SIP Interface. In this topology, traffic to and from Communication Services flows over the Internet.
++
+*Figure 1 - Communication Services topology*
+
+The direction of the arrows on the above diagram reflect the initiation direction of the communication that affects connectivity at the enterprise perimeters. In the case of UDP for media, the first packet(s) may flow in the reverse direction, but these packets may be blocked until packets in the other direction are flowing.
+
+Flow descriptions:
+* Flow 2* ΓÇô Represents a flow initiated by a user on the customer network to the Internet as a part of the user's Communication Services experience. Examples of these flows include DNS and peer-to-peer media transmission.
+* Flow 2 ΓÇô Represents a flow initiated by a remote mobile Communication Services user, with VPN to the customer network.
+* Flow 3 ΓÇô Represents a flow initiated by a remote mobile Communication Services user to Communication Services endpoints.
+* Flow 4 ΓÇô Represents a flow initiated by a user on the customer network to Communication Services.
+* Flow 5 ΓÇô Represents a peer-to-peer media flow between one Communication Services user and another within the customer network.
+* Flow 6 ΓÇô Represents a peer-to-peer media flow between a remote mobile Communication Services user and another remote mobile Communication Services user over the Internet.
+
+### Use case: One-to-one
+
+One-to-one calls use a common model in which the caller will obtain a set of candidates consisting of IP addresses/ports, including local, relay, and reflexive (public IP address of client as seen by the relay) candidates. The caller sends these candidates to the called party; the called party also obtains a similar set of candidates and sends them to the caller. STUN connectivity check messages are used to find which caller/called party media paths work, and the best working path is selected. Media (that is, RTP/RTCP packets secured using SRTP) are then sent using the selected candidate pair. The transport relay is deployed as part of Azure Communication Services.
+
+If the local IP address/port candidates or the reflexive candidates have connectivity, then the direct path between the clients (or using a NAT) will be selected for media. If the clients are both on the customer network, then the direct path should be selected. This requires direct UDP connectivity within the customer network. If the clients are both nomadic cloud users, then depending on the NAT/firewall, media may use direct connectivity.
+
+If one client is internal on the customer network and one client is external (for example, a mobile cloud user), then it's unlikely that direct connectivity between the local or reflexive candidates is working. In this case, an option is to use one of the transport relay candidates from either client (for example, the internal client obtained a relay candidate from the transport relay in Azure; the external client needs to be able to send STUN/RTP/RTCP packets to the transport relay). Another option is the internal client sends to the relay candidate obtained by the mobile cloud client. Although UDP connectivity for media is highly recommended, TCP is supported.
+
+**High-level steps:**
+1. Communication Services User A resolves URL domain name (DNS) using Flow 2.
+2. User A allocates a media relay port on the Teams transport relay using Flow 4.
+3. Communication Services User A sends an "invite" with ICE candidates using Flow 4 to Communication Services.
+4. Communication Services notifies User B using Flow 4.
+5. User B allocates a media relay port on Teams transport relay using Flow 4.
+6. User B sends "answer" with ICE candidates using Flow 4, which is forwarded back to User A using Flow 4.
+7. User A and User B invoke ICE connectivity tests and the best available media path is selected (see diagrams below for various use cases).
+8. Both users send telemetry to Communication Services using Flow 4.
+
+### Customer network (intranet)
++
+*Figure 2 - Within customer network*
+
+In Step 7, peer-to-peer media Flow 5 is selected.
+
+This media transmission is bidirectional. The direction of Flow 5 indicates that one side initiates the communication from a connectivity perspective. In this case, it doesn't matter which direction is used because both endpoints are within the customer network.
+
+### Customer network to external user (media relayed by Teams Transport Relay)
++
+*Figure 3 - Customer network to external user (media relayed by Azure Transport Relay)*
+
+In Step 7, Flow 4 (from the customer network to Communication Services) and Flow 3 (from a remote mobile Communication Services user to Azure Communication Services) are selected.
+
+These flows are relayed by Teams Transport Relay within Azure.
+
+This media transmission is bidirectional. The direction indicates which side initiates the communication from a connectivity perspective. In this case, these flows are used for signaling and media, using different transport protocols and addresses.
+
+### Customer network to external user (direct media)
++
+*Figure 4 - Customer network to external user (direct media)*
+
+In step 7, Flow 2 (from customer network to the client's peer over the Internet) is selected.
+
+Direct media with a remote mobile user (not relayed through Azure) is optional. In other words, you can block this path to enforce a media path through a transport relay in Azure.
+
+This media transmission is bidirectional. The direction of Flow 2 to remote mobile user indicates that one side initiates the communication from a connectivity perspective.
+
+### VPN user to internal user (media relayed by Teams Transport Relay)
++
+*Figure 5 - VPN user to internal user (media relayed by Azure Relay*
+
+Signaling between the VPN to the customer network uses Flow 2*. Signaling between the customer network and Azure uses Flow 4. However, media bypasses the VPN and is routed using Flows 3 and 4 through Azure Media Relay.
+
+### VPN user to internal user (direct media)
++
+*Figure 6 - VPN user to internal user (direct media)*
+
+Signaling between the VPN to the customer network uses Flow 2'. Signaling between the customer network and Azure is using flow 4. However, media bypasses the VPN and is routed using flow 2 from the customer network to the Internet.
+
+This media transmission is bidirectional. The direction of Flow 2 to the remote mobile user indicates that one side initiates the communication from a connectivity perspective.
+
+### VPN user to external user (direct media)
++
+*Figure 7 - VPN user to external user (direct media)*
+
+Signaling between the VPN user to the customer network uses Flow 2* and Flow 4 to Azure. However, media bypasses VPN and is routed using Flow 6.
+
+This media transmission is bidirectional. The direction of Flow 6 to the remote mobile user indicates that one side initiates the communication from a connectivity perspective.
+
+### Use Case: Communication Services client to PSTN through Communication Services Trunk
+
+Communication Services allows placing and receiving calls from the Public Switched Telephone Network (PSTN). If the PSTN trunk is connected using phone numbers provided by Communication Services, there are no special connectivity requirements for this use case. If you want to connect your own on-premises PSTN trunk to Azure Communication Services, you can use SIP Interface (available in CY2021).
++
+*Figure 8 ΓÇô Communication Services User to PSTN through Azure Trunk*
+
+In this case, signaling and media from the customer network to Azure use Flow 4.
+
+### Use case: Communication Services group calls
+
+The audio/video/screen sharing (VBSS) service is part of Azure Communication Services. It has a public IP address that must be reachable from the customer network and must be reachable from a Nomadic Cloud client. Each client/endpoint needs to be able to connect to the service.
+
+Internal clients will obtain local, reflexive, and relay candidates in the same manner as described for one-to-one calls. The clients will send these candidates to the service in an invite. The service does not use a relay since it has a publicly reachable IP address, so it responds with its local IP address candidate. The client and the service will check connectivity in the same manner described for one-to-one calls.
++
+*Figure 9 ΓÇô Communication Services Group Calls*
+
+## Next steps
+
+> [!div class="nextstepaction"]
+> [Get started with calling](../quickstarts/voice-video-calling/getting-started-with-calling.md)
+
+The following documents may be interesting to you:
+
+- Learn more about [call types](../concepts/voice-video-calling/about-call-types.md)
+- Learn about [Client-server architecture](./client-and-server-architecture.md)
+
communication-services https://docs.microsoft.com/en-us/azure/communication-services/concepts/known-issues https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/known-issues.md
@@ -0,0 +1,38 @@
+
+ Title: Azure Communication Services - Known issues
+description: Learn about known issues with Azure Communication Services
+++++ Last updated : 10/03/2020++++
+# Known issues: Azure Communication Services
+
+This article provides information about known issues associated with Azure Communication Services.
+
+## Video streaming quality on Chrome/Android
+
+Video streaming performance may be degraded on Chrome Android.
+
+### Possible causes
+The quality of remote streams depend on the resolution of the client-side renderer that was used to initiate that stream. When subscribing to a remote stream, a receiver will determine its own resolution based on the sender's client-side renderer dimensions.
+
+## Bluetooth headset microphones are not detected
+
+You may experience issues connecting your bluetooth headset to a Communication Services call.
+
+### Possible causes
+There isn't an option to select Bluetooth microphone on iOS.
++
+## Repeatedly switching video devices may cause video streaming to temporarily stop
+
+Switching between video devices may cause your video stream to pause while the stream is acquired from the selected device.
+
+### Possible causes
+Streaming from and switching between media devices is computationally intensive. Switching frequently can cause performance degradation. Developers are encouraged to stop one device stream before starting another.
communication-services https://docs.microsoft.com/en-us/azure/communication-services/quickstarts/voice-video-calling/includes/calling-sdk-android https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/voice-video-calling/includes/calling-sdk-android.md
@@ -106,6 +106,7 @@ call oneToOneCall = callAgent.call(appContext, participants, startCallOptions);
### Place a 1:n call with users and PSTN > [!WARNING] > Currently PSTN calling is not available+ To place a 1:n call to a user and a PSTN number you have to specify the phone number of callee. Your Communication Services resource must be configured to allow PSTN calling: ```java
communication-services https://docs.microsoft.com/en-us/azure/communication-services/quickstarts/voice-video-calling/includes/calling-sdk-js https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/voice-video-calling/includes/calling-sdk-js.md
@@ -19,10 +19,7 @@
Use the `npm install` command to install the Azure Communication Services Calling and Common client libraries for JavaScript. ```console
-npm install @azure/communication-common --save
- npm install @azure/communication-calling --save- ``` ## Object model
@@ -33,21 +30,22 @@ The following classes and interfaces handle some of the major features of the Az
| | - | | CallClient | The CallClient is the main entry point to the Calling client library. | | CallAgent | The CallAgent is used to start and manage calls. |
-| AzureCommunicationUserCredential | The AzureCommunicationUserCredential class implements the CommunicationUserCredential interface which is used to instantiate the CallAgent. |
+| DeviceManager | The DeviceManager is used to manage media devices |
+| AzureCommunicationTokenCredential | The AzureCommunicationTokenCredential class implements the CommunicationTokenCredential interface which is used to instantiate the CallAgent. |
## Initialize the CallClient, create CallAgent, and access DeviceManager Instantiate a new `CallClient` instance. You can configure it with custom options like a Logger instance. Once a `CallClient` is instantiated, you can create a `CallAgent` instance by calling the `createCallAgent` method on the `CallClient` instance. This asynchronously returns a `CallAgent` instance object.
-The `createCallAgent` method takes a `CommunicationUserCredential` as an argument, which accepts a [user access token](../../access-tokens.md).
+The `createCallAgent` method takes a `CommunicationTokenCredential` as an argument, which accepts a [user access token](https://docs.microsoft.com/azure/communication-services/quickstarts/access-tokens).
To access the `DeviceManager` a callAgent instance must first be created. You can then use the `getDeviceManager` method on the `CallClient` instance to get the DeviceManager. ```js const userToken = '<user token>'; callClient = new CallClient(options);
-const tokenCredential = new AzureCommunicationUserCredential(userToken);
-const callAgent = await callClient.createCallAgent(tokenCredential, { displayName: 'optional ACS user name' });
+const tokenCredential = new AzureCommunicationTokenCredential(userToken);
+const callAgent = await callClient.createCallAgent(tokenCredential, {displayName: 'optional ACS user name'});
const deviceManager = await callClient.getDeviceManager() ```
@@ -57,26 +55,31 @@ To create and start a call you need to use one of the APIs on CallAgent and prov
Call creation and start is synchronous. The Call instance allows you to subscribe to call events.
-## Place a 1:1 call to a user or a 1:n call with users and PSTN
+## Place a call
-To place a call to another Communication Services user, invoke the `call` method on `callAgent` and pass the CommunicationUser that you've [created with the Communication Services Administration library](../../access-tokens.md).
+### Place a 1:1 call to a user or PSTN
+To place a call to another Communication Services user, invoke the `call` method on `callAgent` and pass the callee's CommunicationUserIdentifier:
```js
-const oneToOneCall = callAgent.call([CommunicationUser]);
+const userCallee = { communicationUserId: '<ACS_USER_ID>' }
+const oneToOneCall = callAgent.call([userCallee]);
```
-### Place a 1:n call with users and PSTN
-
-To place a 1:n call to a user and a PSTN number you have to specify a CommunicationUser
-and a Phone Number for both callees.
-
+To place a call to a PSTN, invoke the `call` method on `callAgent` and pass the callee's PhoneNumberIdentifier.
Your Communication Services resource must be configured to allow PSTN calling.
+When calling a PSTN number, you must specify your alternate caller ID.
```js
+const pstnCalee = { phoneNumber: '<ACS_USER_ID>' }
+const alternateCallerId = {alternateCallerId: '<Alternate caller Id>'};
+const oneToOneCall = callAgent.call([pstnCallee], {alternateCallerId});
+```
-const userCallee = { communicationUserId: <ACS_USER_ID> };
+### Place a 1:n call with users and PSTN
+```js
+const userCallee = { communicationUserId: <ACS_USER_ID> }
const pstnCallee = { phoneNumber: <PHONE_NUMBER>};
-const groupCall = callAgent.call([userCallee, pstnCallee], placeCallOptions);
-
+const alternateCallerId = {alternateCallerId: '<Alternate caller Id>'};
+const groupCall = callAgent.call([userCallee, pstnCallee], {alternateCallerId});
``` ### Place a 1:1 call with video camera
@@ -85,9 +88,7 @@ const groupCall = callAgent.call([userCallee, pstnCallee], placeCallOptions);
To place a video call, you have to enumerate local cameras using the deviceManager `getCameraList` API. Once you select the desired camera, use it to construct a `LocalVideoStream` instance and pass it within `videoOptions` as an item within the `localVideoStream` array to the `call` method.
-Once your call connects it'll automatically start sending a video stream from the selected camera to the other participant(s).
-
-This also applies to the Call.accept() video options and CallAgent.join() video options.
+Once your call connects it'll automatically start sending a video stream from the selected camera to the other participant(s). This also applies to the Call.Accept() video options and CallAgent.join() video options.
```js const deviceManager = await callClient.getDeviceManager(); const videoDeviceInfo = deviceManager.getCameraList()[0];
@@ -97,27 +98,15 @@ const call = callAgent.call(['acsUserId'], placeCallOptions);
```
-### Receiving an incoming call
-```js
-callAgent.on('callsUpdated', e => {
- e.added.forEach(addedCall => {
- if(addedCall.isIncoming) {
- addedCall.accept();
- }
- });
-})
-```
- ### Join a group call To start a new group call or join an ongoing group call, use the 'join' method and pass an object with a `groupId` property. The value has to be a GUID. ```js
-const locator = { groupId: <GUID>}
-const call = callAgent.join(locator);
+const context = { groupId: <GUID>}
+const call = callAgent.join(context);
```- ### Join a Teams Meeting To join a Teams meeting, use 'join' method and pass a meeting link or a meeting's coordinates ```js
@@ -135,6 +124,24 @@ const locator = {
const call = callAgent.join(locator); ```
+## Receiving an incoming call
+
+The `CallAgent` instance emits an `incomingCall` event when the logged in identity is receiving an incoming call. To listen to this event, subscribe in the following way:
+
+```js
+const incomingCallHander = async (args: { incomingCall: IncomingCall }) => {
+ //accept the call
+ var call = await incomingCall.accept();
+
+ //reject the call
+ incomingCall.reject();
+};
+callAgentInstance.on('incomingCall', incomingCallHander);
+```
+
+The `incomingCall` event will provide with an instance of `IncomingCall` on which you can accept or reject a call.
++ ## Call Management You can access call properties and perform various operations during a call to manage settings related to video and audio.
@@ -152,10 +159,10 @@ const callId: string = call.id;
const remoteParticipants = call.remoteParticipants; ```
-* The identity of caller if the call is incoming. Identity is one of the `Identifier` types
+* The identity of caller if the call is incoming. Identity is one of the `CommunicationIdentifier` types
```js
-const callerIdentity = call.callerIdentity;
+const callerIdentity = call.callerInfo.identity;
```
@@ -174,9 +181,8 @@ This returns a string representing the current state of a call:
* 'Connected' - call is connected * 'Hold' - call is put on hold, no media is flowing between local endpoint and remote participant(s) * 'Disconnecting' - transition state before the call goes to 'Disconnected' state
-* 'Disconnected' - final call state.
- * If network connection is lost, state goes to 'Disconnected' after about 2 minutes.
-
+* 'Disconnected' - final call state
+ * If network connection is lost, state goes to 'Disconnected' after about 2 minutes.
* To see why a given call ended, inspect the `callEndReason` property. ```js
@@ -186,14 +192,10 @@ const callEndReason = call.callEndReason;
// callEndReason.subCode (number) subCode associated with the reason ```
-* To learn if the current call is an incoming call, inspect the `isIncoming` property, it returns `Boolean`.
+* To learn if the current call is an incoming or outgoing call, inspect the `direction` property, it returns `CallDirection`.
```js
-const isIncoming = call.isIncoming;
-```
-
-* To check if the call is being recorded, inspect the `isRecordingActive` property, it returns `Boolean`.
-```js
-const isResordingActive = call.isRecordingActive;
+const isIncoming = call.direction == 'Incoming';
+const isOutgoing = call.direction == 'Outgoing';
``` * To check if the current microphone is muted, inspect the `muted` property, it returns `Boolean`.
@@ -217,6 +219,18 @@ const localVideoStreams = call.localVideoStreams;
```
+### Call ended event
+
+The `Call` instance emits a `callEnded` event when the call ends. To listen to this event subscribe in the following way:
+
+```js
+const callEndHander = async (args: { callEndReason: CallEndReason }) => {
+ console.log(args.callEndReason)
+};
+
+call.on('callEnded', callEndHander);
+```
+ ### Mute and unmute To mute or unmute the local endpoint you can use the `mute` and `unmute` asynchronous APIs:
@@ -266,9 +280,6 @@ const source callClient.getDeviceManager().getCameraList()[1];
localVideoStream.switchSource(source); ```
-### FAQ
- * If network connectivity is lost, does the call state change to 'Disconnected' ?
- * Yes, if network connection is lost for more than 2 minutes, call will transition to Disconnected state and call will end.
## Remote participants management
@@ -285,17 +296,18 @@ call.remoteParticipants; // [remoteParticipant, remoteParticipant....]
### Remote participant properties Remote participant has a set of properties and collections associated with it-
-* Get the identifier for this remote participant.
-Identity is one of the 'Identifier' types:
+#### CommunicationIdentifier
+Get the identifier for this remote participant.
```js const identifier = remoteParticipant.identifier;
-//It can be one of:
-// { communicationUserId: '<ACS_USER_ID'> } - object representing ACS User
-// { phoneNumber: '<E.164>' } - object representing phone number in E.164 format
```
+It can be one of 'CommunicationIdentifier' types:
+ * { communicationUserId: '<ACS_USER_ID'> } - object representing ACS User
+ * { phoneNumber: '<E.164>' } - object representing phone number in E.164 format
+ * { microsoftTeamsUserId: '<TEAMS_USER_ID>', isAnonymous?: boolean; cloud?: "public" | "dod" | "gcch" } - object representing Teams user
-* Get state of this remote participant.
+#### State
+Get state of this remote participant.
```js const state = remoteParticipant.state;
@@ -306,30 +318,29 @@ State can be one of
* 'Connected' - participant is connected to the call * 'Hold' - participant is on hold * 'EarlyMedia' - announcement is played before participant is connected to the call
-* 'Disconnected' - final state - participant is disconnected from the call.
- * If remote participant loses their network connectivity, then remote participant state goes to 'Disconnected' after about 2 minutes.
+* 'Disconnected' - final state - participant is disconnected from the call
+ * If remote participant loses their network connectivity, then remote participant state goes to 'Disconnected' after about 2 minutes.
+#### Call End reason
To learn why participant left the call, inspect `callEndReason` property: ```js- const callEndReason = remoteParticipant.callEndReason; // callEndReason.code (number) code associated with the reason // callEndReason.subCode (number) subCode associated with the reason ```-
-* To check whether this remote participant is muted or not, inspect `isMuted` property, it returns `Boolean`
+#### Is Muted
+To check whether this remote participant is muted or not, inspect `isMuted` property, it returns `Boolean`
```js const isMuted = remoteParticipant.isMuted; ```-
-* To check whether this remote participant is speaking or not, inspect `isSpeaking` property it returns `Boolean`
+#### Is Speaking
+To check whether this remote participant is speaking or not, inspect `isSpeaking` property it returns `Boolean`
```js- const isSpeaking = remoteParticipant.isSpeaking;- ```
-* To inspect all video streams that a given participant is sending in this call, check `videoStreams` collection, it contains `RemoteVideoStream` objects
+#### Video Streams
+To inspect all video streams that a given participant is sending in this call, check `videoStreams` collection, it contains `RemoteVideoStream` objects
```js const videoStreams = remoteParticipant.videoStreams; // [RemoteVideoStream, ...]
@@ -345,9 +356,9 @@ This will synchronously return the remote participant instance.
```js const userIdentifier = { communicationUserId: <ACS_USER_ID> };
-const pstnIdentifier = { phoneNumber: <PHONE_NUMBER>};
+const pstnIdentifier = { phoneNumber: <PHONE_NUMBER>}
const remoteParticipant = call.addParticipant(userIdentifier);
-const remoteParticipant = call.addParticipant(pstnIdentifier);
+const remoteParticipant = call.addParticipant(pstnIdentifier, {alternateCallerId: '<Alternate Caller ID>'});
``` ### Remove participant from a call
@@ -359,7 +370,7 @@ The participant will also be removed from the `remoteParticipants` collection.
```js const userIdentifier = { communicationUserId: <ACS_USER_ID> };
-const pstnIdentifier = { phoneNumber: <PHONE_NUMBER>};
+const pstnIdentifier = { phoneNumber: <PHONE_NUMBER>}
await call.removeParticipant(userIdentifier); await call.removeParticipant(pstnIdentifier); ```
@@ -381,9 +392,8 @@ Whenever availability of a remote stream changes you can choose to destroy the w
or keep them, but this will result in displaying blank video frame. ```js
-let renderer: Renderer;
+let renderer: Renderer = new Renderer(remoteParticipantStream);
const displayVideo = () => {
- renderer = new Renderer(remoteParticipantStream);
const view = await renderer.createView(); htmlElement.appendChild(view.target); }
@@ -407,7 +417,7 @@ Remote video streams have the following properties:
const id: number = remoteVideoStream.id; ```
-* `StreamSize` - size (width/height) of a remote video stream
+* `StreamSize` - size ( width/height ) of a remote video stream
```js const size: {width: number; height: number} = remoteVideoStream.size; ```
@@ -451,9 +461,7 @@ You can later update the scaling mode by invoking the `updateScalingMode` method
```js view.updateScalingMode('Crop') ```
-### FAQ
-* If a remote participant loses their network connection, does their state change to 'Disconnected' ?
- * Yes, if a remote participant loses their network connection for more than 2 minutes, their state will transition to Disconnected and they will be removed from the call.
+ ## Device management `DeviceManager` lets you enumerate local devices that can be used in a call to transmit your audio/video streams. It also allows you to request permission from a user to access their microphone and camera using the native browser API.
@@ -496,7 +504,7 @@ If client defaults are not set, Communication Services will fall back to OS defa
const defaultMicrophone = deviceManager.getMicrophone(); // Set the microphone device to use.
-await deviceMicrophone.setMicrophone(AudioDeviceInfo);
+await deviceManager.setMicrophone(AudioDeviceInfo);
// Get the speaker device that is being used. const defaultSpeaker = deviceManager.getSpeaker();
@@ -543,6 +551,92 @@ console.log(result); // 'Granted' | 'Denied' | 'Prompt' | 'Unknown';
```
+## Call recording management
+
+Call recording is an extended feature of the core `Call` API. You first need to obtain the recording feature API object:
+
+```js
+const callRecordingApi = call.api(Features.Recording);
+```
+
+Then, to can check if the call is being recorded, inspect the `isRecordingActive` property of `callRecordingApi`, it returns `Boolean`.
+
+```js
+const isResordingActive = callRecordingApi.isRecordingActive;
+```
+
+You can also subscribe to recording changes:
+
+```js
+const isRecordingActiveChangedHandler = () => {
+ console.log(callRecordingApi.isRecordingActive);
+};
+
+callRecordingApi.on('isRecordingActiveChanged', isRecordingActiveChangedHandler);
+
+```
+
+## Call Transfer management
+
+Call transfer is an extended feature of the core `Call` API. You first need to obtain the transfer feature API object:
+
+```js
+const callTransferApi = call.api(Features.Transfer);
+```
+
+Call transfer involves three parties *transferor*, *transferee*, and *transfer target*. Transfer flow is working as following:
+
+1. There is already a connected call between *transferor* and *transferee*
+2. *transferor* decide to transfer the call (*transferee* -> *transfer target*)
+3. *transferor* call `transfer` API
+4. *transferee* decide to whether `accept` or `reject` the transfer request to *transfer target* via `transferRequested` event.
+5. *transfer target* will receive an incoming call only if *transferee* did `accept` the transfer request
+
+### Transfer terminology
+
+- Transferor - The one who initiates the transfer request
+- Transferee - The one who is being transferred by the transferor to the transfer target
+- Transfer target - The one who is the target that is being transferred to
+
+To transfer current call, you can use `transfer` synchronous API. `transfer` takes optional `TransferCallOptions` which allows you to set `disableForwardingAndUnanswered` flag:
+
+- `disableForwardingAndUnanswered` = false - if *transfer target* doesn't answer the transfer call, then it will follow the *transfer target* forwarding and unanswered settings
+- `disableForwardingAndUnanswered` = true - if *transfer target* doesn't answer the transfer call, then the transfer attempt will end
+
+```js
+// transfer target can be ACS user
+const id = { communicationUserId: <ACS_USER_ID> };
+```
+
+```js
+// call transfer API
+const transfer = callTransferApi.transfer({targetParticipant: id});
+```
+
+Transfer allows you to subscribe to `transferStateChanged` and `transferRequested` events. `transferRequsted` event comes from `call` instance, `transferStateChanged` event and transfer `state` and `error` comes from `transfer` instance
+
+```js
+// transfer state
+const transferState = transfer.state; // None | Transferring | Transferred | Failed
+
+// to check the transfer failure reason
+const transferError = transfer.error; // transfer error code that describes the failure if transfer request failed
+```
+
+Transferee can accept or reject the transfer request initiated by transferor in `transferRequested` event via `accept()` or `reject()` in `transferRequestedEventArgs`. You can access `targetParticipant` information, `accept`, `reject` methods in `transferRequestedEventArgs`.
+
+```js
+// Transferee to accept the transfer request
+callTransferApi.on('transferRequested', args => {
+ args.accept();
+});
+
+// Transferee to reject the transfer request
+callTransferApi.on('transferRequested', args => {
+ args.reject();
+});
+```
+ ## Eventing model You can subscribe to most of the properties and collections to be notified when values change.
communication-services https://docs.microsoft.com/en-us/azure/communication-services/samples/web-calling-sample https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/samples/web-calling-sample.md
@@ -14,34 +14,26 @@
# Get started with the web calling sample -
-> [!IMPORTANT]
-> [This sample is available on Github.](https://github.com/Azure-Samples/communication-services-web-calling-tutorial/).
-
-The Azure Communication Services **web calling sample** demonstrates how the Communication Services calling client library can be used to build a calling experience with JavaScript.
+The web calling sample is a web application that serves as a step-by-step walkthrough of the various capabilities provided by the Communication Services web calling client library.
-In this Sample quickstart, we'll learn how the sample works before we run the sample on your local machine. We'll then deploy the sample to Azure using your own Azure Communication Services resources.
+This sample was built for developers and makes it very easy for you to get started with Communication Services. Its user interface is divided into multiple sections, each featuring a "Show code" button that allows you to copy code directly from your browser into your own Communication Services application.
-## Overview
+## Get started with the web calling sample
-The web calling sample is a web application that serves as a step-by-step walkthrough of the various capabilities provided by the Communication Services web calling client library.
-This sample was built for developers and makes it very easy for you to get started with Communication Services. Its user interface is divided into multiple sections, each featuring a "Show code" button that allows you to copy code directly from your browser into your own Communication Services application.
+> [!IMPORTANT]
+> [This sample is available on Github.](https://github.com/Azure-Samples/communication-services-web-calling-tutorial/).
-When the [web calling sample](https://github.com/Azure-Samples/communication-services-web-calling-tutorial) is running on your machine, you'll see the following landing page:
+Follow the /Project/readme.md to set up the project and run it locally on your machine.
+Once the [web calling sample](https://github.com/Azure-Samples/communication-services-web-calling-tutorial) is running on your machine, you'll see the following landing page:
:::image type="content" source="./media/web-calling-tutorial-page-1.png" alt-text="Web calling tutorial 1" lightbox="./media/web-calling-tutorial-page-1.png"::: :::image type="content" source="./media/web-calling-tutorial-page-2.png" alt-text="Web calling tutorial 2" lightbox="./media/web-calling-tutorial-page-2.png"::: - ## User provisioning and SDK initialization
-To begin using the demo, enter the connection string from your [Communication Services resource](../quickstarts/create-communication-resource.md) into `config.json`. This will be used to provision a [user access token](../concepts/authentication.md) so that your calling SDK can be initialized.
-
-Enter your own personal identifier in the User Identity input. If nothing is provided here, then a random user identity will be generated.
- Click on the "Provisioning user and initialize SDK" to initialize your SDK using a token provisioned by the backend token provisioning service. This backend service is in `/project/webpack.config.js`. Click on the "Show code" button to see the sample code that you can use in your own solution.
@@ -100,4 +92,4 @@ For more information, see the following articles:
- [Redux](https://redux.js.org/) - Client-side state management - [FluentUI](https://aka.ms/fluent-ui) - Microsoft powered UI library - [React](https://reactjs.org/) - Library for building user interfaces-- [ASP.NET Core](/aspnet/core/introduction-to-aspnet-core?preserve-view=true&view=aspnetcore-3.1) - Framework for building web applications
+- [ASP.NET Core](/aspnet/core/introduction-to-aspnet-core?preserve-view=true&view=aspnetcore-3.1) - Framework for building web applications
communication-services https://docs.microsoft.com/en-us/azure/communication-services/tutorials/hmac-header-tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/tutorials/hmac-header-tutorial.md
@@ -0,0 +1,37 @@
+
+ Title: How to sign an HTTP request C#
+
+description: Learn how to Sign an HTTP request Communication services via C#
+++++ Last updated : 01/15/2021++++
+# Sign an HTTP request
+
+In this tutorial, you'll learn how to sign an HTTP request with an HMAC signature.
+++
+## Clean up resources
+
+If you want to clean up and remove a Communication Services subscription, you can delete the resource or resource group. Deleting the resource group also deletes any other resources associated with it. You can find out more about [cleaning up Azure Communication Service resources](../quickstarts/create-communication-resource.md#clean-up-resources) and [cleaning Azure Function Resources](../../azure-functions/create-first-function-vs-code-csharp.md#clean-up-resources).
+
+## Next steps
+
+> [!div class="nextstepaction"]
+> [Add voice calling to your app](../quickstarts/voice-video-calling/getting-started-with-calling.md)
+
+You may also want to:
+
+- [Add chat to your app](../quickstarts/chat/get-started.md)
+- [Creating user access tokens](../quickstarts/access-tokens.md)
+- [Learn about client and server architecture](../concepts/client-and-server-architecture.md)
+- [Learn about authentication](../concepts/authentication.md)
communication-services https://docs.microsoft.com/en-us/azure/communication-services/tutorials/includes/hmac-header-csharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/tutorials/includes/hmac-header-csharp.md
@@ -0,0 +1,189 @@
+
+ Title: Sign an HTTP request with C#
+description: This is the C# version of signing an HTTP request with an HMAC signature for Communication Services.
+++++ Last updated : 01/15/2021+++
+## Prerequisites
+
+Before you get started, make sure to:
+- Create an Azure account with an active subscription. For details, see [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- Install [Visual Studio](https://visualstudio.microsoft.com/downloads/)
+- Create an Azure Communication Services resource. For details, see [Create an Azure Communication Resource](../../quickstarts/create-communication-resource.md).You'll need to record your **resourceEndpoint** and **resourceAccessKey** for this tutorial.
+++
+## Sign an HTTP request with C#
+Access key authentication uses a shared secret key to generate an HMAC signature for each HTTP request. This signature is generated with the SHA256 algorithm, and is sent in the `Authorization` header using the `HMAC-SHA256` scheme. For example:
+
+```
+Authorization: "HMAC-SHA256 SignedHeaders=date;host;x-ms-content-sha256&Signature=<hmac-sha256-signature>"
+```
+
+The `hmac-sha256-signature` consists of:
+
+- HTTP Verb (e.g. `GET` or `PUT`)
+- HTTP request path
+- Date
+- Host
+- x-ms-content-sha256
+
+## Setting up
+The following steps describe how to construct the Authorization header:
+
+### Create a new C# application
+
+In a console window (such as cmd, PowerShell, or Bash), use the `dotnet new` command to create a new console app with the name `SignHmacTutorial`. This command creates a simple "Hello World" C# project with a single source file: **Program.cs**.
+
+```console
+dotnet new console -o SignHmacTutorial
+```
+
+Change your directory to the newly created app folder and use the `dotnet build` command to compile your application.
+
+```console
+cd SignHmacTutorial
+dotnet build
+```
+
+## Install the package
+
+Install the package `Newtonsoft.Json`, used for body serialization:
+
+```console
+dotnet add package Newtonsoft.Json
+```
+
+Update the `Main` method declaration to support async code. Use the following code to begin:
+
+```csharp
+using System;
+using System.Globalization;
+using System.Net.Http;
+using System.Security.Cryptography;
+using System.Text;
+using System.Threading.Tasks;
+using Newtonsoft.Json;
+namespace SignHmacTutorial
+{
+ class Program
+ {
+ static async Task Main(string[] args)
+ {
+ Console.WriteLine("Azure Communication Services - Sign an HTTP request Tutorial");
+ // Tutorial code goes here
+ }
+ }
+}
+
+```
+## Create a request message
+
+For this example we'll sign a request to create a new identity using the Communication Services Authentication API (version `2020-07-20-preview2`)
+
+Add following code to the `Main` method:
+
+```csharp
+string resourceEndpoint = "resourceEndpoint";
+//Create an uri you are going to call
+var requestUri = new Uri($"{resourceEndpoint}/identities?api-version=2020-07-20-preview2");
+//Endpoint identities?api-version=2020-07-20-preview2 accepts list of scopes as a body
+var body = new[] { "chat" };
+var serializedBody = JsonConvert.SerializeObject(body);
+var requestMessage = new HttpRequestMessage(HttpMethod.Post, requestUri)
+{
+ Content = new StringContent(serializedBody, Encoding.UTF8)
+};
+```
+
+Replace `resourceEndpoint` with your real resource endpoint value.
+
+## Create content hash
+
+The content hash is a part of your HMAC signature. Use the following code to compute the content hash. You can add this method to `Progam.cs` under the `Main` method.
+
+```csharp
+static string ComputeContentHash(string content)
+{
+ using (var sha256 = SHA256.Create())
+ {
+ byte[] hashedBytes = sha256.ComputeHash(Encoding.UTF8.GetBytes(content));
+ return Convert.ToBase64String(hashedBytes);
+ }
+}
+```
+
+## Compute a signature
+Use the following code to create a method for computing a your HMAC signature.
+
+```csharp
+ static string ComputeSignature(string stringToSign)
+{
+ string secret = "resourceAccessKey";
+ using (var hmacsha256 = new HMACSHA256(Convert.FromBase64String(secret)))
+ {
+ var bytes = Encoding.ASCII.GetBytes(stringToSign);
+ var hashedBytes = hmacsha256.ComputeHash(bytes);
+ return Convert.ToBase64String(hashedBytes);
+ }
+}
+```
+
+Replace `resourceAccessKey` with access key of your real Azure Communication Services resource.
+
+## Create an authorization header string
+
+We'll now construct the string we'll add to our authorization header:
+
+1. Compute a content hash
+2. Specify the Coordinated Universal Time (UTC) timestamp
+3. Prepare a string to sign
+4. Compute the signature
+5. Concatenate the string, which will be used in authorization header
+
+Add following code to the `Main` method:
+
+```csharp
+// Compute a content hash
+var contentHash = ComputeContentHash(serializedBody);
+//Specify the Coordinated Universal Time (UTC) timestamp
+var date = DateTimeOffset.UtcNow.ToString("r", CultureInfo.InvariantCulture);
+//Prepare a string to sign
+var stringToSign = $"POST\n{requestUri.PathAndQuery}\n{date};{requestUri.Authority};{contentHash}";
+//Compute the signature
+var signature = ComputeSignature(stringToSign);
+//Concatenate the string, which will be used in authorization header
+var authorizationHeader = $"HMAC-SHA256 SignedHeaders=date;host;x-ms-content-sha256&Signature={signature}";
+```
+
+## Add headers to requestMessage
+
+Use the following code to add the required headers to your `requestMessage`:
+
+```csharp
+//Add content hash header
+requestMessage.Headers.Add("x-ms-content-sha256", contentHash);
+//add date header
+requestMessage.Headers.Add("Date", date);
+//add Authorization header
+requestMessage.Headers.Add("Authorization", authorizationHeader);
+```
+
+## Test the client
+Call the endpoint using `HttpClient` and check the response.
+
+```csharp
+HttpClient httpClient = new HttpClient
+{
+ BaseAddress = requestUri
+};
+var response = await httpClient.SendAsync(requestMessage);
+var responseString = await response.Content.ReadAsStringAsync();
+Console.WriteLine(responseString);
+```
cosmos-db https://docs.microsoft.com/en-us/azure/cosmos-db/account-databases-containers-items https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/account-databases-containers-items.md
@@ -15,13 +15,13 @@
Azure Cosmos DB is a fully managed platform-as-a-service (PaaS). To begin using Azure Cosmos DB, you should initially create an Azure Cosmos account in your Azure subscription and databases, containers, items under it. This article describes the Azure Cosmos DB resource model and different entities in the resource model hierarchy.
-The Azure Cosmos account is the fundamental unit of global distribution and high availability. Your Azure Cosmos account contains a unique DNS name and you can manage an account by using Azure portal, Azure CLI or by using different language-specific SDKs. For more information, see [how to manage your Azure Cosmos account](how-to-manage-database-account.md). For globally distributing your data and throughput across multiple Azure regions, you can add and remove Azure regions to your account at any time. You can configure your account to have either a single region or multiple write regions. For more information, see [how to add and remove Azure regions to your account](how-to-manage-database-account.md). You can configure the [default consistency](consistency-levels.md) level on an account.
+The Azure Cosmos account is the fundamental unit of global distribution and high availability. Your Azure Cosmos account contains a unique DNS name and you can manage an account by using the Azure portal or the Azure CLI, or by using different language-specific SDKs. For more information, see [how to manage your Azure Cosmos account](how-to-manage-database-account.md). For globally distributing your data and throughput across multiple Azure regions, you can add and remove Azure regions to your account at any time. You can configure your account to have either a single region or multiple write regions. For more information, see [how to add and remove Azure regions to your account](how-to-manage-database-account.md). You can configure the [default consistency](consistency-levels.md) level on an account.
## Elements in an Azure Cosmos account
-Azure Cosmos container is the fundamental unit of scalability. You can virtually have an unlimited provisioned throughput (RU/s) and storage on a container. Azure Cosmos DB transparently partitions your container using the logical partition key that you specify in order to elastically scale your provisioned throughput and storage.
+An Azure Cosmos container is the fundamental unit of scalability. You can virtually have an unlimited provisioned throughput (RU/s) and storage on a container. Azure Cosmos DB transparently partitions your container using the logical partition key that you specify in order to elastically scale your provisioned throughput and storage.
-Currently, you can create a maximum of 50 Azure Cosmos accounts under an Azure subscription (this is a soft limit that can be increased via support request). A single Azure Cosmos account can virtually manage unlimited amount of data and provisioned throughput. To manage your data and provisioned throughput, you can create one or more Azure Cosmos databases under your account and within that database, you can create one or more containers. The following image shows the hierarchy of elements in an Azure Cosmos account:
+Currently, you can create a maximum of 50 Azure Cosmos accounts under an Azure subscription (this is a soft limit that can be increased via support request). A single Azure Cosmos account can virtually manage an unlimited amount of data and provisioned throughput. To manage your data and provisioned throughput, you can create one or more Azure Cosmos databases under your account and within that database, you can create one or more containers. The following image shows the hierarchy of elements in an Azure Cosmos account:
:::image type="content" source="./media/account-databases-containers-items/hierarchy.png" alt-text="Hierarchy of an Azure Cosmos account" border="false":::
cosmos-db https://docs.microsoft.com/en-us/azure/cosmos-db/high-availability https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/high-availability.md
@@ -4,7 +4,7 @@ description: This article describes how Azure Cosmos DB provides high availabili
Previously updated : 01/18/2021 Last updated : 02/05/2021
@@ -76,7 +76,7 @@ For the rare cases of regional outage, Azure Cosmos DB makes sure your database
* During a read region outage, Azure Cosmos accounts using any consistency level or strong consistency with three or more read regions will remain highly available for reads and writes.
-* Azure Cosmos accounts using strong consistency with two or fewer read regions (which includes the read & write region) will lose read write availability during a read region outage.
+* Azure Cosmos accounts using strong consistency with three or fewer total regions (one write, two read) will lose write availability during a read region outage. However, customers with four or more total regions can opt-in to using dynamic read quorums by submitting a support ticket. Accounts which maintain at least two read regions in this configuration will maintain write availability.
* The impacted region is automatically disconnected and will be marked offline. The [Azure Cosmos DB SDKs](sql-api-sdk-dotnet.md) will redirect read calls to the next available region in the preferred region list.
cosmos-db https://docs.microsoft.com/en-us/azure/cosmos-db/partitioning-overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/partitioning-overview.md
@@ -38,7 +38,7 @@ The number of physical partitions in your container depends on the following:
* The total data storage (each individual physical partition can store up to 50GB data). > [!NOTE]
-> Physical partitions are an internal implementation of the system and they are entirely managed by Azure Cosmos DB. When developing your solutions, don't focus on physical partitions because you can't control them instead focus on your partition keys. If you choose a partition key that evenly distributes throughput consumption across logical partitions, you will ensure that throughput consumption across physical partitions is balanced.
+> Physical partitions are an internal implementation of the system and they are entirely managed by Azure Cosmos DB. When developing your solutions, don't focus on physical partitions because you can't control them. Instead, focus on your partition keys. If you choose a partition key that evenly distributes throughput consumption across logical partitions, you will ensure that throughput consumption across physical partitions is balanced.
There is no limit to the total number of physical partitions in your container. As your provisioned throughput or data size grows, Azure Cosmos DB will automatically create new physical partitions by splitting existing ones. Physical partition splits do not impact your application's availability. After the physical partition split, all data within a single logical partition will still be stored on the same physical partition. A physical partition split simply creates a new mapping of logical partitions to physical partitions.
@@ -66,7 +66,7 @@ You can learn more about [how Azure Cosmos DB manages partitions](partitioning-o
Each physical partition consists of a set of replicas, also referred to as a [*replica set*](global-dist-under-the-hood.md). Each replica set hosts an instance of the database engine. A replica set makes the data stored within the physical partition durable, highly available, and consistent. Each replica that makes up the physical partition inherits the partition's storage quota. All replicas of a physical partition collectively support the throughput that's allocated to the physical partition. Azure Cosmos DB automatically manages replica sets.
-Typically smaller containers only require a single physical partition but they will still have at least 4 replicas.
+Typically, smaller containers only require a single physical partition, but they will still have at least 4 replicas.
The following image shows how logical partitions are mapped to physical partitions that are distributed globally:
cosmos-db https://docs.microsoft.com/en-us/azure/cosmos-db/synapse-link-frequently-asked-questions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/synapse-link-frequently-asked-questions.md
@@ -105,9 +105,9 @@ Currently Terraform doesnΓÇÖt support analytical store containers. Please check
## Analytical Time to live (TTL)
-### Is TTl for analytical data supported at both container and item level?
+### Is TTL for analytical data supported at both container and item level?
-At this time, TTl for analytical data can only be configured at container level and there is no support to set analytical TTL at item level.
+At this time, TTL for analytical data can only be configured at container level and there is no support to set analytical TTL at item level.
### After setting the container level analytical TTL on an Azure Cosmos DB container, can I change to a different value later?
cost-management-billing https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/understand-mca-roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/understand-mca-roles.md
@@ -5,7 +5,7 @@
Previously updated : 12/17/2020 Last updated : 02/05/2021
@@ -95,6 +95,8 @@ The following tables show what role you need to complete tasks in the context of
|Update billing profile properties |Γ£ö|Γ£ö|Γ£ÿ|Γ£ÿ|Γ£ÿ|Γ£ÿ|Γ£ÿ| |View policies applied on the billing profile like enable Azure reservation purchases, enable Azure marketplace purchases, and more|Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö|Γ£ö| |Apply policies on the billing profile |Γ£ö|Γ£ö|Γ£ÿ|Γ£ÿ|Γ£ÿ|Γ£ÿ|Γ£ÿ|
+|Manage reservation orders |Γ£ö|Γ£ö|Γ£ÿ|Γ£ÿ|Γ£ÿ|Γ£ÿ|Γ£ÿ|
+|View reservation orders |Γ£ö|Γ£ö|Γ£ö|Γ£ÿ|Γ£ÿ|Γ£ÿ|Γ£ÿ|
### Manage invoices for billing profile
data-factory https://docs.microsoft.com/en-us/azure/data-factory/concepts-data-flow-expression-builder https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-expression-builder.md
@@ -6,7 +6,7 @@
Previously updated : 10/30/2020 Last updated : 02/04/2021 # Build expressions in mapping data flow
@@ -102,6 +102,9 @@ Some examples of string interpolation:
* ```"{:playerName} is a {:playerRating} player"```
+> [!NOTE]
+> When using string interpolation syntax in SQL source queries, the query string must be on one single line, without '/n'.
+ ## Commenting expressions Add comments to your expressions by using single-line and multiline comment syntax.
@@ -167,4 +170,4 @@ toLong(
## Next steps
-[Begin building data transformation expressions](data-flow-expression-functions.md)
+[Begin building data transformation expressions](data-flow-expression-functions.md)
data-factory https://docs.microsoft.com/en-us/azure/data-factory/data-flow-expression-functions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-expression-functions.md
@@ -6,7 +6,7 @@
Previously updated : 01/06/2021 Last updated : 02/04/2021 # Data transformation expressions in mapping data flow
@@ -72,6 +72,54 @@ ___
Returns the angle in radians between the positive x-axis of a plane and the point given by the coordinates. * ``atan2(0, 0) -> 0.0`` ___
+### <code>between</code>
+<code><b>between(<i>&lt;value1&gt;</i> : any, <i>&lt;value2&gt;</i> : any, <i>&lt;value3&gt;</i> : any) => boolean</b></code><br/><br/>
+Checks if the first value is in between two other values inclusively. Numeric, string and datetime values can be compared
+* ``between(10, 5, 24)``
+* ``true``
+* ``between(currentDate(), currentDate() + 10, currentDate() + 20)``
+* ``false``
+___
+### <code>bitwiseAnd</code>
+<code><b>bitwiseAnd(<i>&lt;value1&gt;</i> : integral, <i>&lt;value2&gt;</i> : integral) => integral</b></code><br/><br/>
+Bitwise And operator across integral types. Same as & operator
+* ``bitwiseAnd(0xf4, 0xef)``
+* ``0xe4``
+* ``(0xf4 & 0xef)``
+* ``0xe4``
+___
+### <code>bitwiseOr</code>
+<code><b>bitwiseOr(<i>&lt;value1&gt;</i> : integral, <i>&lt;value2&gt;</i> : integral) => integral</b></code><br/><br/>
+Bitwise Or operator across integral types. Same as | operator
+* ``bitwiseOr(0xf4, 0xef)``
+* ``0xff``
+* ``(0xf4 | 0xef)``
+* ``0xff``
+___
+### <code>bitwiseXor</code>
+<code><b>bitwiseXor(<i>&lt;value1&gt;</i> : any, <i>&lt;value2&gt;</i> : any) => any</b></code><br/><br/>
+Bitwise Or operator across integral types. Same as | operator
+* ``bitwiseXor(0xf4, 0xef)``
+* ``0x1b``
+* ``(0xf4 ^ 0xef)``
+* ``0x1b``
+* ``(true ^ false)``
+* ``true``
+* ``(true ^ true)``
+* ``false``
+___
+### <code>blake2b</code>
+<code><b>blake2b(<i>&lt;value1&gt;</i> : integer, <i>&lt;value2&gt;</i> : any, ...) => string</b></code><br/><br/>
+Calculates the Blake2 digest of set of column of varying primitive datatypes given a bit length which can only be multiples of 8 between 8 & 512. It can be used to calculate a fingerprint for a row
+* ``blake2b(256, 'gunchus', 8.2, 'bojjus', true, toDate('2010-4-4'))``
+* ``'c9521a5080d8da30dffb430c50ce253c345cc4c4effc315dab2162dac974711d'``
+___
+### <code>blake2bBinary</code>
+<code><b>blake2bBinary(<i>&lt;value1&gt;</i> : integer, <i>&lt;value2&gt;</i> : any, ...) => binary</b></code><br/><br/>
+Calculates the Blake2 digest of set of column of varying primitive datatypes given a bit length which can only be multiples of 8 between 8 & 512. It can be used to calculate a fingerprint for a row
+* ``blake2bBinary(256, 'gunchus', 8.2, 'bojjus', true, toDate('2010-4-4'))``
+* ``unHex('c9521a5080d8da30dffb430c50ce253c345cc4c4effc315dab2162dac974711d')``
+___
### <code>case</code> <code><b>case(<i>&lt;condition&gt;</i> : boolean, <i>&lt;true_expression&gt;</i> : any, <i>&lt;false_expression&gt;</i> : any, ...) => any</b></code><br/><br/> Based on alternating conditions applies one value or the other. If the number of inputs are even, the other is defaulted to NULL for last condition.
@@ -227,6 +275,10 @@ Comparison equals operator ignoring case. Same as <=> operator.
* ``'abc'<=>'Abc' -> true`` * ``equalsIgnoreCase('abc', 'Abc') -> true`` ___
+### <code>escape</code>
+<code><b>escape(<i>&lt;string_to_escape&gt;</i> : string, <i>&lt;format&gt;</i> : string) => string</b></code><br/><br/>
+Escapes a string according to a format. Literal values for acceptable format are 'json', 'xml', 'ecmascript', 'html', 'java'.
+___
### <code>factorial</code> <code><b>factorial(<i>&lt;value1&gt;</i> : number) => long</b></code><br/><br/> Calculates the factorial of a number.
@@ -766,6 +818,12 @@ Matches the type of the column. Can only be used in pattern expressions.number m
* ``typeMatch(type, 'number')`` * ``typeMatch('date', 'datetime')`` ___
+### <code>unescape</code>
+<code><b>unescape(<i>&lt;string_to_escape&gt;</i> : string, <i>&lt;format&gt;</i> : string) => string</b></code><br/><br/>
+Unescapes a string according to a format. Literal values for acceptable format are 'json', 'xml', 'ecmascript', 'html', 'java'.
+* ```unescape('{\\\\\"value\\\\\": 10}', 'json')```
+* ```'{\\\"value\\\": 10}'```
+___
### <code>upper</code> <code><b>upper(<i>&lt;value1&gt;</i> : string) => string</b></code><br/><br/> Uppercases a string.
@@ -1085,6 +1143,30 @@ Sorts the array using the provided predicate function. Sort expects a reference
* ``sort([4, 8, 2, 3], compare(#item1, #item2)) -> [2, 3, 4, 8]`` * ``sort(['a3', 'b2', 'c1'], iif(right(#item1, 1) >= right(#item2, 1), 1, -1)) -> ['c1', 'b2', 'a3']``
+## Cached lookup functions
+The following functions are only available when using a cached lookup when you've included a cached sink.
+___
+### <code>lookup</code>
+<code><b>lookup(key, key2, ...) => complex[]</b></code><br/><br/>
+Looks up the first row from the cached sink using the specified keys that match the keys from the cached sink.
+* ``cacheSink#lookup(movieId)``
+___
+### <code>mlookup</code>
+<code><b>mlookup(key, key2, ...) => complex[]</b></code><br/><br/>
+Looks up the all matching rows from the cached sink using the specified keys that match the keys from the cached sink.
+* ``cacheSink#mlookup(movieId)``
+___
+### <code>output</code>
+<code><b>output() => any</b></code><br/><br/>
+Returns the first row of the results of the cache sink
+* ``cacheSink#output()``
+___
+### <code>outputs</code>
+<code><b>output() => any</b></code><br/><br/>
+Returns the entire output row set of the results of the cache sink
+* ``cacheSink#outputs()``
+___
+ ## Conversion functions
@@ -1247,30 +1329,6 @@ Selects a column value by its relative position(1 based) in the stream. If the p
* ``toString(byName($colName))`` * ``toString(byPosition(1234))``
-## Cached lookup functions
-The following functions are only available when using a cached lookup when you've included a cached sink.
-___
-### <code>lookup</code>
-<code><b>lookup(key, key2, ...) => complex[]</b></code><br/><br/>
-Looks up the first row from the cached sink using the specified keys that match the keys from the cached sink.
-* ``cacheSink#lookup(movieId)``
-___
-### <code>mlookup</code>
-<code><b>mlookup(key, key2, ...) => complex[]</b></code><br/><br/>
-Looks up the all matching rows from the cached sink using the specified keys that match the keys from the cached sink.
-* ``cacheSink#mlookup(movieId)``
-___
-### <code>output</code>
-<code><b>output() => any</b></code><br/><br/>
-Returns the first row of the results of the cache sink
-* ``cacheSink#output()``
-___
-### <code>outputs</code>
-<code><b>output() => any</b></code><br/><br/>
-Returns the entire output row set of the results of the cache sink
-* ``cacheSink#outputs()``
-___
- ## Window functions The following functions are only available in window transformations. ___
data-factory https://docs.microsoft.com/en-us/azure/data-factory/format-common-data-model https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-common-data-model.md
@@ -5,7 +5,7 @@
Previously updated : 12/07/2020 Last updated : 02/04/2021
@@ -80,6 +80,7 @@ When mapping data flow columns to entity properties in the Sink transformation,
2. Find the partitions.Location property 3. Change "blob.core.windows.net" to "dfs.core.windows.net" 4. Fix any "%2F" encoding in the URL to "/"
+5. If using ADF Data Flows, Special characters in the partition file path must be replaced with alpha-numeric values, or switch to Synapse Data Flows
### CDM source data flow script example
data-factory https://docs.microsoft.com/en-us/azure/data-factory/security-and-access-control-troubleshoot-guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/security-and-access-control-troubleshoot-guide.md
@@ -5,7 +5,7 @@
Previously updated : 01/05/2021 Last updated : 02/04/2021
@@ -83,9 +83,10 @@ To verify whether the Data Factory fully qualified domain name (FQDN) is resolve
#### Resolution To resolve the issue, do the following:-- Refer to the [Azure Private Link for Azure Data Factory](./data-factory-private-link.md#dns-changes-for-private-endpoints) article. The instruction is for configuring the private DNS zone or server to resolve the Data Factory FQDN to a private IP address. -- We recommend using a custom DNS as the long-term solution. However, if you don't want to configure the private DNS zone or server, try the following temporary solution:
+- As option, we would like to suggest you to manually add a "Virtual Network link" under the Data Factory "Private link DNS Zone". For details, refer to the [Azure Private Link for Azure Data Factory](./data-factory-private-link.md#dns-changes-for-private-endpoints) article. The instruction is for configuring the private DNS zone or custom DNS server to resolve the Data Factory FQDN to a private IP address.
+
+- However, if you don't want to configure the private DNS zone or custom DNS server, try the following temporary solution:
1. Change the host file in Windows, and map the private IP (the Azure Data Factory private endpoint) to the Azure Data Factory FQDN.
databox-online https://docs.microsoft.com/en-us/azure/databox-online/azure-stack-edge-gpu-deploy-configure-network-compute-web-proxy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-deploy-configure-network-compute-web-proxy.md
@@ -7,7 +7,7 @@
Previously updated : 01/27/2021 Last updated : 02/04/2021 Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro so I can use it to transfer data to Azure.
@@ -51,15 +51,13 @@ Follow these steps to configure the network for your device.
![Local web UI "Network settings" page](./media/azure-stack-edge-gpu-deploy-configure-network-compute-web-proxy/network-2a.png) -
-
3. To change the network settings, select a port and in the right pane that appears, modify the IP address, subnet, gateway, primary DNS, and secondary DNS. - If you select Port 1, you can see that it is preconfigured as static. ![Local web UI "Port 1 Network settings"](./media/azure-stack-edge-gpu-deploy-configure-network-compute-web-proxy/network-3.png)
- - If you select Port 2, Port 3, Port 4 or Port 5, all of these ports are configured as DHCP by default.
+ - If you select Port 2, Port 3, Port 4, or Port 5, all of these ports are configured as DHCP by default.
![Local web UI "Port 3 Network settings"](./media/azure-stack-edge-gpu-deploy-configure-network-compute-web-proxy/network-4.png)
@@ -68,7 +66,8 @@ Follow these steps to configure the network for your device.
* If DHCP is enabled in your environment, network interfaces are automatically configured. An IP address, subnet, gateway, and DNS are automatically assigned. * If DHCP isn't enabled, you can assign static IPs if needed. * You can configure your network interface as IPv4.
- * On the 25-Gbps interfaces, you can set the RDMA (Remote Direct Access Memory) mode to iWarp or RoCE (RDMA over Converged Ethernet). Where low latencies are the primary requirement and scalability is not a concern, use RoCE. When latency is a key requirement, but ease-of-use and scalability are also high priorities, iWARP is the best candidate.
+ * On 25-Gbps interfaces, you can set the RDMA (Remote Direct Access Memory) mode to iWarp or RoCE (RDMA over Converged Ethernet). Where low latencies are the primary requirement and scalability is not a concern, use RoCE. When latency is a key requirement, but ease-of-use and scalability are also high priorities, iWARP is the best candidate.
+ * Network Interface Card (NIC) Teaming or link aggregation is not supported with Azure Stack Edge.
* Serial number for any port corresponds to the node serial number. Once the device network is configured, the page updates as shown below.
@@ -76,12 +75,11 @@ Follow these steps to configure the network for your device.
![Local web UI "Network settings" page 2](./media/azure-stack-edge-gpu-deploy-configure-network-compute-web-proxy/network-2.png)
- >[!NOTE]
- >
- > * We recommend that you do not switch the local IP address of the network interface from static to DCHP, unless you have another IP address to connect to the device. If using one network interface and you switch to DHCP, there would be no way to determine the DHCP address. If you want to change to a DHCP address, wait until after the device has activated with the service, and then change. You can then view the IPs of all the adapters in the **Device properties** in the Azure portal for your service.
+ > [!NOTE]
+ > We recommend that you do not switch the local IP address of the network interface from static to DCHP, unless you have another IP address to connect to the device. If using one network interface and you switch to DHCP, there would be no way to determine the DHCP address. If you want to change to a DHCP address, wait until after the device has activated with the service, and then change. You can then view the IPs of all the adapters in the **Device properties** in the Azure portal for your service.
- After you have configured and applied the network settings, select Next: Compute to configure compute network.
+ After you have configured and applied the network settings, select **Next: Compute** to configure compute network.
## Enable compute network
databox-online https://docs.microsoft.com/en-us/azure/databox-online/azure-stack-edge-mini-r-deploy-configure-network-compute-web-proxy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-mini-r-deploy-configure-network-compute-web-proxy.md
@@ -7,7 +7,7 @@
Previously updated : 10/14/2020 Last updated : 02/04/2021 Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Mini R so I can use it to transfer data to Azure.
@@ -103,7 +103,7 @@ Follow these steps to configure the network for your device.
![Local web UI "Port Wi-Fi Network settings" 4](./media/azure-stack-edge-mini-r-deploy-configure-network-compute-web-proxy/add-wifi-profile-4.png)
-6. Select the Wi-Fi profile that you added in the previous step and select **Apply**.
+6. Select the Wi-Fi profile that you added in the previous step, and select **Apply**.
![Local web UI "Port Wi-Fi Network settings" 5](./media/azure-stack-edge-mini-r-deploy-configure-network-compute-web-proxy/add-wifi-profile-5.png)
@@ -121,6 +121,7 @@ Follow these steps to configure the network for your device.
- If DHCP is enabled in your environment, network interfaces are automatically configured. An IP address, subnet, gateway, and DNS are automatically assigned. - If DHCP isn't enabled, you can assign static IPs if needed. - You can configure your network interface as IPv4.
+ - Network Interface Card (NIC) Teaming or link aggregation is not supported with Azure Stack Edge.
- Serial number for any port corresponds to the node serial number. For a K-series device, only one serial number is displayed. >[!NOTE]
@@ -147,7 +148,7 @@ Follow these steps to enable compute and configure compute network.
> Kubernetes on Azure Stack Edge uses 172.27.0.0/16 subnet for pod and 172.28.0.0/16 subnet for service. Make sure that these are not in use in your network. If these subnets are already in use in your network, you can change these subnets by running the `Set-HcsKubeClusterNetworkInfo` cmdlet from the PowerShell interface of the device. For more information, see [Change Kubernetes pod and service subnets](azure-stack-edge-gpu-connect-powershell-interface.md#change-kubernetes-pod-and-service-subnets).
-1. Assign **Kubernetes external service IPs**. These are also the load balancing IP addresses. These contiguous IP addresses are for services that you want to expose outside of the Kubernetes cluster and you specify the static IP range depending on the number of services exposed.
+1. Assign **Kubernetes external service IPs**. These are also the load-balancing IP addresses. These contiguous IP addresses are for services that you want to expose outside the Kubernetes cluster and you specify the static IP range depending on the number of services exposed.
> [!IMPORTANT] > We strongly recommend that you specify a minimum of 1 IP address for Azure Stack Edge Mini R Hub service to access compute modules. You can then optionally specify additional IP addresses for other services/IoT Edge modules (1 per service/module) that need to be accessed from outside the cluster. The service IP addresses can be updated later.
@@ -156,7 +157,7 @@ Follow these steps to enable compute and configure compute network.
![Compute page in local UI 3](./media/azure-stack-edge-mini-r-deploy-configure-network-compute-web-proxy/compute-network-3.png)
-1. The configuration is takes a couple minutes to apply and you may need to refresh the browser. You can see that the specified port is enabled for compute.
+1. The configuration takes a couple minutes to apply and you may need to refresh the browser. You can see that the specified port is enabled for compute.
![Compute page in local UI 4](./media/azure-stack-edge-mini-r-deploy-configure-network-compute-web-proxy/compute-network-4.png)
databox-online https://docs.microsoft.com/en-us/azure/databox-online/azure-stack-edge-pro-r-deploy-configure-network-compute-web-proxy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-pro-r-deploy-configure-network-compute-web-proxy.md
@@ -7,7 +7,7 @@
Previously updated : 10/15/2020 Last updated : 02/04/2021 Customer intent: As an IT admin, I need to understand how to connect and activate Azure Stack Edge Pro R so I can use it to transfer data to Azure.
@@ -67,6 +67,7 @@ Follow these steps to configure the network for your device.
* If DHCP is enabled in your environment, network interfaces are automatically configured. An IP address, subnet, gateway, and DNS are automatically assigned. * If DHCP isn't enabled, you can assign static IPs if needed. * You can configure your network interface as IPv4.
+ * Network Interface Card (NIC) Teaming or link aggregation is not supported with Azure Stack Edge.
* Serial number for any port corresponds to the node serial number. <!--* On the 25-Gbps interfaces, you can set the RDMA (Remote Direct Access Memory) mode to iWarp or RoCE (RDMA over Converged Ethernet). Where low latencies are the primary requirement and scalability is not a concern, use RoCE. When latency is a key requirement, but ease-of-use and scalability are also high priorities, iWARP is the best candidate.--> Once the device network is configured, the page updates as shown below.
ddos-protection https://docs.microsoft.com/en-us/azure/ddos-protection/ddos-protection-overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/ddos-protection/ddos-protection-overview.md
@@ -47,6 +47,10 @@ Under a tenant, a single DDoS protection plan can be used across multiple subscr
To learn about Azure DDoS Protection Standard pricing, see [Azure DDoS Protection Standard pricing](https://azure.microsoft.com/pricing/details/ddos-protection/).
+## Reference architectures
+
+DDoS Protection Standard is designed for [services that are deployed in a virtual network](https://docs.microsoft.com/azure/virtual-network/virtual-network-for-azure-services). For other services, the default DDoS Protection Basic service applies. To learn more about supported architectures, see [DDoS Protection reference architectures](https://docs.microsoft.com/azure/ddos-protection/ddos-protection-reference-architectures).
+ ## Next steps > [!div class="nextstepaction"]
governance https://docs.microsoft.com/en-us/azure/governance/azure-management https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/azure-management.md
@@ -1,7 +1,7 @@
Title: Azure Management Overview - Azure Governance description: Overview of the areas of management for Azure applications and resources with links to content on Azure management tools. Previously updated : 09/09/2020 Last updated : 02/05/2021 # What are the Azure Management areas?
@@ -47,7 +47,7 @@ Configure refers to the initial deployment and configuration of resources and on
Automation of these tasks allows you to eliminate redundancy, minimizing your time and effort and increasing your accuracy and efficiency. [Azure Automation](../automation/automation-intro.md) provides the bulk of services for automating configuration tasks. While runbooks handle process
-automation, configuration and update management assist in managing configuration.
+automation, configuration and update management help manage configuration.
## Govern
@@ -64,8 +64,8 @@ to track cloud usage and expenditures for your Azure resources and other cloud p
Manage the security of your resources and data. A security program involves assessing threats, collecting and analyzing data, and compliance of your applications and resources. Security monitoring and threat analysis are provided by [Azure Security
-Center](../security-center/security-center-introduction.md), which includes unified security management and
-advanced threat protection across hybrid cloud workloads. See [Introduction to Azure
+Center](../security-center/security-center-introduction.md), which includes unified security
+management and advanced threat protection across hybrid cloud workloads. See [Introduction to Azure
Security](../security/fundamentals/overview.md) for comprehensive information and guidance on securing Azure resources.
@@ -81,10 +81,10 @@ recovery during a disaster.
## Migrate Migration refers to transitioning workloads currently running on-premises to the Azure cloud.
-[Azure Migrate](../migrate/migrate-services-overview.md) is a service that helps you assess the migration
-suitability of on-premises virtual machines to Azure. Azure Site Recovery migrates virtual machines
-[from on-premises](../site-recovery/migrate-tutorial-on-premises-azure.md) or [from Amazon Web
-Services](../site-recovery/migrate-tutorial-aws-azure.md). [Azure Database
+[Azure Migrate](../migrate/migrate-services-overview.md) is a service that helps you assess the
+migration suitability of on-premises virtual machines to Azure. Azure Site Recovery migrates virtual
+machines [from on-premises](../site-recovery/migrate-tutorial-on-premises-azure.md) or [from Amazon
+Web Services](../site-recovery/migrate-tutorial-aws-azure.md). [Azure Database
Migration](../dms/dms-overview.md) assists you in migrating database sources to Azure Data platforms.
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/how-to/configure-for-blueprint-operator https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/how-to/configure-for-blueprint-operator.md
@@ -1,7 +1,7 @@
Title: Set up your environment for Blueprint Operator description: Learn how to configure your Azure environment for use with the Blueprint Operator Azure built-in role. Previously updated : 11/24/2020 Last updated : 02/05/2021 # Configure your environment for a Blueprint Operator
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/canada-federal-pbmm/control-mapping https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/canada-federal-pbmm/control-mapping.md
@@ -1,7 +1,7 @@
Title: Canada Federal PBMM blueprint sample controls description: Control mapping of the Canada Federal PBMM blueprint samples. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 11/05/2020 Last updated : 02/05/2021 # Control mapping of the Canada Federal PBMM blueprint sample
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/canada-federal-pbmm/deploy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/canada-federal-pbmm/deploy.md
@@ -1,7 +1,7 @@
Title: Deploy Canada Federal PBMM blueprint sample description: Deploy steps for the Canada Federal PBMM blueprint sample including blueprint artifact parameter details. Previously updated : 11/02/2020 Last updated : 02/05/2021 # Deploy the Canada Federal PBMM blueprint samples
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/canada-federal-pbmm/index https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/canada-federal-pbmm/index.md
@@ -1,7 +1,7 @@
Title: Canada Federal PBMM blueprint sample overview description: Overview of the Canada Federal PBMM blueprint sample. This blueprint sample helps customers assess specific Canada Federal PBMM controls. Previously updated : 11/02/2020 Last updated : 02/05/2021 # Overview of the Canada Federal PBMM blueprint sample
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/iso27001-ase-sql-workload/control-mapping https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/iso27001-ase-sql-workload/control-mapping.md
@@ -1,7 +1,7 @@
Title: ISO 27001 ASE/SQL workload blueprint sample controls description: Control mapping of the ISO 27001 App Service Environment/SQL Database workload blueprint sample to Azure Policy and Azure RBAC. Previously updated : 11/05/2020 Last updated : 02/05/2021 # Control mapping of the ISO 27001 ASE/SQL workload blueprint sample
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/iso27001-ase-sql-workload/deploy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/iso27001-ase-sql-workload/deploy.md
@@ -1,7 +1,7 @@
Title: Deploy ISO 27001 ASE/SQL workload blueprint sample description: Deploy steps of the ISO 27001 App Service Environment/SQL Database workload blueprint sample including blueprint artifact parameter details. Previously updated : 11/02/2020 Last updated : 02/05/2021 # Deploy the ISO 27001 App Service Environment/SQL Database workload blueprint sample
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/iso27001-ase-sql-workload/index https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/iso27001-ase-sql-workload/index.md
@@ -1,7 +1,7 @@
Title: ISO 27001 ASE/SQL workload blueprint sample overview description: Overview and architecture of the ISO 27001 App Service Environment/SQL Database workload blueprint sample. Previously updated : 11/02/2020 Last updated : 02/05/2021 # Overview of the ISO 27001 App Service Environment/SQL Database workload blueprint sample
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/iso27001-shared/control-mapping https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/iso27001-shared/control-mapping.md
@@ -1,7 +1,7 @@
Title: ISO 27001 Shared Services blueprint sample controls description: Control mapping of the ISO 27001 Shared Services blueprint sample. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 11/05/2020 Last updated : 02/05/2021 # Control mapping of the ISO 27001 Shared Services blueprint sample
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/iso27001-shared/deploy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/iso27001-shared/deploy.md
@@ -1,7 +1,7 @@
Title: Deploy ISO 27001 Shared Services blueprint sample description: Deploy steps for the ISO 27001 Shared Services blueprint sample including blueprint artifact parameter details. Previously updated : 11/02/2020 Last updated : 02/05/2021 # Deploy the ISO 27001 Shared Services blueprint sample
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/iso27001-shared/index https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/iso27001-shared/index.md
@@ -1,7 +1,7 @@
Title: ISO 27001 Shared Services blueprint sample overview description: Overview and architecture of the ISO 27001 Shared Services blueprint sample. This blueprint sample helps customers assess specific ISO 27001 controls. Previously updated : 11/02/2020 Last updated : 02/05/2021 # Overview of the ISO 27001 Shared Services blueprint sample
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/ukofficial/control-mapping https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/ukofficial/control-mapping.md
@@ -1,7 +1,7 @@
Title: UK OFFICIAL & UK NHS blueprint sample controls description: Control mapping of the UK OFFICIAL and UK NHS blueprint samples. Each control is mapped to one or more Azure Policy definitions that assist with assessment. Previously updated : 11/05/2020 Last updated : 02/05/2021 # Control mapping of the UK OFFICIAL and UK NHS blueprint samples
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/ukofficial/deploy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/ukofficial/deploy.md
@@ -1,7 +1,7 @@
Title: Deploy UK OFFICIAL & UK NHS blueprint samples description: Deploy steps for the UK OFFICIAL and UK NHS blueprint samples including blueprint artifact parameter details. Previously updated : 11/02/2020 Last updated : 02/05/2021 # Deploy the UK OFFICIAL and UK NHS blueprint samples
governance https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/ukofficial/index https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/blueprints/samples/ukofficial/index.md
@@ -1,7 +1,7 @@
Title: UK OFFICIAL & UK NHS blueprint sample overview description: Overview and architecture of the UK OFFICIAL and UK NHS blueprint samples. This blueprint sample helps customers assess specific controls. Previously updated : 11/02/2020 Last updated : 02/05/2021 # Overview of the UK OFFICIAL and UK NHS blueprint samples
governance https://docs.microsoft.com/en-us/azure/governance/management-groups/create-management-group-azure-cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/management-groups/create-management-group-azure-cli.md
@@ -1,7 +1,7 @@
Title: "Quickstart: Create a management group with the Azure CLI" description: In this quickstart, you use the Azure CLI to create a management group to organize your resources into a resource hierarchy. Previously updated : 08/31/2020 Last updated : 02/05/2021
governance https://docs.microsoft.com/en-us/azure/governance/management-groups/create-management-group-dotnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/management-groups/create-management-group-dotnet.md
@@ -1,7 +1,7 @@
Title: "Quickstart: Create a management group with .NET Core" description: In this quickstart, you use .NET Core to create a management group to organize your resources into a resource hierarchy. Previously updated : 09/30/2020 Last updated : 02/05/2021
governance https://docs.microsoft.com/en-us/azure/governance/management-groups/create-management-group-javascript https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/management-groups/create-management-group-javascript.md
@@ -1,7 +1,7 @@
Title: "Quickstart: Create a management group with JavaScript" description: In this quickstart, you use JavaScript to create a management group to organize your resources into a resource hierarchy. Previously updated : 11/18/2020 Last updated : 02/05/2021
@@ -23,7 +23,7 @@ directory. You receive a notification when the process is complete. For more inf
- If you don't have an Azure subscription, create a [free](https://azure.microsoft.com/free/) account before you begin. -- Before you start, make sure that the at least version 12 of [Node.js](https://nodejs.org/) is
+- Before you start, make sure that at least version 12 of [Node.js](https://nodejs.org/) is
installed. - Any Azure AD user in the tenant can create a management group without the management group write
governance https://docs.microsoft.com/en-us/azure/governance/management-groups/create-management-group-portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/management-groups/create-management-group-portal.md
@@ -1,7 +1,7 @@
Title: "Quickstart: Create a management group with portal" description: In this quickstart, you use Azure portal to create a management group to organize your resources into a resource hierarchy. Previously updated : 08/31/2020 Last updated : 02/05/2021 # Quickstart: Create a management group
governance https://docs.microsoft.com/en-us/azure/governance/management-groups/create-management-group-powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/management-groups/create-management-group-powershell.md
@@ -1,7 +1,7 @@
Title: "Quickstart: Create a management group with Azure PowerShell" description: In this quickstart, you use Azure PowerShell to create a management group to organize your resources into a resource hierarchy. Previously updated : 08/31/2020 Last updated : 02/05/2021
governance https://docs.microsoft.com/en-us/azure/governance/management-groups/create-management-group-rest-api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/management-groups/create-management-group-rest-api.md
@@ -1,7 +1,7 @@
Title: "Quickstart: Create a management group with REST API" description: In this quickstart, you use REST API to create a management group to organize your resources into a resource hierarchy. Previously updated : 08/31/2020 Last updated : 02/05/2021 # Quickstart: Create a management group with REST API
@@ -23,8 +23,8 @@ directory. You receive a notification when the process is complete. For more inf
account before you begin. - If you haven't already, install [ARMClient](https://github.com/projectkudu/ARMClient). It's a tool
- that sends HTTP requests to Azure Resource Manager-based REST APIs. Alternatively, you can use the
- "Try It" feature in REST documentation or tooling like PowerShell's
+ that sends HTTP requests to Azure Resource Manager-based REST APIs. Instead, you can use the "Try
+ It" feature in REST documentation or tooling like PowerShell's
[Invoke-RestMethod](/powershell/module/microsoft.powershell.utility/invoke-restmethod) or [Postman](https://www.postman.com).
governance https://docs.microsoft.com/en-us/azure/governance/management-groups/how-to/protect-resource-hierarchy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/management-groups/how-to/protect-resource-hierarchy.md
@@ -1,7 +1,7 @@
Title: How to protect your resource hierarchy - Azure Governance description: Learn how to protect your resource hierarchy with hierarchy settings that include setting the default management group. Previously updated : 09/02/2020 Last updated : 02/05/2021 # How to protect your resource hierarchy
@@ -16,8 +16,8 @@ behaviors. This article covers each of the available hierarchy settings and how
## Azure RBAC permissions for hierarchy settings
-Configuring any of the hierarchy settings requires the following two resource provider operations on the root
-management group:
+Configuring any of the hierarchy settings requires the following two resource provider operations on
+the root management group:
- `Microsoft.Management/managementgroups/settings/write` - `Microsoft.Management/managementgroups/settings/read`
@@ -29,12 +29,12 @@ these operations are available in the Azure built-in role **Hierarchy Settings A
## Setting - Default management group By default, a new subscription added within a tenant is added as a member of the root management
-group. If policy assignments, Azure role-based access control (Azure RBAC), and other governance constructs are
-assigned to the root management group, they immediately effect these new subscriptions. For this
-reason, many organizations don't apply these constructs at the root management group even though
-that is the desired place to assign them. In other cases, a more restrictive set of controls is
-desired for new subscriptions, but shouldn't be assigned to all subscriptions. This setting supports
-both use cases.
+group. If policy assignments, Azure role-based access control (Azure RBAC), and other governance
+constructs are assigned to the root management group, they immediately effect these new
+subscriptions. For this reason, many organizations don't apply these constructs at the root
+management group even though that is the desired place to assign them. In other cases, a more
+restrictive set of controls is desired for new subscriptions, but shouldn't be assigned to all
+subscriptions. This setting supports both use cases.
By allowing the default management group for new subscriptions to be defined, organization-wide governance constructs can be applied at the root management group, and a separate management group
governance https://docs.microsoft.com/en-us/azure/governance/resource-graph/reference/supported-tables-resources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/resource-graph/reference/supported-tables-resources.md
@@ -1,7 +1,7 @@
Title: Supported Azure Resource Manager resource types description: Provide a list of the Azure Resource Manager resource types supported by Azure Resource Graph and Change History. Previously updated : 01/06/2021 Last updated : 02/04/2021
@@ -126,6 +126,9 @@ part of a **table** in Resource Graph.
- microsoft.azurestack/linkedsubscriptions - Microsoft.Azurestack/registrations (Azure Stack Hubs) - Microsoft.AzureStackHCI/clusters (Azure Stack HCI)
+- microsoft.azurestackhci/galleryimages
+- microsoft.azurestackhci/networkinterfaces
+- microsoft.azurestackhci/virtualnetworks
- microsoft.baremetal/consoleconnections - Microsoft.BareMetal/crayServers (Cray Servers) - Microsoft.BareMetal/monitoringServers (Monitoring Servers)
@@ -327,6 +330,9 @@ part of a **table** in Resource Graph.
- microsoft.insights/workbooktemplates (Azure Workbook Templates) - Microsoft.IntelligentITDigitalTwin/digitalTwins (Minervas) - microsoft.intelligentitdigitaltwin/digitaltwins/assets
+- microsoft.intelligentitdigitaltwin/digitaltwins/executionplans
+- microsoft.intelligentitdigitaltwin/digitaltwins/testplans
+- microsoft.intelligentitdigitaltwin/digitaltwins/tests
- Microsoft.IoTCentral/IoTApps (IoT Central Applications) - Microsoft.IoTSpaces/Graph (Digital Twins (Deprecated)) - microsoft.keyvault/hsmpools
@@ -490,6 +496,7 @@ part of a **table** in Resource Graph.
- Microsoft.Resources/templateSpecs (Template specs) - microsoft.resources/templatespecs/versions - Microsoft.SaaS/applications (Software as a Service (classic))
+- Microsoft.SaaS/resources (CPX-Placeholder)
- Microsoft.Scheduler/jobCollections (Scheduler Job Collections) - microsoft.scvmm/clouds - Microsoft.scvmm/virtualMachines (SCVMM virtual machine - Azure Arc)
@@ -584,6 +591,7 @@ part of a **table** in Resource Graph.
- Microsoft.Web/StaticSites (Static Web Apps (Preview)) - Microsoft.WindowsESU/multipleActivationKeys (Windows Multiple Activation Keys) - Microsoft.WindowsIoT/DeviceServices (Windows 10 IoT Core Services)
+- microsoft.workloadbuilder/migrationagents
- microsoft.workloadbuilder/workloads - MyGet.PackageManagement/services (MyGet - Hosted NuGet, NPM, Bower and Vsix) - Paraleap.CloudMonix/services (CloudMonix)
governance https://docs.microsoft.com/en-us/azure/governance/resource-graph/shared-query-azure-cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/resource-graph/shared-query-azure-cli.md
@@ -1,7 +1,7 @@
Title: "Quickstart: Create a shared query with Azure CLI" description: In this quickstart, you follow the steps to enable the Resource Graph extension for Azure CLI and create a shared query. Previously updated : 10/14/2020 Last updated : 02/05/2021 # Quickstart: Create a Resource Graph shared query using Azure CLI
governance https://docs.microsoft.com/en-us/azure/governance/resource-graph/shared-query-template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/resource-graph/shared-query-template.md
@@ -1,7 +1,7 @@
Title: "Quickstart: Create a shared query with templates" description: In this quickstart, you use an Azure Resource Manager template (ARM template) to create a Resource Graph shared query that counts virtual machines by OS. Previously updated : 10/14/2020 Last updated : 02/05/2021
hdinsight https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-release-notes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/hdinsight-release-notes.md
@@ -45,7 +45,7 @@ HDInsight added network security groups (NSGs) and user-defined routes (UDRs) ch
The following changes will happen in upcoming releases. ### Breaking change for .NET for Apache Spark 1.0.0
-HDInsight will introduce the first major official release of .NET for Apache Spark in the next release. It provides DataFrame API completeness for Spark 2.4.x and Spark 3.0.x along with other features. There will be breaking changes for this major version, refer to [this migration guid](https://github.com/dotnet/spark/blob/master/docs/migration-guide.md#upgrading-from-microsoftspark-0x-to-10) to understand steps needed to update your code and pipelines. Learn more [here](https://docs.microsoft.com/azure/hdinsight/spark/spark-dotnet-version-update#using-net-for-apache-spark-v10-in-hdinsight).
+HDInsight will introduce the first major official release of .NET for Apache Spark in the next release. It provides DataFrame API completeness for Spark 2.4.x and Spark 3.0.x along with other features. There will be breaking changes for this major version, refer to [this migration guide](https://github.com/dotnet/spark/blob/master/docs/migration-guide.md#upgrading-from-microsoftspark-0x-to-10) to understand steps needed to update your code and pipelines. Learn more [here](https://docs.microsoft.com/azure/hdinsight/spark/spark-dotnet-version-update#using-net-for-apache-spark-v10-in-hdinsight).
### Default cluster VM size will be changed to Ev3 family Starting from next release (around end of January), default cluster VM sizes will be changed from D family to Ev3 family. This change applies to head nodes and worker nodes. To avoid this change, specify the VM sizes that you want to use in the ARM template.
@@ -77,4 +77,4 @@ HDInsight is deploying fixes and applying patch for all running clusters for bot
``` https://hdiconfigactions.blob.core.windows.net/linuxospatchingrebootconfigv02/replace_cacert_script.sh https://healingscriptssa.blob.core.windows.net/healingscripts/ChangeOOMPolicyAndApplyLatestConfigForClamav.sh
-```
+```
hdinsight https://docs.microsoft.com/en-us/azure/hdinsight/interactive-query/apache-hive-warehouse-connector https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/interactive-query/apache-hive-warehouse-connector.md
@@ -33,7 +33,11 @@ Some of the operations supported by the Hive Warehouse Connector are:
## Hive Warehouse Connector setup > [!IMPORTANT]
-> The HiveServer2 Interactive instance installed on Spark 2.4 Enterprise Security Package clusters is not supported for use with the Hive Warehouse Connector. Instead, you must configure a separate HiveServer2 Interactive cluster to host your HiveServer2 Interactive workloads. A Hive Warehouse Connector configuration that utilizes a single Spark 2.4 cluster is not supported.
+> - The HiveServer2 Interactive instance installed on Spark 2.4 Enterprise Security Package clusters is not supported for use with the Hive Warehouse Connector. Instead, you must configure a separate HiveServer2 Interactive cluster to host your HiveServer2 Interactive workloads. A Hive Warehouse Connector configuration that utilizes a single Spark 2.4 cluster is not supported.
+> - Hive Warehouse Connector (HWC) Library is not supported for use with Interactive Query Clusters where Workload Management (WLM) feature is enabled. <br>
+In a scenario where you only have Spark workloads and want to use HWC Library, ensure Interactive Query cluster doesn't have Workload Management feature enabled (`hive.server2.tez.interactive.queue` configuration is not set in Hive configs). <br>
+For a scenario where both Spark workloads (HWC) and LLAP native workloads exists, You need to create two separate Interactive Query Clusters with shared metastore database. One cluster for native LLAP workloads where WLM feature can be enabled on need basis and other cluster for HWC only workload where WLM feature shouldn't be configured.
+It is important to note that you can view the WLM resource plans from both the clusters even if it is enabled in only one cluster. Don't make any changes to resource plans in the cluster where WLM feature is disabled as it might impact the WLM functionality in other cluster.
Hive Warehouse Connector needs separate clusters for Spark and Interactive Query workloads. Follow these steps to set up these clusters in Azure HDInsight.
healthcare-apis https://docs.microsoft.com/en-us/azure/healthcare-apis/de-identified-export https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/de-identified-export.md
@@ -17,6 +17,9 @@ The $export command can also be used to export de-identified data from the FHIR
`https://<<FHIR service base URL>>/$export?_container=<<container_name>>&_anonymizationConfig=<<config file name>>&_anonymizationConfigEtag=<<ETag on storage>>`
+> [!Note]
+> Right now the Azure API for FHIR only supports de-identified export at the system level ($export).
+ |Query parameter | Example |Optionality| Description| |||--|| | _\_anonymizationConfig_ |DemoConfig.json|Required for de-identified export |Name of the configuration file. See the configuration file format [here](https://github.com/microsoft/FHIR-Tools-for-Anonymization#configuration-file-format). This file should be kept inside a container named **anonymization** within the same Azure storage account that is configured as the export location. |
healthcare-apis https://docs.microsoft.com/en-us/azure/healthcare-apis/export-data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/export-data.md
@@ -24,6 +24,7 @@ The Azure API For FHIR supports $export at the following levels:
* [Patient](https://hl7.org/Fhir/uv/bulkdata/export/https://docsupdatetracker.net/index.html#endpointall-patients): `GET https://<<FHIR service base URL>>/Patient/$export>>` * [Group of patients*](https://hl7.org/Fhir/uv/bulkdata/export/https://docsupdatetracker.net/index.html#endpointgroup-of-patients) - Azure API for FHIR exports all related resources but does not export the characteristics of the group: `GET https://<<FHIR service base URL>>/Group/[ID]/$export>>`
+When data is exported, a separate file is created for each resource type. To ensure that the exported files don't become too large, we create a new file after the size of a single exported file becomes larger than 64 MB. The result is that you may get multiple files for each resource type, which will be enumerated (i.e. Patient-1.ndjson, Patient-2.ndjson).
> [!Note]
@@ -31,6 +32,8 @@ The Azure API For FHIR supports $export at the following levels:
In addition, checking the export status through the URL returned by the location header during the queuing is supported along with canceling the actual export job. ++ ## Settings and parameters ### Headers
iot-edge https://docs.microsoft.com/en-us/azure/iot-edge/quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-edge/quickstart.md
@@ -1,5 +1,5 @@
Title: Quickstart create an Azure IoT Edge device on Windows | Microsoft Docs
+ Title: Quickstart to create an Azure IoT Edge device on Windows | Microsoft Docs
description: In this quickstart, learn how to create an IoT Edge device and then deploy prebuilt code remotely from the Azure portal.
@@ -12,20 +12,20 @@
monikerRange: "=iotedge-2018-06"
-# Quickstart: Deploy your first IoT Edge module to a Windows device (Preview)
+# Quickstart: Deploy your first IoT Edge module to a Windows device (preview)
Try out Azure IoT Edge in this quickstart by deploying containerized code to a Linux on Windows IoT Edge device. IoT Edge allows you to remotely manage code on your devices so that you can send more of your workloads to the edge. For this quickstart, we recommend using your own device to see how easy it is to use Azure IoT Edge for Linux on Windows.
-In this quickstart you learn how to:
+In this quickstart, you'll learn how to:
* Create an IoT hub. * Register an IoT Edge device to your IoT hub. * Install and start the IoT Edge for Linux on Windows runtime on your device. * Remotely deploy a module to an IoT Edge device and send telemetry.
-![Diagram - Quickstart architecture for device and cloud](./media/quickstart/install-edge-full.png)
+![Diagram that shows the architecture of this quickstart for your device and cloud.](./media/quickstart/install-edge-full.png)
-This quickstart walks you through how to set up your Azure IoT Edge for Linux on Windows device. Then, you deploy a module from the Azure portal to your device. The module used in this quickstart is a simulated sensor that generates temperature, humidity, and pressure data. The other Azure IoT Edge tutorials build upon the work you do here by deploying additional modules that analyze the simulated data for business insights.
+This quickstart walks you through how to set up your Azure IoT Edge for Linux on Windows device. Then, you'll deploy a module from the Azure portal to your device. The module you'll use is a simulated sensor that generates temperature, humidity, and pressure data. Other Azure IoT Edge tutorials build on the work you do here by deploying modules that analyze the simulated data for business insights.
If you don't have an active Azure subscription, create a [free account](https://azure.microsoft.com/free) before you begin.
@@ -38,219 +38,214 @@ Prepare your environment for the Azure CLI.
[!INCLUDE [azure-cli-prepare-your-environment-no-header.md](../../includes/azure-cli-prepare-your-environment-no-header.md)]
-Cloud resources:
-
-* A resource group to manage all the resources you use in this quickstart.
+Create a cloud resource group to manage all the resources you'll use in this quickstart.
```azurecli-interactive az group create --name IoTEdgeResources --location westus2 ```
-IoT Edge device:
+Make sure your IoT Edge device meets the following requirements:
-* Your device needs to be a Windows PC or server, version 1809 or later
+* Windows PC or server, version 1809 or later
* At least 4 GB of memory, recommended 8 GB of memory * 10 GB of free disk space ## Create an IoT hub
-Start the quickstart by creating an IoT hub with Azure CLI.
+Start by creating an IoT hub with the Azure CLI.
-![Diagram - Create an IoT hub in the cloud](./media/quickstart/create-iot-hub.png)
+![Diagram that shows the step to create an I o T hub.](./media/quickstart/create-iot-hub.png)
-The free level of IoT Hub works for this quickstart. If you've used IoT Hub in the past and already have a hub created, you can use that IoT hub.
+The free level of Azure IoT Hub works for this quickstart. If you've used IoT Hub in the past and already have a hub created, you can use that IoT hub.
-The following code creates a free **F1** hub in the resource group `IoTEdgeResources`. Replace `{hub_name}` with a unique name for your IoT hub. It might take a few minutes to create an IoT Hub.
+The following code creates a free **F1** hub in the resource group `IoTEdgeResources`. Replace `{hub_name}` with a unique name for your IoT hub. It might take a few minutes to create an IoT hub.
- ```azurecli-interactive
- az iot hub create --resource-group IoTEdgeResources --name {hub_name} --sku F1 --partition-count 2
- ```
+```azurecli-interactive
+az iot hub create --resource-group IoTEdgeResources --name {hub_name} --sku F1 --partition-count 2
+```
- If you get an error because there's already one free hub in your subscription, change the SKU to **S1**. If you get an error that the IoT Hub name isn't available, it means that someone else already has a hub with that name. Try a new name.
+If you get an error because you already have one free hub in your subscription, change the SKU to `S1`. If you get an error that the IoT hub name isn't available, someone else already has a hub with that name. Try a new name.
## Register an IoT Edge device Register an IoT Edge device with your newly created IoT hub.
-![Diagram - Register a device with an IoT Hub identity](./media/quickstart/register-device.png)
+![Diagram that shows the step to register a device with an IoT hub identity.](./media/quickstart/register-device.png)
Create a device identity for your simulated device so that it can communicate with your IoT hub. The device identity lives in the cloud, and you use a unique device connection string to associate a physical device to a device identity.
-Since IoT Edge devices behave and can be managed differently than typical IoT devices, declare this identity to be for an IoT Edge device with the `--edge-enabled` flag.
+IoT Edge devices behave and can be managed differently than typical IoT devices. Use the `--edge-enabled` flag to declare that this identity is for an IoT Edge device.
-1. In the Azure Cloud Shell, enter the following command to create a device named **myEdgeDevice** in your hub.
+1. In Azure Cloud Shell, enter the following command to create a device named **myEdgeDevice** in your hub.
- ```azurecli-interactive
- az iot hub device-identity create --device-id myEdgeDevice --edge-enabled --hub-name {hub_name}
- ```
+ ```azurecli-interactive
+ az iot hub device-identity create --device-id myEdgeDevice --edge-enabled --hub-name {hub_name}
+ ```
- If you get an error about iothubowner policy keys, make sure that your Cloud Shell is running the latest version of the azure-iot extension.
+ If you get an error about `iothubowner` policy keys, make sure that Cloud Shell is running the latest version of the Azure IoT extension.
-2. View the connection string for your device, which links your physical device with its identity in IoT Hub. It contains the name of your IoT hub, the name of your device, and then a shared key that authenticates connections between the two.
+1. View the connection string for your device, which links your physical device with its identity in IoT Hub. It contains the name of your IoT hub, the name of your device, and a shared key that authenticates connections between the two.
- ```azurecli-interactive
- az iot hub device-identity connection-string show --device-id myEdgeDevice --hub-name {hub_name}
- ```
+ ```azurecli-interactive
+ az iot hub device-identity connection-string show --device-id myEdgeDevice --hub-name {hub_name}
+ ```
-3. Copy the value of the `connectionString` key from the JSON output and save it. This value is the device connection string. You'll use this connection string to configure the IoT Edge runtime in the next section.
+1. Copy the value of the `connectionString` key from the JSON output and save it. This value is the device connection string. You'll use it to configure the IoT Edge runtime in the next section.
- ![Retrieve connection string from CLI output](./media/quickstart/retrieve-connection-string.png)
+ ![Screenshot that shows the connectionString output in Cloud Shell.](./media/quickstart/retrieve-connection-string.png)
## Install and start the IoT Edge runtime
-Install IoT Edge for Linux on Windows on your device, and configure it with a device connection string.
+Install IoT Edge for Linux on Windows on your device, and configure it with the device connection string.
-![Diagram - Start the IoT Edge runtime on device](./media/quickstart/start-runtime.png)
+![Diagram that shows the step to start the IoT Edge runtime.](./media/quickstart/start-runtime.png)
1. [Download Windows Admin Center](https://aka.ms/WACDownloadEFLOW).
-1. Follow the installation wizard to set up Windows Admin Center on your device.
+1. Follow the prompts in the installation wizard to set up Windows Admin Center on your device.
-1. Once you are in Windows Admin Center, on the top right of the screen, select the **Settings Gear Icon**
+1. Open Windows Admin Center.
-1. From the settings menu, under Gateway, select **Extensions**
+1. Select the **Settings gear** icon in the upper-right corner, and then select **Extensions**.
-1. Select the **Feeds** tab and select **Add**.
+1. On the **Feeds** tab, select **Add**.
-1. Enter https://aka.ms/wac-insiders-feed into the text box and select **Add**.
+1. Enter `https://aka.ms/wac-insiders-feed` into the text box, and then select **Add**.
-1. After the feed has been added, navigate to the **Available extensions** tab. It may take a moment to update the extensions list.
+1. After the feed has been added, go to the **Available extensions** tab and wait for the extensions list to update.
-1. From the list of **Available extensions** select **Azure IoT Edge**
+1. From the list of **Available extensions**, select **Azure IoT Edge**.
-1. **Install** the extension
+1. Install the extension.
-1. Once the extension has been installed navigate to the main dashboard page by selecting **Windows Admin Center** on top left hand corner your screen.
+1. When the extension is installed, select **Windows Admin Center** in the upper-left corner to go to the main dashboard page.
-1. You will see the local host connection representing the PC where you are running Windows Admin Center.
+ The **localhost** connection represents the PC where you're running Windows Admin Center.
- :::image type="content" source="media/quickstart/windows-admin-center-start-page.png" alt-text="Screenshot - Windows Admin Start Page":::
+ :::image type="content" source="media/quickstart/windows-admin-center-start-page.png" alt-text="Screenshot of the Windows Admin Start page.":::
1. Select **Add**.
- :::image type="content" source="media/quickstart/windows-admin-center-start-page-add.png" alt-text="Screenshot - Windows Admin Start Page Add Button":::
-
-1. Locate the Azure IoT Edge tile, and select **Create new**. This will start the installation wizard.
+ :::image type="content" source="media/quickstart/windows-admin-center-start-page-add.png" alt-text="Screenshot that shows selecting the Add button in Windows Admin Center.":::
- :::image type="content" source="media/quickstart/select-tile-screen.png" alt-text="Screenshot - Azure IoT Edge For Linux on Windows Tile":::
+1. On the Azure IoT Edge tile, select **Create new** to start the installation wizard.
-1. Proceed through the installation wizard to accept the EULA and choose **Next**
+ :::image type="content" source="media/quickstart/select-tile-screen.png" alt-text="Screenshot that shows creating a new deployment in the Azure IoT Edge til.":::
- :::image type="content" source="media/quickstart/wizard-welcome-screen.png" alt-text="Screenshot - Wizard Welcome":::
+1. Continue through the installation wizard to accept the Microsoft Software License Terms, and then select **Next**.
-1. Choose the **Optional diagnostic data** to provide extended diagnostics data which helps Microsoft monitor and maintain quality of service, and click **Next: Deploy**
+ :::image type="content" source="media/quickstart/wizard-welcome-screen.png" alt-text="Screenshot that shows selecting Next to continue through the installation wizard.":::
- :::image type="content" source="media/quickstart/diagnostic-data-screen.png" alt-text="Screenshot - Diagnostic Data":::
+1. Select **Optional diagnostic data**, and then select **Next: Deploy**. This selection provides extended diagnostics data that helps Microsoft monitor and maintain quality of service.
-1. On the **Select target device** screen, select your desired target device to validate that it meets the minimum requirements. For this quickstart, we're installing IoT Edge on the local device, so choose the localhost connection. Once confirmed, choose **Next** to continue
+ :::image type="content" source="media/quickstart/diagnostic-data-screen.png" alt-text="Screenshot that shows the Diagnostic data options.":::
- :::image type="content" source="media/quickstart/wizard-select-target-device-screen.png" alt-text="Screenshot - Select Target Device":::
+1. On the **Select target device** screen, select your desired target device to validate that it meets the minimum requirements. For this quickstart, we're installing IoT Edge on the local device, so choose the **localhost** connection. If the target device meets the requirements, select **Next** to continue.
-1. ΓÇïAccept the default settings by choosing **Next**.
+ :::image type="content" source="media/quickstart/wizard-select-target-device-screen.png" alt-text="Screenshot that shows the Target device list.":::
-1. The deployment screen shows the process of downloading the package, installing the package, configuring the host and final setting up the Linux VMΓÇï. A successful deployment will look as follows:
+1. ΓÇïSelect **Next** to accept the default settings. The deployment screen shows the process of downloading the package, installing the package, configuring the host, and final setting up the Linux virtual machine (VM)ΓÇï. A successful deployment looks like this:
- :::image type="content" source="media/quickstart/wizard-deploy-success-screen.png" alt-text="Screenshot - Wizard Deploy Success":::
+ :::image type="content" source="media/quickstart/wizard-deploy-success-screen.png" alt-text="Screenshot of a successful deployment.":::
-1. Click **Next: Connect** to continue to the final step to provision your Azure IoT Edge device with its device ID from your IoT hub instance.
+1. Select **Next: Connect** to continue to the final step to provision your Azure IoT Edge device with its device ID from your IoT hub instance.
-1. Copy the connection string from your device in Azure IoT Hub and paste it into the device connection string field. Then choose **Provisioning with the selected method**ΓÇï.
+1. Paste the connection string you copied [earlier in this quickstart](#register-an-iot-edge-device) into the **Device connection string** field. Then select **Provisioning with the selected method**ΓÇï.
- > [!NOTE]
- > Refer to step 3 in the previous section, [Register an IoT Edge device](#register-an-iot-edge-device), to retrieve your connection string.
+ :::image type="content" source="media/quickstart/wizard-provision.png" alt-text="Screenshot that shows the connection string in the Device connection string field.":::
- :::image type="content" source="media/quickstart/wizard-provision.png" alt-text="Screenshot - Wizard Provisioning":::
+1. After provisioning is complete, select **Finish** to complete and return to the Windows Admin Center start screen. You should see your device listed as an IoT Edge device.
-1. Once provisioning is complete, select **Finish** to complete and return to the Windows Admin Center start screen. You should now be able to see your device listed as an IoT Edge Device.
+ :::image type="content" source="media/quickstart/windows-admin-center-device-screen.png" alt-text="Screenshot that shows all connections in Windows Admin Center.":::
- :::image type="content" source="media/quickstart/windows-admin-center-device-screen.png" alt-text="Screenshot - Windows Admin Center Azure IoT Edge Device":::
+1. Select your Azure IoT Edge device to view its dashboardΓÇï. You should see that the workloads from your device twin in Azure IoT Hub have been deployed. The **IoT Edge Module List** should show one module running **edgeAgent**, and the **IoT Edge Status** should be **active (running)**.
-1. Select your Azure IoT Edge device to view its dashboardΓÇï. You should see that the workloads from your device twin in Azure IoT Hub have been deployed. The **IoT Edge Module List** should show one module running, **edgeAgent**, and the **IoT Edge Status** should show **active (running)**.
-ΓÇï
Your IoT Edge device is now configured. It's ready to run cloud-deployed modules. ## Deploy a module Manage your Azure IoT Edge device from the cloud to deploy a module that sends telemetry data to IoT Hub.
-![Diagram - deploy module from cloud to device](./media/quickstart/deploy-module.png)
+![Diagram that shows the step to deploy a module.](./media/quickstart/deploy-module.png)
[!INCLUDE [iot-edge-deploy-module](../../includes/iot-edge-deploy-module.md)]
-## View generated data
+## View the generated data
-In this quickstart, you created a new IoT Edge device and installed the IoT Edge runtime on it. Then, you used the Azure portal to deploy an IoT Edge module to run on the device without having to make changes to the device itself.
+In this quickstart, you created a new IoT Edge device and installed the IoT Edge runtime on it. Then you used the Azure portal to deploy an IoT Edge module to run on the device without having to make changes to the device itself.
-In this case, the module that you pushed generates sample environment data that you can use for testing later. The simulated sensor is monitoring both a machine and the environment around the machine. For example, this sensor might be in a server room, on a factory floor, or on a wind turbine. The message includes ambient temperature and humidity, machine temperature and pressure, and a timestamp. The IoT Edge tutorials use the data created by this module as test data for analytics.
+The module that you pushed generates sample environment data that you can use for testing later. The simulated sensor is monitoring both a machine and the environment around the machine. For example, this sensor might be in a server room, on a factory floor, or on a wind turbine. The messages that it sends include ambient temperature and humidity, machine temperature and pressure, and a timestamp. IoT Edge tutorials use the data created by this module as test data for analytics.
-Confirm that the module deployed from the cloud is running on your IoT Edge device by navigating to the Command Shell in Windows Admin Center.
+From the command shell in Windows Admin Center, confirm that the module you deployed from the cloud is running on your IoT Edge device.
-1. Connect to your newly created IoT Edge Device
+1. Connect to your newly created IoT Edge device.
- :::image type="content" source="media/quickstart/connect-edge-screen.png" alt-text="Screenshot - Connect Device":::
+ :::image type="content" source="media/quickstart/connect-edge-screen.png" alt-text="Screenshot that shows selecting Connect in Windows Admin Center.":::
-2. On the **Overview** page you will see the **IoT Edge Module List** and **IoT Edge Status** where you can see the various modules that have been deployed as well as the device status.
+ On the **Overview** page, you'll see the **IoT Edge Module List** and **IoT Edge Status**. You can see the modules that have been deployed and the device status.
-3. Under **Tools** select **Command Shell**. The command shell is a PowerShell terminal that automatically uses ssh (secure shell) to connect to your Azure IoT Edge device's Linux VM on your Windows PC.
+1. Under **Tools**, select **Command Shell**. The command shell is a PowerShell terminal that automatically uses Secure Shell (SSH) to connect to your Azure IoT Edge device's Linux VM on your Windows PC.
- :::image type="content" source="media/quickstart/command-shell-screen.png" alt-text="Screenshot - Command Shell":::
+ :::image type="content" source="media/quickstart/command-shell-screen.png" alt-text="Screenshot that shows opening the command shell.":::
-4. To verify the three modules on your device, run the following **bash command**:
+1. To verify the three modules on your device, run the following Bash command:
- ```bash
- sudo iotedge list
- ```
+ ```bash
+ sudo iotedge list
+ ```
- :::image type="content" source="media/quickstart/iotedge-list-screen.png" alt-text="Screenshot - Command Shell List":::
+ :::image type="content" source="media/quickstart/iotedge-list-screen.png" alt-text="Screenshot that shows the command shell I o T edge list output.":::
-5. View the messages being sent from the temperature sensor module to the cloud.
+1. View the messages being sent from the temperature sensor module to the cloud.
- ```bash
- iotedge logs SimulatedTemperatureSensor -f
- ```
+ ```bash
+ iotedge logs SimulatedTemperatureSensor -f
+ ```
- >[!TIP]
- >IoT Edge commands are case-sensitive when referring to module names.
+ >[!Important]
+ >IoT Edge commands are case-sensitive when they refer to module names.
- :::image type="content" source="media/quickstart/temperature-sensor-screen.png" alt-text="Screenshot - Temperature Sensor":::
+ :::image type="content" source="media/quickstart/temperature-sensor-screen.png" alt-text="Screenshot that shows the list of messages sent from the module to the cloud.":::
-You can also watch the messages arrive at your IoT hub by using the [Azure IoT Hub extension for Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-toolkit).
+You can also use the [Azure IoT Hub extension for Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-toolkit) to watch messages arrive at your IoT hub.
## Clean up resources
-If you want to continue on to the IoT Edge tutorials, you can use the device that you registered and set up in this quickstart. Otherwise, you can delete the Azure resources that you created to avoid charges.
+If you want to continue on to the IoT Edge tutorials, skip this step. You can use the device that you registered and set up in this quickstart. Otherwise, you can delete the Azure resources that you created to avoid charges.
-If you created your virtual machine and IoT hub in a new resource group, you can delete that group and all the associated resources. Double check the contents of the resource group to make sure that there's nothing you want to keep. If you don't want to delete the whole group, you can delete individual resources instead.
+If you created your virtual machine and IoT hub in a new resource group, you can delete that group and all the associated resources. If you don't want to delete the whole group, you can delete individual resources instead.
> [!IMPORTANT]
-> Deleting a resource group is irreversible.
+> Check the contents of the resource group to make sure that there's nothing you want to keep. Deleting a resource group is irreversible.
-Remove the **IoTEdgeResources** group. It might take a few minutes to delete a resource group.
+Use the following command to remove the **IoTEdgeResources** group. Deletion might take a few minutes.
```azurecli-interactive az group delete --name IoTEdgeResources ```
-You can confirm the resource group is removed by viewing the list of resource groups.
+You can confirm that the resource group is removed by using this command to view the list of resource groups.
```azurecli-interactive az group list ```
-### Clean removal of Azure IoT Edge for Linux on Windows
+### Remove Azure IoT Edge for Linux on Windows
+
+Use the dashboard extension in Windows Admin Center to uninstall Azure IoT Edge for Linux on Windows.
-You can uninstall Azure IoT Edge for Linux on Windows from your IoT Edge device through the dashboard extension in Windows Admin Center.
+1. Connect to the IoT Edge device in Windows Admin Center. The Azure dashboard tool extension loads.
-1. Connect to the Azure IoT Edge for Linux on Windows device connection in Windows Admin Center. The Azure dashboard tool extension will load.
-2. Select **Uninstall**. Once Azure IoT Edge for Linux on Windows is removed, Windows Admin Center will navigate to the start page and remove the Azure IoT Edge device connection entry from the list.
+1. Select **Uninstall**. After Azure IoT Edge is removed, Windows Admin Center removes the Azure IoT Edge device connection entry from the **Start** page.
-Another way to remove Azure IoT Edge from your Windows system is to go to **Start** > **Settings** > **Apps** > **Azure IoT Edge** > **Uninstall** on your IoT Edge device. This will remove Azure IoT Edge from your IoT Edge device, but leave the connection behind in Windows Admin Center. Windows Admin Center can be uninstalled from the Settings menu as well.
+>[!Note]
+>Another way to remove Azure IoT Edge from your Windows system is to select **Start** > **Settings** > **Apps** > **Azure IoT Edge** > **Uninstall** on your IoT Edge device. This method removes Azure IoT Edge from your IoT Edge device, but leaves the connection behind in Windows Admin Center. To complete the removal, uninstall Windows Admin Center from the **Settings** menu as well.
## Next steps
-In this quickstart, you created an IoT Edge device and used the Azure IoT Edge cloud interface to deploy code onto the device. Now, you have a test device generating raw data about its environment.
+In this quickstart, you created an IoT Edge device and used the Azure IoT Edge cloud interface to deploy code onto the device. Now you have a test device generating raw data about its environment.
-The next step is to set up your local development environment so that you can start creating IoT Edge modules that run your business logic.
+Next, set up your local development environment so that you can start creating IoT Edge modules that run your business logic.
> [!div class="nextstepaction"] > [Start developing IoT Edge modules](tutorial-develop-for-linux.md)
iot-hub https://docs.microsoft.com/en-us/azure/iot-hub/monitor-iot-hub-reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/monitor-iot-hub-reference.md
@@ -5,6 +5,7 @@
+ Last updated 10/22/2020
@@ -684,4 +685,4 @@ For a reference of all Azure Monitor Logs / Log Analytics tables, see the [Azure
## See Also * See [Monitor Azure IoT Hub](monitor-iot-hub.md) for a description of monitoring Azure IoT Hub.
-* See [Monitoring Azure resources with Azure Monitor](../azure-monitor/insights/monitor-azure-resource.md) for details on monitoring Azure resources.
+* See [Monitoring Azure resources with Azure Monitor](../azure-monitor/insights/monitor-azure-resource.md) for details on monitoring Azure resources.
iot-hub https://docs.microsoft.com/en-us/azure/iot-hub/monitor-iot-hub https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/monitor-iot-hub.md
@@ -5,7 +5,9 @@
+ Last updated 11/06/2020+ # Monitoring Azure IoT Hub
@@ -299,4 +301,4 @@ For more detailed information about monitoring device connections with Event Gri
- See [Monitoring Azure IoT Hub data reference](monitor-iot-hub-reference.md) for a reference of the metrics, logs, and other important values created by [service name]. -- See [Monitoring Azure resources with Azure Monitor](../azure-monitor/insights/monitor-azure-resource.md) for details on monitoring Azure resources.
+- See [Monitoring Azure resources with Azure Monitor](../azure-monitor/insights/monitor-azure-resource.md) for details on monitoring Azure resources.
iot-pnp https://docs.microsoft.com/en-us/azure/iot-pnp/overview-iot-plug-and-play https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-pnp/overview-iot-plug-and-play.md
@@ -42,11 +42,13 @@ IoT Plug and Play is useful for two types of developers:
## Use IoT Plug and Play devices
-As a solution builder, you can develop a cloud-hosted IoT solution that uses IoT Plug and Play devices. Use [IoT Hub](../iot-hub/about-iot-hub.md) - a managed cloud service, that acts as a message hub for secure, bi-directional communication between your IoT application and your devices.
+As a solution builder, you can use [IoT Central](../iot-central/core/overview-iot-central.md) or [IoT Hub](../iot-hub/about-iot-hub.md) to develop a cloud-hosted IoT solution that uses IoT Plug and Play devices.
-When you connect an IoT Plug and Play device to an IoT hub, you can use the [Azure IoT explorer](./howto-use-iot-explorer.md) tool to view the telemetry, properties, and commands defined in the interfaces that compose the model.
+The web UI in IoT Central lets you monitor device conditions, create rules, and manage millions of devices and their data throughout their life cycle. IoT Plug and Play devices connect directly to an IoT Central application where you can use customizable dashboards to monitor and control your devices. You can also use device templates in the IoT Central web UI to create and edit DTDL models.
-If you have existing sensors attached to a Windows or Linux gateway, you can use [IoT Plug and Play bridge](./concepts-iot-pnp-bridge.md), to connect these sensors and create IoT Plug and Play devices without the need to write device software/firmware (for [supported protocols](./concepts-iot-pnp-bridge.md#supported-protocols-and-sensors) ).
+IoT Hub - a managed cloud service - acts as a message hub for secure, bi-directional communication between your IoT application and your devices. When you connect an IoT Plug and Play device to an IoT hub, you can use the [Azure IoT explorer](./howto-use-iot-explorer.md) tool to view the telemetry, properties, and commands defined in the DTDL model.
+
+If you have existing sensors attached to a Windows or Linux gateway, you can use [IoT Plug and Play bridge](./concepts-iot-pnp-bridge.md), to connect these sensors and create IoT Plug and Play devices without the need to write device software/firmware (for [supported protocols](./concepts-iot-pnp-bridge.md#supported-protocols-and-sensors)).
## Develop an IoT device application
key-vault https://docs.microsoft.com/en-us/azure/key-vault/keys/byok-specification https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/keys/byok-specification.md
@@ -9,7 +9,7 @@ tags: azure-resource-manager
Previously updated : 05/29/2020 Last updated : 02/04/2021
@@ -32,7 +32,7 @@ The following are the requirements:
||||| |Key Exchange Key (KEK)|RSA|Azure Key Vault HSM|An HSM backed RSA key pair generated in Azure Key Vault Wrapping Key|AES|Vendor HSM|An [ephemeral] AES key generated by HSM on-prem
-Target Key|RSA, EC, AES|Vendor HSM|The key to be transferred to the Azure Key Vault HSM
+Target Key|RSA, EC, AES (Managed HSM only)|Vendor HSM|The key to be transferred to the Azure Key Vault HSM
**Key Exchange Key**: An HSM-backed key that customer generates in the key vault where the BYOK key will be imported. This KEK must have following properties:
@@ -127,9 +127,16 @@ The JSON blob is stored in a file with a ".byok" extension so that the Azure Pow
Customer will transfer the Key Transfer Blob (".byok" file) to an online workstation and then run a **az keyvault key import** command to import this blob as a new HSM-backed key into Key Vault.
+To import an RSA key use this command:
```azurecli az keyvault key import --vault-name ContosoKeyVaultHSM --name ContosoFirstHSMkey --byok-file KeyTransferPackage-ContosoFirstHSMkey.byok --ops encrypt decrypt ```
+To import an EC key, you must specify key type and the curve name.
+
+```azurecli
+az keyvault key import --vault-name ContosoKeyVaultHSM --name ContosoFirstHSMkey --byok-file --kty EC-HSM --curve-name "P-256" KeyTransferPackage-ContosoFirstHSMkey.byok --ops sign verify
+```
+ When the above command is executed, it results in sending a REST API request as follows:
@@ -137,7 +144,7 @@ When the above command is executed, it results in sending a REST API request as
PUT https://contosokeyvaulthsm.vault.azure.net/keys/ContosoFirstHSMKey?api-version=7.0 ```
-Request body:
+Request body when importing an RSA key:
```json { "key": {
@@ -153,6 +160,25 @@ Request body:
} } ```+
+Request body when importing an EC key:
+```json
+{
+ "key": {
+ "kty": "EC-HSM",
+ "crv": "P-256",
+ "key_ops": [
+ "sign",
+ "verify"
+ ],
+ "key_hsm": "<Base64 encoded BYOK_BLOB>"
+ },
+ "attributes": {
+ "enabled": true
+ }
+}
+```
+ ΓÇ£key_hsmΓÇ¥ value is the entire contents of the KeyTransferPackage-ContosoFirstHSMkey.byok encoded in the Base64 format. ## References
key-vault https://docs.microsoft.com/en-us/azure/key-vault/keys/hsm-protected-keys-byok https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/keys/hsm-protected-keys-byok.md
@@ -9,7 +9,7 @@ tags: azure-resource-manager
Previously updated : 02/01/2021 Last updated : 02/04/2021
@@ -68,10 +68,13 @@ The following table lists prerequisites for using BYOK in Azure Key Vault:
## Supported key types
-|Key name|Key type|Key size|Origin|Description|
+|Key name|Key type|Key size/curve|Origin|Description|
|||||| |Key Exchange Key (KEK)|RSA| 2,048-bit<br />3,072-bit<br />4,096-bit|Azure Key Vault HSM|An HSM-backed RSA key pair generated in Azure Key Vault|
-|Target key|RSA|2,048-bit<br />3,072-bit<br />4,096-bit|Vendor HSM|The key to be transferred to the Azure Key Vault HSM|
+|Target key|
+||RSA|2,048-bit<br />3,072-bit<br />4,096-bit|Vendor HSM|The key to be transferred to the Azure Key Vault HSM|
+||EC|P-256<br />P-384<br />P-521|Vendor HSM|The key to be transferred to the Azure Key Vault HSM|
+||||
## Generate and transfer your key to the Key Vault HSM
@@ -117,7 +120,7 @@ Refer to your HSM vendor's documentation to download and install the BYOK tool.
Transfer the BYOK file to your connected computer. > [!NOTE]
-> Importing RSA 1,024-bit keys is not supported. Currently, importing an Elliptic Curve (EC) key is not supported.
+> Importing RSA 1,024-bit keys is not supported. Importing Elliptic Curve key with curve P-256K is not supported.
> > **Known issue**: Importing an RSA 4K target key from Luna HSMs is only supported with firmware 7.4.0 or newer.
@@ -125,10 +128,17 @@ Transfer the BYOK file to your connected computer.
To complete the key import, transfer the key transfer package (a BYOK file) from your disconnected computer to the internet-connected computer. Use the [az keyvault key import](/cli/azure/keyvault/key?view=azure-cli-latest#az-keyvault-key-import) command to upload the BYOK file to the Key Vault HSM.
+To import an RSA key use following command. Parameter --kty is optional and defaults to 'RSA-HSM'.
```azurecli az keyvault key import --vault-name ContosoKeyVaultHSM --name ContosoFirstHSMkey --byok-file KeyTransferPackage-ContosoFirstHSMkey.byok ```
+To import an EC key, you must specify key type and the curve name.
+
+```azurecli
+az keyvault key import --vault-name ContosoKeyVaultHSM --name ContosoFirstHSMkey --byok-file --kty EC-HSM --curve-name "P-256" KeyTransferPackage-ContosoFirstHSMkey.byok
+```
+ If the upload is successful, Azure CLI displays the properties of the imported key. ## Next steps
key-vault https://docs.microsoft.com/en-us/azure/key-vault/managed-hsm/hsm-protected-keys-byok https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/managed-hsm/hsm-protected-keys-byok.md
@@ -7,7 +7,7 @@ tags: azure-resource-manager
Previously updated : 02/01/2021 Last updated : 02/04/2021
@@ -71,11 +71,14 @@ For more information on login options via the CLI take a look at [sign in with A
## Supported key types
-|Key name|Key type|Key size|Origin|Description|
+|Key name|Key type|Key size/curve|Origin|Description|
|||||| |Key Exchange Key (KEK)|RSA| 2,048-bit<br />3,072-bit<br />4,096-bit|Managed HSM|An HSM-backed RSA key pair generated in Managed HSM|
-|Target key|RSA|2,048-bit<br />3,072-bit<br />4,096-bit|Vendor HSM|The key to be transferred to the Managed HSM|
-
+|Target key|
+||RSA|2,048-bit<br />3,072-bit<br />4,096-bit|Vendor HSM|The key to be transferred to the Managed HSM|
+||EC|P-256<br />P-384<br />P-521|Vendor HSM|The key to be transferred to the Managed HSM|
+||Symmetric key (oct-HSM)|128-bit<br />192-bit<br />256-bit|Vendor HSM|The key to be transferred to the Managed HSM|
+||||
## Generate and transfer your key to the Managed HSM To generate and transfer your key to a Managed HSM:
key-vault https://docs.microsoft.com/en-us/azure/key-vault/secrets/quick-create-github-secret https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/key-vault/secrets/quick-create-github-secret.md
@@ -1,140 +0,0 @@
- Title: Quickstart - Use Azure Key Vault secrets in GitHub Actions workflows
-description: Use Key Vault secrets in GitHub Actions to automate your software development workflows
--- Previously updated : 11/24/2020-----
-# Use Key Vault secrets in GitHub Actions workflows
-
-Use Key Vault secrets in your [GitHub Actions](https://help.github.com/en/articles/about-github-actions) and securely store passwords and other secrets in an Azure key vault. Learn more about [Key Vault](../general/overview.md).
-
-With Key Vault and GitHub Actions, you have the benefits of a centralized secrets management tool and all the advantages of GitHub Actions. GitHub Actions is a suite of features in GitHub to automate your software development workflows. You can deploy workflows in the same place where you store code and collaborate on pull requests and issues.
--
-## Prerequisites
-- A GitHub account. If you don't have one, sign up for [free](https://github.com/join). -- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).-- An Azure App connected to a GitHub repository. This example uses [Deploy containers to Azure App Service](/azure/developer/javascript/tutorial-vscode-docker-node-01). -- An Azure key vault. You can create an Azure Key Vault using the Azure portal, Azure CLI, or Azure PowerShell.-
-## Workflow file overview
-
-A workflow is defined by a YAML (.yml) file in the `/.github/workflows/` path in your repository. This definition contains the various steps and parameters that make up the workflow.
-
-The file has for authenticating with GitHub Actions two sections:
-
-|Section |Tasks |
-|||
-|**Authentication** | 1. Define a service principal. <br /> 2. Create a GitHub secret. <br /> 3. Add a role assignment. |
-|**Key Vault** | 1. Add the key vault action. <br /> 2. Reference key vault secret. |
-
-## Authentication
-
-You can create a [service principal](../../active-directory/develop/app-objects-and-service-principals.md#service-principal-object) with the [az ad sp create-for-rbac](/cli/azure/ad/sp?view=azure-cli-latest#az-ad-sp-create-for-rbac&preserve-view=true) command in the [Azure CLI](/cli/azure/). Run this command with [Azure Cloud Shell](https://shell.azure.com/) in the Azure portal or by selecting the **Try it** button.
-
-```azurecli-interactive
- az ad sp create-for-rbac --name {myApp} --role contributor --scopes /subscriptions/{subscription-id}/resourceGroups/{MyResourceGroup} --sdk-auth
-```
-
-In the example above, replace the placeholders with your subscription ID and resource group name. Replace the placeholder `myApp` with the name of your application. The output is a JSON object with the role assignment credentials that provide access to your App Service app similar to below. Copy this JSON object for later. You will only need the sections with the `clientId`, `clientSecret`, `subscriptionId`, and `tenantId` values.
-
-```output
- {
- "clientId": "<GUID>",
- "clientSecret": "<GUID>",
- "subscriptionId": "<GUID>",
- "tenantId": "<GUID>",
- (...)
- }
-```
-
-### Configure the GitHub secret
-
-Create secrets for your Azure credentials, resource group, and subscriptions.
-
-1. In [GitHub](https://github.com/), browse your repository.
-
-1. Select **Settings > Secrets > New secret**.
-
-1. Paste the entire JSON output from the Azure CLI command into the secret's value field. Give the secret the name `AZURE_CREDENTIALS`.
-
-1. Copy the value of `clientId` to use later.
-
-### Add a role assignment
-
-You need to grant access to the Azure service principal so that you can access your key vault for `get` and `list` operations. If you don't do this, then you will not be able to use the service principal.
-
-Replace `keyVaultName` with the name of your key vault and `clientIdGUID` with the value of your `clientId`.
-
-```azurecli-interactive
- az keyvault set-policy -n {keyVaultName} --secret-permissions get list --spn {clientIdGUID}
-```
-
-## Use the Azure key vault action
-
-With the [Azure key vault action](https://github.com/marketplace/actions/azure-key-vault-get-secrets), you can fetch one or more secrets from an Azure key vault instance and consume it in your GitHub Action workflows.
-
-Secrets fetched are set as outputs and also as environment variables. Variables are automatically masked when they are printed to the console or to logs.
-
-```yaml
- - uses: Azure/get-keyvault-secrets@v1
- with:
- keyvault: "my Vault" # name of key vault in Azure portal
- secrets: 'mySecret' # comma separated list of secret keys to fetch from key vault
- id: myGetSecretAction # ID for secrets that you will reference
-```
-
-To use a key vault in your workflow, you need both the key vault action and to reference that action.
-
-In this example, the key vault is named `containervault`. Two key vault secrets are added to the environment with the key vault action - `containerPassword` and `containerUsername`.
-
-The key vault values are later referenced in the docker login task with the prefix `steps.myGetSecretAction.outputs`.
-
-```yaml
-name: Example key vault flow
-
-on: [push]
-
-jobs:
- build:
- runs-on: ubuntu-latest
- steps:
- # checkout the repo
- - uses: actions/checkout@v2
- - uses: Azure/login@v1
- with:
- creds: ${{ secrets.AZURE_CREDENTIALS }}
- - uses: Azure/get-keyvault-secrets@v1
- with:
- keyvault: "containervault"
- secrets: 'containerPassword, containerUsername'
- id: myGetSecretAction
- - uses: azure/docker-login@v1
- with:
- login-server: myregistry.azurecr.io
- username: ${{ steps.myGetSecretAction.outputs.containerUsername }}
- password: ${{ steps.myGetSecretAction.outputs.containerPassword }}
- - run: |
- docker build . -t myregistry.azurecr.io/myapp:${{ github.sha }}
- docker push myregistry.azurecr.io/myapp:${{ github.sha }}
- - uses: azure/webapps-deploy@v2
- with:
- app-name: 'myapp'
- publish-profile: ${{ secrets.AZURE_WEBAPP_PUBLISH_PROFILE }}
- images: 'myregistry.azurecr.io/myapp:${{ github.sha }}'
-```
-
-## Clean up resources
-
-When your Azure app, GitHub repository, and key vault are no longer needed, clean up the resources you deployed by deleting the resource group for the app, GitHub repository, and key vault.
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Learn more about Azure Key Vault](../general/overview.md)
lab-services https://docs.microsoft.com/en-us/azure/lab-services/class-type-arcgis https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/lab-services/class-type-arcgis.md
@@ -0,0 +1,96 @@
+
+ Title: Set up a lab for ArcMap\ArcGIS Desktop with Azure Lab Services | Microsoft Docs
+description: Learn how to set up a lab for classes using ArcGIS.
++ Last updated : 02/04/2021+++
+# Set up a lab for ArcMap\ArcGIS Desktop
+
+[ArcGIS](https://www.esri.com/en-us/arcgis/products/arcgis-solutions/overview) is a type of geographic information system (GIS). ArcGIS is used to make\analyze maps and work with geographic data that is provided by the [Environmental Systems Research Institute](https://www.esri.com/en-us/home) (ESRI). Although ArcGIS Desktop includes several applications, this article shows how to set up labs for using ArcMap. [ArcMap](https://desktop.arcgis.com/en/arcmap/latest/map/main/what-is-arcmap-.htm) is used to make, edit, and analyze 2D maps.
+
+## Lab configuration
+
+To begin setting up a lab for using ArcMap, you need an Azure subscription and lab account. If you donΓÇÖt have an Azure subscription, create a [free account](https://azure.microsoft.com/free/) before you begin.
+
+After you get an Azure subscription, you can create a new lab account in Azure Lab Services. For more information about creating a new lab account, see [Set up a lab account](tutorial-setup-lab-account.md). You can also use an existing lab account.
+
+### Lab account settings
+
+Enable your lab account settings as described in the following table. For more information about how to enable Azure Marketplace images, see [Specify the Azure Marketplace images available to lab creators](https://docs.microsoft.com/azure/lab-services/specify-marketplace-images).
+
+| Lab account setting | Instructions |
+| - | |
+|Marketplace image| Enable the Windows 10 Pro or Windows 10 Pro N image for use within your lab account.|
+
+### Licensing server
+
+One type of licensing that ArcGIS Desktop offers is [concurrent use licenses](https://desktop.arcgis.com/en/license-manager/latest/license-manager-basics.htm). This requires that you install ArcGIS License Manager on your license server. The License Manager keeps track of the number of copies of software that can be run at the same time. For more information on how to set up the License Manager on your server, see the [License Manager Guide](https://desktop.arcgis.com/en/license-manager/latest/welcome.htm).
+
+The license server is typically located in either your on-premises network or hosted on an Azure virtual machine within an Azure virtual network. After your license server is set up, youΓÇÖll need to [peer the virtual network](https://docs.microsoft.com/azure/lab-services/how-to-connect-peer-virtual-network) with your [lab account](https://docs.microsoft.com/azure/lab-services/tutorial-setup-lab-account). You need to do the network peering before you create the lab so that your lab VMs can access the license server and vice versa.
+
+For more information, see [Set up a license server as a shared resource](how-to-create-a-lab-with-shared-resource.md).
+
+### Lab settings
+
+The size of the virtual machine (VM) that we recommend using for ArcGIS Desktop depends on the applications, extensions, and the specific versions that students will use. The VM size also depends on the workloads that students are expected to perform. Refer to [ArcGIS Desktop system requirements](https://desktop.arcgis.com/en/system-requirements/latest/arcgis-desktop-system-requirements.htm) to help with identifying the VM size. Once youΓÇÖve identified the potential VM size, we recommend that you test your studentsΓÇÖ workloads to ensure adequate performance.
+
+In this article, we recommend using the [**Medium** VM size](administrator-guide.md#vm-sizing) for version [10.7.1 of ArcMap](https://desktop.arcgis.com/en/system-requirements/10.7/arcgis-desktop-system-requirements.htm), assuming that no other ArcGIS Desktop extensions are used. However, depending on the needs of your class, you may require a **Large** or even a **Small\Medium GPU (Visualization)** VM size. For example, the [Spatial Analyst extension](https://desktop.arcgis.com/en/arcmap/latest/tools/spatial-analyst-toolbox/gpu-processing-with-spatial-analyst.htm) that is included with ArcGIS Desktop supports a GPU for enhanced performance, but doesnΓÇÖt require using a GPU.
+
+| Lab setting | Value and description |
+| | |
+|Virtual Machine Size| **Medium**. Best suited for relational databases, in-memory caching, and analytics.|
+
+### Template machine
+
+The steps in this section show how to set up the template VM:
+
+1. Start the template VM and connect to the machine using RDP.
+
+2. Download and install the ArcGIS Desktop components using instructions from by ESRI. These steps include assigning the license manager for concurrent use licensing:
+ - [Introduction to installing and configuring ArcGIS Desktop](https://desktop.arcgis.com/arcmap/latest/get-started/installation-guide/introduction.htm)
+
+3. Set up external backup storage for students. Students can save files directly to their assigned VM since all changes that they make are saved across sessions. However, we recommend that students back up their work to storage that is external from their VM for a few reasons:
+ - To enable students to access their work after the class and lab ends.
+ - In case the student gets their VM into a bad state and their image needs to be [reset](how-to-set-virtual-machine-passwords.md#reset-vms).
+
+ With ArcGIS, each student should back up the following files at the end of each work session:
+
+ - mxd file, which stores the layout information for a project.
+ - File geodatabases, which store all data produced by ArcGIS.
+ - Any other data that the student may be using such as raster files, shapefiles, GeoTIFF, etc.
+
+ We recommend using OneDrive for backup storage. To set up OneDrive on the template VM, follow the steps in the article [Install and configure OneDrive](how-to-prepare-windows-template.md#install-and-configure-onedrive).
+
+4. Finally, [publish](how-to-create-manage-template.md#publish-the-template-vm) the template VM to create the studentsΓÇÖ VM.
+
+### Auto-shutdown and disconnect settings
+
+A labΓÇÖs [auto-shutdown and disconnect settings](cost-management-guide.md#automatic-shutdown-settings-for-cost-control) help make sure that a studentΓÇÖs VM is shut down when itΓÇÖs not being used. These settings should be set according to the types of workloads that your students will perform so that their VM doesnΓÇÖt shut down in the middle of their work. For example, the **Disconnect users when virtual machines are idle** setting disconnects the student from their RDP session after no mouse or keyboard inputs have been detected for a specified amount of time. This setting must allow sufficient time for workloads where the student isn't actively using the mouse or keyboard, such as to run long queries or wait for rendering.
+
+For ArcGIS, we recommend the following values for these settings:
+- Disconnect users when virtual machines are idle
+ - 30 minutes after idle state is detected
+- Shut down virtual machines when users disconnect
+ - 15 minutes after user disconnects
+
+## Cost
+
+Let's cover a possible cost estimate for this class. This estimate doesn't include the cost of running the license server. We'll use a class of 25 students. There are 20 hours of scheduled class time. Also, each student gets 10 hours quota for homework or assignments outside scheduled class time. The virtual machine size we chose was **Medium**, which is 42 lab units.
+
+25 students \* (20 scheduled hours + 10 quota hours) \* 42 Lab Units * 0.01 USD per hour = 315.00 USD
+
+>[!IMPORTANT]
+> Cost estimate is for example purposes only. For current details on pricing, see [Azure Lab Services Pricing](https://azure.microsoft.com/pricing/details/lab-services/).
+
+## Next steps
+
+Next steps are common to setting up any lab.
+
+- [Create and manage a template](how-to-create-manage-template.md)
+- [Add users](tutorial-setup-classroom-lab.md#add-users-to-the-lab)
+- [Set quota](how-to-configure-student-usage.md#set-quotas-for-users)
+- [Set a schedule](tutorial-setup-classroom-lab.md#set-a-schedule-for-the-lab)
+- [Email registration links to students](how-to-configure-student-usage.md#send-invitations-to-users)
lab-services https://docs.microsoft.com/en-us/azure/lab-services/class-type-solidworks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/lab-services/class-type-solidworks.md
@@ -52,7 +52,7 @@ Use the settings in the table below when setting up a classroom lab. For more in
> The **Small GPU (Visualization)** virtual machine size is configured to enable a high-performing graphics experience. For more information about this virtual machine size, see the article on [how to set up a lab with GPUs](./how-to-setup-lab-gpu.md). > [!WARNING]
-> DonΓÇÖt forget to [peer the virtual network](https://www.mathworks.com/support/requirements/matlab-system-requirements.html) for the lab account to the virtual network for the license server **before** creating the lab.
+> DonΓÇÖt forget to [peer the virtual network](./how-to-connect-peer-virtual-network.md) for the lab account to the virtual network for the license server **before** creating the lab.
## Template virtual machine configuration
lab-services https://docs.microsoft.com/en-us/azure/lab-services/class-types https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/lab-services/class-types.md
@@ -9,6 +9,11 @@ Last updated 06/26/2020
Azure Lab Services enables you to quickly set up classroom lab environments in the cloud. Articles in this section provide guidance on how to set up several types of labs using Azure Lab Services.
+## ArcGIS
+[ArcGIS](https://www.esri.com/en-us/arcgis/products/arcgis-solutions/overview) is a type of geographic information system (GIS). You can set up a lab that uses ArcGIS Desktop's various applications, such as [ArcMap](https://desktop.arcgis.com/en/arcmap/latest/map/main/what-is-arcmap-.htm) to make, edit, and analyze 2D maps.
+
+For detailed information on how to set up this type of lab, see [Setup a lab for ArcMap\ArcGIS Desktop](class-type-arcgis.md).
+ ## Big data analytics You can set up a GPU lab to teach a big data analytics class. With this type of class, students learn how to handle large volumes of data, and apply machine and statistical learning algorithms to derive data insights. A key objective for students is to learn to use data analytics tools, such as Apache Hadoop's open-source software package which provides tools for storing, managing, and processing big data.
@@ -36,6 +41,11 @@ For detailed information on how to set up this type of lab, see [Set up a lab to
For detailed information on how to set up this type of lab, see [Setup a lab to teach MATLAB](class-type-matlab.md).
+## Networking with GNS3
+You can set up a lab for a class that focuses on allowing students to emulate, configure, test, and troubleshoot virtual and real networks using [GNS3](https://www.gns3.com/) software.
+
+For detailed information on how to set up this type of lab, see [Setup a lab to teach a networking class](class-type-networking-gns3.md).
+ ## Project Lead the Way (PLTW) [Project Lead the Way (PLTW)](https://www.pltw.org/) is a nonprofit organization that provides PreK-12 curriculum across the United States in computer science, engineering, and biomedical science. In each PLTW class, students use a variety of software applications as part of their hands-on learning experience.
@@ -56,7 +66,12 @@ For detailed information on how to set up this type of lab, see [Shell scripting
## SolidWorks computer-aided design (CAD) You can set up a GPU lab that gives engineering students access to [SolidWorks](https://www.solidworks.com/). SolidWorks provides a 3D CAD environment for modeling solid objects. With SolidWorks, engineers can easily create, visualize, simulate and document their designs.
-For detailed information on how to set up this type of lab, see [Set up a lab for engineering classes using SolidWorks](class-type-solidworks.md)
+For detailed information on how to set up this type of lab, see [Set up a lab for engineering classes using SolidWorks](class-type-solidworks.md).
+
+## SQL database and management
+Structured Query Language (SQL) is the standard language for relational database management including adding, accessing, and managing content in a database. You can set up a lab to teach database concepts using both [MySQL](https://www.mysql.com/) Database server and [SQL Server 2019](https://www.microsoft.com/sql-server/sql-server-2019) server.
+
+For detailed information on how to set up this type of lab, see [Set up a lab to teach database management for relational databases](class-type-database-management.md).
## Next steps See the following articles:
load-balancer https://docs.microsoft.com/en-us/azure/load-balancer/load-balancer-distribution-mode https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/load-balancer/load-balancer-distribution-mode.md
@@ -41,8 +41,8 @@ You can change the configuration of the distribution mode by modifying the load-
The following options are available: * **None (hash-based)** - Specifies that successive requests from the same client may be handled by any virtual machine.
-* **Client IP (source IP affinity 2-tuple)** - Specifies that successive requests from the same client IP address will be handled by the same virtual machine.
-* **Client IP and protocol (source IP affinity 3-tuple)** - Specifies that successive requests from the same client IP address and protocol combination will be handled by the same virtual machine.
+* **Client IP (source IP affinity two-tuple)** - Specifies that successive requests from the same client IP address will be handled by the same virtual machine.
+* **Client IP and protocol (source IP affinity three-tuple)** - Specifies that successive requests from the same client IP address and protocol combination will be handled by the same virtual machine.
5. Choose the distribution mode and then select **Save**.
@@ -61,13 +61,36 @@ $lb.LoadBalancingRules[0].LoadDistribution = 'sourceIp'
Set-AzLoadBalancer -LoadBalancer $lb ```
-Set the value of the `LoadDistribution` element for the amount of load balancing required.
+Set the value of the `LoadDistribution` element for the type of load balancing required.
-Specify **sourceIP** for two-tuple (source IP and destination IP) load balancing.
+* Specify **SourceIP** for two-tuple (source IP and destination IP) load balancing.
-Specify **sourceIPProtocol** for three-tuple (source IP, destination IP, and protocol type) load balancing.
+* Specify **SourceIPProtocol** for three-tuple (source IP, destination IP, and protocol type) load balancing.
-Specify **default** for the default behavior of five-tuple load balancing.
+* Specify **Default** for the default behavior of five-tuple load balancing.
+
+# [**CLI**](#tab/azure-cli)
++
+Use Azure CLI to change the load-balancer distribution settings on an existing load-balancing rule. The following command updates the distribution mode:
+
+```azurecli-interactive
+az network lb rule update \
+ --lb-name myLoadBalancer \
+ --load-distribution SourceIP \
+ --name myHTTPRule \
+ --resource-group myResourceGroupLB
+```
+Set the value of `--load-distribution` for the type of load balancing required.
+
+* Specify **SourceIP** for two-tuple (source IP and destination IP) load balancing.
+
+* Specify **SourceIPProtocol** for three-tuple (source IP, destination IP, and protocol type) load balancing.
+
+* Specify **Default** for the default behavior of five-tuple load balancing.
+
+For more information on the command used in this article, see [az network lb rule update](/cli/azure/network/lb/rule#az_network_lb_rule_update)
load-balancer https://docs.microsoft.com/en-us/azure/load-balancer/load-balancer-standard-virtual-machine-scale-sets https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/load-balancer/load-balancer-standard-virtual-machine-scale-sets.md
@@ -1,7 +1,7 @@
Title: Azure Standard Load Balancer and Virtual Machine Scale Sets-
-description: With this learning path, get started with Azure Standard Load Balancer and Virtual Machine Scale Sets.
+ Title: Add rules for Azure Standard Load Balancer and virtual machine scale sets
+
+description: With this learning path, get started with Azure Standard Load Balancer and virtual machine scale sets.
documentationcenter: na
@@ -14,29 +14,33 @@
Last updated 07/17/2020
-# Azure Load Balancer with Azure virtual machine scale sets
+# Add rules for Azure Load Balancer with virtual machine scale sets
-When working with virtual machine scale sets and load balancer, the following guidelines should be considered:
+When you work with virtual machine scale sets and Azure Load Balancer, consider the following guidelines.
-## Port Forwarding and inbound NAT rules:
- * After the scale set has been created, the backend port cannot be modified for a load balancing rule used by a health probe of the load balancer. To change the port, you can remove the health probe by updating the Azure virtual machine scale set, update the port and then configure the health probe again.
- * When using the virtual machine scale set in the backend pool of the load balancer, the default inbound NAT rules get created automatically.
+## Port forwarding and inbound NAT rules
+
+After the scale set has been created, the back-end port can't be modified for a load-balancing rule used by a health probe of the load balancer. To change the port, remove the health probe by updating the virtual machine scale set and updating the port. Then configure the health probe again.
+
+When you use the virtual machine scale set in the back-end pool of the load balancer, the default inbound NAT rules are created automatically.
-## Inbound NAT pool:
- * Each virtual machine scale set must have at least one inbound NAT pool.
- * Inbound NAT pool is a collection of inbound NAT rules. One inbound NAT pool cannot support multiple virtual machine scales sets.
+## Inbound NAT pool
+
+Each virtual machine scale set must have at least one inbound NAT pool. An inbound NAT pool is a collection of inbound NAT rules. One inbound NAT pool can't support multiple virtual machine scale sets.
-## Load balancing rules:
- * When using the virtual machine scale set in the backend pool of the load balancer, the default load balancing rule gets created automatically.
+## Load-balancing rules
+
+When you use the virtual machine scale set in the back-end pool of the load balancer, the default load-balancing rule is created automatically.
-## Outbound rules:
- * To create outbound rule for a backend pool that is already referenced by a load balancing rule, you need to first mark **"Create implicit outbound rules"** as **No** in the portal when the inbound load balancing rule is created.
+## Outbound rules
+
+To create an outbound rule for a back-end pool that's already referenced by a load-balancing rule, select **No** under **Create implicit outbound rules** in the Azure portal when the inbound load-balancing rule is created.
- :::image type="content" source="./media/vm-scale-sets/load-balancer-and-vm-scale-sets.png" alt-text="Load balancing rule creation" border="true":::
+ :::image type="content" source="./media/vm-scale-sets/load-balancer-and-vm-scale-sets.png" alt-text="Screenshot that shows load-balancing rule creation." border="true":::
-The following methods can be used to deploy a virtual machine scale set with an existing Azure load balancer.
+Use the following methods to deploy a virtual machine scale set with an existing instance of Load Balancer:
-* [Configure a virtual machine scale set with an existing Azure Load Balancer using the Azure portal](./configure-vm-scale-set-portal.md).
-* [Configure a virtual machine scale set with an existing Azure Load Balancer using Azure PowerShell](./configure-vm-scale-set-powershell.md).
-* [Configure a virtual machine scale set with an existing Azure Load Balancer using the Azure CLI](./configure-vm-scale-set-cli.md).
-* [Update or delete existing Azure Load Balancer used by Virtual Machine Scale Set](./update-load-balancer-with-vm-scale-set.md)
+* [Configure a virtual machine scale set with an existing instance of Azure Load Balancer using the Azure portal](./configure-vm-scale-set-portal.md)
+* [Configure a virtual machine scale set with an existing instance of Azure Load Balancer using Azure PowerShell](./configure-vm-scale-set-powershell.md)
+* [Configure a virtual machine scale set with an existing instance of Azure Load Balancer using the Azure CLI](./configure-vm-scale-set-cli.md)
+* [Update or delete an existing instance of Azure Load Balancer used by a virtual machine scale set](./update-load-balancer-with-vm-scale-set.md)
load-balancer https://docs.microsoft.com/en-us/azure/load-balancer/quickstart-load-balancer-standard-public-powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/load-balancer/quickstart-load-balancer-standard-public-powershell.md
@@ -59,6 +59,7 @@ $publicip = @{
Location = 'eastus' Sku = 'Standard' AllocationMethod = 'static'
+ Zone = 1,2,3
} New-AzPublicIpAddress @publicip
@@ -73,7 +74,7 @@ $publicip = @{
Location = 'eastus' Sku = 'Standard' AllocationMethod = 'static'
- Zone = '1'
+ Zone = 1
} New-AzPublicIpAddress @publicip
@@ -337,6 +338,7 @@ $publicipout = @{
Location = 'eastus' Sku = 'Standard' AllocationMethod = 'static'
+ Zone = 1,2,3
} New-AzPublicIpAddress @publicipout
@@ -351,7 +353,7 @@ $publicipout = @{
Location = 'eastus' Sku = 'Standard' AllocationMethod = 'static'
- Zone = '1'
+ Zone = 1
} New-AzPublicIpAddress @publicipout
load-balancer https://docs.microsoft.com/en-us/azure/load-balancer/update-load-balancer-with-vm-scale-set https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/load-balancer/update-load-balancer-with-vm-scale-set.md
@@ -1,7 +1,7 @@
Title: Update or delete existing Azure Load Balancer used by Virtual Machine Scale Set-
-description: With this how-to article, get started with Azure Standard Load Balancer and Virtual Machine Scale Sets.
+ Title: Update or delete an existing load balancer used by virtual machine scale sets
+
+description: With this how-to article, get started with Azure Standard Load Balancer and virtual machine scale sets.
documentationcenter: na
@@ -14,45 +14,58 @@
Last updated 12/30/2020
-# How to update/delete Azure Load Balancer used by Virtual Machine Scale Sets
+# Update or delete a load balancer used by virtual machine scale sets
-## How to set up Azure Load Balancer for scaling out Virtual Machine Scale Sets
- * Make sure that the Load Balancer has [inbound NAT pool](/cli/azure/network/lb/inbound-nat-pool?view=azure-cli-latest) set up and that the Virtual Machine Scale Set is put in the backend pool of the Load Balancer. Azure Load Balancer will automatically create new inbound NAT rules in the inbound NAT pool when new Virtual Machine instances are added to the Virtual Machine Scale Set.
- * To check whether inbound NAT pool is properly set up,
- 1. Sign in to the Azure portal at https://portal.azure.com.
-
- 1. Select **All resources** on the left menu, and then select **MyLoadBalancer** from the resource list.
-
- 1. Under **Settings**, select **Inbound NAT Rules**.
-If you see on the right pane, a list of rules created for each individual instance in the Virtual Machine Scale Set, the congrats you are all set to go for scaling up at any time.
+When you work with virtual machine scale sets and an instance of Azure Load Balancer, you can:
+
+- Add, update, and delete rules.
+- Add configurations.
+- Delete the load balancer.
+
+## Set up a load balancer for scaling out virtual machine scale sets
+
+Make sure that the instance of Azure Load Balancer has an [inbound NAT pool](/cli/azure/network/lb/inbound-nat-pool?view=azure-cli-latest) set up and that the virtual machine scale set is put in the backend pool of the load balancer. Load Balancer will automatically create new inbound NAT rules in the inbound NAT pool when new virtual machine instances are added to the virtual machine scale set.
+
+To check whether the inbound NAT pool is properly set up:
+
+1. Sign in to the [Azure portal](https://portal.azure.com).
+1. On the left menu, select **All resources**. Then select **MyLoadBalancer** from the resource list.
+1. Under **Settings**, select **Inbound NAT rules**. In the right pane, if you see a list of rules created for each individual instance in the virtual machine scale set, you're all set to go for scaling up at any time.
-## How to add inbound NAT rules?
- * Individual inbound NAT rule cannot be added. However, you can add a set of inbound NAT rules with defined frontend port range and backend port for all instances in the Virtual Machine Scale Set.
- * In order to add a whole set of inbound NAT rules for the Virtual Machine Scale Sets, you need to first create an inbound NAT pool in the Load Balancer, and then reference the inbound NAT pool from the network profile of Virtual Machine Scale Set. A full example using CLI is shown below.
- * The new inbound NAT pool should not have overlapping frontend port range with existing inbound NAT pools. To view existing inbound NAT pools set up, you can use this [CLI command](/cli/azure/network/lb/inbound-nat-pool?view=azure-cli-latest#az_network_lb_inbound_nat_pool_list)
+## Add inbound NAT rules
+
+Individual inbound NAT rules can't be added. But you can add a set of inbound NAT rules with defined front-end port range and back-end port for all instances in the virtual machine scale set.
+
+To add a whole set of inbound NAT rules for the virtual machine scale sets, first create an inbound NAT pool in the load balancer. Then reference the inbound NAT pool from the network profile of the virtual machine scale set. A full example using the CLI is shown.
+
+The new inbound NAT pool should not have an overlapping front-end port range with existing inbound NAT pools. To view existing inbound NAT pools that are set up, use this [CLI command](/cli/azure/network/lb/inbound-nat-pool?view=azure-cli-latest#az_network_lb_inbound_nat_pool_list):
+
```azurecli-interactive
-az network lb inbound-nat-pool create
- -g MyResourceGroup
- --lb-name MyLb
- -n MyNatPool
- --protocol Tcp
- --frontend-port-range-start 80
- --frontend-port-range-end 89
- --backend-port 80
- --frontend-ip-name MyFrontendIp
-az vmss update
- -g MyResourceGroup
- -n myVMSS
- --add virtualMachineProfile.networkProfile.networkInterfaceConfigurations[0].ipConfigurations[0].loadBalancerInboundNatPools "{'id':'/subscriptions/mySubscriptionId/resourceGroups/MyResourceGroup/providers/Microsoft.Network/loadBalancers/MyLb/inboundNatPools/MyNatPool'}"
-
-az vmss update-instances
- -ΓÇôinstance-ids *
- --resource-group MyResourceGroup
- --name MyVMSS
+ az network lb inbound-nat-pool create
+ -g MyResourceGroup
+ --lb-name MyLb
+ -n MyNatPool
+ --protocol Tcp
+ --frontend-port-range-start 80
+ --frontend-port-range-end 89
+ --backend-port 80
+ --frontend-ip-name MyFrontendIp
+ az vmss update
+ -g MyResourceGroup
+ -n myVMSS
+ --add virtualMachineProfile.networkProfile.networkInterfaceConfigurations[0].ipConfigurations[0].loadBalancerInboundNatPools "{'id':'/subscriptions/mySubscriptionId/resourceGroups/MyResourceGroup/providers/Microsoft.Network/loadBalancers/MyLb/inboundNatPools/MyNatPool'}"
+
+ az vmss update-instances
+ -ΓÇôinstance-ids *
+ --resource-group MyResourceGroup
+ --name MyVMSS
```
-## How to update inbound NAT rules?
- * Individual inbound NAT rule cannot be updated. However, you can update a set of inbound NAT rules with defined frontend port range and backend port for all instances in the Virtual Machine Scale Set.
- * In order to update a whole set of inbound NAT rules for the Virtual Machine Scale Sets, you need to update the inbound NAT pool in the Load Balancer.
+## Update inbound NAT rules
+
+Individual inbound NAT rules can't be updated. But you can update a set of inbound NAT rules with a defined front-end port range and a back-end port for all instances in the virtual machine scale set.
+
+To update a whole set of inbound NAT rules for virtual machine scale sets, update the inbound NAT pool in the load balancer.
+
```azurecli-interactive az network lb inbound-nat-pool update -g MyResourceGroup
@@ -62,60 +75,68 @@ az network lb inbound-nat-pool update
--backend-port 8080 ```
-## How to delete inbound NAT rules?
-* Individual inbound NAT rule cannot be deleted. However, you can delete the entire set of inbound NAT rules.
-* In order to delete the whole set of inbound NAT rules used by the Scale Set, you need to first remove the NAT pool from the scale set. A full example using CLI is shown below:
+## Delete inbound NAT rules
+
+Individual inbound NAT rules can't be deleted, but you can delete the entire set of inbound NAT rules.
+
+To delete the whole set of inbound NAT rules used by the scale set, first remove the NAT pool from the scale set. A full example using the CLI is shown here:
+
```azurecli-interactive
- az vmss update
- --resource-group MyResourceGroup
- --name MyVMSS
- az vmss update-instances
- --instance-ids "*"
- --resource-group MyResourceGroup
- --name MyVMSS
- az network lb inbound-nat-pool delete
- --resource-group MyResourceGroup
- -ΓÇôlb-name MyLoadBalancer
- --name MyNatPool
+ az vmss update
+ --resource-group MyResourceGroup
+ --name MyVMSS
+ az vmss update-instances
+ --instance-ids "*"
+ --resource-group MyResourceGroup
+ --name MyVMSS
+ az network lb inbound-nat-pool delete
+ --resource-group MyResourceGroup
+ -ΓÇôlb-name MyLoadBalancer
+ --name MyNatPool
```
-## How to add multiple IP Configurations:
-1. Select **All resources** on the left menu, and then select **MyLoadBalancer** from the resource list.
-
-1. Under **Settings**, select