Updates from: 06/02/2021 03:09:08
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Date Transformations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/date-transformations.md
Last updated 02/16/2020 + # Date claims transformations
Checks that one date and time claim (string data type) is later than a second da
| - | -- | | -- | | InputClaim | leftOperand | string | First claim's type, which should be later than the second claim. | | InputClaim | rightOperand | string | Second claim's type, which should be earlier than the first claim. |
-| InputParameter | AssertIfEqualTo | boolean | Specifies whether this assertion should pass if the left operand is equal to the right operand. |
+| InputParameter | AssertIfEqualTo | boolean | Specifies whether this assertion should throw an error if the left operand is equal to the right operand. An error will be thrown if the left operand is equal to the right operand and the value is set to `true`. Possible values: `true` (default), or `false`. |
| InputParameter | AssertIfRightOperandIsNotPresent | boolean | Specifies whether this assertion should pass if the right operand is missing. | | InputParameter | TreatAsEqualIfWithinMillseconds | int | Specifies the number of milliseconds to allow between the two date times to consider the times equal (for example, to account for clock skew). |
The **AssertDateTimeIsGreaterThan** claims transformation is always executed fro
![AssertStringClaimsAreEqual execution](./media/date-transformations/assert-execution.png)
-The following example compares the `currentDateTime` claim with the `approvedDateTime` claim. An error is thrown if `currentDateTime` is later than `approvedDateTime`. The transformation treats values as equal if they are within 5 minutes (30000 milliseconds) difference.
+The following example compares the `currentDateTime` claim with the `approvedDateTime` claim. An error is thrown if `currentDateTime` is later than `approvedDateTime`. The transformation treats values as equal if they are within 5 minutes (30000 milliseconds) difference. It won't throw an error if the values are equal because `AssertIfEqualTo` is set to `false`.
```xml <ClaimsTransformation Id="AssertApprovedDateTimeLaterThanCurrentDateTime" TransformationMethod="AssertDateTimeIsGreaterThan">
The following example compares the `currentDateTime` claim with the `approvedDat
</ClaimsTransformation> ```
+> [!NOTE]
+> In the example above, if you remove the `AssertIfEqualTo` input parameter, and the `currentDateTime` is equal to`approvedDateTime`, an error will be thrown. The `AssertIfEqualTo` default value is `true`.
+>
+ The `login-NonInteractive` validation technical profile calls the `AssertApprovedDateTimeLaterThanCurrentDateTime` claims transformation. ```xml <TechnicalProfile Id="login-NonInteractive">
active-directory-b2c Deploy Custom Policies Devops https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/deploy-custom-policies-devops.md
Previously updated : 05/28/2021 Last updated : 06/01/2021 # Deploy custom policies with Azure Pipelines
-[Azure Pipelines](/azure/devops/pipelines.md) supports continuous integration (CI) and continuous delivery (CD) to constantly and consistently test, build, and ship a code to any target. This article describes how to automate the deployment process of the Azure Active Directory B2C (Azure AD B2C) [custom policies](user-flow-overview.md) using Azure Pipelines.
+[Azure Pipelines](/azure/devops/pipelines) supports continuous integration (CI) and continuous delivery (CD) to constantly and consistently test, build, and ship a code to any target. This article describes how to automate the deployment process of the Azure Active Directory B2C (Azure AD B2C) [custom policies](user-flow-overview.md) using Azure Pipelines.
> [!IMPORTANT] > Managing Azure AD B2C custom policies with Azure Pipelines currently uses **preview** operations available on the Microsoft Graph API `/beta` endpoint. Use of these APIs in production applications is not supported. For more information, see the [Microsoft Graph REST API beta endpoint reference](/graph/api/overview?toc=.%2fref%2ftoc.json&view=graph-rest-beta&preserve-view=true).
## Prerequisites * Complete the steps in the [Get started with custom policies in Active Directory B2C](tutorial-create-user-flows.md).
-* If you haven't created an DevOps organization, create one by following the instructions in [Sign up, sign in to Azure DevOps](/azure/devops/user-guide/sign-up-invite-teammates.md).
+* If you haven't created an DevOps organization, create one by following the instructions in [Sign up, sign in to Azure DevOps](/azure/devops/user-guide/sign-up-invite-teammates).
## Register an application for management tasks
active-directory-b2c Restful Technical Profile https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/restful-technical-profile.md
The technical profile also returns claims, that aren't returned by the identity
| ServiceUrl | Yes | The URL of the REST API endpoint. | | AuthenticationType | Yes | The type of authentication being performed by the RESTful claims provider. Possible values: `None`, `Basic`, `Bearer`, `ClientCertificate`, or `ApiKeyHeader`. <br /><ul><li>The `None` value indicates that the REST API is anonymous. </li><li>The `Basic` value indicates that the REST API is secured with HTTP basic authentication. Only verified users, including Azure AD B2C, can access your API. </li><li>The `ClientCertificate` (recommended) value indicates that the REST API restricts access by using client certificate authentication. Only services that have the appropriate certificates, for example Azure AD B2C, can access your API. </li><li>The `Bearer` value indicates that the REST API restricts access using client OAuth2 Bearer token. </li><li>The `ApiKeyHeader` value indicates that the REST API is secured with API key HTTP header, such as *x-functions-key*. </li></ul> | | AllowInsecureAuthInProduction| No| Indicates whether the `AuthenticationType` can be set to `none` in production environment (`DeploymentMode` of the [TrustFrameworkPolicy](trustframeworkpolicy.md) is set to `Production`, or not specified). Possible values: true, or false (default). |
-| SendClaimsIn | No | Specifies how the input claims are sent to the RESTful claims provider. Possible values: `Body` (default), `Form`, `Header`, `Url` or `QueryString`. <br /> The `Body` value is the input claim that is sent in the request body in JSON format. <br />The `Form` value is the input claim that is sent in the request body in an ampersand '&' separated key value format. <br />The `Header` value is the input claim that is sent in the request header. <br />The `Url` value is the input claim that is sent in the URL, for example, https://api.example.com/{claim1}/{claim2}?{claim3}={claim4}. The host name part of the URL cannot contain claims. <br />The `QueryString` value is the input claim that is sent in the request query string. <br />The HTTP verbs invoked by each are as follows:<br /><ul><li>`Body`: POST</li><li>`Form`: POST</li><li>`Header`: GET</li><li>`Url`: GET</li><li>`QueryString`: GET</li></ul> |
+| SendClaimsIn | No | Specifies how the input claims are sent to the RESTful claims provider. Possible values: `Body` (default), `Form`, `Header`, `Url` or `QueryString`. <br /> The `Body` value is the input claim that is sent in the request body in JSON format. <br />The `Form` value is the input claim that is sent in the request body in an ampersand '&' separated key value format. <br />The `Header` value is the input claim that is sent in the request header. <br />The `Url` value is the input claim that is sent in the URL, for example, `https://api.example.com/{claim1}/{claim2}?{claim3}={claim4}`. The host name part of the URL cannot contain claims. <br />The `QueryString` value is the input claim that is sent in the request query string. <br />The HTTP verbs invoked by each are as follows:<br /><ul><li>`Body`: POST</li><li>`Form`: POST</li><li>`Header`: GET</li><li>`Url`: GET</li><li>`QueryString`: GET</li></ul> |
| ClaimsFormat | No | Not currently used, can be ignored. | | ClaimUsedForRequestPayload| No | Name of a string claim that contains the payload to be sent to the REST API. | | DebugMode | No | Runs the technical profile in debug mode. Possible values: `true`, or `false` (default). In debug mode, the REST API can return more information. See the [Returning error message](#returning-validation-error-message) section. |
active-directory-domain-services Change Sku https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-domain-services/change-sku.md
You can change SKUs up or down after the managed domain has been deployed. Howev
For example:
-* If you have created two forest trusts on the *Premium* SKU, you can't change down to the *Standard* SKU. The *Standard* SKU doesn't support forest trusts.
+* You can't change down to the *Standard* SKU. Azure AD DS resource forest doesn't support the *Standard* SKU.
* Or, if you have created seven trusts on the *Premium* SKU, you can't change down to the *Enterprise* SKU. The *Enterprise* SKU supports a maximum of five trusts. For more information on these limits, see [Azure AD DS SKU features and limits][concepts-sku].
active-directory Workday Integration Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/workday-integration-reference.md
Previously updated : 05/11/2021 Last updated : 06/01/2021
The table below provides guidance on mapping configuration to use to retrieve a
Here are some examples on how you can extend the Workday integration to meet specific requirements.
-**Example 1**
+### Example 1: Retrieving cost center and pay group information
Let's say you want to retrieve the following data sets from Workday and use them in your provisioning rules:
To retrieve these data sets:
>| CostCenterCode | wd:Worker/wd:Worker_Data/wd:Organization_Data/wd:Worker_Organization_Data/wd:Organization_Data[wd:Organization_Type_Reference/@wd:Descriptor='Cost Center']/wd:Organization_Code/text() | >| PayGroup | wd:Worker/wd:Worker_Data/wd:Organization_Data/wd:Worker_Organization_Data/wd:Organization_Data[wd:Organization_Type_Reference/@wd:Descriptor='Pay Group']/wd:Organization_Name/text() |
-**Example 2**
+### Example 2: Retrieving qualification and skills data
Let's say you want to retrieve certifications associated with a user. This information is available as part of the *Qualification Data* set. To get this data set as part of the *Get_Workers* response, use the following XPATH: `wd:Worker/wd:Worker_Data/wd:Qualification_Data/wd:Certification/wd:Certification_Data/wd:Issuer/text()`
-**Example 3**
+### Example 3: Retrieving provisioning group assignments
Let's say you want to retrieve *Provisioning Groups* assigned to a worker. This information is available as part of the *Account Provisioning Data* set.
-To get this data set as part of the *Get_Workers* response, use the following XPATH:
+To get this data, as part of the *Get_Workers* response, use the following XPATH:
`wd:Worker/wd:Worker_Data/wd:Account_Provisioning_Data/wd:Provisioning_Group_Assignment_Data[wd:Status='Assigned']/wd:Provisioning_Group/text()` ## Handling different HR scenarios
+### Support for worker conversions
+
+When a worker converts from employee to contingent worker or from contingent worker to employee, the Workday connector automatically detects this change and links the AD account to the active worker profile so that all AD attributes are in sync with the active worker profile. No configuration changes are required to enable this functionality. Here is the description of the provisioning behavior when a conversion happens.
+
+* Let's say John Smith joins as a contingent worker in January. As there is no AD account associated with John's *WorkerID* (matching attribute), the provisioning service creates a new AD account for the user and links John's contingent worker *WID (WorkdayID)* to his AD account.
+* Three months later, John converts to a full-time employee. In Workday, a new worker profile is created for John. Though John's *WorkerID* in Workday stays the same, John now has two *WID*s in Workday, one associated with the contingent worker profile and another associated with the employee worker profile.
+* During incremental sync, when the provisioning service detects two worker profiles for the same WorkerID, it automatically transfers ownership of the AD account to the active worker profile. In this case, it de-links the contingent worker profile from the AD account and establishes a new link between John's active employee worker profile and his AD account.
+
+>[!NOTE]
+>During initial full sync, you may notice a behavior where the attribute values associated with the previous inactive worker profile flow to the AD account of converted workers. This is temporary and as full sync progresses, it will eventually be overwritten by attribute values from the active worker profile. Once the full sync is complete and the provisioning job reaches steady state, it will always pick the active worker profile during incremental sync.
++ ### Retrieving international job assignments and secondary job details By default, the Workday connector retrieves attributes associated with the worker's primary job. The connector also supports retrieving *Additional Job Data* associated with international job assignments or secondary jobs.
active-directory Concept Sspr Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/concept-sspr-policy.md
The following Azure AD password policy options are defined. Unless noted, you ca
| Property | Requirements | | | |
-| Characters allowed |<ul><li>A ΓÇô Z</li><li>a - z</li><li>0 ΓÇô 9</li> <li>@ # $ % ^ & * - _ ! + = [ ] { } &#124; \ : ' , . ? / \` ~ " ( ) ;</li> <li>blank space</li></ul> |
+| Characters allowed |<ul><li>A ΓÇô Z</li><li>a - z</li><li>0 ΓÇô 9</li> <li>@ # $ % ^ & * - _ ! + = [ ] { } &#124; \ : ' , . ? / \` ~ " ( ) ; < ></li> <li>blank space</li></ul> |
| Characters not allowed | Unicode characters. | | Password restrictions |<ul><li>A minimum of 8 characters and a maximum of 256 characters.</li><li>Requires three out of four of the following:<ul><li>Lowercase characters.</li><li>Uppercase characters.</li><li>Numbers (0-9).</li><li>Symbols (see the previous password restrictions).</li></ul></li></ul> | | Password expiry duration (Maximum password age) |<ul><li>Default value: **90** days.</li><li>The value is configurable by using the `Set-MsolPasswordPolicy` cmdlet from the Azure Active Directory Module for Windows PowerShell.</li></ul> |
active-directory How To Nudge Authenticator App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/how-to-nudge-authenticator-app.md
Previously updated : 05/27/2021 Last updated : 06/01/2021
Here are a few sample JSONs you can use to get started!
![User object ID](./media/how-to-nudge-authenticator-app/object-id.png)
-### Identify the GUIDs of users to insert in the JSONs
+### Identify the GUIDs of groups to insert in the JSONs
1. Navigate to the Azure portal. 1. Tap **Azure Active Directory**.
active-directory Howto Mfaserver Adfs 2012 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/howto-mfaserver-adfs-2012.md
Previously updated : 07/11/2018 Last updated : 06/01/2021
Follow these steps to edit the MultiFactorAuthenticationAdfsAdapter.config file:
There are two options for configuring the Web Service SDK. The first is with a username and password, the second is with a client certificate. Follow these steps for the first option, or skip ahead for the second. 1. Set the value for **WebServiceSdkUsername** to an account that is a member of the PhoneFactor Admins security group. Use the &lt;domain&gt;&#92;&lt;user name&gt; format.
-2. Set the value for **WebServiceSdkPassword** to the appropriate account password.
+2. Set the value for **WebServiceSdkPassword** to the appropriate account password. The special character "&" cannot be used in the **WebServiceSdkPassword**.
### Configure the Web Service SDK with a client certificate
active-directory Tutorial Enable Sspr https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/tutorial-enable-sspr.md
Previously updated : 04/21/2021 Last updated : 06/01/2021
To finish this tutorial, you need the following resources and privileges:
Azure AD lets you enable SSPR for *None*, *Selected*, or *All* users. This granular ability lets you choose a subset of users to test the SSPR registration process and workflow. When you're comfortable with the process and the time is right to communicate the requirements with a broader set of users, you can select a group of users to enable for SSPR. Or, you can enable SSPR for everyone in the Azure AD tenant. > [!NOTE]
-> Currently, you can only enable one Azure AD group for SSPR using the Azure portal. As part of a wider deployment of SSPR, Azure AD supports nested groups. Make sure that the users in the group(s) you choose have the appropriate licenses assigned. There's currently no validation process of these licensing requirements.
+> Currently, you can only enable one Azure AD group for SSPR using the Azure portal. As part of a wider deployment of SSPR, Azure AD supports nested groups.
In this tutorial, set up SSPR for a set of users in a test group. Use the *SSPR-Test-Group* and provide your own Azure AD group as needed:
active-directory Active Directory Configurable Token Lifetimes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/active-directory-configurable-token-lifetimes.md
Previously updated : 04/08/2021 Last updated : 06/01/2021
ID tokens are passed to websites and native clients. ID tokens contain profile i
## Token lifetime policies for refresh tokens and session tokens
-You can not set token lifetime policies for refresh tokens and session tokens.
+You can not set token lifetime policies for refresh tokens and session tokens. For lifetime, timeout, and revocation information on refresh tokens, see [Refresh tokens](refresh-tokens.md).
> [!IMPORTANT] > As of January 30, 2021 you can not configure refresh and session token lifetimes. Azure Active Directory no longer honors refresh and session token configuration in existing policies. New tokens issued after existing tokens have expired are now set to the [default configuration](#configurable-token-lifetime-properties). You can still configure access, SAML, and ID token lifetimes after the refresh and session token configuration retirement.
active-directory Quickstart Register App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/quickstart-register-app.md
Previously updated : 05/04/2021 Last updated : 05/30/2021 -
-# Customer intent: As an enterprise developer or software-as-a-service (SaaS) provider, I want to know how to register my application with the Microsoft identity platform so that the security token service can issue ID and/or access tokens to clients that want to access it.
+# Customer intent: As developer, I want to know how to register my application with the Microsoft identity platform so that the security token service can issue ID and/or access tokens to client applications that request them.
# Quickstart: Register an application with the Microsoft identity platform
You can add both certificates and client secrets (a string) as credentials to yo
### Add a certificate
-Sometimes called a _public key_, a certificate is the recommended credential type. It provides more assurance than a client secret. For more information about using a certificate as an authentication method in your application, see [Microsoft identity platform application authentication certificate credentials](active-directory-certificate-credentials.md).
+Sometimes called a _public key_, a certificate is the recommended credential type because they're considered more secure than client secrets. For more information about using a certificate as an authentication method in your application, see [Microsoft identity platform application authentication certificate credentials](active-directory-certificate-credentials.md).
1. In the Azure portal, in **App registrations**, select your application. 1. Select **Certificates & secrets** > **Upload certificate**.
Sometimes called a _public key_, a certificate is the recommended credential typ
### Add a client secret
-The client secret is also known as an _application password_. It's a string value your app can use in place of a certificate to identity itself. The client secret is the easier of the two credential types to use. It's often used during development, but it's considered less secure than a certificate. Use certificates in your applications that are running in production.
+Sometimes called an _application password_, a client secret is a string value your app can use in place of a certificate to identity itself.
-For more information about application security recommendations, see [Microsoft identity platform best practices and recommendations](identity-platform-integration-checklist.md#security).
+Client secrets are considered less secure than certificate credentials. Application developers sometimes use client secrets during local app development because of their ease of use. However, you should use certificate credentials for any application you have running in production.
1. In the Azure portal, in **App registrations**, select your application. 1. Select **Certificates & secrets** > **New client secret**. 1. Add a description for your client secret.
-1. Select a duration.
+1. Select an expiration for the secret or specify a custom lifetime.
+ - Client secret lifetime is limited to two years (24 months) or less. You can't specify a custom lifetime longer than 24 months.
+ - Microsoft recommends that you set an expiration value of less than 12 months.
1. Select **Add**. 1. _Record the secret's value_ for use in your client application code. This secret value is _never displayed again_ after you leave this page.
-For security reasons, Microsoft limits creation of client secrets longer than 24 months and strongly recommends that you set this to a value less than 12 months.
+For application security recommendations, see [Microsoft identity platform best practices and recommendations](identity-platform-integration-checklist.md#security).
## Next steps
active-directory Sample V2 Code https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/sample-v2-code.md
These samples show how to write a single-page application secured with Microsoft
> [!div class="mx-tdCol2BreakAll"] > | Language/<br/>Platform | Code sample | Description | Auth libraries | Auth flow | > | - | -- | | - | -- |
-> |Angular|[GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-angular-spa)| &#8226; Signs in users with AAD <br/>&#8226; Calls Microsoft Graph | MSAL Angular | Auth code flow (with PKCE) |
-> | Angular | [GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial) | &#8226; [Signs in users](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/1-Authentication/1-sign-in/README.md)<br/>&#8226; [Signs in users (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/1-Authentication/2-sign-in-b2c/README.md) <br/> &#8226; [Calls Microsoft Graph](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/2-Authorization-I/1-call-graph/README.md)<br/>&#8226; [Calls .NET Core web API](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/3-Authorization-II/1-call-api)<br/>&#8226; [Calls .NET Core web API (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/3-Authorization-II/2-call-api-b2c)<br/>&#8226; [Calls Microsoft Graph via OBO](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/7-AdvancedScenarios/1-call-api-obo/README.md)<br/>&#8226; [Calls .NET Core web API using PoP](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/7-AdvancedScenarios/2-call-api-pop/README.md)<br/>&#8226; [Uses App Roles for access control](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/5-AccessControl/1-call-api-roles/README.md)<br/>&#8226; [Uses Security Groups for access control](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/5-AccessControl/2-call-api-groups/README.md)<br/>&#8226; [Deploys to Azure Storage & App Service](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/4-Deployment/README.md)| MSAL Angular | &#8226; Auth code flow (with PKCE)<br/>&#8226; On-behalf-of (OBO) flow<br/>&#8226; Proof of Possession (PoP)|
-> | Blazor WebAssembly | [GitHub repo](https://github.com/Azure-Samples/ms-identity-blazor-wasm) | &#8226; Signs in users<br/>&#8226; Calls Microsoft Graph | MSAL.js | Auth code flow (with PKCE) |
-> | JavaScript | [GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-v2) | &#8226; Signs in users<br/>&#8226; Calls Microsoft Graph | MSAL.js | Auth code flow (with PKCE) |
-> | JavaScript | [GitHub repo](https://github.com/Azure-Samples/ms-identity-b2c-javascript-spa) | &#8226; Signs in users (B2C)<br/>&#8226; Calls Node.js web API | MSAL.js | Auth code flow (with PKCE) |
-> | JavaScript | [GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-tutorial) | &#8226; [Signs in users](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/1-Authentication/1-sign-in/README.md)<br/>&#8226; [Signs in users (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/1-Authentication/2-sign-in-b2c/README.md) <br/> &#8226; [Calls Microsoft Graph](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/2-Authorization-I/1-call-graph/README.md)<br/>&#8226; [Calls Node.js web API](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/3-Authorization-II/1-call-api/README.md)<br/>&#8226; [Calls Node.js web API (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/3-Authorization-II/2-call-api-b2c/README.md)<br/>&#8226; [Calls Microsoft Graph via OBO](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/4-AdvancedGrants/1-call-api-graph/README.md)<br/>&#8226; [Calls Node.js web API via OBO & CA](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/4-AdvancedGrants/2-call-api-api-c)| MSAL.js | &#8226; Auth code flow (with PKCE)<br/>&#8226; On-behalf-of (OBO) flow<br/>&#8226; Conditional Access (CA) |
-> | React | [GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-react-spa) | &#8226; Signs in users<br/>&#8226; Calls Microsoft Graph | MSAL React | Auth code flow (with PKCE) |
-> | React | [GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial) | &#8226; [Signs in users](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/1-Authentication/1-sign-in/README.md)<br/>&#8226; [Signs in users (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/1-Authentication/2-sign-in-b2c/README.md) <br/> &#8226; [Calls Microsoft Graph](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/2-Authorization-I/1-call-graph/README.md)<br/>&#8226; [Calls Node.js web API](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/3-Authorization-II/1-call-api)<br/>&#8226; [Calls Node.js web API (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/3-Authorization-II/2-call-api-b2c)<br/>&#8226; [Calls Microsoft Graph via OBO](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/6-AdvancedScenarios/1-call-api-obo/README.md)<br/>&#8226; [Calls Node.js web API using PoP](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/6-AdvancedScenarios/2-call-api-pop/README.md)<br/>&#8226; [Uses App Roles for access control](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/5-AccessControl/1-call-api-roles/README.md)<br/>&#8226; [Uses Security Groups for access control](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/5-AccessControl/2-call-api-groups/README.md)<br/>&#8226; [Deploys to Azure Storage & App Service](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/4-Deployment/1-deploy-storage/README.md)<br/>&#8226; [Deploys to Azure Static Web Apps](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/4-Deployment/2-deploy-static/README.md)| MSAL React | &#8226; Auth code flow (with PKCE)<br/>&#8226; On-behalf-of (OBO) flow<br/>&#8226; Conditional Access (CA)<br/>&#8226; Proof of Possession (PoP) |
+> |Angular|[GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-angular-spa)| &#8226; Sign in users with AAD <br/>&#8226; Call Microsoft Graph | MSAL Angular | Auth code flow (with PKCE) |
+> | Angular | [GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial) | &#8226; [Sign in users](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/1-Authentication/1-sign-in/README.md)<br/>&#8226; [Sign in users (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/1-Authentication/2-sign-in-b2c/README.md) <br/> &#8226; [Call Microsoft Graph](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/2-Authorization-I/1-call-graph/README.md)<br/>&#8226; [Call .NET Core web API](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/3-Authorization-II/1-call-api)<br/>&#8226; [Call .NET Core web API (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/3-Authorization-II/2-call-api-b2c)<br/>&#8226; [Call Microsoft Graph via OBO](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/7-AdvancedScenarios/1-call-api-obo/README.md)<br/>&#8226; [Call .NET Core web API using PoP](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/7-AdvancedScenarios/2-call-api-pop/README.md)<br/>&#8226; [Use App Roles for access control](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/5-AccessControl/1-call-api-roles/README.md)<br/>&#8226; [Use Security Groups for access control](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/5-AccessControl/2-call-api-groups/README.md)<br/>&#8226; [Deploy to Azure Storage & App Service](https://github.com/Azure-Samples/ms-identity-javascript-angular-tutorial/tree/main/4-Deployment/README.md)| MSAL Angular | &#8226; Auth code flow (with PKCE)<br/>&#8226; On-behalf-of (OBO) flow<br/>&#8226; Proof of Possession (PoP)|
+> | Blazor WebAssembly | [GitHub repo](https://github.com/Azure-Samples/ms-identity-blazor-wasm) | &#8226; Sign in users<br/>&#8226; Call Microsoft Graph | MSAL.js | Auth code flow (with PKCE) |
+> | JavaScript | [GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-v2) | &#8226; Sign in users<br/>&#8226; Call Microsoft Graph | MSAL.js | Auth code flow (with PKCE) |
+> | JavaScript | [GitHub repo](https://github.com/Azure-Samples/ms-identity-b2c-javascript-spa) | &#8226; Sign in users (B2C)<br/>&#8226; Call Node.js web API | MSAL.js | Auth code flow (with PKCE) |
+> | JavaScript | [GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-tutorial) | &#8226; [Sign in users](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/1-Authentication/1-sign-in/README.md)<br/>&#8226; [Sign in users (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/1-Authentication/2-sign-in-b2c/README.md) <br/> &#8226; [Call Microsoft Graph](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/2-Authorization-I/1-call-graph/README.md)<br/>&#8226; [Call Node.js web API](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/3-Authorization-II/1-call-api/README.md)<br/>&#8226; [Call Node.js web API (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/3-Authorization-II/2-call-api-b2c/README.md)<br/>&#8226; [Call Microsoft Graph via OBO](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/4-AdvancedGrants/1-call-api-graph/README.md)<br/>&#8226; [Call Node.js web API via OBO & CA](https://github.com/Azure-Samples/ms-identity-javascript-tutorial/tree/main/4-AdvancedGrants/2-call-api-api-c)| MSAL.js | &#8226; Auth code flow (with PKCE)<br/>&#8226; On-behalf-of (OBO) flow<br/>&#8226; Conditional Access (CA) |
+> | React | [GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-react-spa) | &#8226; Sign in users<br/>&#8226; Call Microsoft Graph | MSAL React | Auth code flow (with PKCE) |
+> | React | [GitHub repo](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial) | &#8226; [Sign in users](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/1-Authentication/1-sign-in/README.md)<br/>&#8226; [Sign in users (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/1-Authentication/2-sign-in-b2c/README.md) <br/> &#8226; [Call Microsoft Graph](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/2-Authorization-I/1-call-graph/README.md)<br/>&#8226; [Call Node.js web API](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/3-Authorization-II/1-call-api)<br/>&#8226; [Call Node.js web API (B2C)](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/3-Authorization-II/2-call-api-b2c)<br/>&#8226; [Call Microsoft Graph via OBO](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/6-AdvancedScenarios/1-call-api-obo/README.md)<br/>&#8226; [Call Node.js web API using PoP](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/6-AdvancedScenarios/2-call-api-pop/README.md)<br/>&#8226; [Use App Roles for access control](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/5-AccessControl/1-call-api-roles/README.md)<br/>&#8226; [Use Security Groups for access control](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/5-AccessControl/2-call-api-groups/README.md)<br/>&#8226; [Deploy to Azure Storage & App Service](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/4-Deployment/1-deploy-storage/README.md)<br/>&#8226; [Deploy to Azure Static Web Apps](https://github.com/Azure-Samples/ms-identity-javascript-react-tutorial/tree/main/4-Deployment/2-deploy-static/README.md)| MSAL React | &#8226; Auth code flow (with PKCE)<br/>&#8226; On-behalf-of (OBO) flow<br/>&#8226; Conditional Access (CA)<br/>&#8226; Proof of Possession (PoP) |
## Web applications The following samples illustrate web applications that sign in users. Some samples also demonstrate the application calling Microsoft Graph, or your own web API with the user's identity.
-| Platform | Only signs in users | Signs in users and calls Microsoft Graph |
-| -- | - | |
-| ![This image shows the ASP.NET Core logo](media/sample-v2-code/logo_NETcore.png)</p>ASP.NET Core | [ASP.NET Core WebApp signs-in users tutorial](https://aka.ms/aspnetcore-webapp-sign-in) | Same sample in the [ASP.NET Core web app calls Microsoft Graph](https://aka.ms/aspnetcore-webapp-call-msgraph) phase</p>Advanced sample [Accessing the logged-in user's token cache from background apps, APIs and services](https://github.com/Azure-Samples/ms-identity-dotnet-advanced-token-cache) |
-| ![This image shows the ASP.NET Framework logo](media/sample-v2-code/logo_NETframework.png)</p>ASP.NET Core | [AD FS to Azure AD application migration playbook for developers](https://github.com/Azure-Samples/ms-identity-dotnet-adfs-to-aad) to learn how to safely and securely migrate your applications integrated with Active Directory Federation Services (AD FS) to Azure Active Directory (Azure AD) | |
-| ![This image shows the ASP.NET Framework logo](media/sample-v2-code/logo_NETframework.png)</p> ASP.NET | [ASP.NET Quickstart](https://github.com/AzureAdQuickstarts/AppModelv2-WebApp-OpenIDConnect-DotNet) </p> [dotnet-webapp-openidconnect-v2](https://github.com/azure-samples/active-directory-dotnet-webapp-openidconnect-v2) | [dotnet-admin-restricted-scopes-v2](https://github.com/azure-samples/active-directory-dotnet-admin-restricted-scopes-v2) </p> [msgraph-training-aspnetmvcapp](https://github.com/microsoftgraph/msgraph-training-aspnetmvcapp) |
-| ![This image shows the Java logo](medi) Sign in with AAD| |
-| ![This image shows the Java logo](medi) Sign in with B2C |
-| ![This image shows the Java logo](medi) Sign in with AAD and call Graph|
-| ![This image shows the Java logo](medi) Sign in with AAD and control access with Roles claim| |
-| ![This image shows the Java logo](medi) Sign in with AAD and control access with Groups claim|
-| ![This image shows the Java logo](medi) Deploy to Azure App Service|
-| ![This image shows the Java logo](media/sample-v2-code/logo_java.png) | | [ms-identity-java-webapp](https://github.com/Azure-Samples/ms-identity-java-webapp) |
-| ![This image shows the Java logo](media/sample-v2-code/logo_java.png) | [ms-identity-b2c-java-servlet-webapp-authentication](https://github.com/Azure-Samples/ms-identity-b2c-java-servlet-webapp-authentication)| |
-| ![This image shows the Node.js logo](media/sample-v2-code/logo_nodejs.png)</p>Node.js (MSAL Node) | [Express web app signs-in users tutorial](https://github.com/Azure-Samples/ms-identity-node) | |
-| ![This image shows the Python logo](medi) Sign in with AAD | |
-| ![This image shows the Python logo](medi) Sign in with B2C | |
-| ![This image shows the Python logo](medi) Sign in with AAD and Call Graph |
-| ![This image shows the Python logo](medi) Deploy to Azure App Service |
-| ![This image shows the Python logo](medi) Sign in with AAD | |
-| ![This image shows the Python logo](medi) Sign in with B2C | |
-| ![This image shows the Python logo](medi) Sign in with AAD and Call Graph|
-| ![This image shows the Python logo](medi) Deploy to Azure App Service |
-| ![This image shows the Python logo](media/sample-v2-code/logo_python.png) | | [Python Flask web app](https://github.com/Azure-Samples/ms-identity-python-webapp) |
-| ![This image shows the Ruby logo](media/sample-v2-code/logo_ruby.png) | | [msgraph-training-rubyrailsapp](https://github.com/microsoftgraph/msgraph-training-rubyrailsapp) |
-| ![This image shows the Blazor logo](media/sample-v2-code/logo-blazor.png)</p>Blazor Server | [Blazor Server app signs-in users tutorial](https://github.com/Azure-Samples/ms-identity-blazor-server/tree/main/WebApp-OIDC) | [Blazor Server app calls Microsoft Graph](https://github.com/Azure-Samples/ms-identity-blazor-server/tree/main/WebApp-graph-user/Call-MSGraph)</p>Chapterwise Tutorial: [Blazor Server app to sign-in users and call APIs with Azure Active Directory](https://github.com/Azure-Samples/ms-identity-blazor-server) |
+> [!div class="mx-tdCol2BreakAll"]
+> | Language/<br/>Platform | Code sample<br/>on GitHub | Description | Authentication libraries used | Authentication flow |
+> | - | -- | | - | -- |
+> | ASP.NET Core|[GitHub repo](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2) | ASP.NET Core Series <br/> &#8226; [Sign in users](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/1-WebApp-OIDC/README.md) <br/> &#8226; [Sign in users (B2C)](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/1-WebApp-OIDC/1-5-B2C/README.md) <br/> &#8226; [Call Microsoft Graph](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/2-WebApp-graph-user/2-1-Call-MSGraph/README.md) <br/> &#8226; [Customize token cache](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/2-WebApp-graph-user/2-2-TokenCache/README.md) <br/> &#8226; [Call Graph (multi-tenant)](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/2-WebApp-graph-user/2-3-Multi-Tenant/README.md) <br/> &#8226; [Call Azure REST APIs](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/3-WebApp-multi-APIs/README.md) <br/> &#8226; [Protect web API](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/4-WebApp-your-API/4-1-MyOrg/README.md) <br/> &#8226; [Protect web API (B2C)](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/4-WebApp-your-API/4-2-B2C/README.md) <br/> &#8226; [Protect multi-tenant web API](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/4-WebApp-your-API/4-3-AnyOrg/Readme.md) <br/> &#8226; [Use App Roles for access control](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/5-WebApp-AuthZ/5-1-Roles/README.md) <br/> &#8226; [Use Security Groups for access control](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/5-WebApp-AuthZ/5-2-Groups/README.md) <br/> &#8226; [Deploy to Azure Storage & App Service](https://github.com/Azure-Samples/active-directory-aspnetcore-webapp-openidconnect-v2/blob/master/6-Deploy-to-Azure/README.md) | &#8226; [MSAL.NET](https://aka.ms/msal-net) <br/> &#8226; [Microsoft.Identity.Web](https://aka.ms/microsoft-identity-web) | &#8226; [OIDC flow](https://docs.microsoft.com/azure/active-directory/develop/v2-protocols-oidc) <br/> &#8226; [Auth code flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-auth-code-flow) <br/> &#8226; [On-Behalf-Of (OBO) flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-on-behalf-of-flow) |
+> | Blazor | [GitHub repo](https://github.com/Azure-Samples/ms-identity-blazor-server/) | Blazor Server Series <br/> &#8226; [Sign in users](https://github.com/Azure-Samples/ms-identity-blazor-server/tree/main/WebApp-OIDC/MyOrg) <br/> &#8226; [Sign in users (B2C)](https://github.com/Azure-Samples/ms-identity-blazor-server/tree/main/WebApp-OIDC/B2C) <br/> &#8226; [Call Microsoft Graph](https://github.com/Azure-Samples/ms-identity-blazor-server/tree/main/WebApp-graph-user/Call-MSGraph) <br/> &#8226; [Call web API](https://github.com/Azure-Samples/ms-identity-blazor-server/tree/main/WebApp-your-API/MyOrg) <br/> &#8226; [Call web API (B2C)](https://github.com/Azure-Samples/ms-identity-blazor-server/tree/main/WebApp-your-API/B2C) | MSAL.NET | |
+> | ASP.NET Core|[GitHub repo](https://github.com/Azure-Samples/ms-identity-dotnet-advanced-token-cache) | [Advanced Token Cache Scenarios](https://github.com/Azure-Samples/ms-identity-dotnet-advanced-token-cache) | &#8226; [MSAL.NET](https://aka.ms/msal-net) <br/> &#8226; [Microsoft.Identity.Web](https://aka.ms/microsoft-identity-web) | [On-Behalf-Of (OBO) flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-on-behalf-of-flow) |
+> | ASP.NET Core|[GitHub repo](https://github.com/Azure-Samples/ms-identity-dotnet-adfs-to-aad) | [Active Directory FS to Azure AD migration](https://github.com/Azure-Samples/ms-identity-dotnet-adfs-to-aad) | [MSAL.NET](https://aka.ms/msal-net) | |
+> | ASP.NET |[GitHub repo](https://github.com/AzureAdQuickstarts/AppModelv2-WebApp-OpenIDConnect-DotNet) | [Quickstart: Sign in users](https://github.com/AzureAdQuickstarts/AppModelv2-WebApp-OpenIDConnect-DotNet) | [MSAL.NET](https://aka.ms/msal-net) | |
+> | ASP.NET |[GitHub repo](https://github.com/Azure-Samples/ms-identity-aspnet-webapp-openidconnect) | [Sign in users and call Microsoft Graph](https://github.com/Azure-Samples/ms-identity-aspnet-webapp-openidconnect) | [MSAL.NET](https://aka.ms/msal-net) | |
+> | ASP.NET |[GitHub repo](https://github.com/azure-samples/active-directory-dotnet-admin-restricted-scopes-v2) | [Admin Restricted Scopes <br/> &#8226; Sign in users <br/> &#8226; call Microsoft Graph](https://github.com/azure-samples/active-directory-dotnet-admin-restricted-scopes-v2) | [MSAL.NET](https://aka.ms/msal-net) | |
+> | ASP.NET |[GitHub repo](https://github.com/microsoftgraph/msgraph-training-aspnetmvcapp) | Microsoft Graph Training Sample | [MSAL.NET](https://aka.ms/msal-net) | |
+> | Java </p> Spring |[GitHub repo](https://github.com/Azure-Samples/ms-identity-java-spring-tutorial) | Azure AD Spring Boot Starter Series <br/> &#8226; [Sign in users](https://github.com/Azure-Samples/ms-identity-java-spring-tutorial/tree/main/1-Authentication/sign-in) <br/> &#8226; [Sign in users (B2C)](https://github.com/Azure-Samples/ms-identity-java-spring-tutorial/tree/main/1-Authentication/sign-in-b2c) <br/> &#8226; [Call Microsoft Graph](https://github.com/Azure-Samples/ms-identity-java-spring-tutorial/tree/main/2-Authorization-I/call-graph) <br/> &#8226; [Uses App Roles for access control](https://github.com/Azure-Samples/ms-identity-java-spring-tutorial/tree/main/3-Authorization-II/roles) <br/> &#8226; [Deploy to Azure App Service](https://github.com/Azure-Samples/ms-identity-java-spring-tutorial/tree/main/4-Deployment/deploy-to-azure-app-service) | MSAL Java <br/> AAD Boot Starter | [Auth code flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-auth-code-flow) |
+> | Java </p> Servlets |[GitHub repo](https://github.com/Azure-Samples/ms-identity-java-servlet-webapp-authentication) | Spring-less Servlet Series <br/> &#8226; [Sign in users](https://github.com/Azure-Samples/ms-identity-java-servlet-webapp-authentication/tree/main/1-Authentication/sign-in) <br/> &#8226; [Sign in users (B2C)](https://github.com/Azure-Samples/ms-identity-java-servlet-webapp-authentication/tree/main/1-Authentication/sign-in-b2c) <br/> &#8226; [Call Microsoft Graph](https://github.com/Azure-Samples/ms-identity-java-servlet-webapp-authentication/tree/main/2-Authorization-I/call-graph) <br/> &#8226; [Use App Roles for access control](https://github.com/Azure-Samples/ms-identity-java-servlet-webapp-authentication/tree/main/3-Authorization-II/roles) <br/> &#8226; [Use Security Groups for access control](https://github.com/Azure-Samples/ms-identity-java-servlet-webapp-authentication/tree/main/3-Authorization-II/groups) <br/> &#8226; [Deploy to Azure App Service](https://github.com/Azure-Samples/ms-identity-java-servlet-webapp-authentication/tree/main/4-Deployment/deploy-to-azure-app-service) | MSAL Java | [Auth code flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-auth-code-flow) |
+> | Java |[GitHub repo](https://github.com/Azure-Samples/ms-identity-java-webapp) | Sign in users, call Microsoft Graph | MSAL Java | [Auth code flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-auth-code-flow) |
+> | Java </p> Spring|[GitHub repo](https://github.com/Azure-Samples/ms-identity-java-webapi) | Sign in users & call Microsoft Graph via OBO </p> &#8226; web API | MSAL Java | &#8226; [Auth code flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-auth-code-flow) <br/> &#8226; [On-Behalf-Of (OBO) flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-on-behalf-of-flow) |
+> | Node.js </p> Express |[GitHub repo](https://github.com/Azure-Samples/ms-identity-node) | Express web app sample <br/> &#8226; Sign in users <br/> &#8226; Call Microsoft Graph | MSAL Node | [Auth code flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-auth-code-flow) |
+> | Python </p> Flask |[GitHub repo](https://github.com/Azure-Samples/ms-identity-python-flask-tutorial) | Flask Series <br/> &#8226; Sign in users <br/> &#8226; Sign in users (B2C) <br/> &#8226; Call Microsoft Graph <br/> &#8226; Deploy to Azure App Service | MSAL Python | [Auth code flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-auth-code-flow) |
+> | Python </p> Django |[GitHub repo](https://github.com/Azure-Samples/ms-identity-python-django-tutorial) | Django Series <br/> &#8226; [Sign in users](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/1-Authentication/sign-in) <br/> &#8226; [Sign in users (B2C)](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/1-Authentication/sign-in-b2c) <br/> &#8226; [Call Microsoft Graph](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/2-Authorization-I/call-graph) <br/> &#8226; [Deploy to Azure App Service](https://github.com/Azure-Samples/ms-identity-python-django-tutorial/tree/main/3-Deployment/deploy-to-azure-app-service)| MSAL Python | [Auth code flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-auth-code-flow) |
+> | Python </p> Flask |[GitHub repo](https://github.com/Azure-Samples/ms-identity-python-webapp) | Flask standalone sample <br/> [Sign in users and call Microsoft Graph](https://github.com/Azure-Samples/ms-identity-python-webapp) | MSAL Python | [Auth code flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-auth-code-flow) |
+> | Ruby |[GitHub repo](https://github.com/microsoftgraph/msgraph-training-rubyrailsapp) | Graph Training <br/> &#8226; [Sign in and Microsoft Graph](https://github.com/microsoftgraph/msgraph-training-rubyrailsapp) | | |
## Desktop and mobile public client apps
active-directory Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/whats-new-docs.md
Welcome to what's new in the Microsoft identity platform documentation. This article lists new docs that have been added and those that have had significant updates in the last three months.
+## May 2021
+
+### New articles
+
+- [Claims challenges, claims requests, and client capabilities](claims-challenge.md)
+- [DevelopersΓÇÖ guide to Conditional Access authentication context](developer-guide-conditional-access-authentication-context.md)
+- [Microsoft identity platform refresh tokens](refresh-tokens.md)
+- [Microsoft identity platform and OAuth 2.0 SAML bearer assertion flow](v2-saml-bearer-assertion.md)
+- [Tutorial: Sign in users and call the Microsoft Graph API from a React single-page app (SPA) using auth code flow](tutorial-v2-react.md)
+- [Tutorial: Sign in users and call the Microsoft Graph API from an Angular single-page application (SPA) using auth code flow](tutorial-v2-angular-auth-code.md)
+
+### Updated articles
+
+- [DevelopersΓÇÖ guide to Conditional Access authentication context](developer-guide-conditional-access-authentication-context.md)
+- [How to: Add app roles to your application and receive them in the token](howto-add-app-roles-in-azure-ad-apps.md)
+- [How to migrate a Node.js app from ADAL to MSAL](msal-node-migration.md)
+- [Microsoft identity platform ID tokens](id-tokens.md)
+- [Quickstart: Sign in users and call the Microsoft Graph API from an Android app](quickstart-v2-android.md)
+- [Quickstart: Register an application with the Microsoft identity platform](quickstart-register-app.md)
+- [Quickstart: Call an ASP.NET web API that's protected by Microsoft identity platform](quickstart-v2-dotnet-native-aspnet.md)
+- [Tutorial: Sign in users and call the Microsoft Graph API from an Android application](tutorial-v2-android.md)
+ ## April 2021 ### New articles
Welcome to what's new in the Microsoft identity platform documentation. This art
- [Support and help options for developers](developer-support-help-options.md) - [Web app that signs in users: Code configuration](scenario-web-app-sign-user-app-configuration.md) - [Web app that signs in users: Sign-in and sign-out](scenario-web-app-sign-user-sign-in.md)-
-## February 2021
-
-### New articles
--- [Quickstart: Acquire an access token and call the Microsoft Graph API from an Electron desktop app](quickstart-v2-nodejs-desktop.md)-- [Tutorial: Sign in users and call the Microsoft Graph API in an Electron desktop app](tutorial-v2-nodejs-desktop.md)-- [Quickstart: Acquire a token and call Microsoft Graph API from a Node.js console app using app's identity](quickstart-v2-nodejs-console.md)-- [Tutorial: Call the Microsoft Graph API in a Node.js console app](tutorial-v2-nodejs-console.md)-- [Tutorial: Sign-in users in a Node.js & Express web app](tutorial-v2-nodejs-webapp-msal.md)-- [Support passwordless authentication with FIDO2 keys in apps you develop](support-fido2-authentication.md)-
-### Updated articles
--- [What's new for authentication?](reference-breaking-changes.md)-- [Use MSAL.NET to sign in users with social identities](msal-net-aad-b2c-considerations.md)-- [Microsoft identity platform code samples (v2.0 endpoint)](sample-v2-code.md)-- [Microsoft identity platform videos](identity-videos.md)-- [Quickstart: Set up a tenant](quickstart-create-new-tenant.md)-- [Quickstart: Register an application with the Microsoft identity platform](quickstart-register-app.md)-- [Quickstart: Acquire a token and call Microsoft Graph API from a Java console app using app's identity](quickstart-v2-java-daemon.md)
active-directory Groups Assign Sensitivity Labels https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/groups-assign-sensitivity-labels.md
Previously updated : 12/02/2020 Last updated : 05/28/2021
To apply published labels to groups, you must first enable the feature. These st
1. Run the following commands to prepare to run the cmdlets. ```PowerShell
+ Install-Module AzureADPreview
Import-Module AzureADPreview Connect-AzureAD ```
active-directory Licensing Group Advanced https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/licensing-group-advanced.md
Previously updated : 04/05/2021 Last updated : 05/28/2021
A user can be a member of multiple groups with licenses. Here are some things to
## Direct licenses coexist with group licenses
-When a user inherits a license from a group, you can't directly remove or modify that license assignment in the user's properties. Changes must be made in the group and then propagated to all users.
+When a user inherits a license from a group, you can't directly remove or modify that license assignment in the user's properties. You can change the license assignment only in the group and the changes are then propagated to all users. It is possible, however, to assign the same product license to the user directly and by group license assignment. In this way, you can enable additional services from the product just for one user, without affecting other users.
-It is possible, however, to assign the same product license directly to the user, in addition to the inherited license. You can enable additional services from the product just for one user, without affecting other users.
-
-Directly assigned licenses can be removed, and donΓÇÖt affect inherited licenses. Consider the user who inherits an Office 365 Enterprise E3 license from a group.
+Directly assigned licenses can be removed, and donΓÇÖt affect a user's inherited licenses. Consider the user who inherits an Office 365 Enterprise E3 license from a group.
Initially, the user inherits the license only from the *E3 basic services* group, which enables four service plans.
-1. Select **Assign** to directly assign an E3 license to the user. In this case, you are going to disable all service plans except Yammer Enterprise.
+1. Select **Assign** to directly assign an E3 license to the user. For example, if you want to disable all service plans except Yammer Enterprise.
- As a result, the user still uses only one license of the E3 product. But the direct assignment enables the Yammer Enterprise service for that user only. You can see which services are enabled by the group membership versus the direct assignment.
+ As a result, the user still uses only one license of the E3 product. But the direct assignment enables the Yammer Enterprise service for that user only. You can see which services are enabled by the group membership versus the direct assignment.
1. When you use direct assignment, the following operations are allowed:
- - Yammer Enterprise can be turned off on the user resource directly. The **On/Off** toggle in the illustration was enabled for this service, as opposed to the other service toggles. Because the service is enabled directly on the user, it can be modified.
+ - Yammer Enterprise can be turned off for a individual user. Because the service is assigned directly to the user, it can be changed.
- Additional services can be enabled as well, as part of the directly assigned license.
- - The **Remove** button can be used to remove the direct license from the user. You can see that the user now only has the inherited group license and only the original services remain enabled:
+ - The **Remove** button can be used to remove the direct license from the user. You can see that the user then has the inherited group license and only the original services remain enabled.
## Managing new services added to products
active-directory Licensing Service Plan Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/licensing-service-plan-reference.md
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
- **Service plans included (friendly names)**: A list of service plans (friendly names) in the product that correspond to the string ID and GUID >[!NOTE]
->This information is accurate as of May 2021.
+>This information is accurate as of June 2021.
| Product name | String ID | GUID | Service plans included | Service plans included (friendly names) | | | | | | |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| DYNAMICS 365 UNF OPS PLAN ENT EDITION | Dynamics_365_for_Operations | ccba3cfe-71ef-423a-bd87-b6df3dce59a9 | DDYN365_CDS_DYN_P2 (d1142cfd-872e-4e77-b6ff-d98ec5a51f66)<br/>DYN365_TALENT_ENTERPRISE (65a1ebf4-6732-4f00-9dcb-3d115ffdeecd)<br/>Dynamics_365_for_Operations (95d2cd7b-1007-484b-8595-5e97e63fe189)<br/>Dynamics_365_for_Retail (a9e39199-8369-444b-89c1-5fe65ec45665)<br/>DYNAMICS_365_HIRING_FREE_PLAN (f815ac79-c5dd-4bcc-9b78-d97f7b817d0d)<br/>Dynamics_365_Onboarding_Free_PLAN (300b8114-8555-4313-b861-0c115d820f50)<br/>FLOW_DYN_P2 (b650d915-9886-424b-a08d-633cede56f57)<br/>POWERAPPS_DYN_P2 (0b03f40b-c404-40c3-8651-2aceb74365fa) | COMMON DATA SERVICE (d1142cfd-872e-4e77-b6ff-d98ec5a51f66)<br/>DYNAMICS 365 FOR TALENT (65a1ebf4-6732-4f00-9dcb-3d115ffdeecd)<br/>DYNAMICS 365 FOR_OPERATIONS (95d2cd7b-1007-484b-8595-5e97e63fe189)<br/>DYNAMICS 365 FOR RETAIL (a9e39199-8369-444b-89c1-5fe65ec45665)<br/>DYNAMICS 365 HIRING FREE PLAN (f815ac79-c5dd-4bcc-9b78-d97f7b817d0d)<br/>DYNAMICS 365 FOR TALENT: ONBOARD (300b8114-8555-4313-b861-0c115d820f50)<br/>FLOW FOR DYNAMICS 365(b650d915-9886-424b-a08d-633cede56f57)<br/>POWERAPPS FOR DYNAMICS 365 (0b03f40b-c404-40c3-8651-2aceb74365fa) | | ENTERPRISE MOBILITY + SECURITY E3 | EMS | efccb6f7-5641-4e0e-bd10-b4976e1bf68e | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3) | AZURE ACTIVE DIRECTORY PREMIUM P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>CLOUD APP SECURITY DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>MICROSOFT INTUNE (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>MICROSOFT AZURE MULTI-FACTOR AUTHENTICATION (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFT AZURE ACTIVE DIRECTORY RIGHTS (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>AZURE INFORMATION PROTECTION PREMIUM P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3) | | ENTERPRISE MOBILITY + SECURITY E5 | EMSPREMIUM | b05e124f-c7cc-45a0-a6aa-8cf78c946968 | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c) | AZURE ACTIVE DIRECTORY PREMIUM P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AZURE ACTIVE DIRECTORY PREMIUM P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>MICROSOFT CLOUD APP SECURITY (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>AZURE ADVANCED THREAT PROTECTION (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>MICROSOFT INTUNE (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>MICROSOFT AZURE MULTI-FACTOR AUTHENTICATION (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFT AZURE ACTIVE DIRECTORY RIGHTS (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>AZURE INFORMATION PROTECTION PREMIUM P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>AZURE INFORMATION PROTECTION PREMIUM P2 (5689bec4-755d-4753-8b61-40975025187c) |
-| EXCHANGE ONLINE ESSENTIALS | EXCHANGE_S_ESSENTIALS | e8f81a67-bd96-4074-b108-cf193eb9433b | EXCHANGE_S_ESSENTIALS (1126bef5-da20-4f07-b45e-ad25d2581aa8)<br/>BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c) | EXCHANGE ESSENTIALS (1126bef5-da20-4f07-b45e-ad25d2581aa8)<br/> TO-DO (PLAN 1) (5e62787c-c316-451f-b873-1d05acd4d12c) |
| EXCHANGE ONLINE (PLAN 1) | EXCHANGESTANDARD | 4b9405b0-7788-4568-add1-99614e613b69 | EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c) | EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)| | EXCHANGE ONLINE (PLAN 2) | EXCHANGEENTERPRISE | 19ec0d23-8335-4cbd-94ac-6050e30712fa | EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0) | EXCHANGE ONLINE (PLAN 2) (efb87545-963c-4e0d-99df-69c6916d9eb0) | | EXCHANGE ONLINE ARCHIVING FOR EXCHANGE ONLINE | EXCHANGEARCHIVE_ADDON | ee02fd1b-340e-4a4b-b355-4a514e4c8943 | EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793) | EXCHANGE ONLINE ARCHIVING FOR EXCHANGE ONLINE (176a09a6-7ec5-4039-ac02-b2791c6ba793) | | EXCHANGE ONLINE ARCHIVING FOR EXCHANGE SERVER | EXCHANGEARCHIVE | 90b5e015-709a-4b8b-b08e-3200f994494c | EXCHANGE_S_ARCHIVE (da040e0a-b393-4bea-bb76-928b3fa1cf5a) | EXCHANGE ONLINE ARCHIVING FOR EXCHANGE SERVER (da040e0a-b393-4bea-bb76-928b3fa1cf5a) |
-| EXCHANGE ONLINE ESSENTIALS | EXCHANGEESSENTIALS | 7fc0182e-d107-4556-8329-7caaa511197b | EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c) | EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)|
+| EXCHANGE ONLINE ESSENTIALS (ExO P1 BASED) | EXCHANGEESSENTIALS | 7fc0182e-d107-4556-8329-7caaa511197b | EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c) | EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)|
| EXCHANGE ONLINE ESSENTIALS | EXCHANGE_S_ESSENTIALS | e8f81a67-bd96-4074-b108-cf193eb9433b | EXCHANGE_S_ESSENTIALS (1126bef5-da20-4f07-b45e-ad25d2581aa8)<br/>BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c) | EXCHANGE ESSENTIALS (1126bef5-da20-4f07-b45e-ad25d2581aa8)<br/>TO-DO (PLAN 1) (5e62787c-c316-451f-b873-1d05acd4d12c) | | EXCHANGE ONLINE KIOSK | EXCHANGEDESKLESS | 80b2d799-d2ba-4d2a-8842-fb0d0f3a4b82 | EXCHANGE_S_DESKLESS (4a82b400-a79f-41a4-b4e2-e94f5787b113) | EXCHANGE ONLINE KIOSK (4a82b400-a79f-41a4-b4e2-e94f5787b113) | | EXCHANGE ONLINE POP | EXCHANGETELCO | cb0a98a8-11bc-494c-83d9-c1b1ac65327e | EXCHANGE_B_STANDARD (90927877-dcff-4af6-b346-2332c0b15bb7) | EXCHANGE ONLINE POP (90927877-dcff-4af6-b346-2332c0b15bb7) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| MICROSOFT 365 PHONE SYSTEM_USGOV_DOD | MCOEV_USGOV_DOD | b0e7de67-e503-4934-b729-53d595ba5cd1 | MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | MICROSOFT 365 PHONE SYSTEM (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | | MICROSOFT 365 PHONE SYSTEM_USGOV_GCCHIGH | MCOEV_USGOV_GCCHIGH | 985fcb26-7b94-475b-b512-89356697be71 | MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | MICROSOFT 365 PHONE SYSTEM (4828c8ec-dc2e-4779-b502-87ac9ce28ab7) | | MICROSOFT 365 PHONE SYSTEM - VIRTUAL USER | PHONESYSTEM_VIRTUALUSER | 440eaaa8-b3e0-484b-a8be-62870b9ba70a | MCOEV_VIRTUALUSER (f47330e9-c134-43b3-9993-e7f004506889) | MICROSOFT 365 PHONE SYSTEM VIRTUAL USER (f47330e9-c134-43b3-9993-e7f004506889)|
-| MICROSOFT 365 SECURITY AND COMPLIANCE FOR FLW | M365_SECURITY_COMPLIANCE_FOR_FLW | 2347355b-4e81-41a4-9c22-55057a399791 | AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>BPOS_S_DlpAddOn (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>PAM_ENTERPRISE b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f) | AZURE ACTIVE DIRECTORY PREMIUM P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>AZURE INFORMATION PROTECTION PREMIUM P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>CUSTOMER LOCKBOX (9f431833-0334-42de-a7dc-70aa40db46db)<br/>DATA CLASSIFICATION IN MICROSOFT 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>DATA LOSS PREVENTION (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>EXCHANGE ONLINE ARCHIVING FOR EXCHANGE ONLINE (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>INFORMATION BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>INFORMATION PROTECTION FOR OFFICE 365 ΓÇô PREMIUM (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MICROSOFT 365 DEFENDER (bf28f719-7844-4079-9c78-c1307898e192)<br/>MICROSOFT CLOUD APP SECURITY (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>MICROSOFT DEFENDER FOR ENDPOINT (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>MICROSOFT DEFENDER FOR IDENTITY (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>MICROSOFT DEFENDER FOR OFFICE 365 (PLAN 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>MICROSOFT DEFENDER FOR OFFICE 365 (PLAN 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>OFFICE 365 ADVANCED EDISCOVERY (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>OFFICE 365 PRIVILEGED ACCESS MANAGEMENT (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>PREMIUM ENCRYPTION IN OFFICE 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f |
+| MICROSOFT 365 SECURITY AND COMPLIANCE FOR FLW | M365_SECURITY_COMPLIANCE_FOR_FLW | 2347355b-4e81-41a4-9c22-55057a399791 | AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>BPOS_S_DlpAddOn (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f) | AZURE ACTIVE DIRECTORY PREMIUM P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>AZURE INFORMATION PROTECTION PREMIUM P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>CUSTOMER LOCKBOX (9f431833-0334-42de-a7dc-70aa40db46db)<br/>DATA CLASSIFICATION IN MICROSOFT 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>DATA LOSS PREVENTION (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>EXCHANGE ONLINE ARCHIVING FOR EXCHANGE ONLINE (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>INFORMATION BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>INFORMATION PROTECTION FOR OFFICE 365 ΓÇô PREMIUM (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MICROSOFT 365 DEFENDER (bf28f719-7844-4079-9c78-c1307898e192)<br/>MICROSOFT CLOUD APP SECURITY (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>MICROSOFT DEFENDER FOR ENDPOINT (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>MICROSOFT DEFENDER FOR IDENTITY (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>MICROSOFT DEFENDER FOR OFFICE 365 (PLAN 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>MICROSOFT DEFENDER FOR OFFICE 365 (PLAN 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>OFFICE 365 ADVANCED EDISCOVERY (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>OFFICE 365 PRIVILEGED ACCESS MANAGEMENT (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>PREMIUM ENCRYPTION IN OFFICE 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f |
| MICROSOFT BUSINESS CENTER | MICROSOFT_BUSINESS_CENTER | 726a0894-2c77-4d65-99da-9775ef05aad1 | MICROSOFT_BUSINESS_CENTER (cca845f9-fd51-4df6-b563-976a37c56ce0) | MICROSOFT BUSINESS CENTER (cca845f9-fd51-4df6-b563-976a37c56ce0) | | MICROSOFT DEFENDER FOR ENDPOINT | WIN_DEF_ATP | 111046dd-295b-4d6d-9724-d52ac90bd1f2 | EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef) | Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>MICROSOFT DEFENDER FOR ENDPOINT (871d91ec-ec1a-452b-a83f-bd76c7d770ef) | | MICROSOFT DYNAMICS CRM ONLINE BASIC | CRMPLAN2 | 906af65a-2970-46d5-9b58-4e9aa50f0657 | EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>FLOW_DYN_APPS (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>CRMPLAN2 (bf36ca64-95c6-4918-9275-eb9f4ce2c04f)<br/>POWERAPPS_DYN_APPS (874fc546-6efe-4d22-90b8-5c4e7aa59f4b) | EXCHANGE FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>FLOW FOR DYNAMICS 365 (7e6d7d78-73de-46ba-83b1-6d25117334ba)<br/>MICROSOFT DYNAMICS CRM ONLINE BASIC (bf36ca64-95c6-4918-9275-eb9f4ce2c04f)<br/>POWERAPPS FOR DYNAMICS 365 (874fc546-6efe-4d22-90b8-5c4e7aa59f4b) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| MS IMAGINE ACADEMY | IT_ACADEMY_AD | ba9a34de-4489-469d-879c-0f0f145321cd | IT_ACADEMY_AD (d736def0-1fde-43f0-a5be-e3f8b2de6e41) | MS IMAGINE ACADEMY (d736def0-1fde-43f0-a5be-e3f8b2de6e41) | | MICROSOFT INTUNE DEVICE for GOVERNMENT | INTUNE_A_D_GOV | 2c21e77a-e0d6-4570-b38a-7ff2dc17d2ca | EXCHANGE_S_FOUNDATION_GOV (922ba911-5694-4e99-a794-73aed9bfeec8)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5) | EXCHANGE FOUNDATION FOR GOVERNMENT (922ba911-5694-4e99-a794-73aed9bfeec8)<br/>MICROSOFT INTUNE (c1ec4a95-1f05-45b3-a911-aa3fa01094f5) | | MICROSOFT POWER APPS PLAN 2 TRIAL | POWERAPPS_VIRAL | dcb1a3ae-b33f-4487-846a-a640262fadf4 | DYN365_CDS_VIRAL (17ab22cd-a0b3-4536-910a-cb6eb12696c0)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>FLOW_P2_VIRAL (50e68c76-46c6-4674-81f9-75456511b170)<br/>FLOW_P2_VIRAL_REAL (d20bfa21-e9ae-43fc-93c2-20783f0840c3)<br/>POWERAPPS_P2_VIRAL (d5368ca3-357e-4acb-9c21-8495fb025d1f) | COMMON DATA SERVICE ΓÇô VIRAL (17ab22cd-a0b3-4536-910a-cb6eb12696c0)<br/>EXCHANGE FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>FLOW FREE (50e68c76-46c6-4674-81f9-75456511b170)<br/>FLOW P2 VIRAL (d20bfa21-e9ae-43fc-93c2-20783f0840c3)<br/>POWERAPPS TRIAL (d5368ca3-357e-4acb-9c21-8495fb025d1f) |
-| MICROSOFT INTUNE SMB | INTUNE_SMB | e6025b08-2fa5-4313-bd0a-7e5ffca32958 | AAD_SMB (de377cbc-0019-4ec2-b77c-3f223947e102)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>INTUNE_SMBIZ (8e9ff0ff-aa7a-4b20-83c1-2f636b600ac2)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/> | AZURE ACTIVE DIRECTORY (de377cbc-0019-4ec2-b77c-3f223947e102)<br/> EXCHANGE FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/> MICROSOFT INTUNE (8e9ff0ff-aa7a-4b20-83c1-2f636b600ac2)<br/> MICROSOFT INTUNE (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/> |
+| MICROSOFT INTUNE SMB | INTUNE_SMB | e6025b08-2fa5-4313-bd0a-7e5ffca32958 | AAD_SMB (de377cbc-0019-4ec2-b77c-3f223947e102)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>INTUNE_SMBIZ (8e9ff0ff-aa7a-4b20-83c1-2f636b600ac2)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/> | AZURE ACTIVE DIRECTORY (de377cbc-0019-4ec2-b77c-3f223947e102)<br/> EXCHANGE FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/> MICROSOFT INTUNE (8e9ff0ff-aa7a-4b20-83c1-2f636b600ac2)<br/> MICROSOFT INTUNE (c1ec4a95-1f05-45b3-a911-aa3fa01094f5) |
| MICROSOFT TEAM (FREE) | TEAMS_FREE | 16ddbbfc-09ea-4de2-b1d7-312db6112d70 | EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>MCOFREE (617d9209-3b90-4879-96e6-838c42b2701d)<br/>TEAMS_FREE (4fa4026d-ce74-4962-a151-8e96d57ea8e4)<br/>SHAREPOINTDESKLESS (902b47e5-dcb2-4fdc-858b-c63a90a2bdb9)<br/>TEAMS_FREE_SERVICE (bd6f2ac2-991a-49f9-b23c-18c96a02c228)<br/>WHITEBOARD_FIRSTLINE1 (36b29273-c6d0-477a-aca6-6fbe24f538e3) | EXCHANGE FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>MCO FREE FOR MICROSOFT TEAMS (FREE) (617d9209-3b90-4879-96e6-838c42b2701d)<br/>MICROSOFT TEAMS (FREE) (4fa4026d-ce74-4962-a151-8e96d57ea8e4)<br/>SHAREPOINT KIOSK (902b47e5-dcb2-4fdc-858b-c63a90a2bdb9)<br/>TEAMS FREE SERVICE (bd6f2ac2-991a-49f9-b23c-18c96a02c228)<br/>WHITEBOARD (FIRSTLINE) (36b29273-c6d0-477a-aca6-6fbe24f538e3) | | MICROSOFT TEAMS EXPLORATORY | TEAMS_EXPLORATORY | 710779e8-3d4a-4c88-adb9-386c958d1fdf | CDS_O365_P1 (bed136c6-b799-4462-824d-fc045d3a9d25)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>DESKLESS (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E1 (743dd19e-1ce3-4c62-a3ad-49ba8f63a2f6)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MCO_TEAMS_IW (42a3ec34-28ba-46b6-992f-db53a675ac5b)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>OFFICEMOBILE_SUBSCRIPTION (c63d4d19-e8cb-460e-b37c-4d6c34603745)<br/>POWERAPPS_O365_P1 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>FLOW_O365_P1 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>POWER_VIRTUAL_AGENTS_O365_P1 (0683001c-0492-4d59-9515-d9a6426b5813)<br/>SHAREPOINTSTANDARD (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>WHITEBOARD_PLAN1 (b8afc642-032e-4de5-8c0a-507a7bba7e5d)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | COMMON DATA SERVICE FOR TEAMS_P1 (bed136c6-b799-4462-824d-fc045d3a9d25)<br/>EXCHANGE ONLINE (PLAN 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>INSIGHTS BY MYANALYTICS (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>MICROSOFT FORMS (PLAN E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>MICROSOFT PLANNER (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>MICROSOFT STAFFHUB (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>MICROSOFT STREAM FOR O365 E1 SKU (743dd19e-1ce3-4c62-a3ad-49ba8f63a2f6)<br/>MICROSOFT TEAMS (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MICROSOFT TEAMS (42a3ec34-28ba-46b6-992f-db53a675ac5b)<br/>MOBILE DEVICE MANAGEMENT FOR OFFICE 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>OFFICE FOR THE WEB (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>OFFICE MOBILE APPS FOR OFFICE 365 (c63d4d19-e8cb-460e-b37c-4d6c34603745)<br/>POWER APPS FOR OFFICE 365 (92f7a6f3-b89b-4bbd-8c30-809e6da5ad1c)<br/>POWER AUTOMATE FOR OFFICE 365 (0f9b09cb-62d1-4ff4-9129-43f4996f83f4)<br/>POWER VIRTUAL AGENTS FOR OFFICE 365 P1(0683001c-0492-4d59-9515-d9a6426b5813)<br/>SHAREPOINT ONLINE (PLAN 1) (c7699d2e-19aa-44de-8edf-1736da088ca1)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TO-DO (PLAN 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>WHITEBOARD (PLAN 1) (b8afc642-032e-4de5-8c0a-507a7bba7e5d)<br/>YAMMER ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | | Office 365 A5 for faculty| ENTERPRISEPREMIUM_FACULTY | a4585165-0533-458a-97e3-c400570268c4 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>OFFICE_FORMS_PLAN_3 (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Flow for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Forms (Plan 3) (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E5 SKU (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PowerApps for Office 365 Plan 3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
active-directory Access Reviews Downloadable Review History https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/governance/access-reviews-downloadable-review-history.md
+
+ Title: Create and manage downloadable access review history report (Preview) - Azure Active Directory
+description: Using Azure Active Directory access reviews, you can download a review history for access reviews in your organization.
+
+documentationcenter: ''
++++
+ na
+ms.devlang: na
++ Last updated : 03/25/2021+++
+# Create and manage downloadable access review history report (Preview) in Azure AD access reviews
+
+With Azure Active Directory (Azure AD) Access Reviews, you can create a downloadable review history to help your organization gain more insight. The report pulls the decisions that were taken by reviewers when a report is created. These reports can be constructed to include specific access reviews, for a specific time frame, and can be filtered to include different review types and review results.
+
+## Who can access and request review history
+
+Review history and request review history are available for any user if they're authorized to view access reviews. To see which roles can view and create access reviews, see [What resource types can be reviewed?](deploy-access-reviews.md#what-resource-types-can-be-reviewed). Global Administrator and Global Reader can see all access reviews. All other users are only allowed to see reports on access reviews that they've generated.
+
+## How to create a review history report
+
+**Prerequisite role:** All users authorized to view access reviews
+
+1. In the Azure portal, select **Azure Active Directory** and then select **Identity Governance**.
+
+1. In the left menu, under **Access Reviews** select **Review history**.
+
+1. Select **New report**.
+
+1. Specify a review start and end date.
+
+1. Select the review types and review results you want to include in the report.
+
+ ![Access Reviews - Access Review History Report - Create](./media/access-reviews-downloadable-review-history/create-review-history.png)
+
+1. Then select **create** to create an Access Review History Report.
+
+## How to download review history reports
+
+Once a review history report is created, you can download it. All reports that are created are available for download for 30 days in CSV format.
+
+1. Select **Review History** under **Identity Governance**. All review history reports that you created will be available.
+1. Select the report you wish to download.
+
+## What is included in a review history report?
+
+The reports provide details on a per-user basis showing the following:
+
+| Element name | Description |
+| | |
+| AccessReviewId | Review object id |
+| ReviewType | Review types include group, application, Azure AD role, Azure role, and access package|
+|ResourceDisplayName | Display Name of the resource being reviewed |
+| ResourceId | Id of the resource being reviewed |
+| ReviewName | Name of the review |
+| CreatedDateTime | Creation datetime of the review |
+| ReviewStartDate | Start date of the review
+| ReviewEndDate | End date of the review |
+| ReviewStatus | Status of the review. For all review statuses, see the access review status table [here](create-access-review.md) |
+| OwnerId | Reviewer owner ID |
+| OwnerName | Reviewer owner name |
+| OwnerUPN | Reviewer owner User Principal Name |
+| PrincipalId | Id of the principal being reviewed |
+| PrincipalName | Name of the principal being reviewed |
+| PrincipalUPN | Principal Name of the user being reviewed |
+| PrincipalType | Type of the principal. Options include user, group, and service principal |
+| ReviewDate | Date of the review |
+| ReviewResult | Review results include Deny, Approve, and Not reviewed |
+|Justification | Justification for review result provided by reviewer |
+| ReviewerId | Reviewer Id |
+| ReviewerName | Reviewer Name |
+| ReviewerUPN | Reviewer User Principal Name |
+| ReviewerEmailAddress | Reviewer email address |
+| AppliedByName | Name of the user who applied the review result |
+| AppliedByUPN | User Principal Name of the user who applied the review result|
+| AppliedByEmailAddress | Email address of the user who applied the review result |
+| AppliedDate | Date when the review result were applied |
+| AccessRecommendation | System recommendations include Approve, Deny, and No Info |
+|SubmissionResult | Review result submission status include applied, and not applied. |
+
+## Next steps
+- [Review access to groups or applications](perform-access-review.md)
+- [Review access for yourself to groups or applications](review-your-access.md)
+- [Complete an access review of groups or applications](complete-access-review.md)
active-directory Concept Identity Protection B2b https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/identity-protection/concept-identity-protection-b2b.md
The user risk for B2B collaboration users is evaluated at their home directory.
There are limitations in the implementation of Identity Protection for B2B collaboration users in a resource directory due to their identity existing in their home directory. The main limitations are as follows: - If a guest user triggers the Identity Protection user risk policy to force password reset, **they will be blocked**. This block is due to the inability to reset passwords in the resource directory.-- **Guest users do not appear in the risky users report**. This loss of visibility is due to the risk evaluation occurring in the B2B user's home directory.-- Administrators **cannot dismiss or remediate a risky B2B collaboration user** in their resource directory. This loss of functionality is due to administrators in the resource directory not having access to the B2B user's home directory.
+- **Guest users do not appear in the risky users report**. This limitation is due to the risk evaluation occurring in the B2B user's home directory.
+- Administrators **cannot dismiss or remediate a risky B2B collaboration user** in their resource directory. This limitation is due to administrators in the resource directory not having access to the B2B user's home directory.
### Why can't I remediate risky B2B collaboration users in my directory?
active-directory Services Support Managed Identities https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/services-support-managed-identities.md
The following Azure services support managed identities for Azure resources:
Managed identity type | All Generally Available<br>Global Azure Regions | Azure Government | Azure Germany | Azure China 21Vianet | | | :-: | :-: | :-: | :-: | | System assigned | ![Available][check] | ![Available][check] | Not available | ![Available][check] |
-| User assigned | Preview | Preview | Not available | Preview |
+| User assigned | ![Available][check] | ![Available][check] | Not available | ![Available][check] |
Refer to the following list to configure managed identity for Azure API Management (in regions where available):
Refer to the following list to configure access to Azure Resource
> Microsoft Power BI also [supports managed identities](../../stream-analytics/powerbi-output-managed-identity.md).
-[check]: media/services-support-managed-identities/check.png "Available"
+[check]: media/services-support-managed-identities/check.png "Available"
active-directory Pim How To Activate Role https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/privileged-identity-management/pim-how-to-activate-role.md
Previously updated : 03/15/2021 Last updated : 05/28/2021
When you activate a role in Privileged Identity Management, your activation migh
## Next steps -- [Activate my Azure AD roles in Privileged Identity Management](pim-how-to-activate-role.md)
+- [View audit history for Azure AD roles](pim-how-to-use-audit-log.md)
active-directory Pim How To Start Security Review https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/privileged-identity-management/pim-how-to-start-security-review.md
Previously updated : 4/27/2021 Last updated : 05/28/2021
This article describes how to create one or more access reviews for privileged A
[!INCLUDE [Azure AD Premium P2 license](../../../includes/active-directory-p2-license.md)] For more information about licenses for PIM, refer to [License requirements to use Privileged Identity Management](subscription-requirements.md). > [!Note]
-> Currently, you can scope an access review to service principals with access to Azure AD and Azure resource roles (Preview) with an Azure Active Directory Premium P2 edition active in your tenant. The licensing model for service principals will be finalized for general availability of this feature and additional licenses may be required.
+> Currently, you can scope an access review to service principals with access to Azure AD and Azure resource roles (Preview) with an Azure Active Directory Premium P2 edition active in your tenant. The licensing model for service principals will be finalized for general availability of this feature and additional licenses may be required.
## Prerequisites
-[Privileged Role Administrator](../roles/permissions-reference.md#privileged-role-administrator)
+[Global Administrator](../roles/permissions-reference.md#global-administrator)
## Open access reviews
-1. Sign in to [Azure portal](https://portal.azure.com/) with a user that is a member of the Privileged role administrator role.
+1. Sign in to [Azure portal](https://portal.azure.com/) as a user that is assigned the Global Administrator role.
-2. Select **Identity Governance**
+2. Select **Identity Governance**.
3. Select **Azure AD roles** under **Azure AD Privileged Identity Management**.
active-directory Admin Units Assign Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/admin-units-assign-roles.md
Role | Description
-- | -- Authentication Administrator | Has access to view, set, and reset authentication method information for any non-admin user in the assigned administrative unit only. Groups Administrator | Can manage all aspects of groups and groups settings, such as naming and expiration policies, in the assigned administrative unit only.
-Helpdesk Administrator | Can reset passwords for non-administrators and Helpdesk administrators in the assigned administrative unit only.
+Helpdesk Administrator | Can reset passwords for non-administrators and Helpdesk Administrators in the assigned administrative unit only.
License Administrator | Can assign, remove, and update license assignments within the administrative unit only. Password Administrator | Can reset passwords for non-administrators and Password Administrators within the assigned administrative unit only. User Administrator | Can manage all aspects of users and groups, including resetting passwords for limited admins within the assigned administrative unit only.
active-directory Admin Units Faq Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/admin-units-faq-troubleshoot.md
For more granular administrative control in Azure Active Directory (Azure AD), y
**A:** Sometimes, the addition or removal of one or more members of an administrative unit might take a few minutes to be reflected on the **Administrative units** pane. Alternatively, you can go directly to the associated resource's properties and see whether the action has been completed. For more information about users and groups in administrative units, see [View a list of administrative units for a user](admin-units-add-manage-users.md) and [View a list of administrative units for a group](admin-units-add-manage-groups.md).
-**Q: I am a delegated password administrator on an administrative unit. Why am I unable to reset a specific user's password?**
+**Q: I am a delegated Password Administrator on an administrative unit. Why am I unable to reset a specific user's password?**
**A:** As an administrator of an administrative unit, you can reset passwords only for users who are assigned to your administrative unit. Make sure that the user whose password reset is failing belongs to the administrative unit to which you've been assigned. If the user belongs to the same administrative unit but you still can't reset the user's password, check the roles that are assigned to the user.
active-directory Concept Understand Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/concept-understand-roles.md
Microsoft 365 has a number of role-based access control systems that developed i
Azure AD built-in roles differ in where they can be used, which fall into the following three broad categories. - **Azure AD-specific roles**: These roles grant permissions to manage resources within Azure AD only. For example, User Administrator, Application Administrator, Groups Administrator all grant permissions to manage resources that live in Azure AD.-- **Service-specific roles**: For major Microsoft 365 services (non-Azure AD), we have built service-specific roles that grant permissions to manage all features within the service. For example, Exchange Admin, Intune Admin, SharePoint Admin, and Teams Admin roles can manage features with their respective services. Exchange Admin can manage mailboxes, Intune Admin can manage device policies, SharePoint Admin can manage site collections, Teams Admin can manage call qualities and so on.-- **Cross-service roles**: There are some roles that span services. We have two global roles - Global Administrator and Global Reader. All Microsoft 365 services honor these two roles. Also, there are some security-related roles like Security Admin and Security Reader that grant access across multiple security services within Microsoft 365. For example, using Security Admin roles in Azure AD, you can manage Microsoft 365 Security Center, Microsoft Defender Advanced Threat Protection, and Microsoft Cloud App Security. Similarly, in the Compliance Administrator role you can manage Compliance-related settings in Microsoft 365 Compliance Center, Exchange, and so on.
+- **Service-specific roles**: For major Microsoft 365 services (non-Azure AD), we have built service-specific roles that grant permissions to manage all features within the service. For example, Exchange Administrator, Intune Administrator, SharePoint Administrator, and Teams Administrator roles can manage features with their respective services. Exchange Administrator can manage mailboxes, Intune Administrator can manage device policies, SharePoint Administrator can manage site collections, Teams Administrator can manage call qualities and so on.
+- **Cross-service roles**: There are some roles that span services. We have two global roles - Global Administrator and Global Reader. All Microsoft 365 services honor these two roles. Also, there are some security-related roles like Security Administrator and Security Reader that grant access across multiple security services within Microsoft 365. For example, using Security Administrator roles in Azure AD, you can manage Microsoft 365 Security Center, Microsoft Defender Advanced Threat Protection, and Microsoft Cloud App Security. Similarly, in the Compliance Administrator role you can manage Compliance-related settings in Microsoft 365 Compliance Center, Exchange, and so on.
![The three categories of Azure AD built-in roles](./media/concept-understand-roles/role-overlap-diagram.png)
-The following table is offered as an aid to understanding these role categories. The categories are named arbitrarily, and aren't intended to imply any other capabilities beyond the [documented role permissions](permissions-reference.md).
+The following table is offered as an aid to understanding these role categories. The categories are named arbitrarily, and aren't intended to imply any other capabilities beyond the [documented Azure AD role permissions](permissions-reference.md).
Category | Role - | - Azure AD-specific roles | Application Administrator<br>Application Developer<br>Authentication Administrator<br>B2C IEF Keyset Administrator<br>B2C IEF Policy Administrator<br>Cloud Application Administrator<br>Cloud Device Administrator<br>Conditional Access Administrator<br>Device Administrators<br>Directory Readers<br>Directory Synchronization Accounts<br>Directory Writers<br>External ID User Flow Administrator<br>External ID User Flow Attribute Administrator<br>External Identity Provider Administrator<br>Groups Administrator<br>Guest Inviter<br>Helpdesk Administrator<br>Hybrid Identity Administrator<br>License Administrator<br>Partner Tier1 Support<br>Partner Tier2 Support<br>Password Administrator<br>Privileged Authentication Administrator<br>Privileged Role Administrator<br>Reports Reader<br>User Account Administrator Cross-service roles | Global Administrator<br>Compliance Administrator<br>Compliance Data Administrator<br>Global Reader<br>Security Administrator<br>Security Operator<br>Security Reader<br>Service Support Administrator
-Service-specific roles | Azure DevOps Administrator<br>Azure Information Protection Administrator<br>Billing Administrator<br>CRM Service Administrator<br>Customer LockBox Access Approver<br>Desktop Analytics Administrator<br>Exchange Service Administrator<br>Insights Administrator<br>Insights Business Leader<br>Intune Service Administrator<br>Kaizala Administrator<br>Lync Service Administrator<br>Message Center Privacy Reader<br>Message Center Reader<br>Modern Commerce User<br>Network Administrator<br>Office Apps Administrator<br>Power BI Service Administrator<br>Power Platform Administrator<br>Printer Administrator<br>Printer Technician<br>Search Administrator<br>Search Editor<br>SharePoint Service Administrator<br>Teams Communications Administrator<br>Teams Communications Support Engineer<br>Teams Communications Support Specialist<br>Teams Devices Administrator<br>Teams Service Administrator
+Service-specific roles | Azure DevOps Administrator<br>Azure Information Protection Administrator<br>Billing Administrator<br>CRM Service Administrator<br>Customer LockBox Access Approver<br>Desktop Analytics Administrator<br>Exchange Service Administrator<br>Insights Administrator<br>Insights Business Leader<br>Intune Service Administrator<br>Kaizala Administrator<br>Lync Service Administrator<br>Message Center Privacy Reader<br>Message Center Reader<br>Modern Commerce User<br>Network Administrator<br>Office Apps Administrator<br>Power BI Service Administrator<br>Power Platform Administrator<br>Printer Administrator<br>Printer Technician<br>Search Administrator<br>Search Editor<br>SharePoint Service Administrator<br>Teams Communications Administrator<br>Teams Communications Support Engineer<br>Teams Communications Support Specialist<br>Teams Devices Administrator<br>Teams Administrator
## Next steps
active-directory Custom Assign Graph https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-assign-graph.md
You can automate how you assign roles to user accounts using the Microsoft Graph
- Azure AD Premium P1 or P2 license - Privileged Role Administrator or Global Administrator-- Admin consent when using Graph explorer for Microsoft Graph API
+- Admin consent when using Graph Explorer for Microsoft Graph API
For more information, see [Prerequisites to use PowerShell or Graph Explorer](prerequisites.md).
We prevent users from deleting their own Global Administrator role to avoid a sc
## Next steps * Feel free to share with us on the [Azure AD administrative roles forum](https://feedback.azure.com/forums/169401-azure-active-directory?category_id=166032)
-* For more about roles and Administrator role assignment, see [Assign administrator roles](permissions-reference.md)
-* For default user permissions, see a [comparison of default guest and member user permissions](../fundamentals/users-default-permissions.md)
+* For more about role permissions, see [Azure AD built-in roles](permissions-reference.md)
+* For default user permissions, see a [comparison of default guest and member user permissions](../fundamentals/users-default-permissions.md)
+
active-directory Custom Assign Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-assign-powershell.md
This article describes how to create a role assignment at organization-wide scope in Azure Active Directory (Azure AD). Assigning a role at organization-wide scope grants access across the Azure AD organization. To create a role assignment with a scope of a single Azure AD resource, see [How to create a custom role and assign it at resource scope](custom-create.md). This article uses the [Azure Active Directory PowerShell Version 2](/powershell/module/azuread/#directory_roles) module.
-For more information about Azure AD admin roles, see [Assigning administrator roles in Azure Active Directory](permissions-reference.md).
+For more information about Azure AD roles, see [Azure AD built-in roles](permissions-reference.md).
## Prerequisites
active-directory Custom Consent Permissions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-consent-permissions.md
To delegate the creation, update and deletion of [app consent policies](../manag
## Full list of permissions
-Permission | Description
-- | --
-microsoft.directory/servicePrincipals/managePermissionGrantsForSelf.{id} | Grants the ability to consent to apps on behalf of self (user consent), subject to app consent policy `{id}`.
-microsoft.directory/servicePrincipals/managePermissionGrantsForAll.{id} | Grants the permission to consent to apps on behalf of all (tenant-wide admin consent), subject to app consent policy `{id}`.
-microsoft.directory/permissionGrantPolicies/standard/read | Grants the ability to read app consent policies.
-microsoft.directory/permissionGrantPolicies/basic/update | Grants the ability to update basic properties on existing app consent policies.
-microsoft.directory/permissionGrantPolicies/create | Grants the ability to create app consent policies.
-microsoft.directory/permissionGrantPolicies/delete | Grants the ability to delete app consent policies.
+> [!div class="mx-tableFixed"]
+> | Permission | Description |
+> | - | -- |
+> | microsoft.directory/servicePrincipals/managePermissionGrantsForSelf.{id} | Grants the ability to consent to apps on behalf of self (user consent), subject to app consent policy `{id}`. |
+> | microsoft.directory/servicePrincipals/managePermissionGrantsForAll.{id} | Grants the permission to consent to apps on behalf of all (tenant-wide admin consent), subject to app consent policy `{id}`. |
+> | microsoft.directory/permissionGrantPolicies/standard/read | Grants the ability to read app consent policies. |
+> | microsoft.directory/permissionGrantPolicies/basic/update | Grants the ability to update basic properties on existing app consent policies. |
+> | microsoft.directory/permissionGrantPolicies/create | Grants the ability to create app consent policies. |
+> | microsoft.directory/permissionGrantPolicies/delete | Grants the ability to delete app consent policies. |
## Next steps
active-directory Custom Create https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-create.md
Like built-in roles, custom roles are assigned by default at the default organiz
## Next steps - Feel free to share with us on the [Azure AD administrative roles forum](https://feedback.azure.com/forums/169401-azure-active-directory?category_id=166032).-- For more about roles and Administrator role assignment, see [Assign administrator roles](permissions-reference.md).
+- For more about role permissions, see [Azure AD built-in roles](permissions-reference.md).
- For default user permissions, see a [comparison of default guest and member user permissions](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2froles%2fcontext%2fugr-context).
active-directory Custom Enterprise App Permissions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-enterprise-app-permissions.md
To delegate ability to authorize application access for provisioning. Example in
## Full list of permissions
-Permission | Description
-- | --
-microsoft.directory/applicationPolicies/allProperties/read | Read all properties on application policies.
-microsoft.directory/applicationPolicies/allProperties/update | Update all properties on application policies.
-microsoft.directory/applicationPolicies/basic/update | Update standard properties of application policies.
-microsoft.directory/applicationPolicies/create | Create application policies.
-microsoft.directory/applicationPolicies/createAsOwner | Create application policies. Creator is added as the first owner.
-microsoft.directory/applicationPolicies/delete | Delete application policies.
-microsoft.directory/applicationPolicies/owners/read | Read owners on application policies.
-microsoft.directory/applicationPolicies/owners/update | Update the owner property of application policies.
-microsoft.directory/applicationPolicies/policyAppliedTo/read | Read application policies applied to objects list.
-microsoft.directory/applicationPolicies/standard/read | Read standard properties of application policies.
-microsoft.directory/servicePrincipals/allProperties/allTasks | Create and delete servicePrincipals, and read and update all properties in Azure Active Directory.
-microsoft.directory/servicePrincipals/allProperties/read | Read all properties on servicePrincipals.
-microsoft.directory/servicePrincipals/allProperties/update | Update all properties on servicePrincipals.
-microsoft.directory/servicePrincipals/appRoleAssignedTo/read | Read service principal role assignments.
-microsoft.directory/servicePrincipals/appRoleAssignedTo/update | Update service principal role assignments.
-microsoft.directory/servicePrincipals/appRoleAssignments/read | Read role assignments assigned to service principals.
-microsoft.directory/servicePrincipals/audience/update | Update audience properties on service principals.
-microsoft.directory/servicePrincipals/authentication/update | Update authentication properties on service principals.
-microsoft.directory/servicePrincipals/basic/update | Update basic properties on service principals.
-microsoft.directory/servicePrincipals/create | Create service principals.
-microsoft.directory/servicePrincipals/createAsOwner | Create service principals. Creator is added as the first owner.
-microsoft.directory/servicePrincipals/credentials/update | Update credentials properties on service principals.
-microsoft.directory/servicePrincipals/delete | Delete service principals.
-microsoft.directory/servicePrincipals/disable | Disable service principals.
-microsoft.directory/servicePrincipals/enable | Enable service principals.
-microsoft.directory/servicePrincipals/getPasswordSingleSignOnCredentials | Read password single sign-on credentials on service principals.
-microsoft.directory/servicePrincipals/managePasswordSingleSignOnCredentials | Manage password single sign-on credentials on service principals.
-microsoft.directory/servicePrincipals/oAuth2PermissionGrants/read | Read delegated permission grants on service principals.
-microsoft.directory/servicePrincipals/owners/read | Read owners on service principals.
-microsoft.directory/servicePrincipals/owners/update | Update owners on service principals.
-microsoft.directory/servicePrincipals/permissions/update |
-microsoft.directory/servicePrincipals/policies/read | Read policies on service principals.
-microsoft.directory/servicePrincipals/policies/update | Update policies on service principals.
-microsoft.directory/servicePrincipals/standard/read | Read standard properties of service principals.
-microsoft.directory/servicePrincipals/synchronization/standard/read | Read provisioning settings associated with your service principal.
-microsoft.directory/servicePrincipals/tag/update | Update tags property on service principals.
-microsoft.directory/applicationTemplates/instantiate | Instantiate gallery applications from application templates.
-microsoft.directory/auditLogs/allProperties/read | Read audit logs.
-microsoft.directory/signInReports/allProperties/read | Read sign-in reports.
-microsoft.directory/applications/synchronization/standard/read | Read provisioning settings associated with the application object.
-microsoft.directory/servicePrincipals/synchronizationJobs/manage | Manage all aspects of job synchronization for service principal resources
-microsoft.directory/servicePrincipals/synchronization/standard/read | Read provisioning settings associated with service principals
-microsoft.directory/servicePrincipals/synchronizationSchema/manage | Manage all aspects of schema synchronization for service principal resources
-microsoft.directory/provisioningLogs/allProperties/read | Read all properties of provisioning logs
+> [!div class="mx-tableFixed"]
+> | Permission | Description |
+> | - | -- |
+> | microsoft.directory/applicationPolicies/allProperties/read | Read all properties on application policies. |
+> | microsoft.directory/applicationPolicies/allProperties/update | Update all properties on application policies. |
+> | microsoft.directory/applicationPolicies/basic/update | Update standard properties of application policies. |
+> | microsoft.directory/applicationPolicies/create | Create application policies. |
+> | microsoft.directory/applicationPolicies/createAsOwner | Create application policies. Creator is added as the first owner. |
+> | microsoft.directory/applicationPolicies/delete | Delete application policies. |
+> | microsoft.directory/applicationPolicies/owners/read | Read owners on application policies. |
+> | microsoft.directory/applicationPolicies/owners/update | Update the owner property of application policies. |
+> | microsoft.directory/applicationPolicies/policyAppliedTo/read | Read application policies applied to objects list. |
+> | microsoft.directory/applicationPolicies/standard/read | Read standard properties of application policies. |
+> | microsoft.directory/servicePrincipals/allProperties/allTasks | Create and delete servicePrincipals, and read and update all properties in Azure Active Directory. |
+> | microsoft.directory/servicePrincipals/allProperties/read | Read all properties on servicePrincipals. |
+> | microsoft.directory/servicePrincipals/allProperties/update | Update all properties on servicePrincipals. |
+> | microsoft.directory/servicePrincipals/appRoleAssignedTo/read | Read service principal role assignments. |
+> | microsoft.directory/servicePrincipals/appRoleAssignedTo/update | Update service principal role assignments. |
+> | microsoft.directory/servicePrincipals/appRoleAssignments/read | Read role assignments assigned to service principals. |
+> | microsoft.directory/servicePrincipals/audience/update | Update audience properties on service principals. |
+> | microsoft.directory/servicePrincipals/authentication/update | Update authentication properties on service principals. |
+> | microsoft.directory/servicePrincipals/basic/update | Update basic properties on service principals. |
+> | microsoft.directory/servicePrincipals/create | Create service principals. |
+> | microsoft.directory/servicePrincipals/createAsOwner | Create service principals. Creator is added as the first owner. |
+> | microsoft.directory/servicePrincipals/credentials/update | Update credentials properties on service principals. |
+> | microsoft.directory/servicePrincipals/delete | Delete service principals. |
+> | microsoft.directory/servicePrincipals/disable | Disable service principals. |
+> | microsoft.directory/servicePrincipals/enable | Enable service principals. |
+> | microsoft.directory/servicePrincipals/getPasswordSingleSignOnCredentials | Read password single sign-on credentials on service principals. |
+> | microsoft.directory/servicePrincipals/managePasswordSingleSignOnCredentials | Manage password single sign-on credentials on service principals. |
+> | microsoft.directory/servicePrincipals/oAuth2PermissionGrants/read | Read delegated permission grants on service principals. |
+> | microsoft.directory/servicePrincipals/owners/read | Read owners on service principals. |
+> | microsoft.directory/servicePrincipals/owners/update | Update owners on service principals. |
+> | microsoft.directory/servicePrincipals/permissions/update | |
+> | microsoft.directory/servicePrincipals/policies/read | Read policies on service principals. |
+> | microsoft.directory/servicePrincipals/policies/update | Update policies on service principals. |
+> | microsoft.directory/servicePrincipals/standard/read | Read standard properties of service principals. |
+> | microsoft.directory/servicePrincipals/synchronization/standard/read | Read provisioning settings associated with your service principal. |
+> | microsoft.directory/servicePrincipals/tag/update | Update tags property on service principals. |
+> | microsoft.directory/applicationTemplates/instantiate | Instantiate gallery applications from application templates. |
+> | microsoft.directory/auditLogs/allProperties/read | Read audit logs. |
+> | microsoft.directory/signInReports/allProperties/read | Read sign-in reports. |
+> | microsoft.directory/applications/synchronization/standard/read | Read provisioning settings associated with the application object. |
+> | microsoft.directory/servicePrincipals/synchronizationJobs/manage | Manage all aspects of job synchronization for service principal resources |
+> | microsoft.directory/servicePrincipals/synchronization/standard/read | Read provisioning settings associated with service principals |
+> | microsoft.directory/servicePrincipals/synchronizationSchema/manage | Manage all aspects of schema synchronization for service principal resources |
+> | microsoft.directory/provisioningLogs/allProperties/read | Read all properties of provisioning logs |
## Next steps
active-directory Delegate App Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/delegate-app-roles.md
This article describes how to use permissions granted by custom roles in Azure A
- [Assigning a built-in administrative role](#assign-built-in-application-admin-roles) that grants access to manage configuration in Azure AD for all applications. This is the recommended way to grant IT experts access to manage broad application configuration permissions without granting access to manage other parts of Azure AD not related to application configuration. - [Creating a custom role](#create-and-assign-a-custom-role-preview) defining very specific permissions and assigning it to someone either to the scope of a single application as a limited owner, or at the directory scope (all applications) as a limited administrator.
-It's important to consider granting access using one of the above methods for two reasons. First, delegating the ability to perform administrative tasks reduces global administrator overhead. Second, using limited permissions improves your security posture and reduces the potential for unauthorized access. For guidelines about role security planning, see [Securing privileged access for hybrid and cloud deployments in Azure AD](security-planning.md).
+It's important to consider granting access using one of the above methods for two reasons. First, delegating the ability to perform administrative tasks reduces Global Administrator overhead. Second, using limited permissions improves your security posture and reduces the potential for unauthorized access. For guidelines about role security planning, see [Securing privileged access for hybrid and cloud deployments in Azure AD](security-planning.md).
## Restrict who can create applications
By default in Azure AD, all users can register applications and manage all aspec
### To disable the default ability to create application registrations or consent to applications
-1. Sign in to your Azure AD organization with an account that eligible for the Global administrator role in your Azure AD organization.
+1. Sign in to your Azure AD organization with an account that eligible for the Global Administrator role in your Azure AD organization.
1. Set one or both of the following: - On the [User settings page for your organization](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/UserSettings), set the **Users can register applications** setting to No. This will disable the default ability for users to create application registrations.
By default in Azure AD, all users can register applications and manage all aspec
### Grant individual permissions to create and consent to applications when the default ability is disabled
-Assign the Application developer role to grant the ability to create application registrations when the **Users can register applications** setting is set to No. This role also grants permission to consent on one's own behalf when the **Users can consent to apps accessing company data on their behalf** setting is set to No. As a system behavior, when a user creates a new application registration, they are automatically added as the first owner. Ownership permissions give the user the ability to manage all aspects of an application registration or enterprise application that they own.
+Assign the Application Developer role to grant the ability to create application registrations when the **Users can register applications** setting is set to No. This role also grants permission to consent on one's own behalf when the **Users can consent to apps accessing company data on their behalf** setting is set to No. As a system behavior, when a user creates a new application registration, they are automatically added as the first owner. Ownership permissions give the user the ability to manage all aspects of an application registration or enterprise application that they own.
## Assign application owners
Assigning owners is a simple way to grant the ability to manage all aspects of A
### Enterprise application owners
-As an owner, a user can manage the organization-specific configuration of the enterprise application, such as the single sign-on configuration, provisioning, and user assignments. An owner can also add or remove other owners. Unlike Global administrators, owners can manage only the enterprise applications they own.
+As an owner, a user can manage the organization-specific configuration of the enterprise application, such as the single sign-on configuration, provisioning, and user assignments. An owner can also add or remove other owners. Unlike Global Administrators, owners can manage only the enterprise applications they own.
In some cases, enterprise applications created from the application gallery include both an enterprise application and an application registration. When this is true, adding an owner to the enterprise application automatically adds the owner to the corresponding application registration as an owner. ### To assign an owner to an enterprise application
-1. Sign in to [your Azure AD organization](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview) with an account that eligible for the Application administrator or Cloud application administrator for the organization.
+1. Sign in to [your Azure AD organization](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Overview) with an account that eligible for the Application Administrator or Cloud Application Administrator for the organization.
1. On the [App registrations page](https://portal.azure.com/#blade/Microsoft_AAD_IAM/StartboardApplicationsMenuBlade/AllApps/menuId/) for the organization, select an app to open the Overview page for the app. 1. Select **Owners** to see the list of the owners for the app. 1. Select **Add** to select one or more owners to add to the app.
For more information on the basics of custom roles, see the [custom roles overvi
## Next steps - [Application registration subtypes and permissions](custom-available-permissions.md)-- [Azure AD administrator role reference](permissions-reference.md)
+- [Azure AD built-in roles](permissions-reference.md)
active-directory Delegate By Task https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/delegate-by-task.md
In this article, you can find the information needed to restrict a user's admini
## Application proxy
-Task | Least privileged role | Additional roles
-- | | -
-Configure application proxy app | Application administrator |
-Configure connector group properties | Application administrator |
-Create application registration when ability is disabled for all users | Application developer | Cloud Application administrator, Application Administrator
-Create connector group | Application administrator |
-Delete connector group | Application administrator |
-Disable application proxy | Application administrator |
-Download connector service | Application administrator |
-Read all configuration | Application administrator |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Configure application proxy app | Application Administrator | |
+> | Configure connector group properties | Application Administrator | |
+> | Create application registration when ability is disabled for all users | Application Developer | Cloud Application Administrator, Application Administrator |
+> | Create connector group | Application Administrator | |
+> | Delete connector group | Application Administrator | |
+> | Disable application proxy | Application Administrator | |
+> | Download connector service | Application Administrator | |
+> | Read all configuration | Application Administrator | |
## External Identities/B2C
-Task | Least privileged role | Additional roles
-- | | -
-Create Azure AD B2C directories | All non-guest users ([see documentation](../fundamentals/users-default-permissions.md)) |
-Create B2C applications | Global Administrator |
-Create enterprise applications | Cloud Application Administrator | Application Administrator
-Create, read, update, and delete B2C policies | B2C IEF Policy Administrator |
-Create, read, update, and delete identity providers | External Identity Provider Administrator |
-Create, read, update, and delete password reset user flows | External ID User Flow Administrator |
-Create, read, update, and delete profile editing user flows | External ID User Flow Administrator |
-Create, read, update, and delete sign-in user flows | External ID User Flow Administrator |
-Create, read, update, and delete sign-up user flow |External ID User Flow Administrator |
-Create, read, update, and delete user attributes | External ID User Flow Attribute Administrator |
-Create, read, update, and delete users | User Administrator
-Read all configuration | Global reader |
-Read B2C audit logs | Global reader ([see documentation](../../active-directory-b2c/faq.yml)) |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Create Azure AD B2C directories | All non-guest users ([see documentation](../fundamentals/users-default-permissions.md)) | |
+> | Create B2C applications | Global Administrator | |
+> | Create enterprise applications | Cloud Application Administrator | Application Administrator |
+> | Create, read, update, and delete B2C policies | B2C IEF Policy Administrator | |
+> | Create, read, update, and delete identity providers | External Identity Provider Administrator | |
+> | Create, read, update, and delete password reset user flows | External ID User Flow Administrator | |
+> | Create, read, update, and delete profile editing user flows | External ID User Flow Administrator | |
+> | Create, read, update, and delete sign-in user flows | External ID User Flow Administrator | |
+> | Create, read, update, and delete sign-up user flow |External ID User Flow Administrator | |
+> | Create, read, update, and delete user attributes | External ID User Flow Attribute Administrator | |
+> | Create, read, update, and delete users | User Administrator | |
+> | Read all configuration | Global Reader | |
+> | Read B2C audit logs | Global Reader ([see documentation](../../active-directory-b2c/faq.yml)) | |
> [!NOTE]
-> Azure AD B2C Global administrators do not have the same permissions as Azure AD global administrators. If you have Azure AD B2C global administrator privileges, make sure that you are in an Azure AD B2C directory and not an Azure AD directory.
+> Azure AD B2C Global Administrators do not have the same permissions as Azure AD Global Administrators. If you have Azure AD B2C Global Administrator privileges, make sure that you are in an Azure AD B2C directory and not an Azure AD directory.
## Company branding
-Task | Least privileged role | Additional roles
-- | | -
-Configure company branding | Global Administrator |
-Read all configuration | Directory readers | Default user role ([see documentation](../fundamentals/users-default-permissions.md))
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Configure company branding | Global Administrator | |
+> | Read all configuration | Directory readers | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) |
## Company properties
-Task | Least privileged role | Additional roles
-- | | -
-Configure company properties | Global Administrator |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Configure company properties | Global Administrator | |
## Connect
-Task | Least privileged role | Additional roles
-- | | -
-Passthrough authentication | Global Administrator |
-Read all configuration | Global reader | Global Administrator |
-Seamless single sign-on | Global Administrator |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Passthrough authentication | Global Administrator | |
+> | Read all configuration | Global Reader | Global Administrator |
+> | Seamless single sign-on | Global Administrator | |
## Cloud Provisioning
-Task | Least privileged role | Additional roles
-- | | -
-Passthrough authentication | Hybrid Identity Administrator |
-Read all configuration | Global reader | Hybrid Identity Administrator |
-Seamless single sign-on | Hybrid Identity Administrator |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Passthrough authentication | Hybrid Identity Administrator | |
+> | Read all configuration | Global Reader | Hybrid Identity Administrator |
+> | Seamless single sign-on | Hybrid Identity Administrator | |
## Connect Health
-Task | Least privileged role | Additional roles
-- | | -
-Add or delete services | Owner ([see documentation](../hybrid/how-to-connect-health-operations.md)) |
-Apply fixes to sync error | Contributor ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Owner
-Configure notifications | Contributor ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Owner
-Configure settings | Owner ([see documentation](../hybrid/how-to-connect-health-operations.md)) |
-Configure sync notifications | Contributor ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Owner
-Read ADFS security reports | Security Reader | Contributor, Owner
-Read all configuration | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner
-Read sync errors | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner
-Read sync services | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner
-View metrics and alerts | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner
-View metrics and alerts | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner
-View sync service metrics and alerts | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Add or delete services | Owner ([see documentation](../hybrid/how-to-connect-health-operations.md)) | |
+> | Apply fixes to sync error | Contributor ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Owner |
+> | Configure notifications | Contributor ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Owner |
+> | Configure settings | Owner ([see documentation](../hybrid/how-to-connect-health-operations.md)) | |
+> | Configure sync notifications | Contributor ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Owner |
+> | Read ADFS security reports | Security Reader | Contributor, Owner
+> | Read all configuration | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner |
+> | Read sync errors | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner |
+> | Read sync services | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner |
+> | View metrics and alerts | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner |
+> | View metrics and alerts | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner |
+> | View sync service metrics and alerts | Reader ([see documentation](../fundamentals/users-default-permissions.md?context=azure%2factive-directory%2fusers-groups-roles%2fcontext%2fugr-context)) | Contributor, Owner |
## Custom domain names
-Task | Least privileged role | Additional roles
-- | | -
-Manage domains | Domain Name Administrator |
-Read all configuration | Directory readers | Default user role ([see documentation](../fundamentals/users-default-permissions.md))
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Manage domains | Domain Name Administrator | |
+> | Read all configuration | Directory readers | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) |
## Domain Services
-Task | Least privileged role | Additional roles
-- | | -
-Create Azure AD Domain Services instance | Global Administrator |
-Perform all Azure AD Domain Services tasks | Azure AD DC Administrators group ([see documentation](../../active-directory-domain-services/tutorial-create-management-vm.md#administrative-tasks-you-can-perform-on-a-managed-domain)) |
-Read all configuration | Reader on Azure subscription containing AD DS service |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Create Azure AD Domain Services instance | Global Administrator | |
+> | Perform all Azure AD Domain Services tasks | Azure AD DC Administrators group ([see documentation](../../active-directory-domain-services/tutorial-create-management-vm.md#administrative-tasks-you-can-perform-on-a-managed-domain)) | |
+> | Read all configuration | Reader on Azure subscription containing AD DS service | |
## Devices
-Task | Least privileged role | Additional roles
-- | | -
-Disable device | Cloud device administrator |
-Enable device | Cloud device administrator |
-Read basic configuration | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) |
-Read BitLocker keys | Security Reader | Password administrator, Security administrator
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Disable device | Cloud Device Administrator | |
+> | Enable device | Cloud Device Administrator | |
+> | Read basic configuration | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) | |
+> | Read BitLocker keys | Security Reader | Password Administrator, Security Administrator |
## Enterprise applications
-Task | Least privileged role | Additional roles
-- | | -
-Consent to any delegated permissions | Cloud application administrator | Application administrator
-Consent to application permissions not including Microsoft Graph | Cloud application administrator | Application administrator
-Consent to application permissions to Microsoft Graph | Privileged Role Administrator |
-Consent to applications accessing own data | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) |
-Create enterprise application | Cloud application administrator | Application administrator
-Manage Application Proxy | Application administrator |
-Manage user settings | Global Administrator |
-Read access review of a group or of an app | Security Reader | Security Administrator, User Administrator
-Read all configuration | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) |
-Update enterprise application assignments | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud application administrator, Application administrator
-Update enterprise application owners | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud application administrator, Application administrator
-Update enterprise application properties | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud application administrator, Application administrator
-Update enterprise application provisioning | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud application administrator, Application administrator
-Update enterprise application self-service | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud application administrator, Application administrator
-Update single sign-on properties | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud application administrator, Application administrator
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Consent to any delegated permissions | Cloud Application Administrator | Application Administrator |
+> | Consent to application permissions not including Microsoft Graph | Cloud Application Administrator | Application Administrator |
+> | Consent to application permissions to Microsoft Graph | Privileged Role Administrator | |
+> | Consent to applications accessing own data | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) | |
+> | Create enterprise application | Cloud Application Administrator | Application Administrator |
+> | Manage Application Proxy | Application Administrator | |
+> | Manage user settings | Global Administrator | |
+> | Read access review of a group or of an app | Security Reader | Security Administrator, User Administrator |
+> | Read all configuration | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) | |
+> | Update enterprise application assignments | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud Application Administrator, Application Administrator |
+> | Update enterprise application owners | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud Application Administrator, Application Administrator |
+> | Update enterprise application properties | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud Application Administrator, Application Administrator |
+> | Update enterprise application provisioning | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud Application Administrator, Application Administrator |
+> | Update enterprise application self-service | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud Application Administrator, Application Administrator |
+> | Update single sign-on properties | Enterprise application owner ([see documentation](../fundamentals/users-default-permissions.md)) | Cloud Application Administrator, Application Administrator |
## Entitlement management
-Task | Least privileged role | Additional roles
-- | | -
-Add resources to a catalog | User administrator | With entitlement management, you can delegate this task to the catalog owner ([see documentation](../governance/entitlement-management-catalog-create.md#add-additional-catalog-owners))
-Add SharePoint Online sites to catalog | Global administrator
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Add resources to a catalog | User Administrator | With entitlement management, you can delegate this task to the catalog owner ([see documentation](../governance/entitlement-management-catalog-create.md#add-additional-catalog-owners)) |
+> | Add SharePoint Online sites to catalog | Global Administrator | |
## Groups
-Task | Least privileged role | Additional roles
-- | | -
-Assign license | User administrator |
-Create group | Groups administrator | User administrator
-Create, update, or delete access review of a group or of an app | User administrator |
-Manage group expiration | User administrator |
-Manage group settings | Groups Administrator | User Administrator |
-Read all configuration (except hidden membership) | Directory readers | Default user role ([see documentation](../fundamentals/users-default-permissions.md))
-Read hidden membership | Group member | Group owner, Password administrator, Exchange administrator, SharePoint administrator, Teams administrator, User administrator
-Read membership of groups with hidden membership | Helpdesk Administrator | User administrator, Teams administrator
-Revoke license | License administrator | User administrator
-Update group membership | Group owner ([see documentation](../fundamentals/users-default-permissions.md)) | User administrator
-Update group owners | Group owner ([see documentation](../fundamentals/users-default-permissions.md)) | User administrator
-Update group properties | Group owner ([see documentation](../fundamentals/users-default-permissions.md)) | User administrator
-Delete group | Groups administrator | User administrator
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Assign license | User Administrator | |
+> | Create group | Groups Administrator | User Administrator |
+> | Create, update, or delete access review of a group or of an app | User Administrator | |
+> | Manage group expiration | User Administrator | |
+> | Manage group settings | Groups Administrator | User Administrator |
+> | Read all configuration (except hidden membership) | Directory readers | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) |
+> | Read hidden membership | Group member | Group owner, Password Administrator, Exchange Administrator, SharePoint Administrator, Teams Administrator, User Administrator |
+> | Read membership of groups with hidden membership | Helpdesk Administrator | User Administrator, Teams Administrator |
+> | Revoke license | License Administrator | User Administrator |
+> | Update group membership | Group owner ([see documentation](../fundamentals/users-default-permissions.md)) | User Administrator |
+> | Update group owners | Group owner ([see documentation](../fundamentals/users-default-permissions.md)) | User Administrator |
+> | Update group properties | Group owner ([see documentation](../fundamentals/users-default-permissions.md)) | User Administrator |
+> | Delete group | Groups Administrator | User Administrator |
## Identity Protection
-Task | Least privileged role | Additional roles
-- | | -
-Configure alert notifications| Security Administrator |
-Configure and enable or disable MFA policy| Security Administrator |
-Configure and enable or disable sign-in risk policy| Security Administrator |
-Configure and enable or disable user risk policy | Security Administrator |
-Configure weekly digests | Security Administrator|
-Dismiss all risk detections | Security Administrator |
-Fix or dismiss vulnerability | Security Administrator |
-Read all configuration | Security Reader |
-Read all risk detections | Security Reader |
-Read vulnerabilities | Security Reader |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Configure alert notifications| Security Administrator | |
+> | Configure and enable or disable MFA policy| Security Administrator | |
+> | Configure and enable or disable sign-in risk policy| Security Administrator | |
+> | Configure and enable or disable user risk policy | Security Administrator | |
+> | Configure weekly digests | Security Administrator | |
+> | Dismiss all risk detections | Security Administrator | |
+> | Fix or dismiss vulnerability | Security Administrator | |
+> | Read all configuration | Security Reader | |
+> | Read all risk detections | Security Reader | |
+> | Read vulnerabilities | Security Reader | |
## Licenses
-Task | Least privileged role | Additional roles
-- | | -
-Assign license | License administrator | User administrator
-Read all configuration | Directory readers | Default user role ([see documentation](../fundamentals/users-default-permissions.md))
-Revoke license | License administrator | User administrator
-Try or buy subscription | Billing administrator |
-
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Assign license | License Administrator | User Administrator |
+> | Read all configuration | Directory readers | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) |
+> | Revoke license | License Administrator | User Administrator |
+> | Try or buy subscription | Billing Administrator | |
## Monitoring - Audit logs
-Task | Least privileged role | Additional roles
-- | | -
-Read audit logs | Reports reader | Security Reader, Security administrator
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Read audit logs | Reports Reader | Security Reader, Security Administrator |
## Monitoring - Sign-ins
-Task | Least privileged role | Additional roles
-- | | -
-Read sign-in logs | Reports reader | Security Reader, Security administrator
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Read sign-in logs | Reports Reader | Security Reader, Security Administrator |
## Multi-factor authentication
-Task | Least privileged role | Additional roles
-- | | -
-Delete all existing app passwords generated by the selected users | Global Administrator |
-Disable MFA | Authentication Administrator (via PowerShell) | Privileged Authentication Administrator (via PowerShell)
-Enable MFA | Authentication Administrator (via PowerShell) | Privileged Authentication Administrator (via PowerShell)
-Manage MFA service settings | Authentication Policy Administrator |
-Require selected users to provide contact methods again | Authentication Administrator |
-Restore multi-factor authentication on all remembered devices  | Authentication Administrator |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Delete all existing app passwords generated by the selected users | Global Administrator | |
+> | Disable MFA | Authentication Administrator (via PowerShell) | Privileged Authentication Administrator (via PowerShell) |
+> | Enable MFA | Authentication Administrator (via PowerShell) | Privileged Authentication Administrator (via PowerShell) |
+> | Manage MFA service settings | Authentication Policy Administrator | |
+> | Require selected users to provide contact methods again | Authentication Administrator | |
+> | Restore multi-factor authentication on all remembered devices  | Authentication Administrator | |
## MFA Server
-Task | Least privileged role | Additional roles
-- | | -
-Block/unblock users | Authentication Policy Administrator |
-Configure account lockout | Authentication Policy Administrator |
-Configure caching rules | Authentication Policy Administrator |
-Configure fraud alert | Authentication Policy Administrator
-Configure notifications | Authentication Policy Administrator |
-Configure one-time bypass | Authentication Policy Administrator |
-Configure phone call settings | Authentication Policy Administrator |
-Configure providers | Authentication Policy Administrator |
-Configure server settings | Authentication Policy Administrator |
-Read activity report | Global reader |
-Read all configuration | Global reader |
-Read server status | Global reader |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Block/unblock users | Authentication Policy Administrator | |
+> | Configure account lockout | Authentication Policy Administrator | |
+> | Configure caching rules | Authentication Policy Administrator | |
+> | Configure fraud alert | Authentication Policy Administrator | |
+> | Configure notifications | Authentication Policy Administrator | |
+> | Configure one-time bypass | Authentication Policy Administrator | |
+> | Configure phone call settings | Authentication Policy Administrator | |
+> | Configure providers | Authentication Policy Administrator | |
+> | Configure server settings | Authentication Policy Administrator | |
+> | Read activity report | Global Reader | |
+> | Read all configuration | Global Reader | |
+> | Read server status | Global Reader | |
## Organizational relationships
-Task | Least privileged role | Additional roles
-- | | -
-Manage identity providers | External Identity Provider Administrator |
-Manage settings | Global Administrator |
-Manage terms of use | Global Administrator |
-Read all configuration | Global reader |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Manage identity providers | External Identity Provider Administrator | |
+> | Manage settings | Global Administrator | |
+> | Manage terms of use | Global Administrator | |
+> | Read all configuration | Global Reader | |
## Password reset
-Task | Least privileged role | Additional roles
-- | | -
-Configure authentication methods | Global Administrator |
-Configure customization | Global Administrator |
-Configure notification | Global Administrator |
-Configure on-premises integration | Global Administrator |
-Configure password reset properties | User Administrator | Global Administrator
-Configure registration | Global Administrator |
-Read all configuration | Security Administrator | User Administrator |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Configure authentication methods | Global Administrator | |
+> | Configure customization | Global Administrator | |
+> | Configure notification | Global Administrator | |
+> | Configure on-premises integration | Global Administrator | |
+> | Configure password reset properties | User Administrator | Global Administrator |
+> | Configure registration | Global Administrator | |
+> | Read all configuration | Security Administrator | User Administrator |
## Privileged identity management
-Task | Least privileged role | Additional roles
-- | | -
-Assign users to roles | Privileged role administrator |
-Configure role settings | Privileged role administrator |
-View audit activity | Security reader |
-View role memberships | Security reader |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Assign users to roles | Privileged Role Administrator | |
+> | Configure role settings | Privileged Role Administrator | |
+> | View audit activity | Security Reader | |
+> | View role memberships | Security Reader | |
## Roles and administrators
-Task | Least privileged role | Additional roles
-- | | -
-Manage role assignments | Privileged role administrator |
-Read access review of an Azure AD role | Security Reader | Security administrator, Privileged role administrator
-Read all configuration | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Manage role assignments | Privileged Role Administrator | |
+> | Read access review of an Azure AD role | Security Reader | Security Administrator, Privileged Role Administrator |
+> | Read all configuration | Default user role ([see documentation](../fundamentals/users-default-permissions.md)) | |
## Security - Authentication methods
-Task | Least privileged role | Additional roles
-- | | -
-Configure authentication methods | Global Administrator |
-Configure password protection | Security administrator
-Configure smart lockout | Security administrator
-Read all configuration | Global reader |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Configure authentication methods | Global Administrator | |
+> | Configure password protection | Security Administrator | |
+> | Configure smart lockout | Security Administrator |
+> | Read all configuration | Global Reader | |
## Security - Conditional Access
-Task | Least privileged role | Additional roles
-- | | -
-Configure MFA trusted IP addresses | Conditional Access administrator |
-Create custom controls | Conditional Access administrator | Security administrator
-Create named locations | Conditional Access administrator | Security administrator
-Create policies | Conditional Access administrator | Security administrator
-Create terms of use | Conditional Access administrator | Security administrator
-Create VPN connectivity certificate | Conditional Access administrator | Security administrator
-Delete classic policy | Conditional Access administrator | Security administrator
-Delete terms of use | Conditional Access administrator | Security administrator
-Delete VPN connectivity certificate | Conditional Access administrator | Security administrator
-Disable classic policy | Conditional Access administrator | Security administrator
-Manage custom controls | Conditional Access administrator | Security administrator
-Manage named locations | Conditional Access administrator | Security administrator
-Manage terms of use | Conditional Access administrator | Security administrator
-Read all configuration | Security reader | Security administrator
-Read named locations | Security reader | Conditional Access administrator, security administrator
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Configure MFA trusted IP addresses | Conditional Access Administrator | |
+> | Create custom controls | Conditional Access Administrator | Security Administrator |
+> | Create named locations | Conditional Access Administrator | Security Administrator |
+> | Create policies | Conditional Access Administrator | Security Administrator |
+> | Create terms of use | Conditional Access Administrator | Security Administrator |
+> | Create VPN connectivity certificate | Conditional Access Administrator | Security Administrator |
+> | Delete classic policy | Conditional Access Administrator | Security Administrator |
+> | Delete terms of use | Conditional Access Administrator | Security Administrator |
+> | Delete VPN connectivity certificate | Conditional Access Administrator | Security Administrator |
+> | Disable classic policy | Conditional Access Administrator | Security Administrator |
+> | Manage custom controls | Conditional Access Administrator | Security Administrator |
+> | Manage named locations | Conditional Access Administrator | Security Administrator |
+> | Manage terms of use | Conditional Access Administrator | Security Administrator |
+> | Read all configuration | Security Reader | Security Administrator |
+> | Read named locations | Security Reader | Conditional Access Administrator, Security Administrator |
## Security - Identity security score
-Task | Least privileged role | Additional roles |
-- | | -
-Read all configuration | Security reader | Security administrator
-Read security score | Security reader | Security administrator
-Update event status | Security administrator |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Read all configuration | Security Reader | Security Administrator |
+> | Read security score | Security Reader | Security Administrator |
+> | Update event status | Security Administrator | |
## Security - Risky sign-ins
-Task | Least privileged role | Additional roles
-- | | -
-Read all configuration | Security Reader |
-Read risky sign-ins | Security Reader |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Read all configuration | Security Reader | |
+> | Read risky sign-ins | Security Reader | |
## Security - Users flagged for risk
-Task | Least privileged role | Additional roles
-- | | -
-Dismiss all events | Security Administrator |
-Read all configuration | Security Reader |
-Read users flagged for risk | Security Reader |
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Dismiss all events | Security Administrator | |
+> | Read all configuration | Security Reader | |
+> | Read users flagged for risk | Security Reader | |
## Users
-Task | Least privileged role | Additional roles
-- | | -
-Add user to directory role | Privileged role administrator |
-Add user to group | User administrator |
-Assign license | License administrator | User administrator
-Create guest user | Guest inviter | User administrator
-Create user | User administrator |
-Delete users | User administrator |
-Invalidate refresh tokens of limited admins (see documentation) | User administrator |
-Invalidate refresh tokens of non-admins (see documentation) | Password administrator | User administrator
-Invalidate refresh tokens of privileged admins (see documentation) | Privileged Authentication Administrator |
-Read basic configuration | Default User role ([see documentation](../fundamentals/users-default-permissions.md) |
-Reset password for limited admins (see documentation) | User administrator |
-Reset password of non-admins (see documentation) | Password administrator | User administrator
-Reset password of privileged admins | Privileged Authentication Administrator |
-Revoke license | License administrator | User administrator
-Update all properties except User Principal Name | User administrator |
-Update User Principal Name for limited admins (see documentation) | User administrator |
-Update User Principal Name property on privileged admins (see documentation) | Global Administrator |
-Update user settings | Global Administrator |
-Update Authentication methods | Authentication Administrator | Privileged Authentication Administrator, Global Administrator
-
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Add user to directory role | Privileged Role Administrator | |
+> | Add user to group | User Administrator | |
+> | Assign license | License Administrator | User Administrator |
+> | Create guest user | Guest inviter | User Administrator |
+> | Create user | User Administrator | |
+> | Delete users | User Administrator | |
+> | Invalidate refresh tokens of limited admins (see documentation) | User Administrator | |
+> | Invalidate refresh tokens of non-admins (see documentation) | Password Administrator | User Administrator |
+> | Invalidate refresh tokens of privileged admins (see documentation) | Privileged Authentication Administrator | |
+> | Read basic configuration | Default User role ([see documentation](../fundamentals/users-default-permissions.md) | |
+> | Reset password for limited admins (see documentation) | User Administrator | |
+> | Reset password of non-admins (see documentation) | Password Administrator | User Administrator |
+> | Reset password of privileged admins | Privileged Authentication Administrator | |
+> | Revoke license | License Administrator | User Administrator |
+> | Update all properties except User Principal Name | User Administrator | |
+> | Update User Principal Name for limited admins (see documentation) | User Administrator | |
+> | Update User Principal Name property on privileged admins (see documentation) | Global Administrator | |
+> | Update user settings | Global Administrator | |
+> | Update Authentication methods | Authentication Administrator | Privileged Authentication Administrator, Global Administrator |
## Support
-Task | Least privileged role | Additional roles
-- | | -
-Submit support ticket | Service Administrator | Application Administrator, Azure Information Protection Administrator, Billing Administrator, Cloud Application Administrator, Compliance Administrator, Dynamics 365 Administrator, Desktop Analytics Administrator, Exchange Administrator, Password Administrator, Intune Administrator, Skype for Business Administrator, Power BI Administrator, Privileged Authentication Administrator, SharePoint Administrator, Teams Communications Administrator, Teams Administrator, User Administrator, Workplace Analytics Administrator
+> [!div class="mx-tableFixed"]
+> | Task | Least privileged role | Additional roles |
+> | - | | - |
+> | Submit support ticket | Service Administrator | Application Administrator, Azure Information Protection Administrator, Billing Administrator, Cloud Application Administrator, Compliance Administrator, Dynamics 365 Administrator, Desktop Analytics Administrator, Exchange Administrator, Password Administrator, Intune Administrator, Skype for Business Administrator, Power BI Administrator, Privileged Authentication Administrator, SharePoint Administrator, Teams Communications Administrator, Teams Administrator, User Administrator, Workplace Analytics Administrator |
## Next steps * [How to assign or remove azure AD administrator roles](manage-roles-portal.md)
-* [Azure AD administrator roles reference](permissions-reference.md)
+* [Azure AD built-in roles](permissions-reference.md)
active-directory Groups Concept https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/groups-concept.md
# Use cloud groups to manage role assignments in Azure Active Directory (preview)
-Azure Active Directory (Azure AD) is introducing a public preview in which you can assign a cloud group to Azure AD built-in roles. With this feature, you can use groups to grant admin access in Azure AD with minimal effort from your Global and Privileged role admins.
+Azure Active Directory (Azure AD) is introducing a public preview in which you can assign a cloud group to Azure AD built-in roles. With this feature, you can use groups to grant admin access in Azure AD with minimal effort from your Global Administrators and Privileged Role Administrators.
-Consider this example: Contoso has hired people across geographies to manage and reset passwords for employees in its Azure AD organization. Instead of asking a Privileged role admin or Global admin to assign the Helpdesk admin role to each person individually, they can create a Contoso_Helpdesk_Administrators group and assign it to the role. When people join the group, they are assigned the role indirectly. Your existing governance workflow can then take care of the approval process and auditing of the groupΓÇÖs membership to ensure that only legitimate users are members of the group and are thus assigned to the Helpdesk admin role.
+Consider this example: Contoso has hired people across geographies to manage and reset passwords for employees in its Azure AD organization. Instead of asking a Privileged Role Administrator or Global Administrator to assign the Helpdesk Administrator role to each person individually, they can create a Contoso_Helpdesk_Administrators group and assign it to the role. When people join the group, they are assigned the role indirectly. Your existing governance workflow can then take care of the approval process and auditing of the groupΓÇÖs membership to ensure that only legitimate users are members of the group and are thus assigned to the Helpdesk Administrator role.
## How this feature works
If you do not want members of the group to have standing access to the role, you
## Why we enforce creation of a special group for assigning it to a role
-If a group is assigned a role, any IT admin who can manage group membership could also indirectly manage the membership of that role. For example, assume that a group Contoso_User_Administrators is assigned to User account admin role. An Exchange admin who can modify group membership could add themselves to the Contoso_User_Administrators group and in that way become a User account admin. As you can see, an admin could elevate their privilege in a way you did not intend.
+If a group is assigned a role, any IT admin who can manage group membership could also indirectly manage the membership of that role. For example, assume that a group Contoso_User_Administrators is assigned to User account admin role. An Exchange Administrator who can modify group membership could add themselves to the Contoso_User_Administrators group and in that way become a User account admin. As you can see, an admin could elevate their privilege in a way you did not intend.
Azure AD allows you to protect a group assigned to a role by using a new property called isAssignableToRole for groups. Only cloud groups that had the isAssignableToRole property set to ΓÇÿtrueΓÇÖ at creation time can be assigned to a role. This property is immutable; once a group is created with this property set to ΓÇÿtrueΓÇÖ, it canΓÇÖt be changed. You can't set the property on an existing group. We designed how groups are assigned to roles to help prevent potential breaches from happening: -- Only Global admins and Privileged role admins can create a role-assignable group (with the "isAssignableToRole" property enabled).
+- Only Global Administrators and Privileged Role Administrators can create a role-assignable group (with the "isAssignableToRole" property enabled).
- It can't be an Azure AD dynamic group; that is, it must have a membership type of "Assigned." Automated population of dynamic groups could lead to an unwanted account being added to the group and thus assigned to the role.-- By default, only Global admins and Privileged role admins can manage the membership of a role-assignable group, but you can delegate the management of role-assignable groups by adding group owners.-- To prevent elevation of privilege, the credentials of members and owners of a role-assignable group can be changed only by a Privileged Authentication administrator or a Global administrator.
+- By default, only Global Administrators and Privileged Role Administrators can manage the membership of a role-assignable group, but you can delegate the management of role-assignable groups by adding group owners.
+- To prevent elevation of privilege, the credentials of members and owners of a role-assignable group can be changed only by a Privileged Authentication Administrator or a Global Administrator.
- No nesting. A group can't be added as a member of a role-assignable group. ## Limitations
active-directory Groups Faq Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/groups-faq-troubleshooting.md
Here are some common questions and troubleshooting tips for assigning roles to g
**Q:** I'm a Groups Administrator but I can't see the **Azure AD roles can be assigned to the group** switch.
-**A:** Only Privileged Role administrators or Global administrators can create a group that's eligible for role assignment. Only users in those roles see this control.
+**A:** Only Privileged Role Administrators or Global Administrators can create a group that's eligible for role assignment. Only users in those roles see this control.
**Q:** Who can modify the membership of groups that are assigned to Azure AD roles? **A:** By default, only Privileged Role Administrator and Global Administrator manage the membership of a role-assignable group, but you can delegate the management of role-assignable groups by adding group owners.
-**Q**: I am a Helpdesk Administrator in my organization but I can't update password of a user who is a Directory Reader. Why does that happen?
+**Q**: I am a Helpdesk Administrator in my organization but I can't update password of a user who is a Directory Readers. Why does that happen?
-**A**: The user might have gotten Directory Reader by way of a role-assignable group. All members and owners of a role-assignable groups are protected. Only users in the Privileged Authentication Administrator or Global Administrator roles can reset credentials for a protected user.
+**A**: The user might have gotten Directory Readers by way of a role-assignable group. All members and owners of a role-assignable groups are protected. Only users in the Privileged Authentication Administrator or Global Administrator roles can reset credentials for a protected user.
**Q:** I can't update password of a user. They don't have any higher privileged role assigned. Why is it happening?
-**A:** The user could be an owner of a role-assignable group. We protect owners of role-assignable groups to avoid elevation of privilege. An example might be if a group Contoso_Security_Admins is assigned to Security administrator role, where Bob is the group owner and Alice is Password administrator in the organization. If this protection weren't present, Alice could reset Bob's credentials and take over his identity. After that, Alice could add herself or anyone to the group Contoso_Security_Admins group to become a Security administrator in the organization. To find out if a user is a group owner, get the list of owned objects of that user and see if any of the groups have isAssignableToRole set to true. If yes, then that user is protected and the behavior is by design. Refer to these documentations for getting owned objects:
+**A:** The user could be an owner of a role-assignable group. We protect owners of role-assignable groups to avoid elevation of privilege. An example might be if a group Contoso_Security_Admins is assigned to Security Administrator role, where Bob is the group owner and Alice is Password Administrator in the organization. If this protection weren't present, Alice could reset Bob's credentials and take over his identity. After that, Alice could add herself or anyone to the group Contoso_Security_Admins group to become a Security Administrator in the organization. To find out if a user is a group owner, get the list of owned objects of that user and see if any of the groups have isAssignableToRole set to true. If yes, then that user is protected and the behavior is by design. Refer to these documentations for getting owned objects:
- [Get-AzureADUserOwnedObject](/powershell/module/azuread/get-azureaduserownedobject)ΓÇ» - [List ownedObjects](/graph/api/user-list-ownedobjects?tabs=http) **Q:** Can I create an access review on groups that can be assigned to Azure AD roles (specifically, groups with isAssignableToRole property set to true)?
-**A:** Yes, you can. If you are on newest version of Access Review, then your reviewers are directed to My Access by default, and only Global administrators can create access reviews on role-assignable groups. However, if you are on the older version of Access Review, then your reviewers are directed to the Access Panel by default, and both Global administrators and User administrator can create access reviews on role-assignable groups. The new experience will be rolled out to all customers on July 28, 2020 but if youΓÇÖd like to upgrade sooner, make a request to [Azure AD Access Reviews - Updated reviewer experience in My Access Signup](https://forms.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR5dv-S62099HtxdeKIcgO-NUOFJaRDFDWUpHRk8zQ1BWVU1MMTcyQ1FFUi4u).
+**A:** Yes, you can. If you are on newest version of Access Review, then your reviewers are directed to My Access by default, and only Global Administrators can create access reviews on role-assignable groups. However, if you are on the older version of Access Review, then your reviewers are directed to the Access Panel by default, and both Global Administrators and User Administrator can create access reviews on role-assignable groups. The new experience will be rolled out to all customers on July 28, 2020 but if youΓÇÖd like to upgrade sooner, make a request to [Azure AD Access Reviews - Updated reviewer experience in My Access Signup](https://forms.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR5dv-S62099HtxdeKIcgO-NUOFJaRDFDWUpHRk8zQ1BWVU1MMTcyQ1FFUi4u).
**Q:** Can I create an access package and put groups that can be assigned to Azure AD roles in it?
-**A:** Yes, you can. Global Administrator and User Administrator have the power to put any group in an access package. Nothing changes for Global Administrator, but there's a slight change in User administrator role permissions. To put a role-assignable group into an access package, you must be a User Administrator and also owner of the role-assignable group. Here's the full table showing who can create access package in Enterprise License Management:
+**A:** Yes, you can. Global Administrator and User Administrator have the power to put any group in an access package. Nothing changes for Global Administrator, but there's a slight change in User Administrator role permissions. To put a role-assignable group into an access package, you must be a User Administrator and also owner of the role-assignable group. Here's the full table showing who can create access package in Enterprise License Management:
Azure AD directory role | Entitlement management role | Can add security group\* | Can add Microsoft 365 group\* | Can add app | Can add SharePoint Online site -- | | -- | - | -- | --
-Global administrator | n/a | ✔️ | ✔️ | ✔️ | ✔️
-User administrator | n/a | ✔️ | ✔️ | ✔️
-Intune administrator | Catalog owner | ✔️ | ✔️ | &nbsp; | &nbsp;
-Exchange administrator | Catalog owner | &nbsp; | ✔️ | &nbsp; | &nbsp;
-Teams service administrator | Catalog owner | &nbsp; | ✔️ | &nbsp; | &nbsp;
-SharePoint administrator | Catalog owner | &nbsp; | ✔️ | &nbsp; | ✔️
-Application administrator | Catalog owner | &nbsp; | &nbsp; | ✔️ | &nbsp;
-Cloud application administrator | Catalog owner | &nbsp; | &nbsp; | ✔️ | &nbsp;
+Global Administrator | n/a | ✔️ | ✔️ | ✔️ | ✔️
+User Administrator | n/a | ✔️ | ✔️ | ✔️
+Intune Administrator | Catalog owner | ✔️ | ✔️ | &nbsp; | &nbsp;
+Exchange Administrator | Catalog owner | &nbsp; | ✔️ | &nbsp; | &nbsp;
+Teams Administrator | Catalog owner | &nbsp; | ✔️ | &nbsp; | &nbsp;
+SharePoint Administrator | Catalog owner | &nbsp; | ✔️ | &nbsp; | ✔️
+Application Administrator | Catalog owner | &nbsp; | &nbsp; | ✔️ | &nbsp;
+Cloud Application Administrator | Catalog owner | &nbsp; | &nbsp; | ✔️ | &nbsp;
User | Catalog owner | Only if group owner | Only if group owner | Only if app owner | &nbsp; \*Group isn't role-assignable; that is, isAssignableToRole = false. If a group is role-assignable, then the person creating the access package must also be owner of the role-assignable group.
active-directory M365 Workload Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/m365-workload-docs.md
All products in Microsoft 365 can be managed with administrative roles in Azure
## Where to find content
-Microsoft 365 service | Role content | API content
-- | | --
-Admin roles in Office 365 and Microsoft 365 business plans | [Microsoft 365 admin roles](/office365/admin/add-users/about-admin-roles) | Not available
-Azure Active Directory (Azure AD) and Azure AD Identity Protection| [Azure AD admin roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list)
-Exchange Online| [Exchange role-based access control](/exchange/understanding-role-based-access-control-exchange-2013-help) | [PowerShell for Exchange](/powershell/module/exchange/role-based-access-control/add-managementroleentry)<br>[Fetch role assignments](/powershell/module/exchange/role-based-access-control/get-rolegroup)
-SharePoint Online | [Azure AD admin roles](permissions-reference.md)<br>Also [About the SharePoint admin role in Microsoft 365](/sharepoint/sharepoint-admin-role) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list)
-Teams/Skype for Business | [Azure AD admin roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list)
-Security & Compliance Center (Office 365 Advanced Threat Protection, Exchange Online Protection, Information Protection) | [Office 365 admin roles](/office365/SecurityCompliance/permissions-in-the-security-and-compliance-center) | [Exchange PowerShell](/powershell/module/exchange/role-based-access-control/add-managementroleentry)<br>[Fetch role assignments](/powershell/module/exchange/role-based-access-control/get-rolegroup)
-Secure Score | [Azure AD admin roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list)
-Compliance Manager | [Compliance Manager roles](/office365/securitycompliance/meet-data-protection-and-regulatory-reqs-using-microsoft-cloud#permissions-and-role-based-access-control) | Not available
-Azure Information Protection | [Azure AD admin roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list)
-Microsoft Cloud App Security | [Role-based access control](/cloud-app-security/manage-admins) | [API reference](/cloud-app-security/api-tokens)
-Azure Advanced Threat Protection | [Azure ATP role groups](/azure-advanced-threat-protection/atp-role-groups) | Not available
-Windows Defender Advanced Threat Protection | [Windows Defender ATP role-based access control](/windows/security/threat-protection/windows-defender-atp/rbac-windows-defender-advanced-threat-protection) | Not available
-Privileged Identity Management | [Azure AD admin roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list)
-Intune | [Intune role-based access control](/intune/role-based-access-control) | [Graph API](/graph/api/resources/intune-rbac-conceptual?view=graph-rest-beta&preserve-view=true)<br>[Fetch role assignments](/graph/api/intune-rbac-roledefinition-list?view=graph-rest-beta&preserve-view=true)
-Managed Desktop | [Azure AD admin roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list)
+> [!div class="mx-tableFixed"]
+> | Microsoft 365 service | Role content | API content |
+> | - | | -- |
+> | Admin roles in Office 365 and Microsoft 365 business plans | [Microsoft 365 admin roles](/office365/admin/add-users/about-admin-roles) | Not available |
+> | Azure Active Directory (Azure AD) and Azure AD Identity Protection| [Azure AD built-in roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list) |
+> | Exchange Online| [Exchange role-based access control](/exchange/understanding-role-based-access-control-exchange-2013-help) | [PowerShell for Exchange](/powershell/module/exchange/role-based-access-control/add-managementroleentry)<br>[Fetch role assignments](/powershell/module/exchange/role-based-access-control/get-rolegroup) |
+> | SharePoint Online | [Azure AD built-in roles](permissions-reference.md)<br>Also [About the SharePoint admin role in Microsoft 365](/sharepoint/sharepoint-admin-role) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list) |
+> | Teams/Skype for Business | [Azure AD built-in roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list) |
+> | Security & Compliance Center (Office 365 Advanced Threat Protection, Exchange Online Protection, Information Protection) | [Office 365 admin roles](/office365/SecurityCompliance/permissions-in-the-security-and-compliance-center) | [Exchange PowerShell](/powershell/module/exchange/role-based-access-control/add-managementroleentry)<br>[Fetch role assignments](/powershell/module/exchange/role-based-access-control/get-rolegroup) |
+> | Secure Score | [Azure AD built-in roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list) |
+> | Compliance Manager | [Compliance Manager roles](/office365/securitycompliance/meet-data-protection-and-regulatory-reqs-using-microsoft-cloud#permissions-and-role-based-access-control) | Not available |
+> | Azure Information Protection | [Azure AD built-in roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list) |
+> | Microsoft Cloud App Security | [Role-based access control](/cloud-app-security/manage-admins) | [API reference](/cloud-app-security/api-tokens) |
+> | Azure Advanced Threat Protection | [Azure ATP role groups](/azure-advanced-threat-protection/atp-role-groups) | Not available |
+> | Windows Defender Advanced Threat Protection | [Windows Defender ATP role-based access control](/windows/security/threat-protection/windows-defender-atp/rbac-windows-defender-advanced-threat-protection) | Not available |
+> | Privileged Identity Management | [Azure AD built-in roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list) |
+> | Intune | [Intune role-based access control](/intune/role-based-access-control) | [Graph API](/graph/api/resources/intune-rbac-conceptual?view=graph-rest-beta&preserve-view=true)<br>[Fetch role assignments](/graph/api/intune-rbac-roledefinition-list?view=graph-rest-beta&preserve-view=true) |
+> | Managed Desktop | [Azure AD built-in roles](permissions-reference.md) | [Graph API](/graph/api/overview)<br>[Fetch role assignments](/graph/api/directoryrole-list) |
## Next steps * [How to assign or remove Azure AD administrator roles](manage-roles-portal.md)
-* [Azure AD administrator roles reference](permissions-reference.md)
+* [Azure AD built-in roles](permissions-reference.md)
active-directory Manage Roles Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/manage-roles-portal.md
You can now see and manage all the members of the administrator roles in the Azu
You can select **Manage in PIM** for additional management capabilities using [Azure AD Privileged Identity Management (PIM)](../privileged-identity-management/pim-configure.md). Privileged Role Administrators can change ΓÇ£PermanentΓÇ¥ (always active in the role) assignments to ΓÇ£EligibleΓÇ¥ (in the role only when elevated). If you don't have Privileged Identity Management, you can still select **Manage in PIM** to sign up for a trial. Privileged Identity Management requires an [Azure AD Premium P2 license](../privileged-identity-management/subscription-requirements.md).
-![Screenshot that shows the "User administrator - Assignments" page with the "Manage in PIM" action selected](./media/manage-roles-portal/member-list-pim.png)
+![Screenshot that shows the "User Administrator - Assignments" page with the "Manage in PIM" action selected](./media/manage-roles-portal/member-list-pim.png)
If you are a Global Administrator or a Privileged Role Administrator, you can easily add or remove members, filter the list, or select a member to see their active assigned roles.
active-directory My Staff Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/my-staff-configure.md
To complete this article, you need the following resources and privileges:
* An Azure Active Directory tenant associated with your subscription. * If needed, [create an Azure Active Directory tenant](../fundamentals/sign-up-organization.md) or [associate an Azure subscription with your account](../fundamentals/active-directory-how-subscriptions-associated-directory.md).
-* You need *Global administrator* privileges in your Azure AD tenant to enable SMS-based authentication.
+* You need *Global Administrator* privileges in your Azure AD tenant to enable SMS-based authentication.
* Each user who's enabled in the text message authentication method policy must be licensed, even if they don't use it. Each enabled user must have one of the following Azure AD or Microsoft 365 licenses: * [Azure AD Premium P1 or P2](https://azure.microsoft.com/pricing/details/active-directory/)
To complete this article, you need the following resources and privileges:
Once you have configured administrative units, you can apply this scope to your users who access My Staff. Only users who are assigned an administrative role can access My Staff. To enable My Staff, complete the following steps:
-1. Sign into the Azure portal as a User administrator.
+1. Sign into the Azure portal as a User Administrator.
2. Browse to **Azure Active Directory** > **User settings** > **User feature previews** > **Manage user feature preview settings**. 3. Under **Administrators can access My Staff**, you can choose to enable for all users, selected users, or no user access.
Before you can reset passwords for on-premises users, you must fulfill the follo
The following roles have permission to reset a user's password:
-* [Authentication administrator](permissions-reference.md#authentication-administrator)
-* [Privileged authentication administrator](permissions-reference.md#privileged-authentication-administrator)
-* [Global administrator](permissions-reference.md#global-administrator)
-* [Helpdesk administrator](permissions-reference.md#helpdesk-administrator)
-* [User administrator](permissions-reference.md#user-administrator)
-* [Password administrator](permissions-reference.md#password-administrator)
+* [Authentication Administrator](permissions-reference.md#authentication-administrator)
+* [Privileged Authentication Administrator](permissions-reference.md#privileged-authentication-administrator)
+* [Global Administrator](permissions-reference.md#global-administrator)
+* [Helpdesk Administrator](permissions-reference.md#helpdesk-administrator)
+* [User Administrator](permissions-reference.md#user-administrator)
+* [Password Administrator](permissions-reference.md#password-administrator)
From **My Staff**, open a user's profile. Select **Reset password**.
Depending on your settings, the user can then use the phone number you set up to
To manage a user's phone number, you must be assigned one of the following roles:
-* [Authentication administrator](permissions-reference.md#authentication-administrator)
-* [Privileged authentication administrator](permissions-reference.md#privileged-authentication-administrator)
-* [Global administrator](permissions-reference.md#global-administrator)
+* [Authentication Administrator](permissions-reference.md#authentication-administrator)
+* [Privileged Authentication Administrator](permissions-reference.md#privileged-authentication-administrator)
+* [Global Administrator](permissions-reference.md#global-administrator)
## Search
active-directory Permissions Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/permissions-reference.md
Users in this role can create and manage all aspects of attack simulation creati
## Authentication Administrator
-Users with this role can set or reset any authentication method (including passwords) for non-administrators and some roles. Authentication administrators can require users who are non-administrators or assigned to some roles to re-register against existing non-password credentials (for example, MFA or FIDO), and can also revoke **remember MFA on the device**, which prompts for MFA on the next sign-in. For a list of the roles that an Authentication Administrator can read or update authentication methods, see [Password reset permissions](#password-reset-permissions).
+Users with this role can set or reset any authentication method (including passwords) for non-administrators and some roles. Authentication Administrators can require users who are non-administrators or assigned to some roles to re-register against existing non-password credentials (for example, MFA or FIDO), and can also revoke **remember MFA on the device**, which prompts for MFA on the next sign-in. For a list of the roles that an Authentication Administrator can read or update authentication methods, see [Password reset permissions](#password-reset-permissions).
-The [Privileged authentication administrator](#privileged-authentication-administrator) role has permission to force re-registration and multi-factor authentication for all users.
+The [Privileged Authentication Administrator](#privileged-authentication-administrator) role has permission to force re-registration and multi-factor authentication for all users.
-The [Authentication policy administrator](#authentication-policy-administrator) role has permissions to set the tenant's authentication method policy that determines which methods each user can register and use.
+The [Authentication Policy Administrator](#authentication-policy-administrator) role has permissions to set the tenant's authentication method policy that determines which methods each user can register and use.
| Role | Manage user's auth methods | Manage per-user MFA | Manage MFA settings | Manage auth method policy | Manage password protection policy | | - | - | - | - | - | - |
-| Authentication administrator | Yes for some users (see above) | Yes for some users (see above) | No | No | No |
-| Privileged authentication administrator| Yes for all users | Yes for all users | No | No | No |
-| Authentication policy administrator | No |No | Yes | Yes | Yes |
+| Authentication Administrator | Yes for some users (see above) | Yes for some users (see above) | No | No | No |
+| Privileged Authentication Administrator| Yes for all users | Yes for all users | No | No | No |
+| Authentication Policy Administrator | No |No | Yes | Yes | Yes |
> [!IMPORTANT] > Users with this role can change credentials for people who may have access to sensitive or private information or critical configuration inside and outside of Azure Active Directory. Changing the credentials of a user may mean the ability to assume that user's identity and permissions. For example:
The [Authentication policy administrator](#authentication-policy-administrator)
Users with this role can configure the authentication methods policy, tenant-wide MFA settings, and password protection policy. This role grants permission to manage Password Protection settings: smart lockout configurations and updating the custom banned passwords list.
-The [Authentication administrator](#authentication-administrator) and [Privileged authentication administrator](#privileged-authentication-administrator) roles have permission to manage registered authentication methods on users and can force re-registration and multi-factor authentication for all users.
+The [Authentication Administrator](#authentication-administrator) and [Privileged Authentication Administrator](#privileged-authentication-administrator) roles have permission to manage registered authentication methods on users and can force re-registration and multi-factor authentication for all users.
| Role | Manage user's auth methods | Manage per-user MFA | Manage MFA settings | Manage auth method policy | Manage password protection policy | | - | - | - | - | - | - |
-| Authentication administrator | Yes for some users (see above) | Yes for some users (see above) | No | No | No |
-| Privileged authentication administrator| Yes for all users | Yes for all users | No | No | No |
-| Authentication policy administrator | No | No | Yes | Yes | Yes |
+| Authentication Administrator | Yes for some users (see above) | Yes for some users (see above) | No | No | No |
+| Privileged Authentication Administrator| Yes for all users | Yes for all users | No | No | No |
+| Authentication Policy Administrator | No | No | Yes | Yes | Yes |
> [!IMPORTANT] > This role can't manage MFA settings in the legacy MFA management portal or Hardware OATH tokens.
Users with this role have access to all administrative features in Azure Active
## Global Reader
-Users in this role can read settings and administrative information across Microsoft 365 services but can't take management actions. Global reader is the read-only counterpart to Global Administrator. Assign Global reader instead of Global Administrator for planning, audits, or investigations. Use Global reader in combination with other limited admin roles like Exchange Administrator to make it easier to get work done without the assigning the Global Administrator role. Global reader works with Microsoft 365 admin center, Exchange admin center, SharePoint admin center, Teams admin center, Security center, Compliance center, Azure AD admin center, and Device Management admin center.
+Users in this role can read settings and administrative information across Microsoft 365 services but can't take management actions. Global Reader is the read-only counterpart to Global Administrator. Assign Global Reader instead of Global Administrator for planning, audits, or investigations. Use Global Reader in combination with other limited admin roles like Exchange Administrator to make it easier to get work done without the assigning the Global Administrator role. Global Reader works with Microsoft 365 admin center, Exchange admin center, SharePoint admin center, Teams admin center, Security center, Compliance center, Azure AD admin center, and Device Management admin center.
> [!NOTE]
-> Global reader role has a few limitations right now -
+> Global Reader role has a few limitations right now -
>
->- [OneDrive admin center](https://admin.onedrive.com/) - OneDrive admin center does not support the Global reader role
->- [Microsoft 365 admin center](https://admin.microsoft.com/Adminportal/Home#/homepage) - Global reader can't read integrated apps. You won't find the **Integrated apps** tab under **Settings** in the left pane of Microsoft 365 admin center.
->- [Office Security & Compliance Center](https://sip.protection.office.com/homepage) - Global reader can't read SCC audit logs, do content search, or see Secure Score.
->- [Teams admin center](https://admin.teams.microsoft.com) - Global reader cannot read **Teams lifecycle**, **Analytics & reports**, **IP phone device management** and **App catalog**.
->- [Privileged Access Management (PAM)](/office365/securitycompliance/privileged-access-management-overview) doesn't support the Global reader role.
->- [Azure Information Protection](/azure/information-protection/what-is-information-protection) - Global reader is supported [for central reporting](/azure/information-protection/reports-aip) only, and when your Azure AD organization isn't on the [unified labeling platform](/azure/information-protection/faqs#how-can-i-determine-if-my-tenant-is-on-the-unified-labeling-platform).
+>- [OneDrive admin center](https://admin.onedrive.com/) - OneDrive admin center does not support the Global Reader role
+>- [Microsoft 365 admin center](https://admin.microsoft.com/Adminportal/Home#/homepage) - Global Reader can't read integrated apps. You won't find the **Integrated apps** tab under **Settings** in the left pane of Microsoft 365 admin center.
+>- [Office Security & Compliance Center](https://sip.protection.office.com/homepage) - Global Reader can't read SCC audit logs, do content search, or see Secure Score.
+>- [Teams admin center](https://admin.teams.microsoft.com) - Global Reader cannot read **Teams lifecycle**, **Analytics & reports**, **IP phone device management** and **App catalog**.
+>- [Privileged Access Management (PAM)](/office365/securitycompliance/privileged-access-management-overview) doesn't support the Global Reader role.
+>- [Azure Information Protection](/azure/information-protection/what-is-information-protection) - Global Reader is supported [for central reporting](/azure/information-protection/reports-aip) only, and when your Azure AD organization isn't on the [unified labeling platform](/azure/information-protection/faqs#how-can-i-determine-if-my-tenant-is-on-the-unified-labeling-platform).
> > These features are currently in development. >
Users in this role can read settings and administrative information across Micro
## Groups Administrator
-Users in this role can create/manage groups and its settings like naming and expiration policies. It is important to understand that assigning a user to this role gives them the ability to manage all groups in the organization across various workloads like Teams, SharePoint, Yammer in addition to Outlook. Also the user will be able to manage the various groups settings across various admin portals like Microsoft Admin Center, Azure portal, as well as workload specific ones like Teams and SharePoint Admin Centers.
+Users in this role can create/manage groups and its settings like naming and expiration policies. It is important to understand that assigning a user to this role gives them the ability to manage all groups in the organization across various workloads like Teams, SharePoint, Yammer in addition to Outlook. Also the user will be able to manage the various groups settings across various admin portals like Microsoft admin center, Azure portal, as well as workload specific ones like Teams and SharePoint admin centers.
> [!div class="mx-tableFixed"] > | Actions | Description |
Users with this role can manage Azure AD identity governance configuration, incl
## Insights Administrator
-Users in this role can access the full set of administrative capabilities in the [M365 Insights application](https://go.microsoft.com/fwlink/?linkid=2129521). This role has the ability to read directory information, monitor service health, file support tickets, and access the Insights admin settings aspects.
+Users in this role can access the full set of administrative capabilities in the [M365 Insights application](https://go.microsoft.com/fwlink/?linkid=2129521). This role has the ability to read directory information, monitor service health, file support tickets, and access the Insights Administrator settings aspects.
> [!div class="mx-tableFixed"] > | Actions | Description |
Users in this role can access the full set of administrative capabilities in the
## Insights Business Leader
-Users in this role can access a set of dashboards and insights via the [M365 Insights application](https://go.microsoft.com/fwlink/?linkid=2129521). This includes full access to all dashboards and presented insights and data exploration functionality. Users in this role do not have access to product configuration settings, which is the responsibility of the Insights Admin role.
+Users in this role can access a set of dashboards and insights via the [M365 Insights application](https://go.microsoft.com/fwlink/?linkid=2129521). This includes full access to all dashboards and presented insights and data exploration functionality. Users in this role do not have access to product configuration settings, which is the responsibility of the Insights Administrator role.
> [!div class="mx-tableFixed"] > | Actions | Description |
Users in this role can access a set of dashboards and insights via the [M365 Ins
Users with this role have global permissions within Microsoft Intune Online, when the service is present. Additionally, this role contains the ability to manage users and devices in order to associate policy, as well as create and manage groups. More information at [Role-based administration control (RBAC) with Microsoft Intune](/intune/role-based-access-control).
-This role can create and manage all security groups. However, Intune Admin does not have admin rights over Office groups. That means the admin cannot update owners or memberships of all Office groups in the organization. However, he/she can manage the Office group that he creates which comes as a part of his/her end-user privileges. So, any Office group (not security group) that he/she creates should be counted against his/her quota of 250.
+This role can create and manage all security groups. However, Intune Administrator does not have admin rights over Office groups. That means the admin cannot update owners or memberships of all Office groups in the organization. However, he/she can manage the Office group that he creates which comes as a part of his/her end-user privileges. So, any Office group (not security group) that he/she creates should be counted against his/her quota of 250.
> [!NOTE] > In the Microsoft Graph API and Azure AD PowerShell, this role is identified as "Intune Service Administrator." It is "Intune Administrator" in the [Azure portal](https://portal.azure.com).
Users with this role have global permissions to manage settings within Microsoft
## Knowledge Administrator
-Users in this role have full access to all knowledge, learning and intelligent features settings in the Microsoft 365 admin center. They have a general understanding of the suite of products, licensing details and has responsibility to control access. Knowledge administrator can create and manage content, like topics, acronyms and learning resources. Additionally, these users can create content centers, monitor service health, and create service requests.
+Users in this role have full access to all knowledge, learning and intelligent features settings in the Microsoft 365 admin center. They have a general understanding of the suite of products, licensing details and has responsibility to control access. Knowledge Administrator can create and manage content, like topics, acronyms and learning resources. Additionally, these users can create content centers, monitor service health, and create service requests.
> [!div class="mx-tableFixed"] > | Actions | Description |
Do not use. This role is automatically assigned from Commerce, and is not intend
The Modern Commerce User role gives certain users permission to access Microsoft 365 admin center and see the left navigation entries for **Home**, **Billing**, and **Support**. The content available in these areas is controlled by [commerce-specific roles](../../cost-management-billing/manage/understand-mca-roles.md) assigned to users to manage products that they bought for themselves or your organization. This might include tasks like paying bills, or for access to billing accounts and billing profiles.
-Users with the Modern Commerce User role typically have administrative permissions in other Microsoft purchasing systems, but do not have Global Administrator or Billing administrator roles used to access the admin center.
+Users with the Modern Commerce User role typically have administrative permissions in other Microsoft purchasing systems, but do not have Global Administrator or Billing Administrator roles used to access the admin center.
**When is the Modern Commerce User role assigned?** * **Self-service purchase in Microsoft 365 admin center** ΓÇô Self-service purchase gives users a chance to try out new products by buying or signing up for them on their own. These products are managed in the admin center. Users who make a self-service purchase are assigned a role in the commerce system, and the Modern Commerce User role so they can manage their purchases in admin center. Admins can block self-service purchases (for Power BI, Power Apps, Power automate) through [PowerShell](/microsoft-365/commerce/subscriptions/allowselfservicepurchase-powershell). For more information, see [Self-service purchase FAQ](/microsoft-365/commerce/subscriptions/self-service-purchase-faq).
-* **Purchases from Microsoft commercial marketplace** ΓÇô Similar to self-service purchase, when a user buys a product or service from Microsoft AppSource or Azure Marketplace, the Modern Commerce User role is assigned if they donΓÇÖt have the Global Administrator or Billing admin role. In some cases, users might be blocked from making these purchases. For more information, see [Microsoft commercial marketplace](../../marketplace/marketplace-faq-publisher-guide.md#what-could-block-a-customer-from-completing-a-purchase).
-* **Proposals from Microsoft** ΓÇô A proposal is a formal offer from Microsoft for your organization to buy Microsoft products and services. When the person who is accepting the proposal doesnΓÇÖt have a Global Administrator or Billing admin role in Azure AD, they are assigned both a commerce-specific role to complete the proposal and the Modern Commerce User role to access admin center. When they access the admin center they can only use features that are authorized by their commerce-specific role.
-* **Commerce-specific roles** ΓÇô Some users are assigned commerce-specific roles. If a user isn't a Global or Billing admin, they get the Modern Commerce User role so they can access the admin center.
+* **Purchases from Microsoft commercial marketplace** ΓÇô Similar to self-service purchase, when a user buys a product or service from Microsoft AppSource or Azure Marketplace, the Modern Commerce User role is assigned if they donΓÇÖt have the Global Administrator or Billing Administrator role. In some cases, users might be blocked from making these purchases. For more information, see [Microsoft commercial marketplace](../../marketplace/marketplace-faq-publisher-guide.md#what-could-block-a-customer-from-completing-a-purchase).
+* **Proposals from Microsoft** ΓÇô A proposal is a formal offer from Microsoft for your organization to buy Microsoft products and services. When the person who is accepting the proposal doesnΓÇÖt have a Global Administrator or Billing Administrator role in Azure AD, they are assigned both a commerce-specific role to complete the proposal and the Modern Commerce User role to access admin center. When they access the admin center they can only use features that are authorized by their commerce-specific role.
+* **Commerce-specific roles** ΓÇô Some users are assigned commerce-specific roles. If a user isn't a Global Administrator or Billing Administrator, they get the Modern Commerce User role so they can access the admin center.
If the Modern Commerce User role is unassigned from a user, they lose access to Microsoft 365 admin center. If they were managing any products, either for themselves or for your organization, they wonΓÇÖt be able to manage them. This might include assigning licenses, changing payment methods, paying bills, or other tasks for managing subscriptions.
Users with this role have limited ability to manage passwords. This role does no
## Power BI Administrator
-Users with this role have global permissions within Microsoft Power BI, when the service is present, as well as the ability to manage support tickets and monitor service health. More information at [Understanding the Power BI admin role](/power-bi/service-admin-role).
+Users with this role have global permissions within Microsoft Power BI, when the service is present, as well as the ability to manage support tickets and monitor service health. More information at [Understanding the Power BI Administrator role](/power-bi/service-admin-role).
> [!NOTE] > In the Microsoft Graph API and Azure AD PowerShell, this role is identified as "Power BI Service Administrator ". It is "Power BI Administrator" in the [Azure portal](https://portal.azure.com).
Users with this role can register printers and manage printer status in the Micr
Users with this role can set or reset any authentication method (including passwords) for any user, including Global Administrators. Privileged Authentication Administrators can force users to re-register against existing non-password credential (such as MFA or FIDO) and revoke 'remember MFA on the device', prompting for MFA on the next sign-in of all users.
-The [Authentication administrator](#authentication-administrator) role has permission to force re-registration and multi-factor authentication for standard users and users with some admin roles.
+The [Authentication Administrator](#authentication-administrator) role has permission to force re-registration and multi-factor authentication for standard users and users with some admin roles.
-The [Authentication policy administrator](#authentication-policy-administrator) role has permissions to set the tenant's authentication method policy that determines which methods each user can register and use.
+The [Authentication Policy Administrator](#authentication-policy-administrator) role has permissions to set the tenant's authentication method policy that determines which methods each user can register and use.
| Role | Manage user's auth methods | Manage per-user MFA | Manage MFA settings | Manage auth method policy | Manage password protection policy | | - | - | - | - | - | - |
-| Authentication administrator | Yes for some users (see above) | Yes for some users (see above) | No | No | No |
-| Privileged authentication administrator| Yes for all users | Yes for all users | No | No | No |
-| Authentication policy administrator | No | No | Yes | Yes | Yes |
+| Authentication Administrator | Yes for some users (see above) | Yes for some users (see above) | No | No | No |
+| Privileged Authentication Administrator| Yes for all users | Yes for all users | No | No | No |
+| Authentication Policy Administrator | No | No | Yes | Yes | Yes |
> [!IMPORTANT] > Users with this role can change credentials for people who may have access to sensitive or private information or critical configuration inside and outside of Azure Active Directory. Changing the credentials of a user may mean the ability to assume that user's identity and permissions. For example:
In | Can do
Identity Protection Center | Read all security reports and settings information for security features<br><ul><li>Anti-spam<li>Encryption<li>Data loss prevention<li>Anti-malware<li>Advanced threat protection<li>Anti-phishing<li>Mail flow rules [Privileged Identity Management](../privileged-identity-management/pim-configure.md) | Has read-only access to all information surfaced in Azure AD Privileged Identity Management: Policies and reports for Azure AD role assignments and security reviews.<br>**Cannot** sign up for Azure AD Privileged Identity Management or make any changes to it. In the Privileged Identity Management portal or via PowerShell, someone in this role can activate additional roles (for example, Global Administrator or Privileged Role Administrator), if the user is eligible for them. [Office 365 Security & Compliance Center](https://support.office.com/article/About-Office-365-admin-roles-da585eea-f576-4f55-a1e0-87090b6aaa9d) | View security policies<br>View and investigate security threats<br>View reports
-Windows Defender ATP and EDR | View and investigate alerts. When you turn on role-based access control in Windows Defender ATP, users with read-only permissions such as the Azure AD Security reader role lose access until they are assigned to a Windows Defender ATP role.
+Windows Defender ATP and EDR | View and investigate alerts. When you turn on role-based access control in Windows Defender ATP, users with read-only permissions such as the Azure AD Security Reader role lose access until they are assigned to a Windows Defender ATP role.
[Intune](/intune/role-based-access-control) | Views user, device, enrollment, configuration, and application information. Cannot make changes to Intune. [Cloud App Security](/cloud-app-security/manage-admins) | Has read-only permissions and can manage alerts [Azure Security Center](../../key-vault/managed-hsm/built-in-roles.md) | Can view recommendations and alerts, view security policies, view security states, but cannot make changes
Users with this role have global permissions within Microsoft SharePoint Online,
## Skype for Business Administrator
-Users with this role have global permissions within Microsoft Skype for Business, when the service is present, as well as manage Skype-specific user attributes in Azure Active Directory. Additionally, this role grants the ability to manage support tickets and monitor service health, and to access the Teams and Skype for Business Admin Center. The account must also be licensed for Teams or it can't run Teams PowerShell cmdlets. More information at [About the Skype for Business admin role](https://support.office.com/article/about-the-skype-for-business-admin-role-aeb35bda-93fc-49b1-ac2c-c74fbeb737b5) and Teams licensing information at [Skype for Business and Microsoft Teams add-on licensing](/skypeforbusiness/skype-for-business-and-microsoft-teams-add-on-licensing/skype-for-business-and-microsoft-teams-add-on-licensing)
+Users with this role have global permissions within Microsoft Skype for Business, when the service is present, as well as manage Skype-specific user attributes in Azure Active Directory. Additionally, this role grants the ability to manage support tickets and monitor service health, and to access the Teams and Skype for Business admin center. The account must also be licensed for Teams or it can't run Teams PowerShell cmdlets. More information at [About the Skype for Business admin role](https://support.office.com/article/about-the-skype-for-business-admin-role-aeb35bda-93fc-49b1-ac2c-c74fbeb737b5) and Teams licensing information at [Skype for Business and Microsoft Teams add-on licensing](/skypeforbusiness/skype-for-business-and-microsoft-teams-add-on-licensing/skype-for-business-and-microsoft-teams-add-on-licensing)
> [!NOTE] > In the Microsoft Graph API and Azure AD PowerShell, this role is identified as "Lync Service Administrator." It is "Skype for Business Administrator" in the [Azure portal](https://portal.azure.com/).
Users in this role can troubleshoot communication issues within Microsoft Teams
## Teams Devices Administrator
-Users with this role can manage [Teams-certified devices](https://www.microsoft.com/microsoft-365/microsoft-teams/across-devices/devices) from the Teams Admin Center. This role allows viewing all devices at single glance, with ability to search and filter devices. The user can check details of each device including logged-in account, make and model of the device. The user can change the settings on the device and update the software versions. This role does not grant permissions to check Teams activity and call quality of the device.
+Users with this role can manage [Teams-certified devices](https://www.microsoft.com/microsoft-365/microsoft-teams/across-devices/devices) from the Teams admin center. This role allows viewing all devices at single glance, with ability to search and filter devices. The user can check details of each device including logged-in account, make and model of the device. The user can change the settings on the device and update the software versions. This role does not grant permissions to check Teams activity and call quality of the device.
> [!div class="mx-tableFixed"] > | Actions | Description |
Users with this role can manage [Teams-certified devices](https://www.microsoft.
## Usage Summary Reports Reader
-Users with this role can access tenant level aggregated data and associated insights in Microsoft 365 Admin Center for Usage and Productivity Score but cannot access any user level details or insights. In Microsoft 365 Admin Center for the two reports, we differentiate between tenant level aggregated data and user level details. This role gives an extra layer of protection on individual user identifiable data, which was requested by both customers and legal teams.
+Users with this role can access tenant level aggregated data and associated insights in Microsoft 365 admin center for Usage and Productivity Score but cannot access any user level details or insights. In Microsoft 365 admin center for the two reports, we differentiate between tenant level aggregated data and user level details. This role gives an extra layer of protection on individual user identifiable data, which was requested by both customers and legal teams.
> [!div class="mx-tableFixed"] > | Actions | Description |
Users with this role can access tenant level aggregated data and associated insi
## User Administrator
-Users with this role can create users, and manage all aspects of users with some restrictions (see the table), and can update password expiration policies. Additionally, users with this role can create and manage all groups. This role also includes the ability to create and manage user views, manage support tickets, and monitor service health. User administrators don't have permission to manage some user properties for users in most administrator roles. User with this role do not have permissions to manage MFA. The roles that are exceptions to this restriction are listed in the following table.
+Users with this role can create users, and manage all aspects of users with some restrictions (see the table), and can update password expiration policies. Additionally, users with this role can create and manage all groups. This role also includes the ability to create and manage user views, manage support tickets, and monitor service health. User Administrators don't have permission to manage some user properties for users in most administrator roles. User with this role do not have permissions to manage MFA. The roles that are exceptions to this restriction are listed in the following table.
| User Administrator permission | Notes | | | |
active-directory Quickstart App Registration Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/quickstart-app-registration-limits.md
# Quickstart: Grant permission to create unlimited app registrations
-In this quick start guide, you will create a custom role with permission to create an unlimited number of app registrations, and then assign that role to a user. The assigned user can then use the Azure portal, Azure AD PowerShell, or Microsoft Graph API to create application registrations. Unlike the built-in Application Developer role, this custom role grants the ability to create an unlimited number of application registrations. The Application Developer role grants the ability, but the total number of created objects is limited to 250 to prevent hitting [the directory-wide object quota](../enterprise-users/directory-service-limits-restrictions.md). The least privileged role required to create and assign Azure AD custom roles is the Privileged Role administrator.
+In this quick start guide, you will create a custom role with permission to create an unlimited number of app registrations, and then assign that role to a user. The assigned user can then use the Azure portal, Azure AD PowerShell, or Microsoft Graph API to create application registrations. Unlike the built-in Application Developer role, this custom role grants the ability to create an unlimited number of application registrations. The Application Developer role grants the ability, but the total number of created objects is limited to 250 to prevent hitting [the directory-wide object quota](../enterprise-users/directory-service-limits-restrictions.md). The least privileged role required to create and assign Azure AD custom roles is the Privileged Role Administrator.
If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
Body
## Next steps - Feel free to share with us on the [Azure AD administrative roles forum](https://feedback.azure.com/forums/169401-azure-active-directory?category_id=166032).-- For more about Azure AD role assignments, see [Assign administrator roles](permissions-reference.md).
+- For more about Azure AD roles, see [Azure AD built-in roles](permissions-reference.md).
- For more about default user permissions, see [comparison of default guest and member user permissions](../fundamentals/users-default-permissions.md).
active-directory Role Definitions List https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/role-definitions-list.md
This article describes how to list the Azure AD built-in and custom roles along
The page includes links to relevant documentation to help guide you through managing roles.
- ![Screenshot that shows the "Global administrator - Description" page.](./media/role-definitions-list/role-description.png)
+ ![Screenshot that shows the "Global Administrator - Description" page.](./media/role-definitions-list/role-description.png)
## Next steps * Feel free to share with us on the [Azure AD administrative roles forum](https://feedback.azure.com/forums/169401-azure-active-directory?category_id=166032).
-* For more about roles and Administrator role assignment, see [Assign administrator roles](permissions-reference.md).
+* For more about role permissions, see [Azure AD built-in roles](permissions-reference.md).
* For default user permissions, see a [comparison of default guest and member user permissions](../fundamentals/users-default-permissions.md).
active-directory Security Emergency Access https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/security-emergency-access.md
Organizations should monitor sign-in and audit log activity from the emergency a
### Obtain Object IDs of the break glass accounts
-1. Sign in to the [Azure portal](https://portal.azure.com) with an account assigned to the User administrator role.
+1. Sign in to the [Azure portal](https://portal.azure.com) with an account assigned to the User Administrator role.
1. Select **Azure Active Directory** > **Users**. 1. Search for the break-glass account and select the userΓÇÖs name. 1. Copy and save the Object ID attribute so that you can use it later.
active-directory Security Planning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/security-planning.md
Title: Secure access practices for administrators in Azure AD | Microsoft Docs
-description: Ensure that your organization's administrative access and admin accounts are secure. For system architects and IT pros who configure Azure AD, Azure, and Microsoft Online Services.
+description: Ensure that your organization's administrative access and administrator accounts are secure. For system architects and IT pros who configure Azure AD, Azure, and Microsoft Online Services.
keywords:
# Securing privileged access for hybrid and cloud deployments in Azure AD
-The security of business assets depends on the integrity of the privileged accounts that administer your IT systems. Cyber-attackers use credential theft attacks to target admin accounts and other privileged access to try to gain access to sensitive data.
+The security of business assets depends on the integrity of the privileged accounts that administer your IT systems. Cyber-attackers use credential theft attacks to target administrator accounts and other privileged access to try to gain access to sensitive data.
For cloud services, prevention and response are the joint responsibilities of the cloud service provider and the customer. For more information about the latest threats to endpoints and the cloud, see the [Microsoft Security Intelligence Report](https://www.microsoft.com/security/operations/security-intelligence-report). This article can help you develop a roadmap toward closing the gaps between your current plans and the guidance described here.
Securing privileged access requires changes to:
* Processes, administrative practices, and knowledge management * Technical components such as host defenses, account protections, and identity management
-Secure your privileged access in a way that is managed and reported in the Microsoft services you care about. If you have on-premises admin accounts, see the guidance for on-premises and hybrid privileged access in Active Directory at [Securing Privileged Access](/windows-server/identity/securing-privileged-access/securing-privileged-access).
+Secure your privileged access in a way that is managed and reported in the Microsoft services you care about. If you have on-premises administrator accounts, see the guidance for on-premises and hybrid privileged access in Active Directory at [Securing Privileged Access](/windows-server/identity/securing-privileged-access/securing-privileged-access).
> [!NOTE] > The guidance in this article refers primarily to features of Azure Active Directory that are included in Azure AD Premium P1 and P2. Azure AD Premium P2 is included in the EMS E5 suite and Microsoft 365 E5 suite. This guidance assumes your organization already has Azure AD Premium P2 licenses purchased for your users. If you do not have these licenses, some of the guidance might not apply to your organization. Also, throughout this article, the term Global Administrator means the same thing as "company administrator" or "tenant administrator."
Microsoft recommends that you develop and follow a roadmap to secure privileged
* Stage 2 (2-4 weeks): Mitigate the most frequently used attack techniques
-* Stage 3 (1-3 months): Build visibility and build full control of admin activity
+* Stage 3 (1-3 months): Build visibility and build full control of administrator activity
* Stage 4 (six months and beyond): Continue building defenses to further harden your security platform
After you turn on Azure AD Privileged Identity Management:
4. Open Privileged Identity Management from the **All services** list and pin it to your dashboard.
-Make sure the first person to use PIM in your organization is assigned to the **Security administrator** and **Privileged role administrator** roles. Only privileged role administrators can manage the Azure AD directory role assignments of users. The PIM security wizard walks you through the initial discovery and assignment experience. You can exit the wizard without making any additional changes at this time.
+Make sure the first person to use PIM in your organization is assigned to the **Security Administrator** and **Privileged Role Administrator** roles. Only Privileged Role Administrators can manage the Azure AD directory role assignments of users. The PIM security wizard walks you through the initial discovery and assignment experience. You can exit the wizard without making any additional changes at this time.
#### Identify and categorize accounts that are in highly privileged roles After turning on Azure AD Privileged Identity Management, view the users who are in the following Azure AD roles:
-* Global administrator
-* Privileged role administrator
-* Exchange administrator
-* SharePoint administrator
+* Global Administrator
+* Privileged Role Administrator
+* Exchange Administrator
+* SharePoint Administrator
If you don't have Azure AD Privileged Identity Management in your organization, you can use the [PowerShell API](/powershell/module/azuread/get-azureaddirectoryrolemember). Start with the Global Administrator role because a Global Administrator has the same permissions across all cloud services for which your organization has subscribed. These permissions are granted no matter where they were assigned: in the Microsoft 365 admin center, the Azure portal, or by the Azure AD module for Microsoft PowerShell.
-Remove any accounts that are no longer needed in those roles. Then, categorize the remaining accounts that are assigned to admin roles:
+Remove any accounts that are no longer needed in those roles. Then, categorize the remaining accounts that are assigned to administrator roles:
* Assigned to administrative users, but also used for non-administrative purposes (for example, personal email) * Assigned to administrative users and used for administrative purposes only
Emergency access accounts help restrict privileged access within an Azure AD org
Evaluate the accounts that are assigned or eligible for the Global Administrator role. If you don't see any cloud-only accounts using the \*.onmicrosoft.com domain (for "break glass" emergency access), create them. For more information, see [Managing emergency access administrative accounts in Azure AD](security-emergency-access.md).
-#### Turn on multi-factor authentication and register all other highly privileged single-user non-federated admin accounts
+#### Turn on multi-factor authentication and register all other highly privileged single-user non-federated administrator accounts
-Require Azure AD Multi-Factor Authentication (MFA) at sign-in for all individual users who are permanently assigned to one or more of the Azure AD admin roles: Global administrator, Privileged Role administrator, Exchange administrator, and SharePoint administrator. Use the guide to enable [Multi-factor Authentication (MFA) for your admin accounts](../authentication/howto-mfa-userstates.md) and ensure that all those users have registered at [https://aka.ms/mfasetup](https://aka.ms/mfasetup). More information can be found under step 2 and step 3 of the guide [Protect access to data and services in Microsoft 365](https://support.office.com/article/Protect-access-to-data-and-services-in-Office-365-a6ef28a4-2447-4b43-aae2-f5af6d53c68e).
+Require Azure AD Multi-Factor Authentication (MFA) at sign-in for all individual users who are permanently assigned to one or more of the Azure AD administrator roles: Global Administrator, Privileged Role Administrator, Exchange Administrator, and SharePoint Administrator. Use the guide to enable [Multi-factor Authentication (MFA) for your administrator accounts](../authentication/howto-mfa-userstates.md) and ensure that all those users have registered at [https://aka.ms/mfasetup](https://aka.ms/mfasetup). More information can be found under step 2 and step 3 of the guide [Protect access to data and services in Microsoft 365](https://support.office.com/article/Protect-access-to-data-and-services-in-Office-365-a6ef28a4-2447-4b43-aae2-f5af6d53c68e).
## Stage 2: Mitigate frequently used attacks
Stage 2 of the roadmap focuses on mitigating the most frequently used attack tec
### General preparation
-#### Conduct an inventory of services, owners, and admins
+#### Conduct an inventory of services, owners, and administrators
The increase in "bring your own device" and work from home policies and the growth of wireless connectivity make it critical to monitor who is connecting to your network. A security audit can reveal devices, applications, and programs on your network that your organization doesn't support and that represent high risk. For more information, see [Azure security management and monitoring overview](../../security/fundamentals/management-monitoring-overview.md). Ensure that you include all of the following tasks in your inventory process. * Identify the users who have administrative roles and the services where they can manage.
-* Use Azure AD PIM to find out which users in your organization have admin access to Azure AD.
-* Beyond the roles defined in Azure AD, Microsoft 365 comes with a set of admin roles that you can assign to users in your organization. Each admin role maps to common business functions, and gives people in your organization permissions to do specific tasks in the [Microsoft 365 admin center](https://admin.microsoft.com). Use the Microsoft 365 admin center to find out which users in your organization have admin access to Microsoft 365, including via roles not managed in Azure AD. For more information, see [About Microsoft 365 admin roles](https://support.office.com/article/About-Office-365-admin-roles-da585eea-f576-4f55-a1e0-87090b6aaa9d) and [Security practices for Office 365](/office365/servicedescriptions/office-365-platform-service-description/office-365-securitycompliance-center).
+* Use Azure AD PIM to find out which users in your organization have administrator access to Azure AD.
+* Beyond the roles defined in Azure AD, Microsoft 365 comes with a set of administrator roles that you can assign to users in your organization. Each administrator role maps to common business functions, and gives people in your organization permissions to do specific tasks in the [Microsoft 365 admin center](https://admin.microsoft.com). Use the Microsoft 365 admin center to find out which users in your organization have administrator access to Microsoft 365, including via roles not managed in Azure AD. For more information, see [About Microsoft 365 administrator roles](https://support.office.com/article/About-Office-365-admin-roles-da585eea-f576-4f55-a1e0-87090b6aaa9d) and [Security practices for Office 365](/office365/servicedescriptions/office-365-platform-service-description/office-365-securitycompliance-center).
* Do the inventory in services your organization relies on, such as Azure, Intune, or Dynamics 365. * Ensure that your accounts that are used for administration purposes: * Have working email addresses attached to them * Have registered for Azure AD Multi-Factor Authentication or use MFA on-premises * Ask users for their business justification for administrative access.
-* Remove admin access for those individuals and services that don't need it.
+* Remove administrator access for those individuals and services that don't need it.
#### Identify Microsoft accounts in administrative roles that need to be switched to work or school accounts
If your initial Global Administrators reuse their existing Microsoft account cre
Personal email accounts are regularly phished by cyber attackers, a risk that makes personal email addresses unacceptable for Global Administrator accounts. To help separate internet risks from administrative privileges, create dedicated accounts for each user with administrative privileges. * Be sure to create separate accounts for users to do Global Administrator tasks.
-* Make sure that your Global Administrators don't accidentally open emails or run programs with their admin accounts.
+* Make sure that your Global Administrators don't accidentally open emails or run programs with their administrator accounts.
* Be sure those accounts have their email forwarded to a working mailbox. * Global Administrator (and other privileged groups) accounts should be cloud-only accounts with no ties to on-premises Active Directory.
Azure AD Identity Protection is an algorithm-based monitoring and reporting tool
#### Obtain your Microsoft 365 Secure Score (if using Microsoft 365)
-Secure Score looks at your settings and activities for the Microsoft 365 services you're using and compares them to a baseline established by Microsoft. You'll get a score based on how aligned you are with security practices. Anyone who has the admin permissions for a Microsoft 365 Business Standard or Enterprise subscription can access the Secure Score at [https://securescore.office.com](https://securescore.office.com/).
+Secure Score looks at your settings and activities for the Microsoft 365 services you're using and compares them to a baseline established by Microsoft. You'll get a score based on how aligned you are with security practices. Anyone who has the administrator permissions for a Microsoft 365 Business Standard or Enterprise subscription can access the Secure Score at [https://securescore.office.com](https://securescore.office.com/).
#### Review the Microsoft 365 security and compliance guidance (if using Microsoft 365)
The [plan for security and compliance](https://support.office.com/article/Plan-f
#### Configure Microsoft 365 Activity Monitoring (if using Microsoft 365)
-Monitor your organization for users who are using Microsoft 365 to identify staff who have an admin account but might not need Microsoft 365 access because they don't sign in to those portals. For more information, see [Activity reports in the Microsoft 365 admin center](https://support.office.com/article/Activity-Reports-in-the-Office-365-admin-center-0d6dfb17-8582-4172-a9a9-aed798150263).
+Monitor your organization for users who are using Microsoft 365 to identify staff who have an administrator account but might not need Microsoft 365 access because they don't sign in to those portals. For more information, see [Activity reports in the Microsoft 365 admin center](https://support.office.com/article/Activity-Reports-in-the-Office-365-admin-center-0d6dfb17-8582-4172-a9a9-aed798150263).
#### Establish incident/emergency response plan owners
Establishing a successful incident response capability requires considerable pla
If your Azure Active Directory organization is synchronized with on-premises Active Directory, then follow the guidance in [Security Privileged Access Roadmap](/windows-server/identity/securing-privileged-access/securing-privileged-access): This stage includes:
-* Creating separate admin accounts for users who need to conduct on-premises administrative tasks
+* Creating separate administrator accounts for users who need to conduct on-premises administrative tasks
* Deploying Privileged Access Workstations for Active Directory administrators
-* Creating unique local admin passwords for workstations and servers
+* Creating unique local administrator passwords for workstations and servers
### Additional steps for organizations managing access to Azure
If your Azure Active Directory organization is synchronized with on-premises Act
Use the Enterprise portal and the Azure portal to identify the subscriptions in your organization that host production applications.
-#### Remove Microsoft accounts from admin roles
+#### Remove Microsoft accounts from administrator roles
-Microsoft accounts from other programs, such as Xbox, Live, and Outlook, shouldn't be used as administrator accounts for your organization's subscriptions. Remove admin status from all Microsoft accounts, and replace with Azure AD (for example, chris@contoso.com) work or school accounts. For admin purposes, depend on accounts that are authenticated in Azure AD and not in other services.
+Microsoft accounts from other programs, such as Xbox, Live, and Outlook, shouldn't be used as administrator accounts for your organization's subscriptions. Remove administrator status from all Microsoft accounts, and replace with Azure AD (for example, chris@contoso.com) work or school accounts. For administrator purposes, depend on accounts that are authenticated in Azure AD and not in other services.
#### Monitor Azure activity
The Azure Activity Log provides a history of subscription-level events in Azure.
Prepare Conditional Access policies for on-premises and cloud-hosted applications. If you have users workplace joined devices, get more information from [Setting up on-premises Conditional Access by using Azure Active Directory device registration](../../active-directory-b2c/overview.md).
-## Stage 3: Take control of admin activity
+## Stage 3: Take control of administrator activity
-![Stage 3: take control of admin activity](./media/security-planning/stage-three.png)
+![Stage 3: take control of administrator activity](./media/security-planning/stage-three.png)
Stage 3 builds on the mitigations from Stage 2 and should be implemented in approximately 1-3 months. This stage of the Secured Privileged Access roadmap includes the following components.
Stage 3 builds on the mitigations from Stage 2 and should be implemented in appr
#### Complete an access review of users in administrator roles
-More corporate users are gaining privileged access through cloud services, which can lead to un-managed access. Users today can become Global Administrators for Microsoft 365, Azure subscription administrators, or have admin access to VMs or via SaaS apps.
+More corporate users are gaining privileged access through cloud services, which can lead to un-managed access. Users today can become Global Administrators for Microsoft 365, Azure subscription administrators, or have administrator access to VMs or via SaaS apps.
-Your organization should have all employees handle ordinary business transactions as unprivileged users, and then grant admin rights only as needed. Complete access reviews to identify and confirm the users who are eligible to activate admin privileges.
+Your organization should have all employees handle ordinary business transactions as unprivileged users, and then grant administrator rights only as needed. Complete access reviews to identify and confirm the users who are eligible to activate administrator privileges.
We recommend that you:
-1. Determine which users are Azure AD admins, enable on-demand, just-in-time admin access, and role-based security controls.
-2. Convert users who have no clear justification for admin privileged access to a different role (if no eligible role, remove them).
+1. Determine which users are Azure AD administrators, enable on-demand, just-in-time administrator access, and role-based security controls.
+2. Convert users who have no clear justification for administrator privileged access to a different role (if no eligible role, remove them).
#### Continue rollout of stronger authentication for all users
Require highly exposed users to have modern, strong authentication such as Azure
#### Use dedicated workstations for administration for Azure AD
-Attackers might try to target privileged accounts so that they can disrupt the integrity and authenticity of data. They often use malicious code that alters the program logic or snoops the admin entering a credential. Privileged Access Workstations (PAWs) provide a dedicated operating system for sensitive tasks that is protected from Internet attacks and threat vectors. Separating these sensitive tasks and accounts from the daily use workstations and devices provides strong protection from:
+Attackers might try to target privileged accounts so that they can disrupt the integrity and authenticity of data. They often use malicious code that alters the program logic or snoops the administrator entering a credential. Privileged Access Workstations (PAWs) provide a dedicated operating system for sensitive tasks that is protected from Internet attacks and threat vectors. Separating these sensitive tasks and accounts from the daily use workstations and devices provides strong protection from:
* Phishing attacks * Application and operating system vulnerabilities * Impersonation attacks * Credential theft attacks such as keystroke logging, Pass-the-Hash, and Pass-The-Ticket
-By deploying privileged access workstations, you can reduce the risk that admins enter their credentials in a desktop environment that hasn't been hardened. For more information, see [Privileged Access Workstations](https://4sysops.com/archives/understand-the-microsoft-privileged-access-workstation-paw-security-model/).
+By deploying privileged access workstations, you can reduce the risk that administrators enter their credentials in a desktop environment that hasn't been hardened. For more information, see [Privileged Access Workstations](https://4sysops.com/archives/understand-the-microsoft-privileged-access-workstation-paw-security-model/).
#### Review National Institute of Standards and Technology recommendations for handling incidents
The National Institute of Standards and Technology's (NIST) provides guidelines
For Azure Active Directory, use [Azure AD Privileged Identity Management](../privileged-identity-management/pim-configure.md) capability. Time-limited activation of privileged roles works by enabling you to:
-* Activate admin privileges to do a specific task
+* Activate administrator privileges to do a specific task
* Enforce MFA during the activation process
-* Use alerts to inform admins about out-of-band changes
+* Use alerts to inform administrators about out-of-band changes
* Enable users to keep their privileged access for a pre-configured amount of time
-* Allow security admins to:
+* Allow security administrators to:
* Discover all privileged identities * View audit reports
- * Create access reviews to identify every user who is eligible to activate admin privileges
+ * Create access reviews to identify every user who is eligible to activate administrator privileges
If you're already using Azure AD Privileged Identity Management, adjust timeframes for time-bound privileges as necessary (for example, maintenance windows).
We recommend you identify every potential user who could be catastrophic to the
#### Complete a roles review assessment for Microsoft 365 roles (if using Microsoft 365)
-Assess whether all admins users are in the correct roles (delete and reassign according to this assessment).
+Assess whether all administrators users are in the correct roles (delete and reassign according to this assessment).
#### Review the security incident management approach used in Microsoft 365 and compare with your own organization
The [Azure Security Center](../../security-center/security-center-introduction.m
#### Inventory your privileged accounts within hosted Virtual Machines
-You don't usually need to give users unrestricted permissions to all your Azure subscriptions or resources. Use Azure AD admin roles to grant only the access that your users who need to do their jobs. You can use Azure AD administrator roles to let one admin manage only VMs in a subscription, while another can manage SQL databases within the same subscription. For more information, see [What is Azure role-based access control](../../active-directory-b2c/overview.md).
+You don't usually need to give users unrestricted permissions to all your Azure subscriptions or resources. Use Azure AD administrator roles to grant only the access that your users who need to do their jobs. You can use Azure AD administrator roles to let one administrator manage only VMs in a subscription, while another can manage SQL databases within the same subscription. For more information, see [What is Azure role-based access control](../../active-directory-b2c/overview.md).
#### Implement PIM for Azure AD administrator roles
Configure Conditional Access based on a group, location, and application sensiti
#### Monitor activity in connected cloud apps
-We recommend using [Microsoft Cloud App Security](/cloud-app-security/what-is-cloud-app-security) to ensure that user access is also protected in connected applications. This feature secures the enterprise access to cloud apps and secures your admin accounts, allowing you to:
+We recommend using [Microsoft Cloud App Security](/cloud-app-security/what-is-cloud-app-security) to ensure that user access is also protected in connected applications. This feature secures the enterprise access to cloud apps and secures your administrator accounts, allowing you to:
* Extend visibility and control to cloud apps * Create policies for access, activities, and data sharing
Securing privileged access is important to establish security assurances for you
We recommend the following practices when you're managing privileged access accounts:
-* Ensure that admins are doing their day-to-day business as unprivileged users
+* Ensure that administrators are doing their day-to-day business as unprivileged users
* Grant privileged access only when needed, and remove it afterward (just-in-time) * Keep audit activity logs relating to privileged accounts
This final ongoing stage of the Secured Privileged Access roadmap includes the f
### General preparation
-#### Review admin roles in Azure AD
+#### Review administrator roles in Azure AD
-Determine if current built-in Azure AD admin roles are still up to date and ensure that users are in only the roles they need. With Azure AD, you can assign separate administrators to serve different functions. For more information, see [Assigning administrator roles in Azure Active Directory](permissions-reference.md).
+Determine if current built-in Azure AD administrator roles are still up to date and ensure that users are in only the roles they need. With Azure AD, you can assign separate administrators to serve different functions. For more information, see [Azure AD built-in roles](permissions-reference.md).
#### Review users who have administration of Azure AD joined devices
For more information about how Microsoft Office 365 handles security incidents,
**Q:** What do I do if I haven't implemented any secure access components yet?
-**Answer:** Define at least two break-glass account, assign MFA to your privileged admin accounts, and separate user accounts from Global Administrator accounts.
+**Answer:** Define at least two break-glass account, assign MFA to your privileged administrator accounts, and separate user accounts from Global Administrator accounts.
**Q:** After a breach, what is the top issue that needs to be addressed first? **Answer:** Be sure you're requiring the strongest authentication for highly exposed individuals.
-**Q:** What happens if our privileged admins have been deactivated?
+**Q:** What happens if our privileged administrators have been deactivated?
**Answer:** Create a Global Administrator account that is always kept up to date.
For more information about how Microsoft Office 365 handles security incidents,
**Answer:** Use one of your break-glass accounts to gain immediate privileged access.
-**Q:** How can I protect admins within my organization?
+**Q:** How can I protect administrators within my organization?
-**Answer:** Have admins always do their day-to-day business as standard "unprivileged" users.
+**Answer:** Have administrators always do their day-to-day business as standard "unprivileged" users.
-**Q:** What are the best practices for creating admin accounts within Azure AD?
+**Q:** What are the best practices for creating administrator accounts within Azure AD?
-**Answer:** Reserve privileged access for specific admin tasks.
+**Answer:** Reserve privileged access for specific administrator tasks.
-**Q:** What tools exist for reducing persistent admin access?
+**Q:** What tools exist for reducing persistent administrator access?
-**Answer:** Privileged Identity Management (PIM) and Azure AD admin roles.
+**Answer:** Privileged Identity Management (PIM) and Azure AD administrator roles.
-**Q:** What is the Microsoft position on synchronizing admin accounts to Azure AD?
+**Q:** What is the Microsoft position on synchronizing administrator accounts to Azure AD?
-**Answer:** Tier 0 admin accounts are used only for on-premises AD accounts. Such accounts aren't typically synchronized with Azure AD in the cloud. Tier 0 admin accounts include accounts, groups, and other assets that have direct or indirect administrative control of the on-premises Active Directory forest, domains, domain controllers, and assets.
+**Answer:** Tier 0 administrator accounts are used only for on-premises AD accounts. Such accounts aren't typically synchronized with Azure AD in the cloud. Tier 0 administrator accounts include accounts, groups, and other assets that have direct or indirect administrative control of the on-premises Active Directory forest, domains, domain controllers, and assets.
-**Q:** How do we keep admins from assigning random admin access in the portal?
+**Q:** How do we keep administrators from assigning random administrator access in the portal?
-**Answer:** Use non-privileged accounts for all users and most admins. Start by developing a footprint of the organization to determine which few admin accounts should be privileged. And monitor for newly created administrative users.
+**Answer:** Use non-privileged accounts for all users and most administrators. Start by developing a footprint of the organization to determine which few administrator accounts should be privileged. And monitor for newly created administrative users.
## Next steps
active-directory View Assignments https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/view-assignments.md
This section describes how to list role assignments with single-application scop
## Next steps * Feel free to share with us on the [Azure AD administrative roles forum](https://feedback.azure.com/forums/169401-azure-active-directory?category_id=166032).
-* For more about roles and Administrator role assignment, see [Assign administrator roles](permissions-reference.md).
+* For more about role permissions, see [Azure AD built-in roles](permissions-reference.md).
* For default user permissions, see a [comparison of default guest and member user permissions](../fundamentals/users-default-permissions.md).
aks Gpu Cluster https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/gpu-cluster.md
To run Apache Spark jobs, see [Run Apache Spark jobs on AKS][aks-spark].
For more information about running machine learning (ML) workloads on Kubernetes, see [Kubeflow Labs][kubeflow-labs].
+For information on using Azure Kubernetes Service with Azure Machine Learning, see the following articles:
+
+* [Deploy a model to Azure Kubernetes Service][azureml-aks].
+* [Deploy a deep learning model for inference with GPU][azureml-gpu].
+* [High-performance serving with Triton Inference Server][azureml-triton].
+ <!-- LINKS - external --> [kubectl-apply]: https://kubernetes.io/docs/reference/generated/kubectl/kubectl-commands#apply [kubectl-get]: https://kubernetes.io/docs/reference/generated/kubectl/kubectl-commands#get
For more information about running machine learning (ML) workloads on Kubernetes
[aks-spark]: spark-job.md [gpu-skus]: ../virtual-machines/sizes-gpu.md [install-azure-cli]: /cli/azure/install-azure-cli
+[azureml-aks]: ../machine-learning/how-to-deploy-azure-kubernetes-service.md
+[azureml-gpu]: ../machine-learning/how-to-deploy-inferencing-gpus.md
+[azureml-triton]: ../machine-learning/how-to-deploy-with-triton.md
api-management Api Management Using With Vnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/api-management/api-management-using-with-vnet.md
Previously updated : 04/12/2021 Last updated : 05/28/2021
When an API Management service instance is hosted in a VNET, the ports in the fo
| Azure Environment | Endpoints | |-||
- | Azure Public | <ul><li>gcs.prod.monitoring.core.windows.net(**new**)</li><li>global.prod.microsoftmetrics.com(**new**)</li><li>shoebox2-red.prod.microsoftmetrics.com</li><li>shoebox2-black.prod.microsoftmetrics.com</li><li>shoebox2-red.shoebox2.metrics.nsatc.net</li><li>shoebox2-black.shoebox2.metrics.nsatc.net</li><li>prod3.prod.microsoftmetrics.com(**new**)</li><li>prod3-black.prod.microsoftmetrics.com(**new**)</li><li>prod3-red.prod.microsoftmetrics.com(**new**)</li><li>gcs.prod.warm.ingestion.monitoring.azure.com</li></ul> |
- | Azure Government | <ul><li>fairfax.warmpath.usgovcloudapi.net</li><li>global.prod.microsoftmetrics.com(**new**)</li><li>shoebox2.prod.microsoftmetrics.com(**new**)</li><li>shoebox2-red.prod.microsoftmetrics.com</li><li>shoebox2-black.prod.microsoftmetrics.com</li><li>shoebox2-red.shoebox2.metrics.nsatc.net</li><li>shoebox2-black.shoebox2.metrics.nsatc.net</li><li>prod3.prod.microsoftmetrics.com(**new**)</li><li>prod3-black.prod.microsoftmetrics.com</li><li>prod3-red.prod.microsoftmetrics.com</li><li>prod5.prod.microsoftmetrics.com</li><li>prod5-black.prod.microsoftmetrics.com</li><li>prod5-red.prod.microsoftmetrics.com</li><li>gcs.prod.warm.ingestion.monitoring.azure.us</li></ul> |
- | Azure China 21Vianet | <ul><li>mooncake.warmpath.chinacloudapi.cn</li><li>global.prod.microsoftmetrics.com(**new**)</li><li>shoebox2.prod.microsoftmetrics.com(**new**)</li><li>shoebox2-red.prod.microsoftmetrics.com</li><li>shoebox2-black.prod.microsoftmetrics.com</li><li>shoebox2-red.shoebox2.metrics.nsatc.net</li><li>shoebox2-black.shoebox2.metrics.nsatc.net</li><li>prod3.prod.microsoftmetrics.com(**new**)</li><li>prod3-red.prod.microsoftmetrics.com</li><li>prod5.prod.microsoftmetrics.com</li><li>prod5-black.prod.microsoftmetrics.com</li><li>prod5-red.prod.microsoftmetrics.com</li><li>gcs.prod.warm.ingestion.monitoring.azure.cn</li></ul> |
-
- >[!IMPORTANT]
- > The change of clusters above with DNS zone **.nsatc.net** to **.microsoftmetrics.com** is mostly a DNS Change. IP Address of cluster will not change.
+ | Azure Public | <ul><li>gcs.prod.monitoring.core.windows.net</li><li>global.prod.microsoftmetrics.com</li><li>shoebox2.prod.microsoftmetrics.com</li><li>shoebox2-red.prod.microsoftmetrics.com</li><li>shoebox2-black.prod.microsoftmetrics.com</li><li>prod3.prod.microsoftmetrics.com</li><li>prod3-black.prod.microsoftmetrics.com</li><li>prod3-red.prod.microsoftmetrics.com</li><li>gcs.prod.warm.ingestion.monitoring.azure.com</li></ul> |
+ | Azure Government | <ul><li>fairfax.warmpath.usgovcloudapi.net</li><li>global.prod.microsoftmetrics.com</li><li>shoebox2.prod.microsoftmetrics.com</li><li>shoebox2-red.prod.microsoftmetrics.com</li><li>shoebox2-black.prod.microsoftmetrics.com</li><li>prod3.prod.microsoftmetrics.com</li><li>prod3-black.prod.microsoftmetrics.com</li><li>prod3-red.prod.microsoftmetrics.com</li><li>prod5.prod.microsoftmetrics.com</li><li>prod5-black.prod.microsoftmetrics.com</li><li>prod5-red.prod.microsoftmetrics.com</li><li>gcs.prod.warm.ingestion.monitoring.azure.us</li></ul> |
+ | Azure China 21Vianet | <ul><li>mooncake.warmpath.chinacloudapi.cn</li><li>global.prod.microsoftmetrics.com</li><li>shoebox2.prod.microsoftmetrics.com</li><li>shoebox2-red.prod.microsoftmetrics.com</li><li>shoebox2-black.prod.microsoftmetrics.com</li><li>prod3.prod.microsoftmetrics.com</li><li>prod3-red.prod.microsoftmetrics.com</li><li>prod5.prod.microsoftmetrics.com</li><li>prod5-black.prod.microsoftmetrics.com</li><li>prod5-red.prod.microsoftmetrics.com</li><li>gcs.prod.warm.ingestion.monitoring.azure.cn</li></ul> |
+
+ **Regional Service Tags**: NSG rules allowing outbound connectivity to Storage, SQL, and Event Hubs service tags may use the regional versions of those tags corresponding to the region containing the API Management instance (for example, Storage.WestUS for an API Management instance in the West US region). In multi-region deployments, the NSG in each region should allow traffic to the service tags for that region and the primary region. > [!IMPORTANT]
The IP Addresses are divided by **Azure Environment**. When allowing inbound req
| Azure Public| Central US| 13.86.102.66| | Azure Public| Australia East| 20.40.125.155| | Azure Public| West US 2| 51.143.127.203|
+| Azure Public| West US 3| 20.150.167.160|
| Azure Public| East US 2 EUAP| 52.253.229.253| | Azure Public| Central US EUAP| 52.253.159.160| | Azure Public| South Central US| 20.188.77.119|
app-service App Service Web Tutorial Custom Domain Uiex https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/app-service-web-tutorial-custom-domain-uiex.md
Browse to the DNS names that you configured earlier.
<details> <summary>I get an HTTP 404 (Not Found) error.</summary> <ul>
-<li>The custom domain configured is missing an A record or a CNAME record.</li>
+<li>The custom domain configured is missing an A record or a CNAME record. Check if the DNS records are exposed using an <a href="https://www.nslookup.io/">online DNS lookup</a> tool.</li>
<li>The browser client has cached the old IP address of your domain. Clear the cache, and test DNS resolution again. On a Windows machine, you clear the cache with <code>ipconfig /flushdns</code>.</li> </ul> </details>
app-service App Service Web Tutorial Custom Domain https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/app-service-web-tutorial-custom-domain.md
Browse to the DNS names that you configured earlier.
If you receive an HTTP 404 (Not Found) error when you browse to the URL of your custom domain, the two most common causes are:
-* The custom domain configured is missing an A record or a CNAME record. You may have deleted the DNS record after you've enabled the mapping in your app.
+* The custom domain configured is missing an A record or a CNAME record. You may have deleted the DNS record after you've enabled the mapping in your app. Check if the DNS records are properly configured using an <a href="https://www.nslookup.io/">online DNS lookup</a> tool.
* The browser client has cached the old IP address of your domain. Clear the cache, and test DNS resolution again. On a Windows machine, you clear the cache with `ipconfig /flushdns`.
app-service Deploy Ci Cd Custom Container https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/deploy-ci-cd-custom-container.md
Follow the next steps by selecting the tab that matches your choice.
The **Registry** dropdown displays the registries in the same subscription as your app. **Select** the registry you want. > [!NOTE]
-> To deploy from a registry in a different subscription, **select** **Private Registry** in **Registry source** instead.
+> - If want to use Managed Identities to lock down ACR access follow this guide:
+> - [How to use system-assigned Managed Identities with App Service and Azure Container Registry](https://github.com/Azure/app-service-linux-docs/blob/master/HowTo/use_system-assigned_managed_identities.md)
+> - [How to use user-assigned Managed Identities with App Service and Azure Container Registry](https://github.com/Azure/app-service-linux-docs/blob/master/HowTo/use_user-assigned_managed_identities.md)
+> - To deploy from a registry in a different subscription, **select** **Private Registry** in **Registry source** instead.
+>
::: zone pivot="container-windows" **Select** the **Image** and **Tag** to deploy. If you want, **type** the start up command in **Startup File**.
attestation Azure Diagnostic Monitoring https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/attestation/azure-diagnostic-monitoring.md
The Trusted Platform Module (TPM) endpoint service is enabled in the diagnostic
$storageAccount=New-AzStorageAccount -ResourceGroupName $attestationProvider.ResourceGroupName -Name <Storage Account Name> -SkuName Standard_LRS -Location <Location>
- Set-AzDiagnosticSetting -ResourceId $ attestationProvider.Id -StorageAccountId $ storageAccount.Id -Enabled $true
+ Set-AzDiagnosticSetting -ResourceId $attestationProvider.Id -StorageAccountId $storageAccount.Id -Enabled $true
```
-Activity logs are in the **Containers** section of the storage account. For more information, see [Collect and analyze resource logs from an Azure resource](../azure-monitor/essentials/tutorial-resource-logs.md).
+Activity logs are in the **Containers** section of the storage account. For more information, see [Collect and analyze resource logs from an Azure resource](../azure-monitor/essentials/tutorial-resource-logs.md).
automation Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/troubleshoot/managed-identity.md
description: This article tells how to troubleshoot and resolve issues when usin
Last updated 04/28/2021-
azure-arc Custom Locations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/kubernetes/custom-locations.md
A conceptual overview of this feature is available in [Custom locations - Azure
## Enable custom locations on cluster
-To enable this feature on your cluster, execute the following command:
+If you are logged into Azure CLI as a Azure AD user, to enable this feature on your cluster, execute the following command:
```console az connectedk8s enable-features -n <clusterName> -g <resourceGroupName> --features cluster-connect custom-locations ```
+If you are logged into Azure CLI using a service principal, to enable this feature on your cluster, execute the following steps:
+
+1. Fetch the Object ID of the Azure AD application used by Azure Arc service:
+
+ ```console
+ az ad sp show --id 'bc313c14-388c-4e7d-a58e-70017303ee3b' --query objectId -o tsv
+ ```
+
+1. Use the `<objectId>` value from above step to enable custom locations feature on the cluster:
+
+ ```console
+ az connectedk8s enable-features -n <cluster-name> -g <resource-group-name> --custom-locations-oid <objectId> --features cluster-connect custom-locations
+ ```
+ > [!NOTE] > 1. Custom Locations feature is dependent on the Cluster Connect feature. So both features have to be enabled for custom locations to work. > 2. `az connectedk8s enable-features` needs to be run on a machine where the `kubeconfig` file is pointing to the cluster on which the features are to be enabled.
-> 3. If you are logged into Azure CLI using a service principal, [additional permissions](troubleshooting.md#enable-custom-locations-using-service-principal) have to be granted to the service principal before enabling the custom location feature.
## Create custom location
azure-arc Quickstart Connect Cluster https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/kubernetes/quickstart-connect-cluster.md
In this quickstart, you'll learn the benefits of Azure Arc enabled Kubernetes an
| Endpoint (DNS) | Description | | -- | - |
-| `https://management.azure.com` | Required for the agent to connect to Azure and register the cluster. |
-| `https://<region>.dp.kubernetesconfiguration.azure.com` | Data plane endpoint for the agent to push status and fetch configuration information. |
-| `https://login.microsoftonline.com` | Required to fetch and update Azure Resource Manager tokens. |
-| `https://mcr.microsoft.com` | Required to pull container images for Azure Arc agents. |
-| `https://eus.his.arc.azure.com`, `https://weu.his.arc.azure.com`, `https://wcus.his.arc.azure.com`, `https://scus.his.arc.azure.com`, `https://sea.his.arc.azure.com`, `https://uks.his.arc.azure.com`, `https://wus2.his.arc.azure.com`, `https://ae.his.arc.azure.com`, `https://eus2.his.arc.azure.com`, `https://ne.his.arc.azure.com` | Required to pull system-assigned Managed Service Identity (MSI) certificates. |
+| `https://management.azure.com` (for Azure Cloud), `https://management.usgovcloudapi.net` (for Azure US Government) | Required for the agent to connect to Azure and register the cluster. |
+| `https://<region>.dp.kubernetesconfiguration.azure.com` (for Azure Cloud), `https://<region>.dp.kubernetesconfiguration.azure.us` (for Azure US Government) | Data plane endpoint for the agent to push status and fetch configuration information. |
+| `https://login.microsoftonline.com` (for Azure Cloud), `https://login.microsoftonline.us` (for Azure US Government) | Required to fetch and update Azure Resource Manager tokens. |
+| `https://mcr.microsoft.com` | Required to pull container images for Azure Arc agents. |
+| `https://<region-code>.his.arc.azure.com` (for Azure Cloud), `https://usgv.his.arc.azure.us` (for Azure US Government) | Required to pull system-assigned Managed Service Identity (MSI) certificates. `<region-code>` mapping for Azure cloud regions: `eus` (East US), `weu` (West Europe), `wcus` (West Central US), `scus` (South Central US), `sea` (South East Asia), `uks` (UK South), `wus2` (West US 2), `ae` (Australia East), `eus2` (East US 2), `ne` (North Europe), `fc` (France Central). |
## 1. Register providers for Azure Arc enabled Kubernetes
Helm release deployment succeeded
> The above command without the location parameter specified creates the Azure Arc enabled Kubernetes resource in the same location as the resource group. To create the Azure Arc enabled Kubernetes resource in a different location, specify either `--location <region>` or `-l <region>` when running the `az connectedk8s connect` command. > [!NOTE]
-> If you are logged into Azure CLI using a service principal, [additional permissions](troubleshooting.md#enable-custom-locations-using-service-principal) are required on the service principal for enabling the custom location feature when connecting the cluster to Azure Arc.
+> If you are logged into Azure CLI using a service principal, an [additional parameter](troubleshooting.md#enable-custom-locations-using-service-principal) needs to be set for enabling the custom location feature on the cluster.
## 4. Verify cluster connection
azure-arc Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/kubernetes/troubleshooting.md
When you are connecting your cluster to Azure Arc or when you are enabling custo
Unable to fetch oid of 'custom-locations' app. Proceeding without enabling the feature. Insufficient privileges to complete the operation. ```
-The above warning is observed when you have used a service principal to log into Azure and this service principal doesn't have permissions to get information of the application used by Azure Arc service. Run the following commands to grant the required permissions:
+The above warning is observed when you have used a service principal to log into Azure and this service principal doesn't have permissions to get information of the application used by Azure Arc service. To avoid this error, execute the following steps:
-```console
-az ad app permission add --id <service-principal-app-id> --api 00000002-0000-0000-c000-000000000000 --api-permissions 3afa6a7d-9b1a-42eb-948e-1650a849e176=Role
-az ad app permission admin-consent --id <service-principal-app-id>
-```
+1. Fetch the Object ID of the Azure AD application used by Azure Arc service:
+
+ ```console
+ az ad sp show --id 'bc313c14-388c-4e7d-a58e-70017303ee3b' --query objectId -o tsv
+ ```
+
+1. Use the `<objectId>` value from above step to enable custom locations feature on the cluster:
+ - If you are enabling custom locations feature as part of connecting the cluster to Arc, run the following command:
+
+ ```console
+ az connectedk8s connect -n <cluster-name> -g <resource-group-name> --custom-locations-oid <objectId>
+ ```
+
+ - If you are enabling custom locations feature on an existing Arc enabled Kubernetes cluster, run the following command:
+
+ ```console
+ az connectedk8s enable-features -n <cluster-name> -g <resource-group-name> --custom-locations-oid <objectId> --features cluster-connect custom-locations
+ ```
Once above permissions are granted, you can now proceed to [enabling the custom location feature](custom-locations.md#enable-custom-locations-on-cluster) on the cluster.
azure-cache-for-redis Cache How To Import Export Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-how-to-import-export-data.md
# Import and Export data in Azure Cache for Redis
-Import/Export is an Azure Cache for Redis data management operation, which allows you to import data into Azure Cache for Redis or export data from Azure Cache for Redis by importing and exporting an Azure Cache for Redis Database (RDB) snapshot from a premium cache to a blob in an Azure Storage Account.
+
+Import/Export is an Azure Cache for Redis data management operation. It allows you to import data into Azure Cache for Redis or export data from Azure Cache for Redis by importing and exporting an Azure Cache for Redis Database (RDB) snapshot from a premium cache to a blob in an Azure Storage Account.
- **Export** - you can export your Azure Cache for Redis RDB snapshots to a Page Blob. - **Import** - you can import your Azure Cache for Redis RDB snapshots from either a Page Blob or a Block Blob.
This article provides a guide for importing and exporting data with Azure Cache
> ## Import
-Import can be used to bring Redis compatible RDB files from any Redis server running in any cloud or environment, including Redis running on Linux, Windows, or any cloud provider such as Amazon Web Services and others. Importing data is an easy way to create a cache with pre-populated data. During the import process, Azure Cache for Redis loads the RDB files from Azure storage into memory and then inserts the keys into the cache.
+
+Use import to bring Redis compatible RDB files from any Redis server running in any cloud or environment, including Redis running on Linux, Windows, or any cloud provider such as Amazon Web Services and others. Importing data is an easy way to create a cache with pre-populated data. During the import process, Azure Cache for Redis loads the RDB files from Azure storage into memory and then inserts the keys into the cache.
> [!NOTE] > Before beginning the import operation, ensure that your Redis Database (RDB) file or files are uploaded into page or block blobs in Azure storage, in the same region and subscription as your Azure Cache for Redis instance. For more information, see [Get started with Azure Blob storage](../storage/blobs/storage-quickstart-blobs-dotnet.md). If you exported your RDB file using the [Azure Cache for Redis Export](#export) feature, your RDB file is already stored in a page blob and is ready for importing. > >
-1. To import one or more exported cache blobs, [browse to your cache](cache-configure.md#configure-azure-cache-for-redis-settings) in the Azure portal and click **Import data** from the **Resource menu**.
+1. To import one or more exported cache blobs, [browse to your cache](cache-configure.md#configure-azure-cache-for-redis-settings) in the Azure portal and select **Import data** from the **Resource menu**.
![Import data](./media/cache-how-to-import-export-data/cache-import-data.png)
-2. Click **Choose Blob(s)** and select the storage account that contains the data to import.
+2. Select **Choose Blob(s)** and select the storage account that contains the data to import.
![Choose storage account](./media/cache-how-to-import-export-data/cache-import-choose-storage-account.png)
-3. Click the container that contains the data to import.
+3. Select the container that contains the data to import.
![Choose container](./media/cache-how-to-import-export-data/cache-import-choose-container.png)
-4. Select one or more blobs to import by clicking the area to the left of the blob name, and then click **Select**.
+4. Select one or more blobs to import by selecting the area to the left of the blob name, and then **Select**.
![Choose blobs](./media/cache-how-to-import-export-data/cache-import-choose-blobs.png)
-5. Click **Import** to begin the import process.
+5. Select **Import** to begin the import process.
> [!IMPORTANT] > The cache is not accessible by cache clients during the import process, and any existing data in the cache is deleted.
Import can be used to bring Redis compatible RDB files from any Redis server run
![Import progress](./media/cache-how-to-import-export-data/cache-import-data-import-complete.png) ## Export
-Export allows you to export the data stored in Azure Cache for Redis to Redis compatible RDB file(s). You can use this feature to move data from one Azure Cache for Redis instance to another or to another Redis server. During the export process, a temporary file is created on the VM that hosts the Azure Cache for Redis server instance, and the file is uploaded to the designated storage account. When the export operation completes with either a status of success or failure, the temporary file is deleted.
-1. To export the current contents of the cache to storage, [browse to your cache](cache-configure.md#configure-azure-cache-for-redis-settings) in the Azure portal and click **Export data** from the **Resource menu**.
+Export allows you to export the data stored in Azure Cache for Redis to Redis compatible RDB file(s). You can use this feature to move data from one Azure Cache for Redis instance to another or to another Redis server. During the export process, a temporary file is created on the VM that hosts the Azure Cache for Redis server instance. Then, the file is uploaded to the chosen storage account. When the export operation completes with either a status of success or failure, the temporary file is deleted.
+
+1. To export the current contents of the cache to storage, [browse to your cache](cache-configure.md#configure-azure-cache-for-redis-settings) in the Azure portal and select **Export data** from the **Resource menu**.
![On the navigation pane for contoso5premium, the Export data option on the Administration list is highlighted.](./media/cache-how-to-import-export-data/cache-export-data-choose-storage-container.png)
-2. Click **Choose Storage Container** and select the desired storage account. The storage account must be in the same subscription and region as your cache.
+2. Select **Choose Storage Container** and select the storage account you want. The storage account must be in the same subscription and region as your cache.
> [!IMPORTANT]
- > Export works with page blobs, which are supported by both classic and Resource Manager storage accounts, but are not supported by Blob storage accounts at this time. For more information, see [Azure storage account overview](../storage/common/storage-account-overview.md).
+ > Export works with page blobs, which are supported by both classic and Resource Manager storage accounts. Export is not supported by Blob storage accounts at this time. For more information, see [Azure storage account overview](../storage/common/storage-account-overview.md).
> ![Storage account](./media/cache-how-to-import-export-data/cache-export-data-choose-account.png)
-3. Choose the desired blob container and click **Select**. To use new a container, click **Add Container** to add it first and then select it from the list.
+3. Choose the blob container you want, then **Select**. To use new a container, select **Add Container** to add it first and then select it from the list.
![On Containers for contoso55, the + Container option is highlighted. There is one container in the list, cachesaves, and it is selected and highlighted. The Selection option is selected and highlighted.](./media/cache-how-to-import-export-data/cache-export-data-container.png)
-4. Type a **Blob name prefix** and click **Export** to start the export process. The blob name prefix is used to prefix the names of files generated by this export operation.
+4. Type a **Blob name prefix** and select **Export** to start the export process. The blob name prefix is used to prefix the names of files generated by this export operation.
![Export](./media/cache-how-to-import-export-data/cache-export-data.png)
Export allows you to export the data stored in Azure Cache for Redis to Redis co
Caches remain available for use during the export process. ## Import/Export FAQ+ This section contains frequently asked questions about the Import/Export feature.
-* [What pricing tiers can use Import/Export?](#what-pricing-tiers-can-use-importexport)
-* [Can I import data from any Redis server?](#can-i-import-data-from-any-redis-server)
-* [What RDB versions can I import?](#what-rdb-versions-can-i-import)
-* [Is my cache available during an Import/Export operation?](#is-my-cache-available-during-an-importexport-operation)
-* [Can I use Import/Export with Redis cluster?](#can-i-use-importexport-with-redis-cluster)
-* [How does Import/Export work with a custom databases setting?](#how-does-importexport-work-with-a-custom-databases-setting)
-* [How is Import/Export different from Redis persistence?](#how-is-importexport-different-from-redis-persistence)
-* [Can I automate Import/Export using PowerShell, CLI, or other management clients?](#can-i-automate-importexport-using-powershell-cli-or-other-management-clients)
-* [I received a timeout error during my Import/Export operation. What does it mean?](#i-received-a-timeout-error-during-my-importexport-operation-what-does-it-mean)
-* [I got an error when exporting my data to Azure Blob Storage. What happened?](#i-got-an-error-when-exporting-my-data-to-azure-blob-storage-what-happened)
+- [What pricing tiers can use Import/Export?](#what-pricing-tiers-can-use-importexport)
+- [Can I import data from any Redis server?](#can-i-import-data-from-any-redis-server)
+- [What RDB versions can I import?](#what-rdb-versions-can-i-import)
+- [Is my cache available during an Import/Export operation?](#is-my-cache-available-during-an-importexport-operation)
+- [Can I use Import/Export with Redis cluster?](#can-i-use-importexport-with-redis-cluster)
+- [How does Import/Export work with a custom databases setting?](#how-does-importexport-work-with-a-custom-databases-setting)
+- [How is Import/Export different from Redis persistence?](#how-is-importexport-different-from-redis-persistence)
+- [Can I automate Import/Export using PowerShell, CLI, or other management clients?](#can-i-automate-importexport-using-powershell-cli-or-other-management-clients)
+- [I received a timeout error during my Import/Export operation. What does it mean?](#i-received-a-timeout-error-during-my-importexport-operation-what-does-it-mean)
+- [I got an error when exporting my data to Azure Blob Storage. What happened?](#i-got-an-error-when-exporting-my-data-to-azure-blob-storage-what-happened)
### What pricing tiers can use Import/Export?+ Import/Export is available only in the premium pricing tier. ### Can I import data from any Redis server?
-Yes, in addition to importing data exported from Azure Cache for Redis instances, you can import RDB files from any Redis server running in any cloud or environment, such as Linux, Windows, or cloud providers such as Amazon Web Services. To do this, upload the RDB file from the desired Redis server into a page or block blob in an Azure Storage Account, and then import it into your premium Azure Cache for Redis instance. For example, you may want to export the data from your production cache and import it into a cache used as part of a staging environment for testing or migration.
+
+Yes, you can importing data exported from Azure Cache for Redis instances, and you can import RDB files from any Redis server running in any cloud or environment. The environments include Linux, Windows, or cloud providers such as Amazon Web Services. To do import this data, upload the RDB file from the Redis server you want into a page or block blob in an Azure Storage Account. Then, import it into your premium Azure Cache for Redis instance. For example, you might want to export the data from your production cache and import it into a cache that is used as part of a staging environment for testing or migration.
> [!IMPORTANT] > To successfully import data exported from Redis servers other than Azure Cache for Redis when using a page blob, the page blob size must be aligned on a 512 byte boundary. For sample code to perform any required byte padding, see [Sample page blob upload](https://github.com/JimRoberts-MS/SamplePageBlobUpload).
Yes, in addition to importing data exported from Azure Cache for Redis instances
Azure Cache for Redis supports RDB import up through RDB version 7. ### Is my cache available during an Import/Export operation?
-* **Export** - Caches remain available and you can continue to use your cache during an export operation.
-* **Import** - Caches become unavailable when an import operation starts, and become available for use when the import operation completes.
+
+- **Export** - Caches remain available and you can continue to use your cache during an export operation.
+- **Import** - Caches become unavailable when an import operation starts, and become available for use when the import operation completes.
### Can I use Import/Export with Redis cluster?+ Yes, and you can import/export between a clustered cache and a non-clustered cache. Since Redis cluster [only supports database 0](cache-how-to-premium-clustering.md#do-i-need-to-make-any-changes-to-my-client-application-to-use-clustering), any data in databases other than 0 isn't imported. When clustered cache data is imported, the keys are redistributed among the shards of the cluster. ### How does Import/Export work with a custom databases setting?+ Some pricing tiers have different [databases limits](cache-configure.md#databases), so there are some considerations when importing if you configured a custom value for the `databases` setting during cache creation.
-* When importing to a pricing tier with a lower `databases` limit than the tier from which you exported:
- * If you are using the default number of `databases`, which is 16 for all pricing tiers, no data is lost.
- * If you are using a custom number of `databases` that falls within the limits for the tier to which you are importing, no data is lost.
- * If your exported data contained data in a database that exceeds the limits of the new tier, the data from those higher databases is not imported.
+- When importing to a pricing tier with a lower `databases` limit than the tier from which you exported:
+ - If you're using the default number of `databases`, which is 16 for all pricing tiers, no data is lost.
+ - If you're using a custom number of `databases` that falls within the limits for the tier to which you're importing, no data is lost.
+ - If you're exported data contained data in a database that exceeds the limits of the new tier, the data from those higher databases isn't imported.
### How is Import/Export different from Redis persistence?
-Azure Cache for Redis persistence allows you to persist data stored in Redis to Azure Storage. When persistence is configured, Azure Cache for Redis persists a snapshot of the Azure Cache for Redis in a Redis binary format to disk based on a configurable backup frequency. If a catastrophic event occurs that disables both the primary and replica cache, the cache data is restored automatically using the most recent snapshot. For more information, see [How to configure data persistence for a Premium Azure Cache for Redis](cache-how-to-premium-persistence.md).
-Import/ Export allows you to bring data into or export from Azure Cache for Redis. It does not configure backup and restore using Redis persistence.
+Azure Cache for Redis persistence allows you to persist data stored in Redis to Azure Storage. When persistence is configured, Azure Cache for Redis persists a snapshot the cache data in a Redis binary format to disk based on a configurable backup frequency. If a catastrophic event occurs that disables both the primary and replica cache, the cache data is restored automatically using the most recent snapshot. For more information, see [How to configure data persistence for a Premium Azure Cache for Redis](cache-how-to-premium-persistence.md).
+
+Import/ Export allows you to bring data into or export from Azure Cache for Redis. It doesn't configure backup and restore using Redis persistence.
### Can I automate Import/Export using PowerShell, CLI, or other management clients?+ Yes, for PowerShell instructions see [To import an Azure Cache for Redis](cache-how-to-manage-redis-cache-powershell.md#to-import-an-azure-cache-for-redis) and [To export an Azure Cache for Redis](cache-how-to-manage-redis-cache-powershell.md#to-export-an-azure-cache-for-redis). ### I received a timeout error during my Import/Export operation. What does it mean?
-If you remain on the **Import data** or **Export data** blade for longer than 15 minutes before initiating the operation, you receive an error with an error message similar to the following example:
+
+On the left, if you remain on **Import data** or **Export data** for longer than 15 minutes before starting the operation, you receive an error with an error message similar to the following example:
```output The request to import data into cache 'contoso55' failed with status 'error' and error 'One of the SAS URIs provided could not be used for the following reason: The SAS token end time (se) must be at least 1 hour from now and the start time (st), if given, must be at least 15 minutes in the past. ```
-To resolve this, initiate the import or export operation before 15 minutes has elapsed.
+To resolve this error, start the import or export operation before 15 minutes has elapsed.
### I got an error when exporting my data to Azure Blob Storage. What happened?
-Export works only with RDB files stored as page blobs. Other blob types are not currently supported, including Blob storage accounts with hot and cool tiers. For more information, see [Azure storage account overview](../storage/common/storage-account-overview.md).
+
+Export works only with RDB files stored as page blobs. Other blob types aren't currently supported, including Blob storage accounts with hot and cool tiers. For more information, see [Azure storage account overview](../storage/common/storage-account-overview.md).
## Next steps+ Learn more about Azure Cache for Redis features.
-* [Azure Cache for Redis service tiers](cache-overview.md#service-tiers)
+- [Azure Cache for Redis service tiers](cache-overview.md#service-tiers)
azure-cache-for-redis Cache How To Monitor https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-how-to-monitor.md
Cache metrics are reported using several reporting intervals, including **Past h
Each metric includes two versions. One metric measures performance for the entire cache, and for caches that use [clustering](cache-how-to-premium-clustering.md), a second version of the metric that includes `(Shard 0-9)` in the name measures performance for a single shard in a cache. For example if a cache has four shards, `Cache Hits` is the total number of hits for the entire cache, and `Cache Hits (Shard 3)` is just the hits for that shard of the cache.
+> [!NOTE]
+> When you're seeing the aggregation type :
+>
+> - CountΓÇ¥ show 2, it indicates the metric received 2 data points for your time granularity (1 minute).
+> - ΓÇ£MaxΓÇ¥ shows the maximum value of a data point in the time granularity,
+> - ΓÇ£MinΓÇ¥ shows the minimum value of a data point in the time granularity,
+> - ΓÇ£AverageΓÇ¥ shows the average value of all data points in the time granularity.
+> - ΓÇ£SumΓÇ¥ shows the sum of all data points in the time granularity and may be misleading depending on the specific metric.
+> Under normal conditions, ΓÇ£AverageΓÇ¥ and ΓÇ£MaxΓÇ¥ will be very similar because only one node emits these metrics (the master node). In a scenario where the number of connected clients changes rapidly, ΓÇ£Max,ΓÇ¥ ΓÇ£Average,ΓÇ¥ and ΓÇ£MinΓÇ¥ would show very different values and this is also expected behavior.
+>
+> Generally, ΓÇ£AverageΓÇ¥ will show you a smooth chart of your desired metric and reacts well to changes in time granularity. ΓÇ£MaxΓÇ¥ and ΓÇ£MinΓÇ¥ may hide large changes in the metric if the time granularity is large but can be used with a small time granularity to help pinpoint exact times when large changes occur in the metric.
+>
+> ΓÇ£CountΓÇ¥ and ΓÇ£SumΓÇ¥ may be misleading for certain metrics (connected clients included).
+>
+> Hence, we suggested you to have a look at the Average metrics and not the Sum metrics.
+ > [!NOTE] > Even when the cache is idle with no connected active client applications, you may see some cache activity, such as connected clients, memory usage, and operations being performed. This activity is normal during the operation of an Azure Cache for Redis instance. >
Activity logs provide insight into the operations that were performed on your Az
To view activity logs for your cache, click **Activity logs** from the **Resource menu**.
-For more information about Activity logs, see [Overview of the Azure Activity Log](../azure-monitor/essentials/platform-logs-overview.md).
+For more information about Activity logs, see [Overview of the Azure Activity Log](../azure-monitor/essentials/platform-logs-overview.md).
azure-cache-for-redis Cache How To Premium Vnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-how-to-premium-vnet.md
Virtual network support is configured on the **New Azure Cache for Redis** pane
1. On the **Networking** tab, select **Virtual Networks** as your connectivity method. To use a new virtual network, create it first by following the steps in [Create a virtual network using the Azure portal](../virtual-network/manage-virtual-network.md#create-a-virtual-network) or [Create a virtual network (classic) by using the Azure portal](/previous-versions/azure/virtual-network/virtual-networks-create-vnet-classic-pportal). Then return to the **New Azure Cache for Redis** pane to create and configure your Premium-tier cache. > [!IMPORTANT]
- > When you deploy Azure Cache for Redis to a Resource Manager virtual network, the cache must be in a dedicated subnet that contains no other resources except for Azure Cache for Redis instances. If you attempt to deploy an Azure Cache for Redis instance to a Resource Manager virtual network subnet that contains other resources, the deployment fails.
+ > When you deploy Azure Cache for Redis to a Resource Manager virtual network, the cache must be in a dedicated subnet that contains no other resources except for Azure Cache for Redis instances. If you attempt to deploy an Azure Cache for Redis instance to a Resource Manager virtual network subnet that contains other resources, or has a NAT Gateway assigned, the deployment fails.
> >
The following list contains answers to commonly asked questions about Azure Cach
* [Can I use virtual networks with a standard or basic cache?](#can-i-use-virtual-networks-with-a-standard-or-basic-cache) * Why does creating an Azure Cache for Redis instance fail in some subnets but not others? * [What are the subnet address space requirements?](#what-are-the-subnet-address-space-requirements)
+* [Can I connect to my cache from a peered virtual network?](#can-i-connect-to-my-cache-from-a-peered-virtual-network)
* [Do all cache features work when a cache is hosted in a virtual network?](#do-all-cache-features-work-when-a-cache-is-hosted-in-a-virtual-network) ### What are some common misconfiguration issues with Azure Cache for Redis and virtual networks?
Azure reserves some IP addresses within each subnet, and these addresses can't b
In addition to the IP addresses used by the Azure virtual network infrastructure, each Azure Cache for Redis instance in the subnet uses two IP addresses per cluster shard, plus additional IP addresses for additional replicas, if any. One additional IP address is used for the load balancer. A nonclustered cache is considered to have one shard.
+### Can I connect to my cache from a peered virtual network?
+
+If the virtual networks are in the same region, you can connect them using virtual network peering or a VPN Gateway VNET-to-VNET connection.
+
+If the peered Azure virtual networks are in *different* regions, a client VM in region 1 will not be able to access the cache in region 2 via its load balanced IP address because of a constraint with basic load balancers, unless it is a cache with a standard load balancer, which is currently only a cache that was created with *availability zones*. For more information about virtual network peering constraints, see Virtual Network - Peering - Requirements and constraints. One solution is to use a VPN Gateway VNET-to-VNET connection instead of virtual network peering.
+ ### Do all cache features work when a cache is hosted in a virtual network? When your cache is part of a virtual network, only clients in the virtual network can access the cache. As a result, the following cache management features don't work at this time:
azure-cache-for-redis Cache How To Scale https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-how-to-scale.md
Last updated 02/08/2021
# Scale an Azure Cache for Redis instance+ Azure Cache for Redis has different cache offerings, which provide flexibility in the choice of cache size and features. For a Basic, Standard or Premium cache, you can change its size and tier after it's been created to keep up with your application needs. This article shows you how to scale your cache using the Azure portal, and tools such as Azure PowerShell, and Azure CLI. ## When to scale
-You can use the [monitoring](cache-how-to-monitor.md) features of Azure Cache for Redis to monitor the health and performance of your cache and help determine when to scale the cache.
+
+You can use the [monitoring](cache-how-to-monitor.md) features of Azure Cache for Redis to monitor the health and performance of your cache and help determine when to scale the cache.
You can monitor the following metrics to help determine if you need to scale.
You can monitor the following metrics to help determine if you need to scale.
* Network Bandwidth * CPU Usage
-If you determine that your cache is no longer meeting your application's requirements, you can scale to a larger or smaller cache pricing tier that is right for your application. For more information on determining which cache pricing tier to use, see [Choosing the right tier](cache-overview.md#choosing-the-right-tier).
+If you determine your cache is no longer meeting your application's requirements, you can scale to a larger or smaller cache pricing tier that is right for your application. For more information on determining which cache pricing tier to use, see [Choosing the right tier](cache-overview.md#choosing-the-right-tier).
## Scale a cache
-To scale your cache, [browse to the cache](cache-configure.md#configure-azure-cache-for-redis-settings) in the [Azure portal](https://portal.azure.com) and click **Scale** from the **Resource menu**.
+
+To scale your cache, [browse to the cache](cache-configure.md#configure-azure-cache-for-redis-settings) in the [Azure portal](https://portal.azure.com) and select **Scale** from the **Resource menu**.
![Scale](./media/cache-how-to-scale/redis-cache-scale-menu.png)
-Select the desired pricing tier from the **Select pricing tier** blade and click **Select**.
+On the left, select the pricing tier you want from **Select pricing tier** and **Select**.
![Pricing tier][redis-cache-pricing-tier-blade] - You can scale to a different pricing tier with the following restrictions: * You can't scale from a higher pricing tier to a lower pricing tier. * You can't scale from a **Premium** cache down to a **Standard** or a **Basic** cache. * You can't scale from a **Standard** cache down to a **Basic** cache.
-* You can scale from a **Basic** cache to a **Standard** cache but you can't change the size at the same time. If you need a different size, you can do a subsequent scaling operation to the desired size.
-* You can't scale from a **Basic** cache directly to a **Premium** cache. First, scale from **Basic** to **Standard** in one scaling operation, and then from **Standard** to **Premium** in a subsequent scaling operation.
+* You can scale from a **Basic** cache to a **Standard** cache but you can't change the size at the same time. If you need a different size, you can later do a scaling operation to the wanted size.
+* You can't scale from a **Basic** cache directly to a **Premium** cache. First, scale from **Basic** to **Standard** in one scaling operation, and then from **Standard** to **Premium** in the next scaling operation.
* You can't scale from a larger size down to the **C0 (250 MB)** size. However, you can scale down to any other size within the same pricing tier. For example, you can scale down from C5 Standard to C1 Standard.
-
-While the cache is scaling to the new pricing tier, a **Scaling** status is displayed in the **Azure Cache for Redis** blade.
+
+While the cache is scaling to the new pricing tier, a **Scaling** status is displayed on the left in the **Azure Cache for Redis**.
![Scaling][redis-cache-scaling] When scaling is complete, the status changes from **Scaling** to **Running**. ## How to automate a scaling operation
-In addition to scaling your cache instances in the Azure portal, you can scale using PowerShell cmdlets, Azure CLI, and by using the Microsoft Azure Management Libraries (MAML).
+
+You can scale your cache instances in the Azure portal. And, you can scale using PowerShell cmdlets, Azure CLI, and by using the Microsoft Azure Management Libraries (MAML).
* [Scale using PowerShell](#scale-using-powershell) * [Scale using Azure CLI](#scale-using-azure-cli)
In addition to scaling your cache instances in the Azure portal, you can scale u
[!INCLUDE [updated-for-az](../../includes/updated-for-az.md)]
-You can scale your Azure Cache for Redis instances with PowerShell by using the [Set-AzRedisCache](/powershell/module/az.rediscache/set-azrediscache) cmdlet when the `Size`, `Sku`, or `ShardCount` properties are modified. The following example shows how to scale a cache named `myCache` to a 2.5 GB cache.
+You can scale your Azure Cache for Redis instances with PowerShell by using the [Set-AzRedisCache](/powershell/module/az.rediscache/set-azrediscache) cmdlet when the `Size`, `Sku`, or `ShardCount` properties are modified. The following example shows how to scale a cache named `myCache` to a 2.5-GB cache.
```powershell Set-AzRedisCache -ResourceGroupName myGroup -Name myCache -Size 2.5GB
You can scale your Azure Cache for Redis instances with PowerShell by using the
For more information on scaling with PowerShell, see [To scale an Azure Cache for Redis using PowerShell](cache-how-to-manage-redis-cache-powershell.md#scale). ### Scale using Azure CLI
-To scale your Azure Cache for Redis instances using Azure CLI, call the `azure rediscache set` command and pass in the desired configuration changes that include a new size, sku, or cluster size, depending on the desired scaling operation.
+
+To scale your Azure Cache for Redis instances using Azure CLI, call the `azure rediscache set` command and pass in the configuration changes you want that include a new size, sku, or cluster size, depending on the scaling operation you wish.
For more information on scaling with Azure CLI, see [Change settings of an existing Azure Cache for Redis](cache-manage-cli.md#scale). ### Scale using MAML+ To scale your Azure Cache for Redis instances using the [Microsoft Azure Management Libraries (MAML)](https://azure.microsoft.com/updates/management-libraries-for-net-release-announcement/), call the `IRedisOperations.CreateOrUpdate` method and pass in the new size for the `RedisProperties.SKU.Capacity`. ```csharp
To scale your Azure Cache for Redis instances using the [Microsoft Azure Managem
For more information, see the [Manage Azure Cache for Redis using MAML](https://github.com/rustd/RedisSamples/tree/master/ManageCacheUsingMAML) sample. ## Scaling FAQ+ The following list contains answers to commonly asked questions about Azure Cache for Redis scaling. * [Can I scale to, from, or within a Premium cache?](#can-i-scale-to-from-or-within-a-premium-cache)
The following list contains answers to commonly asked questions about Azure Cach
* [How can I tell when scaling is complete?](#how-can-i-tell-when-scaling-is-complete) ### Can I scale to, from, or within a Premium cache?+ * You can't scale from a **Premium** cache down to a **Basic** or **Standard** pricing tier. * You can scale from one **Premium** cache pricing tier to another.
-* You can't scale from a **Basic** cache directly to a **Premium** cache. First, scale from **Basic** to **Standard** in one scaling operation, and then from **Standard** to **Premium** in a subsequent scaling operation.
+* You can't scale from a **Basic** cache directly to a **Premium** cache. First, scale from **Basic** to **Standard** in one scaling operation, and then from **Standard** to **Premium** in a later scaling operation.
* If you enabled clustering when you created your **Premium** cache, you can [change the cluster size](cache-how-to-premium-clustering.md#cluster-size). If your cache was created without clustering enabled, you can configure clustering at a later time. For more information, see [How to configure clustering for a Premium Azure Cache for Redis](cache-how-to-premium-clustering.md). ### After scaling, do I have to change my cache name or access keys?+ No, your cache name and keys are unchanged during a scaling operation. ### How does scaling work?+ * When a **Basic** cache is scaled to a different size, it is shut down and a new cache is provisioned using the new size. During this time, the cache is unavailable and all data in the cache is lost. * When a **Basic** cache is scaled to a **Standard** cache, a replica cache is provisioned and the data is copied from the primary cache to the replica cache. The cache remains available during the scaling process. * When a **Standard** cache is scaled to a different size or to a **Premium** cache, one of the replicas is shut down and reprovisioned to the new size and the data transferred over, and then the other replica performs a failover before it is reprovisioned, similar to the process that occurs during a failure of one of the cache nodes. ### Will I lose data from my cache during scaling?+ * When a **Basic** cache is scaled to a new size, all data is lost and the cache is unavailable during the scaling operation. * When a **Basic** cache is scaled to a **Standard** cache, the data in the cache is typically preserved.
-* When a **Standard** cache is scaled to a larger size or tier, or a **Premium** cache is scaled to a larger size, all data is typically preserved. When scaling a **Standard** or **Premium** cache down to a smaller size, data may be lost depending on how much data is in the cache related to the new size when it is scaled. If data is lost when scaling down, keys are evicted using the [allkeys-lru](https://redis.io/topics/lru-cache) eviction policy.
+* When a **Standard** cache is scaled to a larger size or tier, or a **Premium** cache is scaled to a larger size, all data is typically preserved. When scaling down a Standard or Premium cache to a smaller size, data may be lost depending on how much data is in the cache related to the new size when it is scaled. If data is lost when scaling down, keys are evicted using the [allkeys-lru](https://redis.io/topics/lru-cache) eviction policy.
### Is my custom databases setting affected during scaling?+ If you configured a custom value for the `databases` setting during cache creation, keep in mind that some pricing tiers have different [databases limits](cache-configure.md#databases). Here are some considerations when scaling in this scenario: * When scaling to a pricing tier with a lower `databases` limit than the current tier:
- * If you are using the default number of `databases`, which is 16 for all pricing tiers, no data is lost.
- * If you are using a custom number of `databases` that falls within the limits for the tier to which you are scaling, this `databases` setting is retained and no data is lost.
- * If you are using a custom number of `databases` that exceeds the limits of the new tier, the `databases` setting is lowered to the limits of the new tier and all data in the removed databases is lost.
-* When scaling to a pricing tier with the same or higher `databases` limit than the current tier, your `databases` setting is retained and no data is lost.
+ * If you're using the default number of `databases`, which is 16 for all pricing tiers, no data is lost.
+ * If you're using a custom number of `databases` that falls within the limits for the tier to which you're scaling, this `databases` setting is kept and no data is lost.
+ * If you're using a custom number of `databases` that exceeds the limits of the new tier, the `databases` setting is lowered to the limits of the new tier and all data in the removed databases is lost.
+* When scaling to a pricing tier with the same or higher `databases` limit than the current tier, your `databases` setting is kept and no data is lost.
-While Standard and Premium caches have a 99.9% SLA for availability, there is no SLA for data loss.
+While Standard and Premium caches have a 99.9% SLA for availability, there's no SLA for data loss.
### Will my cache be available during scaling?
-* **Standard** and **Premium** caches remain available during the scaling operation. However, connection blips can occur while scaling Standard and Premium caches, and also while scaling from Basic to Standard caches. These connection blips are expected to be small and redis clients should be able to re-establish their connection instantly.
-* **Basic** caches are offline during scaling operations to a different size. Basic caches remain available when scaling from **Basic** to **Standard** but, may experience a small connection blip. If a connection blip occurs, redis clients should be able to re-establish their connection instantly.
+* **Standard** and **Premium** caches remain available during the scaling operation. However, connection blips can occur while scaling Standard and Premium caches, and also while scaling from Basic to Standard caches. These connection blips are expected to be small and redis clients can generally re-establish their connection instantly.
+* **Basic** caches are offline during scaling operations to a different size. Basic caches remain available when scaling from **Basic** to **Standard** but, may experience a small connection blip. If a connection blip occurs, redis clients can generally re-establish their connection instantly.
### Scaling limitations with Geo-replication
-Once you have added a Geo-replication link between two caches, you will no longer be able to initiate a scaling operation or change the number of shards in a cluster. You must unlink the cache to issue these commands. For more information, see [Configure Geo-replication](cache-how-to-geo-replication.md).
-
+Once you have added a Geo-replication link between two caches, you can no longer start a scaling operation or change the number of shards in a cluster. You must unlink the cache to issue these commands. For more information, see [Configure Geo-replication](cache-how-to-geo-replication.md).
### Operations that are not supported+ * You can't scale from a higher pricing tier to a lower pricing tier. * You can't scale from a **Premium** cache down to a **Standard** or a **Basic** cache. * You can't scale from a **Standard** cache down to a **Basic** cache.
-* You can scale from a **Basic** cache to a **Standard** cache but you can't change the size at the same time. If you need a different size, you can do a subsequent scaling operation to the desired size.
-* You can't scale from a **Basic** cache directly to a **Premium** cache. First scale from **Basic** to **Standard** in one scaling operation, and then scale from **Standard** to **Premium** in a subsequent operation.
+* You can scale from a **Basic** cache to a **Standard** cache but you can't change the size at the same time. If you need a different size, you can do a scaling operation to the size you want at a later time.
+* You can't scale from a **Basic** cache directly to a **Premium** cache. First scale from **Basic** to **Standard** in one scaling operation, and then scale from **Standard** to **Premium** in an operation later.
* You can't scale from a larger size down to the **C0 (250 MB)** size. If a scaling operation fails, the service tries to revert the operation, and the cache will revert to the original size. - ### How long does scaling take?+ Scaling time depends on how much data is in the cache, with larger amounts of data taking a longer time to complete. Scaling takes approximately 20 minutes. For clustered caches, scaling takes approximately 20 minutes per shard. ### How can I tell when scaling is complete?+ In the Azure portal, you can see the scaling operation in progress. When scaling is complete, the status of the cache changes to **Running**. <!-- IMAGES -->
azure-cache-for-redis Cache Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-overview.md
Last updated 02/08/2021
# About Azure Cache for Redis
-Azure Cache for Redis provides an in-memory data store based on the [Redis](https://redis.io/) software. Redis improves the performance and scalability of an application that uses on backend data stores heavily. It is able to process large volumes of application request by keeping frequently accessed data in the server memory that can be written to and read from quickly. Redis brings a critical low-latency and high-throughput data storage solution to modern applications.
-Azure Cache for Redis offers both the Redis open-source (OSS Redis) and a commercial product from Redis Labs (Redis Enterprise) as a managed service. It provides secure and dedicated Redis server instances and full Redis API compatibility. The service is operated by Microsoft, hosted on Azure, and accessible to any application within or outside of Azure.
+Azure Cache for Redis provides an in-memory data store based on the [Redis](https://redis.io/) software. Redis improves the performance and scalability of an application that uses backend data stores heavily. It's able to process large volumes of application requests by keeping frequently accessed data in the server memory, which can be written to and read from quickly. Redis brings a critical low-latency and high-throughput data storage solution to modern applications.
-Azure Cache for Redis can be used as a distributed data or content cache, a session store, a message broker, and more. It can be deployed as a standalone or along side with other Azure database services, such as Azure SQL or Cosmos DB.
+Azure Cache for Redis offers both the Redis open-source (OSS Redis) and a commercial product from Redis Labs (Redis Enterprise) as a managed service. It provides secure and dedicated Redis server instances and full Redis API compatibility. The service is operated by Microsoft, hosted on Azure, and usable by any application within or outside of Azure.
+
+Azure Cache for Redis can be used as a distributed data or content cache, a session store, a message broker, and more. It can be deployed as a standalone. Or, it can be deployed along with other Azure database services, such as Azure SQL or Cosmos DB.
## Key scenarios
-Azure Cache for Redis improves application performance by supporting common application architecture patterns. Some of the most common include the following:
+
+Azure Cache for Redis improves application performance by supporting common application architecture patterns. Some of the most common include the following patterns:
| Pattern | Description | | | -- |
-| [Data cache](cache-web-app-cache-aside-leaderboard.md) | Databases are often too large to load directly into a cache. It is common to use the [cache-aside](/azure/architecture/patterns/cache-aside) pattern to load data into the cache only as needed. When the system makes changes to the data, the system can also update the cache, which is then distributed to other clients. Additionally, the system can set an expiration on data, or use an eviction policy to trigger data updates into the cache.|
+| [Data cache](cache-web-app-cache-aside-leaderboard.md) | Databases are often too large to load directly into a cache. It's common to use the [cache-aside](/azure/architecture/patterns/cache-aside) pattern to load data into the cache only as needed. When the system makes changes to the data, the system can also update the cache, which is then distributed to other clients. Additionally, the system can set an expiration on data, or use an eviction policy to trigger data updates into the cache.|
| [Content cache](cache-aspnet-output-cache-provider.md) | Many web pages are generated from templates that use static content such as headers, footers, banners. These static items shouldn't change often. Using an in-memory cache provides quick access to static content compared to backend datastores. This pattern reduces processing time and server load, allowing web servers to be more responsive. It can allow you to reduce the number of servers needed to handle loads. Azure Cache for Redis provides the Redis Output Cache Provider to support this pattern with ASP.NET.|
-| [Session store](cache-aspnet-session-state-provider.md) | This pattern is commonly used with shopping carts and other user history data that a web application may want to associate with user cookies. Storing too much in a cookie can have a negative impact on performance as the cookie size grows and is passed and validated with every request. A typical solution uses the cookie as a key to query the data in a database. Using an in-memory cache, like Azure Cache for Redis, to associate information with a user is much faster than interacting with a full relational database. |
+| [Session store](cache-aspnet-session-state-provider.md) | This pattern is commonly used with shopping carts and other user history data that a web application might associate with user cookies. Storing too much in a cookie can have a negative effect on performance as the cookie size grows and is passed and validated with every request. A typical solution uses the cookie as a key to query the data in a database. Using an in-memory cache, like Azure Cache for Redis, to associate information with a user is much faster than interacting with a full relational database. |
| Job and message queuing | Applications often add tasks to a queue when the operations associated with the request take time to execute. Longer running operations are queued to be processed in sequence, often by another server. This method of deferring work is called task queuing. Azure Cache for Redis provides a distributed queue to enable this pattern in your application.| | Distributed transactions | Applications sometimes require a series of commands against a backend data-store to execute as a single atomic operation. All commands must succeed, or all must be rolled back to the initial state. Azure Cache for Redis supports executing a batch of commands as a single [transaction](https://redis.io/topics/transactions). | ## Redis versions
-Azure Cache for Redis supports OSS Redis version 4.x and, as a preview, 6.0. We've made the decision to skip Redis 5.0 to bring you the latest version. Previously, Azure Cache for Redis only maintained a single Redis version. It will provide a newer major release upgrade and at least one older stable version going forward. You can [choose which version](cache-how-to-version.md) works the best for your application.
+Azure Cache for Redis supports OSS Redis version 4.x and, as a preview, 6.0. We've made the decision to skip Redis 5.0 to bring you the latest version. Previously, Azure Cache for Redis maintained a single Redis version. In the future, it will provide a newer major release upgrade and at least one older stable version. You can [choose which version](cache-how-to-version.md) works the best for your application.
## Service tiers
-Azure Cache for Redis is available in the following tiers:
+
+Azure Cache for Redis is available in these tiers:
| Tier | Description | ||| | Basic | An OSS Redis cache running on a single VM. This tier has no service-level agreement (SLA) and is ideal for development/test and non-critical workloads. | | Standard | An OSS Redis cache running on two VMs in a replicated configuration. |
-| Premium | High-performance OSS Redis caches. This tier offers higher throughput, lower latency, better availability, and more features. Premium caches are deployed on more powerful VMs compared to those for Basic or Standard caches. |
-| Enterprise | High-performance caches powered by Redis Labs' Redis Enterprise software. This tier supports Redis modules including RediSearch, RedisBloom, and RedisTimeSeries. In addition, it offers even higher availability than the Premium tier. |
+| Premium | High-performance OSS Redis caches. This tier offers higher throughput, lower latency, better availability, and more features. Premium caches are deployed on more powerful VMs compared to the VMs for Basic or Standard caches. |
+| Enterprise | High-performance caches powered by Redis Labs' Redis Enterprise software. This tier supports Redis modules including RediSearch, RedisBloom, and RedisTimeSeries. Also, it offers even higher availability than the Premium tier. |
| Enterprise Flash | Cost-effective large caches powered by Redis Labs' Redis Enterprise software. This tier extends Redis data storage to non-volatile memory, which is cheaper than DRAM, on a VM. It reduces the overall per-GB memory cost. | ### Feature comparison+ The [Azure Cache for Redis Pricing](https://azure.microsoft.com/pricing/details/cache/) provides a detailed comparison of each tier. The following table helps describe some of the features supported by tier: | Feature Description | Basic | Standard | Premium | Enterprise | Enterprise Flash |
The [Azure Cache for Redis Pricing](https://azure.microsoft.com/pricing/details/
| [Scheduled updates](cache-administration.md#schedule-updates) |Γ£ö|Γ£ö|Γ£ö|-|-| ### Choosing the right tier
-You should consider the following when choosing an Azure Cache for Redis tier:
+
+Consider the following options when choosing an Azure Cache for Redis tier:
* **Memory**: The Basic and Standard tiers offer 250 MB ΓÇô 53 GB; the Premium tier 6 GB - 1.2 TB; the Enterprise tiers 12 GB - 14 TB. To create a Premium tier cache larger than 120 GB, you can use Redis OSS clustering. For more information, see [Azure Cache for Redis Pricing](https://azure.microsoft.com/pricing/details/cache/). For more information, see [How to configure clustering for a Premium Azure Cache for Redis](cache-how-to-premium-clustering.md). * **Performance**: Caches in the Premium and Enterprise tiers are deployed on hardware that has faster processors, giving better performance compared to the Basic or Standard tier. Premium tier Caches have higher throughput and lower latencies. For more information, see [Azure Cache for Redis performance](cache-planning-faq.md#azure-cache-for-redis-performance).
-* **Dedicated core for Redis server**: All caches except C0 run dedicated VM cores. Redis, by design, uses only one thread for command processing. Azure Cache for Redis utilizes additional cores for I/O processing. Having more cores improves throughput performance even though it may not produce linear scaling. Furthermore, larger VM sizes typically come with higher bandwidth limits than smaller ones. That helps you avoid network saturation, which will cause timeouts in your application.
+* **Dedicated core for Redis server**: All caches except C0 run dedicated VM cores. Redis, by design, uses only one thread for command processing. Azure Cache for Redis uses other cores for I/O processing. Having more cores improves throughput performance even though it may not produce linear scaling. Furthermore, larger VM sizes typically come with higher bandwidth limits than smaller ones. That helps you avoid network saturation, which will cause timeouts in your application.
* **Network performance**: If you have a workload that requires high throughput, the Premium or Enterprise tier offers more bandwidth compared to Basic or Standard. Also within each tier, larger size caches have more bandwidth because of the underlying VM that hosts the cache. For more information, see [Azure Cache for Redis performance](cache-planning-faq.md#azure-cache-for-redis-performance).
-* **Maximum number of client connections**: The Premium and Enterprise tiers offer the maximum numbers of clients that can connect to Redis, with higher numbers of connections for larger sized caches. Clustering increases the total amount of network bandwidth available for a clustered cache.
-* **High availability**: Azure Cache for Redis provides multiple [high availability](cache-high-availability.md) options. It guarantees that a Standard, Premium, or Enterprise cache is available according to our [SLA](https://azure.microsoft.com/support/legal/sla/cache/v1_0/). The SLA only covers connectivity to the cache endpoints. The SLA does not cover protection from data loss. We recommend using the Redis data persistence feature in the Premium and Enterprise tiers to increase resiliency against data loss.
+* **Maximum number of client connections**: The Premium and Enterprise tiers offer the maximum numbers of clients that can connect to Redis, offering higher numbers of connections for larger sized caches. Clustering increases the total amount of network bandwidth available for a clustered cache.
+* **High availability**: Azure Cache for Redis provides multiple [high availability](cache-high-availability.md) options. It guarantees that a Standard, Premium, or Enterprise cache is available according to our [SLA](https://azure.microsoft.com/support/legal/sla/cache/v1_0/). The SLA only covers connectivity to the cache endpoints. The SLA doesn't cover protection from data loss. We recommend using the Redis data persistence feature in the Premium and Enterprise tiers to increase resiliency against data loss.
* **Data persistence**: The Premium and Enterprise tiers allow you to persist the cache data to an Azure Storage account and a Managed Disk respectively. Underlying infrastructure issues might result in potential data loss. We recommend using the Redis data persistence feature in these tiers to increase resiliency against data loss. Azure Cache for Redis offers both RDB and AOF (preview) options. Data persistence can be enabled through Azure portal and CLI. For the Premium tier, see [How to configure persistence for a Premium Azure Cache for Redis](cache-how-to-premium-persistence.md). * **Network isolation**: Azure Private Link and Virtual Network (VNET) deployments provide enhanced security and traffic isolation for your Azure Cache for Redis. VNET allows you to further restrict access through network access control policies. For more information, see [Azure Cache for Redis with Azure Private Link](cache-private-link.md) and [How to configure Virtual Network support for a Premium Azure Cache for Redis](cache-how-to-premium-vnet.md). * **Redis Modules**: Enterprise tiers support [RediSearch](https://docs.redislabs.com/latest/modules/redisearch/), [RedisBloom](https://docs.redislabs.com/latest/modules/redisbloom/) and [RedisTimeSeries](https://docs.redislabs.com/latest/modules/redistimeseries/). These modules add new data types and functionality to Redis.
-You can scale your cache from the Basic tier up to Premium after it has been created. Scaling down to a lower tier is not supported currently. For step-by-step scaling instructions, see [How to Scale Azure Cache for Redis](cache-how-to-scale.md) and [How to automate a scaling operation](cache-how-to-scale.md#how-to-automate-a-scaling-operation).
+You can scale your cache from the Basic tier up to Premium after it has been created. Scaling down to a lower tier isn't supported currently. For step-by-step scaling instructions, see [How to Scale Azure Cache for Redis](cache-how-to-scale.md) and [How to automate a scaling operation](cache-how-to-scale.md#how-to-automate-a-scaling-operation).
### Special considerations for Enterprise tiers
-The Enterprise tiers rely on Redis Enterprise, a commercial variant of Redis from Redis Labs. Customers will obtain and pay for a license to this software through an Azure Marketplace offer. Azure Cache for Redis will facilitate the license acquisition so that you won't have to do it separately. To purchase in the Azure Marketplace, you must have the following prerequisites:
-* Your Azure subscription has a valid payment instrument. Azure credits or free MSDN subscriptions are not supported.
+The Enterprise tiers rely on Redis Enterprise, a commercial variant of Redis from Redis Labs. Customers obtain and pay for a license to this software through an Azure Marketplace offer. Azure Cache for Redis manages the license acquisition so that you won't have to do it separately. To purchase in the Azure Marketplace, you must have the following prerequisites:
+* Your Azure subscription has a valid payment instrument. Azure credits or free MSDN subscriptions aren't supported.
* Your organization allows [Azure Marketplace purchases](../cost-management-billing/manage/ea-azure-marketplace.md#enabling-azure-marketplace-purchases). * If you use a private Marketplace, it must contain the Redis Labs Enterprise offer. > [!IMPORTANT]
-> Azure Cache for Redis Enterprise cache require standard network Load Balancers that are charged
-> separately from cache instances themselves. Refer to the [Load Balancer pricing](https://azure.microsoft.com/pricing/details/load-balancer/)
-> for more details. If an Enterprise cache is configured for multiple Availability Zones, data
+> Azure Cache for Redis Enterprise requires standard network Load Balancers that are charged
+> separately from cache instances themselves. For more information, see [Load Balancer pricing](https://azure.microsoft.com/pricing/details/load-balancer/).
+> If an Enterprise cache is configured for multiple Availability Zones, data
> transfer will be billed at the [standard network bandwidth rates](https://azure.microsoft.com/pricing/details/bandwidth/) > starting from July 1, 2021. >
The Enterprise tiers rely on Redis Enterprise, a commercial variant of Redis fro
> ## Next steps+ * [Create an open-source Redis cache](quickstart-create-redis.md) * [Create a Redis Enterprise cache](quickstart-create-redis-enterprise.md) * [Use Azure Cache for Redis in an ASP.NET web app](cache-web-app-howto.md)
azure-cache-for-redis Cache Reserved Pricing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-reserved-pricing.md
Last updated 02/20/2020
# Prepay for Azure Cache for Redis compute resources with reserved capacity
-Azure Cache for Redis now helps you save money by prepaying for compute resources compared to pay-as-you-go prices. With Azure Cache for Redis reserved capacity, you make an upfront commitment on cache for a one or three year period to get a significant discount on the compute costs. To purchase Azure Cache for Redis reserved capacity, you need to specify the Azure region, service tier, and term.
+Azure Cache for Redis now helps you save money by prepaying for compute resources compared to pay-as-you-go prices. With Azure Cache for Redis reserved capacity, you make an upfront commitment on cache for one or three years to get a significant discount on the compute costs. To purchase Azure Cache for Redis reserved capacity, you need to specify the Azure region, service tier, and term.
-You do not need to assign the reservation to specific Azure Cache for Redis instances. An already running Azure Cache for Redis or ones that are newly deployed will automatically get the benefit of reserved pricing, up to the reserved cache size. By purchasing a reservation, you are pre-paying for the compute costs for a period of one or three years. As soon as you buy a reservation, the Azure Cache for Redis compute charges that match the reservation attributes are no longer charged at the pay-as-you go rates. A reservation does not cover networking or storage charges associated with the cache. At the end of the reservation term, the billing benefit expires and the Azure Cache for Redis is billed at the pay-as-you go price. Reservations do not auto-renew. For pricing information, see the [Azure Cache for Redis reserved capacity offering](https://azure.microsoft.com/pricing/details/cache).
+You do not need to assign the reservation to specific Azure Cache for Redis instances. An already running Azure Cache for Redis or ones that are newly deployed will automatically get the benefit of reserved pricing, up to the reserved cache size. By purchasing a reservation, you are pre-paying for the compute costs for one or three years. As soon as you buy a reservation, the Azure Cache for Redis compute charges that match the reservation attributes are no longer charged at the pay-as-you go rates. A reservation does not cover networking or storage charges associated with the cache. At the end of the reservation term, the billing benefit expires and the Azure Cache for Redis is billed at the pay-as-you go price. Reservations do not autorenew. For pricing information, see the [Azure Cache for Redis reserved capacity offering](https://azure.microsoft.com/pricing/details/cache).
You can buy Azure Cache for Redis reserved capacity in the [Azure portal](https://portal.azure.com/). To buy the reserved capacity:
For the details on how enterprise customers and Pay-As-You-Go customers are char
## Determine the right cache size before purchase
-The size of reservation should be based on the total amount of memory size used by the existing or soon-to-be-deployed cache within a specific region and using the same service tier.
+The size of reservation should be based on the total amount of memory size that is used by the existing or soon-to-be-deployed cache within a specific region, and using the same service tier.
-For example, let's suppose that you are running two caches - one at 13 GB and the other at 26 GB. You'll need both for at least one year. Further, let's suppose that you plan to scale the existing 13 GB caches to 26 GB for a month to meet your seasonal demand, and then scale back. In this case, you can purchase either 1 P2 cache and 1 P3 cache or 3 P2 caches on a one-year reservation to maximize savings. You'll receive discount on the total amount of cache memory you reserve, independent of how that amount is allocated across your caches.
+For example, let's suppose that you're running two caches - one at 13 GB and the other at 26 GB. You'll need both for at least one year. Further, let's suppose that you plan to scale the existing 13-GB caches to 26 GB for a month to meet your seasonal demand, and then scale back. In this case, you can purchase either one P2-cache and one P3-cache or three P2-caches on a one-year reservation to maximize savings. You'll receive discount on the total amount of cache memory you reserve, independent of how that amount is allocated across your caches.
## Buy Azure Cache for Redis reserved capacity
You can buy a reserved VM instance in the [Azure portal](https://portal.azure.co
1. Sign in to the [Azure portal](https://portal.azure.com/). 2. Select **All services** > **Reservations**. 3. Select **Add** and then in the Purchase reservations pane, select **Azure Cache for Redis** to purchase a new reservation for your caches.
-4. Fill-in the required fields. Existing or new databases that match the attributes you select qualify to get the reserved capacity discount. The actual number of your Azure Cache for Redis instances that get the discount depend on the scope and quantity selected.
+4. Fill in the required fields. Existing or new databases that match the attributes you select qualify to get the reserved capacity discount. The actual number of your Azure Cache for Redis instances that get the discount depend on the scope and quantity selected.
![Overview of reserved pricing](media/cache-reserved-pricing/cache-reserved-price.png)
The following table describes required fields.
| Region | The Azure region thatΓÇÖs covered by the Azure Cache for Redis reserved capacity reservation. | Pricing tier | The service tier for the Azure Cache for Redis servers. | Term | One year or three years
-| Quantity | The amount of compute resources being purchased within the Azure Cache for Redis reserved capacity reservation. The quantity is a number of caches in the selected Azure region and service tier that are being reserved and will get the billing discount. For example, if you are running or planning to run an Azure Cache for Redis servers with the total cache capacity of 26 GB in the East US region, then you would specify quantity that gives you the equivalent of 26 GB to maximize the benefit for all caches. This could be 1 P3 or 2 P2 caches.
+| Quantity | The amount of compute resources being purchased within the Azure Cache for Redis reserved capacity reservation. The quantity is a number of caches in the selected Azure region and service tier that are being reserved and will get the billing discount. For example, if you are running or planning to run an Azure Cache for Redis servers with the total cache capacity of 26 GB in the East US region, then you would specify a quantity that gives you the equivalent of 26 GB to maximize the benefit for all caches. The quantity could be one P3-cache or two P2-caches.
## Cancel, exchange, or refund reservations
azure-functions Dotnet Isolated Process Developer Howtos https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/dotnet-isolated-process-developer-howtos.md
Title: Develop and publish .NET 5 functions using Azure Functions description: Learn how to create and debug C# functions using .NET 5.0, then deploy the local project to serverless hosting in Azure Functions. Previously updated : 03/03/2021 Last updated : 05/03/2021 recommendations: false #Customer intent: As a developer, I need to know how to create functions that run in an isolated process so that I can run my function code on current (not LTS) releases of .NET.
This article shows you how to work with C# functions using .NET 5.0, which run o
If you don't need to support .NET 5.0 or run your functions out-of-process, you might want to instead [create a C# class library function](functions-create-your-first-function-visual-studio.md).
->[!NOTE]
->Developing .NET isolated process functions in the Azure portal isn't currently supported. You must use either the Azure CLI or Visual Studio Code publishing to create a function app in Azure that supports running .NET 5.0 apps out-of-process.
- ## Prerequisites + An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio). + [.NET 5.0 SDK](https://dotnet.microsoft.com/download) + [Azure Functions Core Tools](functions-run-local.md#v2) version 3.0.3381, or a later version. + [Azure CLI](/cli/azure/install-azure-cli) version 2.20, or a later version. + [Visual Studio Code](https://code.visualstudio.com/) on one of the [supported platforms](https://code.visualstudio.com/docs/supporting/requirements#_platforms). + The [C# extension](https://marketplace.visualstudio.com/items?itemName=ms-dotnettools.csharp) for Visual Studio Code. + The [Azure Functions extension](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.vscode-azurefunctions) for Visual Studio Code, version 1.3.0 or newer.
-+ [Visual Studio 2019](https://azure.microsoft.com/downloads/), including the **Azure development** workload.
-.NET isolated function project templates and publishing aren't currently available in Visual Studio.
++ [Visual Studio 2019](https://azure.microsoft.com/downloads/) version 16.10 or later. Your install must include either the **Azure development** or the **ASP.NET and web development** workload. ## Create a local function project In Azure Functions, a function project is a container for one or more individual functions that each responds to a specific trigger. All functions in a project share the same local and hosting configurations. In this section, you create a function project that contains a single function. -
->[!NOTE]
-> At this time, there are no Visual Studio project templates that support creating .NET isolated function projects. This article shows you how to use Core Tools to create your C# project, which you can then run locally and debug in Visual Studio.
-- ::: zone pivot="development-environment-vscode" 1. Choose the Azure icon in the Activity bar, then in the **Azure: Functions** area, select the **Create new project...** icon.
In Azure Functions, a function project is a container for one or more individual
1. Using this information, Visual Studio Code generates an Azure Functions project with an HTTP trigger. You can view the local project files in the Explorer. To learn more about files that are created, see [Generated project files](functions-develop-vs-code.md#generated-project-files). ::: zone-end 1. Run the `func init` command, as follows, to create a functions project in a folder named *LocalFunctionProj*:
In Azure Functions, a function project is a container for one or more individual
`func new` creates an HttpExample.cs code file. ::: zone-end
+1. From the Visual Studio menu, select **File** > **New** > **Project**.
+1. In **Create a new project**, enter *functions* in the search box, choose the **Azure Functions** template, and then select **Next**.
+1. In **Configure your new project**, enter a **Project name** for your project, and then select **Create**. The function app name must be valid as a C# namespace, so don't use underscores, hyphens, or any other nonalphanumeric characters.
+1. For the **Create a new Azure Functions application** settings, use the values in the following table:
+ | Setting | Value | Description |
+ | | - |-- |
+ | **.NET version** | **.NET 5 (Isolated)** | This value creates a function project that runs on .NET 5.0 in an isolated process. |
+ | **Function template** | **HTTP trigger** | This value creates a function triggered by an HTTP request. |
+ | **Storage account (AzureWebJobsStorage)** | **Storage emulator** | Because a function app in Azure requires a storage account, one is assigned or created when you publish your project to Azure. An HTTP trigger doesn't use an Azure Storage account connection string; all other trigger types require a valid Azure Storage account connection string. |
+ | **Authorization level** | **Anonymous** | The created function can be triggered by any client without providing a key. This authorization setting makes it easy to test your new function. For more information about keys and authorization, see [Authorization keys](functions-bindings-http-webhook-trigger.md#authorization-keys) and [HTTP and webhook bindings](functions-bindings-http-webhook.md). |
+
+
+ ![Azure Functions project settings](./media/dotnet-isolated-process-developer-howtos/functions-project-settings.png)
-## Run the function locally
+ Make sure you set the **Authorization level** to **Anonymous**. If you choose the default level of **Function**, you're required to present the [function key](functions-bindings-http-webhook-trigger.md#authorization-keys) in requests to access your function endpoint.
-At this point, you can run the `func start` command from the root of your project folder to compile and run the C# isolated functions project. Currently, if you want to debug your out-of-process function code in Visual Studio, you need to manually attach a debugger to the running Functions runtime process by using the following steps:
+1. Select **Create** to create the function project and HTTP trigger function.
-1. Open the project file (.csproj) in Visual Studio. You can review and modify your project code and set any desired break points in the code.
+Visual Studio creates a project and class that contains boilerplate code for the HTTP trigger function type. The boilerplate code sends a "Welcome to Azure Functions!" HTTP response. The `HttpTrigger` attribute specifies that the function is triggered by an HTTP request.
-1. From the root project folder, use the following command from the terminal or a command prompt to start the runtime host:
+## Rename the function
- ```console
- func start --dotnet-isolated-debug
- ```
+The `FunctionName` method attribute sets the name of the function, which by default is generated as `Function1`. Since the tooling doesn't let you override the default function name when you create your project, take a minute to create a better name for the function class, file, and metadata.
- The `--dotnet-isolated-debug` option tells the process to wait for a debugger to attach before continuing. Towards the end of the output, you should see something like the following lines:
-
- <pre>
- ...
-
- Functions:
+1. In **File Explorer**, right-click the Function1.cs file and rename it to `HttpExample.cs`.
- HttpExample: [GET,POST] http://localhost:7071/api/HttpExample
+1. In the code, rename the Function1 class to `HttpExample`.
- For detailed output, run func with --verbose flag.
- [2021-03-09T08:41:41.904Z] Azure Functions .NET Worker (PID: 81720) initialized in debug mode. Waiting for debugger to attach...
- ...
-
- </pre>
+1. In the `HttpTrigger` method named `Run`, rename the `FunctionName` method attribute to `HttpExample` and the value passed to the `GetLogger` method.
+
+Your function definition should now look like the following code:
- The `PID: XXXXXX` indicates the process ID (PID) of the dotnet.exe process that is the running Functions host.
-1. In the Azure Functions runtime output, make a note of the process ID of the host process, to which you'll attach a debugger. Also note the URL of your local function.
+Now that you've renamed the function, you can test it on your local computer.
-1. From the **Debug** menu in Visual Studio, select **Attach to Process...**, locate the process that matches the process ID, and select **Attach**.
-
- :::image type="content" source="media/dotnet-isolated-process-developer-howtos/attach-to-process.png" alt-text="Attach the debugger to the Functions host process":::
+## Run the function locally
- With the debugger attached you can debug your function code as normal.
+Visual Studio integrates with Azure Functions Core Tools so that you can test your functions locally using the full Azure Functions runtime.
-1. Into your browser's address bar, type your local function URL, which looks like the following, and run the request.
+1. To run your function, press <kbd>F5</kbd> in Visual Studio. You might need to enable a firewall exception so that the tools can handle HTTP requests. Authorization levels are never enforced when you run a function locally.
- `http://localhost:7071/api/HttpExample`
+1. Copy the URL of your function from the Azure Functions runtime output and run the request. A welcome to Functions message is displayed when the function runs successfully and logs are written to the runtime output.
- You should see trace output from the request written to the running terminal. Code execution stops at any break points you set in your function code.
+1. To stop debugging, press <kbd>Shift</kbd>+<kbd>F5</kbd> in Visual Studio.
-1. When you're done, go to the terminal and press Ctrl + C to stop the host process.
-
After you've verified that the function runs correctly on your local computer, it's time to publish the project to Azure.
-> [!NOTE]
-> Visual Studio publishing isn't currently available for .NET isolated process apps. After you've finished developing your project in Visual Studio, you must use the Azure CLI to create the remote Azure resources. Then, you can again use Azure Functions Core Tools from the command line to publish your project to Azure.
::: zone-end ++++ ## Create supporting Azure resources for your function Before you can deploy your function code to Azure, you need to create three resources:
Use the following commands to create these items.
::: zone-end +
+## Publish the project to Azure
+
+Before you can publish your project, you must have a function app in your Azure subscription. Visual Studio publishing creates a function app for you the first time you publish your project.
++
+## Verify your function in Azure
+
+1. In Cloud Explorer, your new function app should be selected. If not, expand your subscription > **App Services**, and select your new function app.
+
+1. Right-click the function app and choose **Open in Browser**. This opens the root of your function app in your default web browser and displays the page that indicates your function app is running.
+
+ :::image type="content" source="media/functions-create-your-first-function-visual-studio/function-app-running-azure.png" alt-text="Function app running":::
+
+1. In the address bar in the browser, append the path `/api/HttpExample` to the base URL and run the request.
+
+1. Go to this URL and you see the same response in the browser you had when running locally.
++ ::: zone pivot="development-environment-vscode" [!INCLUDE [functions-sign-in-vs-code](../../includes/functions-sign-in-vs-code.md)]
Use the following steps to delete the function app and its related resources to
[!INCLUDE [functions-cleanup-resources-vs-code-inner.md](../../includes/functions-cleanup-resources-vs-code-inner.md)] ::: zone-end ::: zone pivot="development-environment-vs"
-Use the following steps to delete the function app and its related resources to avoid incurring any further costs.
-
-1. In the Cloud Explorer, expand your subscription > **App Services**, right-click your function app, and choose **Open in Portal**.
-
-1. In the function app page, select the **Overview** tab and then select the link under **Resource group**.
-
- :::image type="content" source="media/functions-create-your-first-function-visual-studio/functions-app-delete-resource-group.png" alt-text="Select the resource group to delete from the function app page":::
-
-2. In the **Resource group** page, review the list of included resources, and verify that they're the ones you want to delete.
-
-3. Select **Delete resource group**, and follow the instructions.
-
- Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You can also select the bell icon at the top of the page to view the notification.
::: zone-end ## Next steps
azure-functions Functions Add Output Binding Storage Queue Vs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-add-output-binding-storage-queue-vs.md
Title: Connect functions to Azure Storage using Visual Studio description: Learn how to add an output binding to connect your C# class library functions to an Azure Storage queue using Visual Studio. Previously updated : 07/22/2019 Last updated : 05/30/2021 #Customer intent: As an Azure Functions developer, I want to connect my C# class library function to Azure Storage so that I can easily write data to a storage queue.
In the [previous quickstart article](./create-first-function-vs-code-csharp.md),
1. In **Solution Explorer**, right-click the project and select **Publish**.
-1. Under **Actions**, select **Edit Azure App Service Settings**.
+1. In the **Publish** tab under **Hosting**, expand the three dots (**...**) and select **Manage Azure App Service settings**.
![Edit the application settings](media/functions-add-output-binding-storage-queue-vs/edit-app-settings.png)
Next, you should enable Application Insights monitoring for your function app:
> [Enable Application Insights integration](configure-monitoring.md#add-to-an-existing-function-app) [Azure Storage Explorer]: https://storageexplorer.com/
-[previous quickstart article]: functions-create-your-first-function-visual-studio.md
+[previous quickstart article]: functions-create-your-first-function-visual-studio.md
azure-functions Functions Create Your First Function Visual Studio Uiex https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-create-your-first-function-visual-studio-uiex.md
Completing this quickstart incurs a small cost of a few USD cents or less in you
+ Install [Visual Studio 2019](https://azure.microsoft.com/downloads/) and select the **Azure development** workload during installation.
-![Install Visual Studio with the Azure development workload](media/functions-create-your-first-function-visual-studio/functions-vs-workloads.png)
- <br/> <details> <summary><strong>Use an Azure Functions project instead</strong></summary>
The `FunctionName` method attribute sets the name of the function, which by defa
## 7. Clean up resources
-Delete the function app and its resources to avoid incurring any further costs.
-
-1. In the Cloud Explorer, expand your subscription, expand **App Services**, right-click your function app, and choose **Open in Portal**.
-
-1. In the function app page, select the **Overview** tab and then select the link under **Resource group**.
-
- :::image type="content" source="media/functions-create-your-first-function-visual-studio/functions-app-delete-resource-group.png" alt-text="Select the resource group to delete from the function app page":::
-
-1. In the **Resource group** page, review the list of included resources, and verify that they're the ones you want to delete.
-
-1. Select **Delete resource group**, and follow the instructions.
-
- Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You can also select the bell icon at the top of the page to view the notification.
## Next steps
azure-functions Functions Create Your First Function Visual Studio https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-create-your-first-function-visual-studio.md
The `FunctionName` method attribute sets the name of the function, which by defa
1. In the code, rename the Function1 class to `HttpExample`.
-1. In the `HttpTrigger` method named `Run`, rename the `FunctionName` method attribute to `HttpExample`.
+1. In the `HttpTrigger` method named `Run`, rename the `FunctionName` method attribute to `HttpExample`.
Your function definition should now look like the following code:
Before you can publish your project, you must have a function app in your Azure
[!INCLUDE [Publish the project to Azure](../../includes/functions-vstools-publish.md)]
-## Test your function in Azure
+## Verify your function in Azure
1. In Cloud Explorer, your new function app should be selected. If not, expand your subscription > **App Services**, and select your new function app.
Before you can publish your project, you must have a function app in your Azure
`http://<APP_NAME>.azurewebsites.net/api/HttpExample?name=Functions`
-2. Go to this URL and you see a response in the browser to the remote GET request returned by the function, which looks like the following example:
+1. Go to this URL and you see a response in the browser to the remote GET request returned by the function, which looks like the following example:
:::image type="content" source="media/functions-create-your-first-function-visual-studio/functions-create-your-first-function-visual-studio-browser-azure.png" alt-text="Function response in the browser":::
Other quickstarts in this collection build upon this quickstart. If you plan to
*Resources* in Azure refer to function apps, functions, storage accounts, and so forth. They're grouped into *resource groups*, and you can delete everything in a group by deleting the group.
-You created resources to complete these quickstarts. You may be billed for these resources, depending on your [account status](https://azure.microsoft.com/account/) and [service pricing](https://azure.microsoft.com/pricing/). If you don't need the resources anymore, here's how to delete them:
+You created resources to complete these quickstarts. You may be billed for these resources, depending on your [account status](https://azure.microsoft.com/account/) and [service pricing](https://azure.microsoft.com/pricing/).
-1. In the Cloud Explorer, expand your subscription > **App Services**, right-click your function app, and choose **Open in Portal**.
-
-1. In the function app page, select the **Overview** tab and then select the link under **Resource group**.
-
- :::image type="content" source="media/functions-create-your-first-function-visual-studio/functions-app-delete-resource-group.png" alt-text="Select the resource group to delete from the function app page":::
-
-2. In the **Resource group** page, review the list of included resources, and verify that they're the ones you want to delete.
-
-3. Select **Delete resource group**, and follow the instructions.
-
- Deletion may take a couple of minutes. When it's done, a notification appears for a few seconds. You can also select the bell icon at the top of the page to view the notification.
## Next steps
azure-functions Functions Dotnet Dependency Injection https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-dotnet-dependency-injection.md
namespace MyNamespace
private readonly HttpClient _client; private readonly IMyService _service;
- public MyHttpTrigger(HttpClient httpClient, IMyService service)
+ public MyHttpTrigger(IHttpClientFactory httpClientFactory, IMyService service)
{
- this._client = httpClient;
+ this._client = httpClientFactory.CreateClient();
this._service = service; }
azure-functions Functions Scale https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-scale.md
The following table compares the scaling behaviors of the various hosting plans.
| | | | **[Consumption plan](consumption-plan.md)** | Pay only for the time your functions run. Billing is based on number of executions, execution time, and memory used. | | **[Premium plan](functions-premium-plan.md)** | Premium plan is based on the number of core seconds and memory used across needed and pre-warmed instances. At least one instance per plan must be kept warm at all times. This plan provides the most predictable pricing. |
-| **[Dedicated plan](dedicated-plan.md)* | You pay the same for function apps in an App Service Plan as you would for other App Service resources, like web apps.|
+| **[Dedicated plan](dedicated-plan.md)** | You pay the same for function apps in an App Service Plan as you would for other App Service resources, like web apps.|
| **[App Service Environment (ASE)](dedicated-plan.md)** | There's a flat monthly rate for an ASE that pays for the infrastructure and doesn't change with the size of the ASE. There's also a cost per App Service plan vCPU. All apps hosted in an ASE are in the Isolated pricing SKU. | | **[Kubernetes](functions-kubernetes-keda.md)**| You pay only the costs of your Kubernetes cluster; no additional billing for Functions. Your function app runs as an application workload on top of your cluster, just like a regular app. | ## Next steps + [Deployment technologies in Azure Functions](functions-deployment-technologies.md)
-+ [Azure Functions developer guide](functions-reference.md)
++ [Azure Functions developer guide](functions-reference.md)
azure-functions Functions Versions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-versions.md
Title: Azure Functions runtime versions overview
description: Azure Functions supports multiple versions of the runtime. Learn the differences between them and how to choose the one that's right for you. Previously updated : 12/09/2019 Last updated : 05/19/2021 # Azure Functions runtime versions overview
The main differences between versions when running .NET class library functions
## Migrating from 1.x to later versions
-You may choose to migrate an existing app written to use the version 1.x runtime to instead use a newer version. Most of the changes you need to make are related to changes in the language runtime, such as C# API changes between .NET Framework 4.7 and .NET Core. You'll also need to make sure your code and libraries are compatible with the language runtime you choose. Finally, be sure to note any changes in trigger, bindings, and features highlighted below. For the best migration results, you should create a new function app in a new version and port your existing version 1.x function code to the new app.
+You may choose to migrate an existing app written to use the version 1.x runtime to instead use a newer version. Most of the changes you need to make are related to changes in the language runtime, such as C# API changes between .NET Framework 4.8 and .NET Core. You'll also need to make sure your code and libraries are compatible with the language runtime you choose. Finally, be sure to note any changes in trigger, bindings, and features highlighted below. For the best migration results, you should create a new function app in a new version and port your existing version 1.x function code to the new app.
While it's possible to do an "in-place" upgrade by manually updating the app configuration, going from 1.x to a higher version includes some breaking changes. For example, in C#, the debugging object is changed from `TraceWriter` to `ILogger`. By creating a new version 3.x project, you start off with updated functions based on the latest version 3.x templates.
azure-functions Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/start-stop-vms/troubleshoot.md
Depending on which Logic Apps you have enabled to support your start/stop scenar
You can review the details for the operations performed on the VMs that are written to the table **requestsstoretable** in the Azure storage account used for Start/Stop VMs v2 (preview). Perform the following steps to view those records.
-1. Navigate to the storage account in the Azure portal and in the account select **Storage Explorer (preview) from the left-hand pane.
+1. Navigate to the storage account in the Azure portal and in the account select **Storage Explorer (preview)** from the left-hand pane.
1. Select **TABLES** and then select **requeststoretable**. 1. Each record in the table represents the start/stop action performed against an Azure VM based on the target scope defined in the logic app scenario. You can filter the results by any one of the record properties (for example, TIMESTAMP, ACTION, or TARGETTOPLEVELRESOURCENAME).
Learn more about monitoring Azure Functions and logic apps:
* [How to configure monitoring for Azure Functions](../../azure-functions/configure-monitoring.md).
-* [Monitor logic apps](../../logic-apps/monitor-logic-apps.md).
+* [Monitor logic apps](../../logic-apps/monitor-logic-apps.md).
azure-government Documentation Government Csp List https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/documentation-government-csp-list.md
Below you can find a list of all the authorized Cloud Solution Providers, AOS-G
|[Inforeliance LLC](https://www.inforeliance.com/)| |[Infosys Public Services, Inc.](https://www.infosyspublicservices.com/)| |[InnovaSystems International](https://www.innovasi.com/)|
+|[Innovia Consulting](https://www.innovia.com/)|
|[Inquisit, LLC](https://www.inquisitllc.com)| |[InsITe Business Solutions Inc.](https://trustedinsite.com/)| |[Inspired Technologies](https://www.inspired-tech.net)|
Below you can find a list of all the authorized Cloud Solution Providers, AOS-G
|[Neovera Inc.](https://www.neovera.com)| |[Netwize](https://www.netwize.com)| |[NewWave Telecom & Technologies, Inc](https://www.newwave.io)|
-|[NexustTek](https://www.nexustek.com/)|
+|[NexusTek](https://www.nexustek.com/)|
|[Nihilent Inc](https://nihilent.com)| |[Nimbus Logic LLC](https://www.nimbus-logic.com)| |[Norseman, Inc](https://www.norseman.com)|
azure-maps About Azure Maps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/about-azure-maps.md
Azure Maps is a collection of geospatial services and SDKs that use fresh mapping data to provide geographic context to web and mobile applications. Azure Maps provides: * REST APIs to render vector and raster maps in multiple styles and satellite imagery.
-* Creator services (Preview) to create and render maps based on private indoor map data.
+* Creator services to create and render maps based on private indoor map data.
* Search services to locate addresses, places, and points of interest around the world. * Various routing options; such as point-to-point, multipoint, multipoint optimization, isochrone, electric vehicle, commercial vehicle, traffic influenced, and matrix routing. * Traffic flow view and incidents view, for applications that require real-time traffic information. * Mobility services (Preview) to request public transit information, plan routes by blending different travel modes and real-time arrivals. * Time zone and Geolocation (Preview) services.
-* Elevation services (Preview) with Digital Elevation Model
+* Elevation services with Digital Elevation Model
* Geofencing service and mapping data storage, with location information hosted in Azure. * Location intelligence through geospatial analytics.
Use the Azure Maps Android SDK to create mobile mapping applications.
Azure Maps consists of the following services that can provide geographic context to your Azure applications.
-### Data service (Preview)
+### Data service
-Data is imperative for maps. Use the Data service to upload and store geospatial data for use with spatial operations or image composition. Bringing customer data closer to the Azure Maps service will reduce latency, increase productivity, and create new scenarios in your applications. For details on this service, see the [Data service documentation](/rest/api/maps/data).
+Data is imperative for maps. Use the Data service to upload and store geospatial data for use with spatial operations or image composition. Bringing customer data closer to the Azure Maps service will reduce latency, increase productivity, and create new scenarios in your applications. For details on this service, see the [Data service documentation](/rest/api/maps/data v2).
### Geolocation service (Preview)
The [Get Map Tile V2 API](/rest/api/maps/renderv2/getmaptilepreview) allows you
![Example of map with real-time weather radar tiles](media/about-azure-maps/intro_weather.png)
-### Maps Creator service (Preview)
+### Maps Creator service
Maps Creator service is a suite of web services that developers can use to create applications with map features based on indoor map data. Maps Creator provides three core
-* [Dataset service](/rest/api/maps/dataset). Use the Dataset service to create a dataset from a converted Drawing package data. For information on Drawing package requirements, see Drawing package requirements.
+* [Dataset service](/rest/api/maps/v2/dataset). Use the Dataset service to create a dataset from a converted Drawing package data. For information about Drawing package requirements, see Drawing package requirements.
-* [Conversion service](/rest/api/maps/dataset). Use the Conversion service to convert a DWG design file into Drawing package data for indoor maps.
+* [Conversion service](/rest/api/maps/v2/dataset). Use the Conversion service to convert a DWG design file into Drawing package data for indoor maps.
-* [Tileset service](/rest/api/maps/tileset). Use the Tileset service to create a vector-based representation of a dataset. Applications can use a tileset to present a visual tile-based view of the dataset.
+* [Tileset service](/rest/api/maps/v2/tileset). Use the Tileset service to create a vector-based representation of a dataset. Applications can use a tileset to present a visual tile-based view of the dataset.
-* [Feature State service](/rest/api/maps/featurestate). Use the Feature State service to support dynamic map styling. Dynamic map styling allows applications to reflect real-time events on spaces provided by IoT systems.
+* [Feature State service](/rest/api/maps/v2/featurestate). Use the Feature State service to support dynamic map styling. Dynamic map styling allows applications to reflect real-time events on spaces provided by IoT systems.
-* [WFS service](/rest/api/maps/featurestate). Use the WFS service to query your indoor map data. The WFS service follows the [Open Geospatial Consortium API](http://docs.opengeospatial.org/is/17-069r3/17-069r3.html) standards for querying a single dataset.
+* [WFS service](/rest/api/maps/v2/featurestate). Use the WFS service to query your indoor map data. The WFS service follows the [Open Geospatial Consortium API](http://docs.opengeospatial.org/is/17-069r3/17-069r3.html) standards for querying a single dataset.
-### Elevation service (Preview)
+### Elevation service
The Azure Maps Elevation service is a web service that developers can use to retrieve elevation data from anywhere on the EarthΓÇÖs surface. The Elevation service allows you to retrieve elevation data in two formats:
-* **GeoTIFF raster format**. Use the [Render V2 - Get Map Tile API](/rest/api/maps/renderv2) to retrieve elevation data in tile format.
+* **GeoTIFF raster format**. Use the [Render V2-Get Map Tile API](/rest/api/maps/renderv2) to retrieve elevation data in tile format.
* **GeoJSON format**. Use the [Elevation APIs](/rest/api/maps/elevation) to request sampled elevation data along paths, within a defined bounding box, or at specific coordinates.
azure-maps Choose Pricing Tier https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/choose-pricing-tier.md
Azure Maps now offers two pricing tiers: Gen 1 and Gen 2. The Gen 2 new pricing
## Pricing tier targeted customers
-See the **pricing tier targeted customers** table below for a better understanding of Gen 1 and Gen 2 pricing tiers. For more information, see [Azure Maps pricing](https://azure.microsoft.com/pricing/details/azure-maps/). If you're a current Azure Maps customer, you can learn how to change from Gen 1 to Gen 2 pricing [here](how-to-manage-pricing-tier.md)
+See the **pricing tier targeted customers** table below for a better understanding of Gen 1 and Gen 2 pricing tiers. For more information, see [Azure Maps pricing](https://azure.microsoft.com/pricing/details/azure-maps/). If you're a current Azure Maps customer, you can learn how to change from Gen 1 to Gen 2 pricing [here](how-to-manage-pricing-tier.md).
| Pricing tier | SKU | Targeted Customers| |--|-| --|
azure-maps Create Data Source Android Sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/create-data-source-android-sdk.md
Azure Maps adheres to the [Mapbox Vector Tile Specification](https://github.com/
- Road tiles [documentation](/rest/api/maps/renderv2/getmaptilepreview) | [data format details](https://developer.tomtom.com/maps-api/maps-api-documentation-vector/tile) - Traffic incidents [documentation](/rest/api/maps/traffic/gettrafficincidenttile) | [data format details](https://developer.tomtom.com/traffic-api/traffic-api-documentation-traffic-incidents/vector-incident-tiles) - Traffic flow [documentation](/rest/api/maps/traffic/gettrafficflowtile) | [data format details](https://developer.tomtom.com/traffic-api/traffic-api-documentation-traffic-flow/vector-flow-tiles)-- Azure Maps Creator also allows custom vector tiles to be created and accessed through the [Get Tile Render V2](/rest/api/maps/renderv2/getmaptilepreview)
+- Azure Maps Creator also allows custom vector tiles to be created and accessed through the [Render V2-Get Map Tile API](/rest/api/maps/renderv2/getmaptilepreview)
> [!TIP] > When using vector or raster image tiles from the Azure Maps render service with the web SDK, you can replace `atlas.microsoft.com` with the placeholder `azmapsdomain.invalid`. This placeholder will be replaced with the same domain used by the map and will automatically append the same authentication details as well. This greatly simplifies authentication with the render service when using Azure Active Directory authentication.
azure-maps Create Data Source Web Sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/create-data-source-web-sdk.md
Azure Maps adheres to the [Mapbox Vector Tile Specification](https://github.com/
- Road tiles [documentation](/rest/api/maps/renderv2/getmaptilepreview) | [data format details](https://developer.tomtom.com/maps-api/maps-api-documentation-vector/tile) - Traffic incidents [documentation](/rest/api/maps/traffic/gettrafficincidenttile) | [data format details](https://developer.tomtom.com/traffic-api/traffic-api-documentation-traffic-incidents/vector-incident-tiles) - Traffic flow [documentation](/rest/api/maps/traffic/gettrafficflowtile) | [data format details](https://developer.tomtom.com/traffic-api/traffic-api-documentation-traffic-flow/vector-flow-tiles)-- Azure Maps Creator (Preview) also allows custom vector tiles to be created and accessed through the [Get Tile Render V2](/rest/api/maps/renderv2/getmaptilepreview)
+- Azure Maps Creator also allows custom vector tiles to be created and accessed through the [Render V2-Get Map Tile API](/rest/api/maps/renderv2/getmaptilepreview)
> [!TIP] > When using vector or raster image tiles from the Azure Maps render service with the web SDK, you can replace `atlas.microsoft.com` with the placeholder `{azMapsDomain}`. This placeholder will be replaced with the same domain used by the map and will automatically append the same authentication details as well. This greatly simplifies authentication with the render service when using Azure Active Directory authentication.
azure-maps Creator Facility Ontology https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/creator-facility-ontology.md
+
+ Title: Facility Ontology in Microsoft Azure Maps Creator
+description: Facility Ontology that describes the feature class definitions for Azure Maps Creator
++ Last updated : 05/21/2021++++
+zone_pivot_groups: facility-ontology-schema
++
+# Facility Ontology
+
+Facility ontology defines how Azure Maps Creator internally stores facility data in a Creator dataset. In addition to defining internal facility data structure, facility ontology is also exposed externally through the WFS API. When WFS API is used to query facility data in a dataset, the response format is defined by the ontology supplied to that dataset.
+
+At a high level, facility ontology divides the dataset into feature classes. All feature classes share a common set of properties, such as `ID` and `Geometry`. In addition to the common property set, each feature class defines a set of properties. Each property is defined by its data type and constraints. Some feature classes have properties that are dependant on other feature classes. Dependant properties evaluate to the `ID` of another feature class.
+
+## Changes and Revisions
++
+The Facility 1.0 contains revisions for the Facility feature class definitions for [Azure Maps Services](https://aka.ms/AzureMaps).
+++
+The Facility 2.0 contains revisions for the Facility feature class definitions for [Azure Maps Services](https://aka.ms/AzureMaps).
++
+### Major Changes
++
+Fixed the following constraint validation checks:
+
+* Constraint validation check for exclusivity of `isObstruction = true` *or* the presence of `obstructionArea` for `lineElement` and `areaElement` feature classes.
+
+* Constraint validation check for exclusivity of `isRoutable = true` *or* the presence of `routeThroughBehavior` for the `category` feature class.
++
+* Added a structure feature class to hold walls, columns, and so on.
+* Cleaned up the attributes designed to enrich routing scenarios. The current routing engine doesn't support them.
++
+## unit
+
+The `unit` feature class defines a physical and non-overlapping area that can be occupied and traversed by a navigating agent. A `unit` can be a hallway, a room, a courtyard, and so on.
+
+**Geometry Type**: Polygon
++
+| Property | Type | Required | Description |
+|--|--|-|--|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` | [category.Id](#category) |true | The ID of a [`category`](#category) feature.|
+|`isOpenArea` | boolean (Default value is `null`.) |false | Represents whether the unit is an open area. If set to `true`, [structures](#structure) don't surround the unit boundary, and a navigating agent can enter the `unit` without the need of an [`opening`](#opening). By default, units are surrounded by physical barriers and are open only where an opening feature is placed on the boundary of the unit. If walls are needed in an open area unit, they can be represented as a [`lineElement`](#lineelement) or [`areaElement`](#areaelement) with an `isObstruction` property equal to `true`.|
+|`navigableBy` | enum ["pedestrian", "wheelchair", "machine", "bicycle", "automobile", "hiredAuto", "bus", "railcar", "emergency", "ferry", "boat"] | false |Indicates the types of navigating agents that can traverse the unit. If unspecified, the unit is assumed to be traversable by any navigating agent. |
+|`isRoutable` | boolean (Default value is `null`.) | false | Determines if the unit is part of the routing graph. If set to `true`, the unit can be used as source/destination or intermediate node in the routing experience. |
+|`routeThroughBehavior` | enum ["disallowed", "allowed", "preferred"] | false | Determines if navigating through the unit is allowed. If unspecified, it inherits inherits its value from the category feature referred to in the `categoryId` property. If specified, it overrides the value given in its category feature." |
+|`nonPublic` | boolean| false | If `true`, the unit is navigable only by privileged users. Default value is `false`. |
+| `levelId` | [level.Id](#level) | true | The ID of a level feature. |
+|`occupants` | array of [directoryInfo.Id](#directoryinfo) | false | The IDs of [directoryInfo](#directoryinfo) features. Used to represent one or many occupants in the feature. |
+|`addressId` | [directoryInfo.Id](#directoryinfo) | true | The ID of a [directoryInfo](#directoryinfo) feature. Used to represent the address of the feature.|
+|`addressRoomNumber` | [directoryInfo.Id](#directoryinfo) | true | Room/Unit/Apartment/Suite number of the unit.|
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000. |
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000.|
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000. |
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) that represents the feature as a point. Can be used to position the label of the feature.|
+++
+| Property | Type | Required | Description |
+|--|--|-|--|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` | [category.Id](#category) |true | The ID of a [`category`](#category) feature.|
+|`isOpenArea` | boolean (Default value is `null`.) |false | Represents whether the unit is an open area. If set to `true`, [structures](#structure) don't surround the unit boundary, and a navigating agent can enter the `unit` without the need of an [`opening`](#opening). By default, units are surrounded by physical barriers and are open only where an opening feature is placed on the boundary of the unit. If walls are needed in an open area unit, they can be represented as a [`lineElement`](#lineelement) or [`areaElement`](#areaelement) with an `isObstruction` property equal to `true`.|
+|`isRoutable` | boolean (Default value is `null`.) | false | Determines if the unit is part of the routing graph. If set to `true`, the unit can be used as source/destination or intermediate node in the routing experience. |
+| `levelId` | [level.Id](#level) | true | The ID of a level feature. |
+|`occupants` | array of [directoryInfo.Id](#directoryinfo) | false | The IDs of [directoryInfo](#directoryinfo) features. Used to represent one or many occupants in the feature. |
+|`addressId` | [directoryInfo.Id](#directoryinfo) | true | The ID of a [directoryInfo](#directoryinfo) feature. Used to represent the address of the feature.|
+|`addressRoomNumber` | [directoryInfo.Id](#directoryinfo) | true | Room/Unit/Apartment/Suite number of the unit.|
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000.|
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000.|
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000.|
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) that represents the feature as a point. Can be used to position the label of the feature.|
+++
+## structure
+
+The `structure` feature class defines a physical and non-overlapping area that cannot be navigated through. Can be a wall, column, and so on.
+
+**Geometry Type**: Polygon
+
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` | [category.Id](#category) |true | The ID of a [`category`](#category) feature.|
+| `levelId` | [level.Id](#level) | true | The ID of a [`level`](#level) feature. |
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000. |
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000. |
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000.|
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) that represents the feature as a point. Can be used to position the label of the feature.|
++
+## zone
+
+The `zone` feature class defines a virtual area, like a WiFi zone or emergency assembly area. Zones can be used as destinations but are not meant for through traffic.
+
+**Geometry Type**: Polygon
+
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` | [category.Id](#category) |true | The ID of a [`category`](#category) feature.|
+| `setId` | string | true |Required for zone features that represent multi-level zones. The `setId` is the unique ID for a zone that spans multiple levels. The `setId` enables a zone with varying coverage on different floors to be represented with different geometry on different levels. The `setId` can be any string and is case-sensitive. It is recommended that the `setId` is a GUID. Maximum length allowed is 1000.|
+| `levelId` | [level.Id](#level) | true | The ID of a [`level`](#level) feature. |
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000.|
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000.|
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000. |
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) that represents the feature as a point. Can be used to position the label of the feature.|
+
+## level
+
+The `level` class feature defines aAn area of a building at a set elevation. For example, the floor of a building, which contains a set of features, such as [`units`](#unit).
+
+**Geometry Type**: MultiPolygon
+
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` | [category.Id](#category) |true | The ID of a [`category`](#category) feature.|
+| `ordinal` | integer | true | The level number. Used by the [`verticalPenetration`](#verticalpenetration) feature to determine the relative order of the floors to help with travel direction. The general practice is to start with 0 for the ground floor. Add +1 for every floor upwards, and -1 for every floor going down. It can be modeled with any numbers, as long as the higher physical floors are represented by higher ordinal values. |
+| `abbreviatedName` | string | false | A four-character abbreviated level name, like what would be found on an elevator button. Maximum length allowed is 1000.|
+| `heightAboveFacilityAnchor` | double | false | Vertical distance of the level's floor above [`facility.anchorHeightAboveSeaLevel`](#facility), in meters. |
+| `verticalExtent` | double | false | Vertical extent of the level, in meters. If not provided, defaults to [`facility.defaultLevelVerticalExtent`](#facility).|
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000.|
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000.|
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000.|
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) that represents the feature as a point. Can be used to position the label of the feature.|
+
+## facility
+
+The `facility` feature class defines the area of the site, building footprint, and so on.
+
+**Geometry Type**: MultiPolygon
+
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` | [category.Id](#category) |true | The ID of a [`category`](#category) feature.|
+|`occupants` | array of [directoryInfo.Id](#directoryinfo) | false | The IDs of [directoryInfo](#directoryinfo) features. Used to represent one or many occupants in the feature. |
+|`addressId` | [directoryInfo.Id](#directoryinfo) | true | The ID of a [directoryInfo](#directoryinfo) feature. Used to represent the address of the feature.|
+|`addressRoomNumber` | [directoryInfo.Id](#directoryinfo)| true | Room/Unit/Apartment/Suite number of the unit.|
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000. |
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000. |
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000.|
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) that represents the feature as a point. Can be used to position the label of the feature.|
+|`anchorHeightAboveSeaLevel` | double | false | Height of anchor point above sea level, in meters. Sea level is defined by EGM 2008.|
+|`defaultLevelVerticalExtent` | double| false | Default value for vertical extent of levels, in meters.|
+
+## verticalPenetration
+
+The `verticalPenetration` class feature defines an area that, when used in a set, represents a method of navigating vertically between levels. It can be used to model stairs, elevators, and so on. Geometry can overlap units and other vertical penetration features.
+
+**Geometry Type**: Polygon
++
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` | [category.Id](#category) |true | The ID of a [`category`](#category) feature.|
+| `setId` | string | true | Vertical penetration features must be used in sets to connect multiple levels. Vertical penetration features in the same set are considered to be the same. The `setId` can be any string, and is case-sensitive. Using a GUID as a `setId` is recommended. Maximum length allowed is 1000.|
+| `levelId` | [level.Id](#level) | true | The ID of a level feature. |
+|`direction` | string enum [ "both", "lowToHigh", "highToLow", "closed" ]| true | Travel direction allowed on this feature. The ordinal attribute on the [`level`](#level) feature is used to determine the low and high order.|
+|`navigableBy` | enum ["pedestrian", "wheelchair", "machine", "bicycle", "automobile", "hiredAuto", "bus", "railcar", "emergency", "ferry", "boat"] | false |Indicates the types of navigating agents that can traverse the unit. If unspecified, the unit is traversable by any navigating agent. |
+|`nonPublic` | boolean| false | If `true`, the unit is navigable only by privileged users. Default value is `false`. |
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000.|
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000.|
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000. |
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) that represents the feature as a point. Can be used to position the label of the feature.|
+++
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` | [category.Id](#category) |true | The ID of a [`category`](#category) feature.|
+| `setId` | string | true | Vertical penetration features must be used in sets to connect multiple levels. Vertical penetration features in the same set are connected. The `setId` can be any string, and is case-sensitive. Using a GUID as a `setId` is recommended. Maximum length allowed is 1000. |
+| `levelId` | [level.Id](#level) | true | The ID of a level feature. |
+|`direction` | string enum [ "both", "lowToHigh", "highToLow", "closed" ]| true | Travel direction allowed on this feature. The ordinal attribute on the [`level`](#level) feature is used to determine the low and high order.|
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000.|
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000.|
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000. |
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) that represents the feature as a point. Can be used to position the label of the feature.|
++
+## opening
+
+The `opening` class feature defines a traversable boundary between two units, or a `unit` and `verticalPenetration`.
+
+**Geometry Type**: LineString
++
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` |[category.Id](#category) |true | The ID of a category feature.|
+| `levelId` | [level.Id](#level) | true | The ID of a level feature. |
+| `isConnectedToVerticalPenetration` | boolean | false | Whether or not this feature is connected to a `verticalPenetration` feature on one of its sides. Default value is `false`. |
+|`navigableBy` | enum ["pedestrian", "wheelchair", "machine", "bicycle", "automobile", "hiredAuto", "bus", "railcar", "emergency", "ferry", "boat"] | false |Indicates the types of navigating agents that can traverse the unit. If unspecified, the unit is traversable by any navigating agent. |
+| `accessRightToLeft`| enum [ "prohibited", "digitalKey", "physicalKey", "keyPad", "guard", "ticket", "fingerprint", "retina", "voice", "face", "palm", "iris", "signature", "handGeometry", "time", "ticketChecker", "other"] | false | Method of access when passing through the opening from right to left. Left and right are determined by the vertices in the feature geometry, standing at the first vertex and facing the second vertex. Omitting this property means there are no access restrictions.|
+| `accessLeftToRight`| enum [ "prohibited", "digitalKey", "physicalKey", "keyPad", "guard", "ticket", "fingerprint", "retina", "voice", "face", "palm", "iris", "signature", "handGeometry", "time", "ticketChecker", "other"] | false | Method of access when passing through the opening from left to right. Left and right are determined by the vertices in the feature geometry, standing at the first vertex and facing the second vertex. Omitting this property means there are no access restrictions.|
+| `isEmergency` | boolean | false | If `true`, the opening is navigable only during emergencies. Default value is `false` |
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) y that represents the feature as a point. Can be used to position the label of the feature.|
+++
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` |[category.Id](#category) |true | The ID of a category feature.|
+| `levelId` | [level.Id](#level) | true | The ID of a level feature. |
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) y that represents the feature as a point. Can be used to position the label of the feature.|
++
+## directoryInfo
+
+The `directoryInfo` object class feature defines the name, address, phone number, website, and hours of operation for a unit, facility, or an occupant of a unit or facility.
+
+**Geometry Type**: None
+
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`streetAddress` |string |false |Street address part of the address. Maximum length allowed is 1000. |
+|`unit` |string |false |Unit number part of the address. Maximum length allowed is 1000. |
+|`locality`| string| false |The locality of the address. For example: city, municipality, village). Maximum length allowed is 1000.|
+|`adminDivisions`| string| false |Administrative division part of the address, from smallest to largest (County, State, Country). For example: ["King", "Washington", "USA" ] or ["West Godavari", "Andhra Pradesh", "IND" ]. Maximum length allowed is 1000.|
+|`postalCode`| string | false |Postal code part of the address. Maximum length allowed is 1000.|
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000.|
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000. |
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000. |
+|`phoneNumber` | string | false | Phone number. |
+|`website` | string | false | Website URL. Maximum length allowed is 1000. |
+|`hoursOfOperation` | string | false | Hours of operation as text, following the [Open Street Map specification](https://wiki.openstreetmap.org/wiki/Key:openingHours/specification). Maximum length allowed is 1000. |
+
+## pointElement
+
+The `pointElement` is a class feature that defines a point feature in a unit, such as a first aid kit or a sprinkler head.
+
+**Geometry Type**: MultiPoint
+
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` |[category.Id](#category) |true | The ID of a [`category`](#category) feature.|
+| `unitId` | string | true | The ID of a [`unit`](#unit) feature containing this feature. Maximum length allowed is 1000.|
+| `isObstruction` | boolean (Default value is `null`.) | false | If `true`, this feature represents an obstruction to be avoided while routing through the containing unit feature. |
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000.|
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000. |
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000.|
+
+## lineElement
+
+The `lineElement` is a class feature that defines a line feature in a unit, such as a dividing wall or window.
+
+**Geometry Type**: LinearMultiString
+
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` |[category.Id](#category) |true | The ID of a [`category`](#category) feature.|
+| `unitId` | string | true | The ID of a [`unit`](#unit) feature containing this feature. Maximum length allowed is 1000. |
+| `isObstruction` | boolean (Default value is `null`.)| false | If `true`, this feature represents an obstruction to be avoided while routing through the containing unit feature. |
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000. |
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000. |
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000. |
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) that represents the feature as a point. Can be used to position the label of the feature.|
+|`obstructionArea` | [Polygon](/rest/api/maps/wfs/getfeaturepreview#featuregeojson)| false | A simplified geometry (when the line geometry is complicated) of the feature that is to be avoided during routing. Requires `isObstruction` set to true.|
+
+## areaElement
+
+The `areaElement` is a class feature that defines a polygon feature in a unit, such as an area open to below, an obstruction like an island in a unit.
+
+**Geometry Type**: MultiPolygon
+
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the feature with another feature in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`categoryId` |[category.Id](#category) |true | The ID of a [`category`](#category) feature.|
+| `unitId` | string | true | The ID of a [`unit`](#unit) feature containing this feature. Maximum length allowed is 1000. |
+| `isObstruction` | boolean | false | If `true`, this feature represents an obstruction to be avoided while routing through the containing unit feature. |
+|`obstructionArea` | geometry: ["Polygon","MultiPolygon" ]| false | A simplified geometry (when the line geometry is complicated) of the feature that is to be avoided during routing. Requires `isObstruction` set to true.|
+|`name` | string | false | Name of the feature in local language. Maximum length allowed is 1000. |
+|`nameSubtitle` | string | false | Subtitle that shows up under the `name` of the feature. Can be used to display the name in a different language, and so on. Maximum length allowed is 1000.|
+|`nameAlt` | string | false | Alternate name used for the feature. Maximum length allowed is 1000.|
+|`anchorPoint` | [Point](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) | false | [GeoJSON Point geometry](/rest/api/maps/wfs/getfeaturepreview#featuregeojson) that represents the feature as a point. Can be used to position the label of the feature.|
+
+## category
+
+The `category` class feature defines category names. For example: "room.conference".
+
+**Geometry Type**: None
++
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The category's original ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the category with another category in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`name` | string | true | Name of the category. Suggested to use "." to represent hierarchy of categories. For example: "room.conference", "room.privateoffice". Maximum length allowed is 1000. |
+| `routeThroughBehavior` | boolean | false | Determines whether a feature can be used for through traffic.|
+|`isRoutable` | boolean (Default value is `null`.) | false | Determines if a feature should be part of the routing graph. If set to `true`, the unit can be used as source/destination or intermediate node in the routing experience. |
+++
+| Property | Type | Required | Description |
+|--||-|-|
+|`originalId` | string |true | The category's original ID derived from client data. Maximum length allowed is 1000.|
+|`externalId` | string |true | An ID used by the client to associate the category with another category in a different dataset, such as in an internal database. Maximum length allowed is 1000.|
+|`name` | string | true | Name of the category. Suggested to use "." to represent hierarchy of categories. For example: "room.conference", "room.privateoffice". Maximum length allowed is 1000. |
+
azure-maps Creator Geographic Scope https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/creator-geographic-scope.md
+
+ Title: Azure Maps Creator service geographic scope
+description: Learn about Azure Maps Creator service's geographic mappings in Azure Maps
++ Last updated : 05/18/2021+++++
+
+
+# Creator service geographic scope
+
+Azure Maps Creator is a geographically scoped service. Creator offers a resource provider API that, given an Azure region, creates an instance of Creator data deployed at the geographical level. The mapping from an Azure region to geography happens behind the scenes as described in the table below. For more details on Azure regions and geographies, see [Azure geographies](https://azure.microsoft.com/global-infrastructure/geographies).
+
+## Data locations
+
+For disaster recovery and high availability, Microsoft may replicate customer data to other regions only within the same geographic area. For example, data in West Europe may be replicated to North Europe, but not to the United States. Regardless, no matter which geography the customer selected, Microsoft doesnΓÇÖt control or limit the locations from which the customers, or their end users, may access customer data via Azure Maps API.
+
+## Geographic and regional mapping
+
+The following table describes the mapping between geography and supported Azure regions, and the respective geographic API endpoint. For example, if a Creator account is provisioned in the West US 2 region that falls within the United States geography, all API calls to the Conversion service must be made to `us.atlas.microsoft.com/conversion/convert`.
++
+| Azure Geographic areas (geos) | Azure datacenters (regions) | API geographic endpoint |
+||-|-|
+| Europe| West Europe, North Europe | eu.atlas.microsoft.com |
+|United States | West US 2, East US 2 | us.atlas.microsoft.com |
azure-maps Creator Indoor Maps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/creator-indoor-maps.md
Title: Work with indoor maps in Azure Maps Creator (Preview)
-description: This article introduces concepts that apply to Azure Maps Creator services (Preview)
+ Title: Work with indoor maps in Azure Maps Creator
+description: This article introduces concepts that apply to Azure Maps Creator services
Previously updated : 12/07/2020 Last updated : 05/26/2021
-# Creator (Preview) for indoor maps
+# Creator for indoor maps
+This article introduces concepts and tools that apply to Azure Maps Creator. We recommend that you read this article before you begin to use the Azure Maps Creator API and SDK.
-> [!IMPORTANT]
-> Azure Maps Creator services are currently in public preview.
-> This preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-
+You can use Creator to develop applications with map features that are based on indoor map data. This article describes the process of uploading, converting, creating, and using your map data. Typically, the workflow is completed by two different personas with distinct areas of expertise and responsibility:
-This article introduces concepts and tools that apply to Azure Maps Creator. We recommend that you read this article before you begin to use the Azure Maps Creator API and SDK.
+- Map maker: responsible for curating and preparing the map data.
+- Creator map data user: leverages customer map data in applications.
-You can use Creator to develop applications with map features based on indoor map data. This article describes the process of uploading, converting, creating, and using your map data. The entire workflow is illustrated in the diagram below.
+The following diagram illustrates the entire workflow.
![Creator map data workflow](./media/creator-indoor-maps/workflow.png)
-## Create Azure Maps Creator (Preview)
+## Create Azure Maps Creator
+
+To use Creator services, Azure Maps Creator must be created in an Azure Maps account. For information about how to create Azure Maps Creator in Azure Maps, see [Manage Azure Maps Creator](how-to-manage-creator.md).
+
+## Creator authentication
+
+Creator inherits Azure Maps Access Control (IAM) settings. All API calls for data access must be sent with authentication and authorization rules.
+
+Creator usage data is incorporated in your Azure Maps usage charts and activity log. For more information, see [Manage authentication in Azure Maps](./how-to-manage-authentication.md).
+
+>[!Important]
+>We recommend using:
+>
+> - Azure Active Directory (Azure AD) in all solutions that are built with an Azure Maps account using Creator services. For more information about Azure AD, see [Azure AD authentication](azure-maps-authentication.md#azure-ad-authentication).
+>
+>- Role-based access control settings. Using these settings, map makers can act as the Azure Maps Data Contributor role, and Creator map data users can act as the Azure Maps Data Reader role. For more information, see [Authorization with role-based access control](azure-maps-authentication.md#authorization-with-role-based-access-control).
-To use Creator services (Preview), Azure Maps Creator must be created in an Azure Maps account. For information on how to create Azure Maps Creator in Azure Maps, see [Manage Azure Maps Creator](how-to-manage-creator.md).
+## Creator data item types
+
+Creator services create, store, and use various data types that are defined and discussed in the following sections. A creator data item can be of the following types:
+
+- Converted data
+- Dataset
+- Tileset
+- Feature stateset
## Upload a Drawing package
-Creator (Preview) collects indoor map data by converting an uploaded Drawing package. The Drawing package represents a constructed or remodeled facility. For information on Drawing package requirements, see [Drawing package requirements](drawing-requirements.md).
+Creator collects indoor map data by converting an uploaded Drawing package. The Drawing package represents a constructed or remodeled facility. For information about Drawing package requirements, see [Drawing package requirements](drawing-requirements.md).
-Use the [Azure Maps Data (Preview) Upload API](/rest/api/maps/data/uploadpreview) to upload a Drawing package. Upon a successful upload, the Data Upload API will return a user data identifier (`udid`). The `udid` will be used in the next step to convert the uploaded package into indoor map data.
+Use the [Azure Maps Data Upload API](/rest/api/maps/data%20v2/uploadpreview) to upload a Drawing package. After the Drawing packing is uploaded, the Data Upload API returns a user data identifier (`udid`). The `udid` can then be used to convert the uploaded package into indoor map data.
## Convert a Drawing package
-The [Azure Maps Conversion service](/rest/api/maps/conversion) converts an uploaded Drawing package into indoor map data. The Conversion service also validates the package. Validation issues are classified into two types: errors and warnings. If any errors are detected, the conversion process fails. Should warnings be detected, the conversion will succeed. However, it's recommended that you review and resolve all warnings. A warning means that part of the conversion was ignored or automatically fixed. Failing to resolve the warnings could result in errors in latter processes. For more information, see [Drawing package warnings and errors](drawing-conversion-error-codes.md).
+The [Azure Maps Conversion service](/rest/api/maps/v2/conversion) converts an uploaded Drawing package into indoor map data. The Conversion service also validates the package. Validation issues are classified into two types:
-When an error occurs, the Conversion service provides a link to the [Azure Maps Drawing Error Visualizer](drawing-error-visualizer.md) stand-alone web application. You can use the Drawing Error Visualizer to inspect [Drawing package warnings and errors](drawing-conversion-error-codes.md) that occurred during the conversion process. Once you have fixed the errors, you can then attempt to upload and convert the package.
+- Errors: If any errors are detected, the conversion process fails. When an error occurs, the Conversion service provides a link to the [Azure Maps Drawing Error Visualizer](drawing-error-visualizer.md) stand-alone web application. You can use the Drawing Error Visualizer to inspect [Drawing package warnings and errors](drawing-conversion-error-codes.md) that occurred during the conversion process. After you fix the errors, you can attempt to upload and convert the package.
+- Warnings: If any warnings are detected, the conversion succeeds. However, we recommend that you review and resolve all warnings. A warning means that part of the conversion was ignored or automatically fixed. Failing to resolve the warnings could result in errors in later processes.
+For more information, see [Drawing package warnings and errors](drawing-conversion-error-codes.md).
## Create indoor map data
-Azure Maps Creator (Preview) provides three
+Azure Maps Creator provides the following services that support map creation:
-* [Dataset service](/rest/api/maps/dataset/createpreview).
-Use the Dataset service to create a dataset from a converted Drawing package data.
-* [Tileset service](/rest/api/maps/tileset/createpreview).
+- [Dataset service](/rest/api/maps/v2/dataset/createpreview).
+- [Tileset service](/rest/api/maps/v2/tileset/createpreview).
Use the Tileset service to create a vector-based representation of a dataset. Applications can use a tileset to present a visual tile-based view of the dataset.
-* [Feature State service](/rest/api/maps/featurestate).Use the Feature State service to support dynamic map styling. Dynamic map styling allows applications to reflect real-time events on spaces provided by IoT system.
+- [Feature State service](/rest/api/maps/v2/featurestate). Use the Feature State service to support dynamic map styling. Applications can use dynamic map styling to reflect real-time events on spaces provided by the IoT system.
### Datasets
-A dataset is a collection of indoor map features. The indoor map features represent facilities defined in a converted Drawing package. After creating a dataset with the [Dataset service](/rest/api/maps/dataset/createpreview), you can create any number of [tilesets](#tilesets) or [feature statesets](#feature-statesets).
+A dataset is a collection of indoor map features. The indoor map features represent facilities that are defined in a converted Drawing package. After you create a dataset with the [Dataset service](/rest/api/maps/v2/dataset/createpreview), you can create any number of [tilesets](#tilesets) or [feature statesets](#feature-statesets).
-The [Dataset service](/rest/api/maps/dataset/createpreview) allows developers, at any time, to add or remove facilities to an existing dataset. For more information on how to update an existing dataset using the API, see the append options in [Dataset service](/rest/api/maps/dataset/createpreview). For an example of how to update a dataset, see [Data Maintenance](#data-maintenance).
+At any time, developers can use the [Dataset service](/rest/api/maps/v2/dataset/createpreview) to add or remove facilities to an existing dataset. For more information about how to update an existing dataset using the API, see the append options in [Dataset service](/rest/api/maps/v2/dataset/createpreview). For an example of how to update a dataset, see [Data maintenance](#data-maintenance).
### Tilesets
-A tileset is a collection of vector data that represents a set of uniform grid tiles. Developers can use the [Tileset service](/rest/api/maps/tileset/createpreview) to create tilesets from a dataset.
+A tileset is a collection of vector data that represents a set of uniform grid tiles. Developers can use the [Tileset service](/rest/api/maps/v2/tileset/createpreview) to create tilesets from a dataset.
-To reflect different content stages, you can create multiple tilesets from the same dataset. For example, you could make one tileset with furniture and equipment, and another tileset without furniture and equipment. You might choose to generate one tileset with the most recent data updates, and one without the most recent data updates.
+To reflect different content stages, you can create multiple tilesets from the same dataset. For example, you can make one tileset with furniture and equipment, and another tileset without furniture and equipment. You might choose to generate one tileset with the most recent data updates, and another tileset without the most recent data updates.
-In addition to the vector data, the tileset provides metadata for map rendering optimization. For example, tileset metadata contains a min and max zoom level for the tileset. The metadata also provides a bounding box defining the geographic extent of the tileset. The bounding box allows an application to programmatically set the correct center point. For more information about tileset metadata, see [Tileset List API](/rest/api/maps/tileset/listpreview).
+In addition to the vector data, the tileset provides metadata for map rendering optimization. For example, tileset metadata contains a minimum and maximum zoom level for the tileset. The metadata also provides a bounding box that defines the geographic extent of the tileset. An application can use a bounding box to programmatically set the correct center point. For more information about tileset metadata, see [Tileset List API](/rest/api/maps/v2/tileset/listpreview).
-Once a tileset has been created, it can be retrieved by the [Render V2 service](#render-v2-service).
+After a tileset is created, it can be retrieved by the [Render V2 service](#render-v2-get-map-tile-api).
-If a tileset becomes outdated and is no longer useful, you can delete the tileset. For more information on how to delete tilesets, see [Data Maintenance](#data-maintenance).
+If a tileset becomes outdated and is no longer useful, you can delete the tileset. For information about how to delete tilesets, see [Data maintenance](#data-maintenance).
>[!NOTE]
->A tileset is independent of the dataset from which it was created. If you create tilesets from a dataset, and then subsequently update that dataset, the tilesets will not be updated. To reflect changes in a dataset, you must create new tilesets. Similarly, if you delete a tileset, the dataset will not be affected.
+>A tileset is independent of the dataset from which it was created. If you create tilesets from a dataset, and then subsequently update that dataset, the tilesets isn't updated.
+>
+>To reflect changes in a dataset, you must create new tilesets. Similarly, if you delete a tileset, the dataset isn't affected.
### Feature statesets
-Feature statesets are collections of dynamic properties (*states*) assigned to dataset features such as rooms or equipment. An example of a *state* could be temperature or occupancy. Each *state* is a key/value pair containing the name of the property, the value, and the timestamp of the last update.
+Feature statesets are collections of dynamic properties (*states*) that are assigned to dataset features, such as rooms or equipment. An example of a *state* can be temperature or occupancy. Each *state* is a key/value pair that contains the name of the property, the value, and the timestamp of the last update.
-The [Feature State service](/rest/api/maps/featurestate/createstatesetpreview) lets you create and manage a feature stateset for a dataset. The stateset is defined by one or more *states*. Each feature, such as a room, can have one *state* attached to it.
+You can use the [Feature State service](/rest/api/maps/v2/featurestate/createstatesetpreview) to create and manage a feature stateset for a dataset. The stateset is defined by one or more *states*. Each feature, such as a room, can have one *state* attached to it.
-The value of each *state* in a stateset can be updated or retrieved by IoT devices or other applications. For example, using the [Feature State Update API](/rest/api/maps/featurestate/updatestatespreview), devices measuring space occupancy can systematically post the state change of a room.
+The value of each *state* in a stateset can be updated or retrieved by IoT devices or other applications. For example, using the [Feature State Update API](/rest/api/maps/v2/featurestate/updatestatespreview), devices measuring space occupancy can systematically post the state change of a room.
-An application can use a feature stateset to dynamically render features in a facility according to their current state and respective map style. For more information on using feature statesets to style features in a rendering map, see [Indoor Web SDK Module](#indoor-maps-module).
+An application can use a feature stateset to dynamically render features in a facility according to their current state and respective map style. For more information about using feature statesets to style features in a rendering map, see [Indoor Maps module](#indoor-maps-module).
>[!NOTE]
->Like tilesets, changing a dataset does not affect the existing feature stateset and deleting a feature stateset will have no effect on the dataset to which it is attached.
+>Like tilesets, changing a dataset doesn't affect the existing feature stateset, and deleting a feature stateset doesn't affect the dataset to which it's attached.
## Using indoor maps
-### Render V2 service
+### Render V2-Get Map Tile API
-The Azure Maps [Render V2 service-Get Map Tile API (Preview)](/rest/api/maps/renderv2/getmaptilepreview) has been extended to support Creator (Preview) tilesets.
+The Azure Maps [Render V2-Get Map Tile API](/rest/api/maps/renderv2/getmaptilepreview) has been extended to support Creator tilesets.
-Render V2 service-Get Map State Tile API allows applications to request tilesets. The tilesets can then be integrated into a map control or SDK. For an example of a map control that uses the Render V2 service, see [Indoor Maps Module](#indoor-maps-module).
+Applications can use the Render V2-Get Map Tile API to request tilesets. The tilesets can then be integrated into a map control or SDK. For an example of a map control that uses the Render V2 service, see [Indoor Maps Module](#indoor-maps-module).
### Web Feature Service API
-Datasets can be queried using the [Web Feature Service (WFS) API](/rest/api/maps/wfs). WFS follows the [Open Geospatial Consortium API Features](http://docs.opengeospatial.org/DRAFTS/17-069r1.html). The WFS API lets you query features within the dataset itself. For example, you can use WFS to find all mid-size meeting rooms of a given facility and floor level.
+You can use the [Web Feature Service (WFS) API](/rest/api/maps/v2/wfs) to query datasets. WFS follows the [Open Geospatial Consortium API Features](http://docs.opengeospatial.org/DRAFTS/17-069r1.html). You can use the WFS API to query features within the dataset itself. For example, you can use WFS to find all mid-size meeting rooms of a specific facility and floor level.
+
+### Alias API
+
+Creator services such as Conversion, Dataset, Tileset, and Feature State return an identifier for each resource that's created from the APIs. The [Alias API](/rest/api/maps/v2/alias) allows you to assign an alias to reference a resource identifier.
### Indoor Maps module
-The [Azure Maps Web SDK](./index.yml) includes the Indoor Maps module. This module offers extended functionalities to the Azure Maps *Map Control* library. The Indoor Maps module renders indoor maps created in Creator (Preview). It integrates widgets such as *floor picker*, which helps users visualize the different floors.
+The [Azure Maps Web SDK](./index.yml) includes the Indoor Maps module. This module offers extended functionalities to the Azure Maps *Map Control* library. The Indoor Maps module renders indoor maps created in Creator. It integrates widgets, such as *floor picker*, that help users to visualize the different floors.
-The Indoor Maps module allows you to create web applications that integrate indoor map data with other [Azure Maps services](./index.yml). The most common application setups could include adding knowledge to indoor maps from other maps such as road, imagery, weather, and transit.
+You can use the Indoor Maps module to create web applications that integrate indoor map data with other [Azure Maps services](./index.yml). The most common application setups include adding knowledge from other maps - such as road, imagery, weather, and transit - to indoor maps.
-The Indoor Maps module also supports dynamic map styling. For a step-by-step walk-through on how to implement feature stateset dynamic styling in an application, see [How to Use the Indoor Map Module](how-to-use-indoor-module.md)
+The Indoor Maps module also supports dynamic map styling. For a step-by-step walkthrough to implement feature stateset dynamic styling in an application, see [Use the Indoor Map module](how-to-use-indoor-module.md).
### Azure Maps integration
-As you begin to develop solutions for indoor maps, you can discover ways to integrate existing Azure Maps capabilities. For example, asset tracking or safety scenarios could be implemented by using the [Azure Maps Geofence API](/rest/api/maps/spatial/postgeofence) with Creator indoor maps. The Geofence API could be used to determine, for example, whether a worker enters or leaves specific indoor areas. For more information on how to connect Azure Maps with IoT telemetry is available [here](tutorial-iot-hub-maps.md).
+As you begin to develop solutions for indoor maps, you can discover ways to integrate existing Azure Maps capabilities. For example, you can implement asset tracking or safety scenarios by using the [Azure Maps Geofence API](/rest/api/maps/spatial/postgeofence) with Creator indoor maps. For example, you can use the Geofence API to determine whether a worker enters or leaves specific indoor areas. For more information about how to connect Azure Maps with IoT telemetry, see [this IoT spatial analytics tutorial](tutorial-iot-hub-maps.md).
-### Data Maintenance
+### Data maintenance
- Azure Maps Creator (Preview) List, Update, and Delete API allows you to list, update, and delete your datasets, tilesets, and feature statesets.
+ You can use the Azure Maps Creator List, Update, and Delete API to list, update, and delete your datasets, tilesets, and feature statesets.
>[!NOTE]
->Whenever you review a list of items and decide to delete them, you must consider the impact of that deletion on all dependent API or applications. For example, if you should delete a tileset that is currently being used by an application by means of the [Render V2 - Get Map Tile API](/rest/api/maps/renderv2/getmaptilepreview), deleting that tileset would result in an application failure to render that tileset.
+>When you review a list of items to determine whether to delete them, consider the impact of that deletion on all dependent API or applications. For example, if you delete a tileset that's being used by an application by means of the [Render V2-Get Map Tile API](/rest/api/maps/renderv2/getmaptilepreview), the application fails to render that tileset.
### Example: Updating a dataset
-The following example shows you how to update a dataset, create a new tileset, and delete an old tileset.
+The following example shows how to update a dataset, create a new tileset, and delete an old tileset:
1. Follow steps in the [Upload a Drawing package](#upload-a-drawing-package) and [Convert a Drawing package](#convert-a-drawing-package) sections to upload and convert the new Drawing package.-
-2. Use the [Dataset Create API](/rest/api/maps/dataset/createpreview) to append the converted data to the existing dataset.
-
-3. Use the [Tileset Create API](/rest/api/maps/tileset/createpreview) to generate a new tileset out of the updated dataset. Save the new tilesetId for step 4.
-
-4. Update the tileset identifier in your application to enable the visualization of the updated campus dataset. If the old tileset is no longer in use, you can delete it.
+2. Use the [Dataset Create API](/rest/api/maps/v2/dataset/createpreview) to append the converted data to the existing dataset.
+3. Use the [Tileset Create API](/rest/api/maps/v2/tileset/createpreview) to generate a new tileset out of the updated dataset.
+4. Save the new **tilesetId** for the next step.
+5. To enable the visualization of the updated campus dataset, update the tileset identifier in your application. If the old tileset is no longer used, you can delete it.
## Next steps > [!div class="nextstepaction"]
-> [Tutorial: Creating a Creator (Preview) indoor map](tutorial-creator-indoor-maps.md)
+> [Tutorial: Creating a Creator indoor map](tutorial-creator-indoor-maps.md)
azure-maps Creator Long Running Operation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/creator-long-running-operation.md
-# Creator (Preview) Long-Running Operation API
-
-> [!IMPORTANT]
-> Azure Maps Creator services are currently in public preview.
-> This preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+# Creator Long-Running Operation API
Some APIs in Azure Maps use an [Asynchronous Request-Reply pattern](/azure/architecture/patterns/async-request-reply). This pattern allows Azure Maps to provide highly available and responsive services. This article explains Azure Map's specific implementation of long-running asynchronous background processing.
azure-maps Drawing Conversion Error Codes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/drawing-conversion-error-codes.md
Title: Azure Maps Drawing Conversion errors and warnings
description: Learn about the Conversion errors and warnings you may meet while you're using the Azure Maps Conversion service. Read the recommendations on how to resolve the errors and the warnings, with some examples. Previously updated : 12/07/2020 Last updated : 05/21/2021
# Drawing conversion errors and warnings
-The [Azure Maps Conversion service](/rest/api/maps/conversion) lets you convert uploaded Drawing packages into map data. Drawing packages must adhere to the [Drawing package requirements](drawing-requirements.md). If one or more requirements aren't met, then the Conversion service will return errors or warnings. This article lists the conversion error and warning codes, with recommendations on how to resolve them. It also provides some examples of drawings that can cause the Conversion service to return these codes.
+The [Azure Maps Conversion service](/rest/api/maps/v2/conversion) lets you convert uploaded Drawing packages into map data. Drawing packages must adhere to the [Drawing package requirements](drawing-requirements.md). If one or more requirements aren't met, then the Conversion service will return errors or warnings. This article lists the conversion error and warning codes, with recommendations on how to resolve them. It also provides some examples of drawings that can cause the Conversion service to return these codes.
The Conversion service will succeed if there are any conversion warnings. However, it's recommended that you review and resolve all warnings. A warning means part of the conversion was ignored or automatically fixed. Failing to resolve the warnings could result in errors in latter processes.
The **automaticRepairPerformed** warning occurs when the Conversion service auto
![Example of a snapped PolyLine](./media/drawing-conversion-error-codes/automatic-repair-2.png)
-* The image below shows how, in a layer that supports only closed PolyLines, the Conversion service repaired multiple non-closed PolyLines. In order to avoid discarding the non-closed PolyLines, the service combined them into a single closed PolyLine.
+* The image below shows how, in a layer that supports only closed PolyLines, the Conversion service repaired multiple non-closed PolyLines. To avoid discarding the non-closed PolyLines, the service combined them into a single closed PolyLine.
![Example of non-closed Polylines combined into a single closed PolyLine](./media/drawing-conversion-error-codes/automatic-repair-3.png)
In the following image, the door geometry, highlighted in red, overlaps the yell
#### *How to fix doorOutsideLevel*
-To fix a **doorOutsideLevel** warning, redraw your door geometry so that it is inside the level boundaries.
+To fix a **doorOutsideLevel** warning, redraw your door geometry so that it's inside the level boundaries.
## Zone warnings
You attempted to upload a Drawing package with an incorrect `udid` parameter.
To fix an **invalidUserData** error, verify that: * You've provided a correct `udid` for the uploaded package.
-* Azure Maps Creator (Preview) has been enabled for the Azure Maps account you used for uploading the Drawing package.
+* Azure Maps Creator has been enabled for the Azure Maps account you used for uploading the Drawing package.
* The API request to the Conversion service contains the subscription key to the Azure Maps account you used for uploading the Drawing package. ### **dwgError**
To fix an **invalidUserData** error, verify that:
A **dwgError** when the drawing package contains an issue with one or more DWG files in the uploaded ZIP archive.
-The **dwgError** occurs when the drawing package contains a DWG file that can't be opened because it is invalid or corrupt.
+The **dwgError** occurs when the drawing package contains a DWG file that can't be opened because it's invalid or corrupt.
* A DWG file isn't a valid AutoCAD DWG file format drawing. * A DWG file is corrupt.
To fix a **verticalPenetrationError** error, read about how to use a vertical pe
> [How to use Azure Maps Drawing error visualizer](drawing-error-visualizer.md) > [!div class="nextstepaction"]
-> [Creator (Preview) for indoor mapping](creator-indoor-maps.md)
+> [Drawing Package Guide](drawing-package-guide.md)
+
+> [!div class="nextstepaction"]
+> [Creator for indoor mapping](creator-indoor-maps.md)
azure-maps Drawing Error Visualizer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/drawing-error-visualizer.md
Title: Use Azure Maps Drawing Error Visualizer
-description: In this article, you'll learn about how to visualize warnings and errors returned by the Creator (Preview) Conversion API.
+description: In this article, you'll learn about how to visualize warnings and errors returned by the Creator Conversion API.
Previously updated : 12/07/2020 Last updated : 05/26/2021
-# Using the Azure Maps Drawing Error Visualizer with Creator (Preview)
+# Using the Azure Maps Drawing Error Visualizer with Creator
-> [!IMPORTANT]
-> Azure Maps Creator services are currently in public preview.
-> This preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-
-The Drawing Error Visualizer is a stand-alone web application that displays [Drawing package warnings and errors](drawing-conversion-error-codes.md) detected during the conversion process. The Error Visualizer web application consists of a static page that you can use without connecting to the internet. You can use the Error Visualizer to fix errors and warnings in accordance with [Drawing package requirements](drawing-requirements.md). The [Azure Maps Conversion API](/rest/api/maps/conversion) only returns a response with a link to the Error Visualizer only when an error is detected.
+The Drawing Error Visualizer is a stand-alone web application that displays [Drawing package warnings and errors](drawing-conversion-error-codes.md) detected during the conversion process. The Error Visualizer web application consists of a static page that you can use without connecting to the internet. You can use the Error Visualizer to fix errors and warnings in accordance with [Drawing package requirements](drawing-requirements.md). The [Azure Maps Conversion API](/rest/api/maps/v2/conversion) returns a response with a link to the Error Visualizer only when an error is detected.
## Prerequisites
Before you can download the Drawing Error Visualizer, you'll need to:
1. [Create an Azure Maps account](quick-demo-map-app.md#create-an-azure-maps-account) 2. [Obtain a primary subscription key](quick-demo-map-app.md#get-the-primary-key-for-your-account), also known as the primary key or the subscription key.
-3. [Create a Creator (Preview) resource](how-to-manage-creator.md)
+3. [Create a Creator resource](how-to-manage-creator.md)
This tutorial uses the [Postman](https://www.postman.com/) application, but you may choose a different API development environment. ## Download
-1. Upload your Drawing package to the Azure Maps Creator service (Preview) to obtain a `udid` for the uploaded package. For steps on how to upload a package, see [Upload a drawing package](tutorial-creator-indoor-maps.md#upload-a-drawing-package).
+1. Upload your Drawing package to the Azure Maps Creator service to obtain a `udid` for the uploaded package. For steps on how to upload a package, see [Upload a drawing package](tutorial-creator-indoor-maps.md#upload-a-drawing-package).
2. Now that the Drawing package is uploaded, we'll use `udid` for the uploaded package to convert the package into map data. For steps on how to convert a package, see [Convert a drawing package](tutorial-creator-indoor-maps.md#convert-a-drawing-package).
This tutorial uses the [Postman](https://www.postman.com/) application, but you
"operationId": "77dc9262-d3b8-4e32-b65d-74d785b53504", "created": "2020-04-22T19:39:54.9518496+00:00", "status": "Failed",
- "resourceLocation": "https://atlas.microsoft.com/conversion/{conversionId}?api-version=1.0",
"properties": {
- "diagnosticPackageLocation": "https://atlas.microsoft.com/mapData/ce61c3c1-faa8-75b7-349f-d863f6523748?api-version=1.0"
+ "diagnosticPackageLocation": "https://us.atlas.microsoft.com/mapData/ce61c3c1-faa8-75b7-349f-d863f6523748?api-version=2.0"
} } ```
This tutorial uses the [Postman](https://www.postman.com/) application, but you
Inside the downloaded zipped package from the `diagnosticPackageLocation` link, you'll find two files. * _VisualizationTool.zip_: Contains the source code, media, and web page for the Drawing Error Visualizer.
-* _ConversionWarningsAndErrors.json_: Contains a formatted list of warnings, errors, and additional details that are used by the Drawing Error Visualizer.
+* _ConversionWarningsAndErrors.json_: Contains a formatted list of warnings, errors, and other details that are used by the Drawing Error Visualizer.
Unzip the _VisualizationTool.zip_ folder. It contains the following items:
After launching the Drawing Error Visualizer tool, you'll be presented with the
:::image type="content" source="./media/drawing-errors-visualizer/start-page.png" alt-text="Drawing Error Visualizer App - Start Page":::
-The _ConversionWarningsAndErrors.json_ file has been placed at the root of the downloaded directory. To load the _ConversionWarningsAndErrors.json_ you can either drag & drop the file onto the box or click on the box, find the file in the File Explorer dialogue, and then upload the file.
+The _ConversionWarningsAndErrors.json_ file has been placed at the root of the downloaded directory. To load the _ConversionWarningsAndErrors.json_, drag & drop the file onto the box. Or, click on the box, find the file in the `File Explorer dialogue`, and upload the file.
:::image type="content" source="./media/drawing-errors-visualizer/loading-data.gif" alt-text="Drawing Error Visualizer App - Drag and drop to load data":::
Once the _ConversionWarningsAndErrors.json_ file loads, you'll see a list of you
## Next steps
-Once your [Drawing package meets the requirements](drawing-requirements.md), you can use the [Azure Maps Dataset service](/rest/api/maps/conversion) to convert the Drawing package to a dataset. Then, you can use the Indoor Maps web module to develop your application. Learn more by reading the following articles:
+Once your [Drawing package meets the requirements](drawing-requirements.md), you can use the [Azure Maps Dataset service](/rest/api/maps/v2/conversion) to convert the Drawing package to a dataset. Then, you can use the Indoor Maps web module to develop your application. Learn more by reading the following articles:
> [!div class="nextstepaction"] > [Drawing Conversion error codes](drawing-conversion-error-codes.md) > [!div class="nextstepaction"]
-> [Creator (Preview) for indoor maps](creator-indoor-maps.md)
+> [Drawing Package Guide](drawing-package-guide.md)
+
+> [!div class="nextstepaction"]
+> [Creator for indoor maps](creator-indoor-maps.md)
> [!div class="nextstepaction"] > [Use the Indoor Maps module](how-to-use-indoor-module.md)
azure-maps Drawing Package Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/drawing-package-guide.md
+
+ Title: Drawing package guide for Microsoft Azure Maps Creator (Preview)
+description: Learn how to prepare a Drawing package for the Azure Maps Conversion service
++ Last updated : 05/18/2021++++++++
+# Conversion Drawing package guide
+
+This guide shows you how to prepare your Drawing Package for the [Azure Maps Conversion service](/rest/api/maps/v2/conversion) using specific CAD commands to correctly prepare your DWG files and manifest file for the Conversion service.
+
+To start with, make sure your Drawing Package is in .zip format, and contains the following files:
+
+* One or more drawing files in DWG format.
+* A Manifest file describing DWG files and facility metadata.
+
+If you don't have your own package to reference along with this guide, you may download the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples).
+
+You may choose any CAD software to open and prepare your facility drawing files. However, this guide is created using Autodesk's AutoCAD® software. Any commands referenced in this guide are meant to be executed using Autodesk's AutoCAD® software.
+
+>[!TIP]
+>For more information about drawing package requirements that aren't covered in this guide, see [Drawing Package Requirements](drawing-requirements.md).
+
+## Glossary of terms
+
+For easy reference, here are some terms and definitions that are important as you read this guide.
+
+| Term | Definition |
+|:-|:|
+| Layer | An AutoCAD DWG layer from the drawing file.|
+| Entity | An AutoCAD DWG entity from the drawing file. |
+| Level |An area of a building at a set elevation. For example, the floor of a building.ΓÇ» |
+| Feature | An object that combines a geometry with more metadata information. |
+| Feature classes | A common blueprint for features. For example, a *unit* is a feature class, and an *office* is a feature. |
+
+## Step 1: DWG file requirements
+
+When preparing your facility drawing files for the Conversion service, make sure to follow these preliminary requirements and recommendations:
+
+* Facility drawing files must be saved in DWG format, which is the native file format for Autodesk's AutoCAD® software.
+
+* The Conversion service works with the AutoCAD DWG file format. AC1032 is the internal format version for the DWG files, and it's a good idea to select AC1032 for the internal DWG file format version.
+
+* A DWG file can only contain a single floor. A floor of a facility must be provided in its own separate DWG file. So, if you have five floors in a facility, you must create five separate DWG files.
+
+## Step 2: Prepare the DWG files
+
+This part of the guide will show you how to use CAD commands to ensure that your DWG files meet the requirements of the Conversion service.
+
+You may choose any CAD software to open and prepare your facility drawing files. However, this guide is created using Autodesk's AutoCAD® software. Any commands referenced in this guide are meant to be executed using Autodesk's AutoCAD® software.
+
+### Bind External References
+
+Each floor of a facility must be provided as one DWG file. If there are no external references, then nothing more needs to be done. However, if there are any external references, they must be bound to a single drawing. To bind an external reference, you may use the `XREF` command. After binding, each external reference drawing will be added as a block reference. If you need to make changes to any of these layers, remember to explode the block references by using the `XPLODE` command.
+
+### Unit of measurement
+
+The drawings can be created using any unit of measurement. However, all drawings must use the same unit of measurement. So, if one floor of the facility is using millimeters, then all other floors (drawings) must also be in millimeters. You can verify or modify the measurement unit by using the `UNITS` command.
+
+The following image shows the Drawing Units window within Autodesk's AutoCAD® software that you can use to verify the unit of measurement.
++
+### Alignment
+
+Each floor of a facility is provided as an individual DWG file. As a result, it's possible that the floors are not perfectly aligned when stacked on top of each other. Azure Maps Conversion service requires that all drawings be aligned with the physical space. To verify alignment, use a reference point that can span across floors, such as an elevator or column that spans multiple floors. you can view all the floors by opening a new drawing, and then use the `XATTACH` command to load all floor drawings. If you need to fix any alignment issues, you can use the reference points and the `MOVE` command to realign the floors that require it.
+
+### Layers
+
+Ensure that each layer of a drawing contains entities of one feature class. If a layer contains entities for walls, then it can't have other features such as units or doors. However, a feature class can be split up over multiple layers. For example, you can have three layers in the drawing that contain wall entities.
+
+Furthermore, each layer has a list of supported entity types and any other types are ignored. For example, if the Unit Label layer only supports single-line text, a multiline text or Polyline on the same layer is ignored.
+
+For a better understanding of layers and feature classes, see [Drawing Package Requirements](drawing-requirements.md).
+
+### Exterior layer
+
+A single level feature is created from each exterior layer or layers. This level feature defines the level's perimeter. It's important to ensure that the entities in the exterior layer meet the requirements of the layer. For example, a closed Polyline is supported; but an open Polyline isn't. If your exterior layer is made of multiple line segments, they must be provided as one closed Polyline. To join multiple line segments together, select all line segments and use the `JOIN` command.
+
+The following image is taken from the sample package, and shows the exterior layer of the facility in red. The unit layer is turned off to help with visualization.
++
+### Unit layer
+
+Units are navigable spaces in the building, such as offices, hallways, stairs, and elevators. A closed entity type such as Polygon, closed Polyline, Circle, or closed Ellipse is required to represent each unit. So, walls and doors alone won't create a unit because there isn’t an entity that represents the unit.
+
+The following image is taken from the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples) and shows the unit label layer and unit layer in red. All other layers are turned off to help with visualization. Also, one unit is selected to help show that each unit is a closed Polyline.
++
+### Unit label layer
+
+If you'd like to add a name property to a unit, you'll need to add a separate layer for unit labels. Labels must be provided as single-line text entities that fall inside the bounds of a unit. A corresponding unit property must be added to the manifest file where the `unitName` matches the Contents of the Text. To learn about all supported unit properties, see [`unitProperties`](#unitproperties).
+
+### Door layer
+
+Doors are optional. However, doors may be used if you'd like to specify the entry point(s) for a unit. Doors can be drawn in any way if it's a supported entity type by the door layer. The door must overlap the boundary of a unit and the overlapping edge of the unit is then be treated as an opening to the unit.
+
+The following image is taken from the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples) and shows a unit with a door (in red) drawn on the unit boundary.
++
+### Wall layer
+
+The wall layer is meant to represent the physical extents of a facility such as walls and columns. The Azure Maps Conversion service perceives walls as physical structures that are an obstruction to routing. With that in mind, a wall should be thought as a physical structure that one can see, but not walk though. Anything that canΓÇÖt be seen won't captured in this layer. If a wall has inner walls or columns inside, then only the exterior should be captured.
+
+## Step 3: Prepare the manifest
+
+The Drawing package Manifest is a JSON file. The Manifest tells the Azure Maps Conversion service how to read the facility DWG files and metadata. Some examples of this information could be the specific information each DWG layer contains, or the geographical location of the facility.
+
+To achieve a successful conversion, all ΓÇ£requiredΓÇ¥ properties must be defined. A sample manifest file can be found inside the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples). This guide does not cover properties supported by the manifest. For more information about manifest properties, see [Manifest File Properties](drawing-requirements.md#manifest-file-requirements).
+
+### Building levels
+
+The building level specifies which DWG file to use for which level. A level must have a level name and ordinal that describes that vertical order of each level. Every facility must have an ordinal 0, which is the ground floor of a facility. An ordinal 0 must be provided even if the drawings occupy a few floors of a facility. For example, floors 15-17 can be defined as ordinal 0-2, respectively.
+
+The following example is taken from the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples). The facility has three levels: basement, ground, and level 2. The filename contains the full file name and path of the file relative to the manifest file within the .zip Drawing package.
+
+```json
+    "buildingLevels": {
+ ΓÇ» "levels":ΓÇ»[
+       {
+        "levelName": "Basement",
+        "ordinal": -1,
+        "filename": "./Basement.dwg"
+            }, {
+
+            "levelName": "Ground",
+            "ordinal": 0,
+            "filename": "./Ground.dwg"
+            }, {
+
+            "levelName": "Level 2",
+            "ordinal": 1,
+             "filename": "./Level_2.dwg"
+            }
+        ]
+    },
+```
+
+### georeference
+
+The `georeference` object is used to specify where the facility is located geographically and how much to rotate the facility. The origin point of the drawing should match the latitude and longitude provided with the `georeference` object. The clockwise angle, in degrees, between true north and the drawing's vertical (Y) axis.
+
+### dwgLayers
+
+The `dwgLayers` object is used to specify that DWG layer names where feature classes can be found. To receive a property converted facility, it's important to provide the correct layer names. For example, a DWG wall layer must be provided as a wall layer and not as a unit layer. The drawing can have other layers such as furniture or plumbing; but, they'll be ignored by the Azure Maps Conversion service if they're not specified in the manifest.
+
+The following example of the `dwgLayers` object in the manifest.
+
+```json
+"dwgLayers":ΓÇ»{
+        "exterior": [
+            "OUTLINE"
+        ],
+        "unit": [
+            "UNITS"
+        ],
+        "wall": [
+            "WALLS"
+        ],
+        "door": [
+            "DOORS"
+        ],
+        "unitLabel": [
+            "UNITLABELS"
+        ],
+        "zone": [
+            "ZONES"
+        ],
+        "zoneLabel": [
+            "ZONELABELS"
+        ]
+    }
+```
+
+The following image shows the layers from the corresponding DWG drawing viewed in Autodesk's AutoCAD® software.
++
+### unitProperties
+
+The `unitProperties` object allows you to define other properties for a unit that you canΓÇÖt do in the DWG file. Examples could be directory information of a unit or the category type of a unit. A unit property is associated with a unit by having the `unitName` object match the label in the `unitLabel` layer.
+
+The following image is taken from the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples). It displays the unit label that's associated to the unit property in the manifest.
++
+The following snippet shows the unit property object that is associated with the unit.
+
+```json
+ΓÇ»"unitProperties":ΓÇ»[
+        {
+            "unitName": "B01",
+            "categoryName": "room.office",
+            "navigableBy": ["pedestrian", "wheelchair", "machine"],
+            "routeThroughBehavior": "disallowed",
+            "occupants": [
+                {
+                    "name": "Joe's Office",
+                    "phone": "1 (425) 555-1234"
+                }
+            ],
+            "nameAlt": "Basement01",
+            "nameSubtitle": "01",
+            "addressRoomNumber": "B01",
+            "nonPublic": true,
+            "isRoutable": true,
+            "isOpenArea": true
+        },
+```
+
+## Step 4: Prepare the Drawing Package
+
+You should now have all the DWG drawings prepared to meet Azure Maps Conversion service requirements. A manifest file has also been created to help describe the facility. All files will need to be zipped into a single archive file, with the `.zip` extension. It's important that the manifest file is named `manifest.json` and is placed in the root directory of the zipped package. All other files can be in any directory of the zipped package if the filename includes the relative path to the manifest. For an example of a drawing package, see the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples).
+
+## Next steps
+
+> [!div class="nextstepaction"]
+> [Tutorial: Creating a Creator indoor map](tutorial-creator-indoor-maps.md)
azure-maps Drawing Requirements https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/drawing-requirements.md
Title: Drawing package requirements in Microsoft Azure Maps Creator (Preview)
+ Title: Drawing package requirements in Microsoft Azure Maps Creator
description: Learn about the Drawing package requirements to convert your facility design files to map data Previously updated : 1/08/2021 Last updated : 5/27/2021
# Drawing package requirements
+You can convert uploaded Drawing packages into map data by using the [Azure Maps Conversion service](/rest/api/maps/v2/conversion). This article describes the Drawing package requirements for the Conversion API. To view a sample package, you can download the sample [Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples).
-> [!IMPORTANT]
-> Azure Maps Creator services are currently in public preview.
-> This preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+For a guide on how to prepare your Drawing package, see [Conversion Drawing Package Guide](drawing-package-guide.md).
-You can convert uploaded Drawing packages into map data by using the [Azure Maps Conversion service](/rest/api/maps/conversion). This article describes the Drawing package requirements for the Conversion API. To view a sample package, you can download the sample [Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples).
## Prerequisites
The Drawing package includes drawings saved in DWG format, which is the native f
You can choose any CAD software to produce the drawings in the Drawing package.
-The [Azure Maps Conversion service](/rest/api/maps/conversion) converts the Drawing package into map data. The Conversion service works with the AutoCAD DWG file format. `AC1032` is the internal format version for the DWG files, and it's a good idea to select `AC1032` for the internal DWG file format version.
+The [Azure Maps Conversion service](/rest/api/maps/v2/conversion) converts the Drawing package into map data. The Conversion service works with the AutoCAD DWG file format `AC1032`.
+ ## Glossary of terms
For easy reference, here are some terms and definitions that are important as yo
| Term | Definition | |:-|:|
-| Layer | An AutoCAD DWG layer.|
+| Layer | An AutoCAD DWG layer from the drawing file.|
+| Entity | An AutoCAD DWG entity from the drawing file. |
+| Xref | A file in AutoCAD DWG file format, attached to the primary drawing as an external reference. |
| Level | An area of a building at a set elevation. For example, the floor of a building. |
-| Xref |A file in AutoCAD DWG file format (.dwg), attached to the primary drawing as an external reference. |
-| Feature | An object that combines a geometry with more metadata information. |
+| Feature | An instance of an object produced from the Conversion service that combines a geometry with metadata information. |
| Feature classes | A common blueprint for features. For example, a *unit* is a feature class, and an *office* is a feature. | ## Drawing package structure A Drawing package is a .zip archive that contains the following files:
-* DWG files in AutoCAD DWG file format.
-* A _manifest.json_ file that describes the DWG files in the Drawing package.
-
-The Drawing package must be zipped into a single archive file, with the .zip extension. The DWG files can be organized in any way inside the package, but the manifest file must live at the root directory of the zipped package. The next sections detail the requirements for the DWG files, manifest file, and the content of these files.
+- DWG files in AutoCAD DWG file format.
+- A _manifest.json_ file that describes the DWG files in the Drawing package.
-## DWG files requirements
+The Drawing package must be zipped into a single archive file, with the .zip extension. The DWG files can be organized in any way inside the package, but the manifest file must live at the root directory of the zipped package. The next sections detail the requirements for the DWG files, manifest file, and the content of these files. To view a sample package, you can download the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples).
-A single DWG file is required for each level of the facility. The level's data must be contained in a single DWG file. Any external references (_xrefs_) must be bound to the parent drawing. Additionally, each DWG file:
+## DWG file conversion process
-* Must define the _Exterior_ and _Unit_ layers. It can optionally define the following optional layers: _Wall_, _Door_, _UnitLabel_, _Zone_, and _ZoneLabel_.
-* Must not contain features from multiple levels.
-* Must not contain features from multiple facilities.
-* Must reference the same measurement system and unit of measurement as other DWG files in the Drawing package.
+The [Azure Maps Conversion service](/rest/api/maps/v2/conversion) performs the following on each DWG file:
-The [Azure Maps Conversion service](/rest/api/maps/conversion) can extract the following feature classes from a DWG file:
+- Extracts feature classes:
+ - Levels
+ - Units
+ - Zones
+ - Openings
+ - Walls
+ - Vertical penetrations
+- Produces a *Facility* feature.
+- Produces a minimal set of default Category features to be referenced by other features:
+ - room
+ - structure
+ - wall
+ - opening.door
+ - zone
+ - facility
+
+## DWG file requirements
-* Levels
-* Units
-* Zones
-* Openings
-* Walls
-* Vertical penetrations
+A single DWG file is required for each level of the facility. All data of a single level must be contained in a single DWG file. Any external references (_xrefs_) must be bound to the parent drawing. For example, a facility with three levels will have three DWG files in the Drawing package.
-All conversion jobs result in a minimal set of default categories: room, structure.wall, opening.door, zone, and facility. Additional categories are for each category name referenced by objects.
+Each DWG file must adhere to the following requirements:
-A DWG layer must contain features of a single class. Classes must not share a layer. For example, units and walls can't share a layer.
+- The DWG file must define the _Exterior_ and _Unit_ layers. It can optionally define the following layers: _Wall_, _Door_, _UnitLabel_, _Zone_, and _ZoneLabel_.
+- The DWG file cannot contain features from multiple levels.
+- The DWG file cannot contain features from multiple facilities.
+- The DWG must reference the same measurement system and unit of measurement as other DWG files in the Drawing package.
-DWG layers must also follow the following criteria:
+## DWG layer requirements
-* The origins of drawings for all DWG files must align to the same latitude and longitude.
-* Each level must be in the same orientation as the other levels.
-* Self-intersecting polygons are automatically repaired, and the [Azure Maps Conversion service](/rest/api/maps/conversion) raises a warning. It's advisable to manually inspect the repaired results, because they might not match the expected results.
+Each DWG layer must adhere to the following rules:
-All layer entities must be one of the following types: Line, PolyLine, Polygon, Circular Arc, Circle, Ellipse (closed), or Text (single line). Any other entity types are ignored.
+- A layer must exclusively contain features of a single class. For example, units and walls canΓÇÖt be in the same layer.
+- A single class of features can be represented by multiple layers.
+- Self-intersecting polygons are permitted, but are automatically repaired. When this occurs, the [Azure Maps Conversion service](/rest/api/maps/v2/conversion) raises a warning. It's advisable to manually inspect the repaired results, because they might not match the expected results.
+- Each layer has a supported list of entity types. Any other entity types in a layer will be ignored. For example, text entities are not supported on the wall layer.
-The table below outlines the supported entity types and converted map features for each layer. If a layer contains unsupported entity types, then the [Azure Maps Conversion service](/rest/api/maps/conversion) ignores these entities.
+The table below outlines the supported entity types and converted map features for each layer. If a layer contains unsupported entity types, then the [Azure Maps Conversion service](/rest/api/maps/v2/conversion) ignores those entities.
| Layer | Entity types | Converted Features | | :-- | :-| :- | [Exterior](#exterior-layer) | Polygon, PolyLine (closed), Circle, Ellipse (closed) | Levels
-| [Unit](#unit-layer) | Polygon, PolyLine (closed), Circle, Ellipse (closed) | Vertical penetrations, Unit
-| [Wall](#wall-layer) | Polygon, PolyLine (closed), Circle, Ellipse (closed) | Not applicable. For more information, see the [Wall layer](#wall-layer).
+| [Unit](#unit-layer) | Polygon, PolyLine (closed), Circle, Ellipse (closed) | Unit and Vertical penetrations
+| [Wall](#wall-layer) | Polygon, PolyLine (closed), Circle, Ellipse (closed) |
| [Door](#door-layer) | Polygon, PolyLine, Line, CircularArc, Circle | Openings | [Zone](#zone-layer) | Polygon, PolyLine (closed), Circle, Ellipse (closed) | Zone | [UnitLabel](#unitlabel-layer) | Text (single line) | Not applicable. This layer can only add properties to the unit features from the Units layer. For more information, see the [UnitLabel layer](#unitlabel-layer). | [ZoneLabel](#zonelabel-layer) | Text (single line) | Not applicable. This layer can only add properties to zone features from the ZonesLayer. For more information, see the [ZoneLabel layer](#zonelabel-layer).
-The next sections detail the requirements for each layer.
+The sections below describe the requirements for each layer.
### Exterior layer
The DWG file for each level must contain a layer to define that level's perimete
No matter how many entity drawings are in the exterior layer, the [resulting facility dataset](tutorial-creator-indoor-maps.md#create-a-feature-stateset) will contain only one level feature for each DWG file. Additionally:
-* Exteriors must be drawn as Polygon, PolyLine (closed), Circle, or Ellipse (closed).
-* Exteriors may overlap, but are dissolved into one geometry.
-* Resulting level feature must be at least 4 square meters.
-* Resulting level feature must not be greater 400,000 square meters.
+- Exteriors must be drawn as Polygon, PolyLine (closed), Circle, or Ellipse (closed).
+- Exteriors may overlap, but are dissolved into one geometry.
+- Resulting level feature must be at least 4 square meters.
+- Resulting level feature must not be greater 400,000 square meters.
If the layer contains multiple overlapping PolyLines, the PolyLines are dissolved into a single Level feature. Alternatively, if the layer contains multiple non-overlapping PolyLines, the resulting Level feature has a multi-polygonal representation.
The DWG file for each level defines a layer containing units. Units are navigabl
The Units layer should adhere to the following requirements:
-* Units must be drawn as Polygon, PolyLine (closed), Circle, or Ellipse (closed).
-* Units must fall inside the bounds of the facility exterior perimeter.
-* Units must not partially overlap.
-* Units must not contain any self-intersecting geometry.
+- Units must be drawn as Polygon, PolyLine (closed), Circle, or Ellipse (closed).
+- Units must fall inside the bounds of the facility exterior perimeter.
+- Units must not partially overlap.
+- Units must not contain any self-intersecting geometry.
Name a unit by creating a text object in the UnitLabel layer, and then place the object inside the bounds of the unit. For more information, see the [UnitLabel layer](#unitlabel-layer).
You can see an example of the Units layer in the [sample Drawing package](https:
The DWG file for each level can contain a layer that defines the physical extents of walls, columns, and other building structure.
-* Walls must be drawn as Polygon, PolyLine (closed), Circle, or Ellipse (closed).
-* The wall layer or layers should only contain geometry that's interpreted as building structure.
+- Walls must be drawn as Polygon, PolyLine (closed), Circle, or Ellipse (closed).
+- The wall layer or layers should only contain geometry that's interpreted as building structure.
You can see an example of the Walls layer in the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples).
Door openings in an Azure Maps dataset are represented as a single-line segment
The DWG file for each level can contain a Zone layer that defines the physical extents of zones. A zone is a non-navigable space that can be named and rendered. Zones can span multiple levels and are grouped together using the zoneSetId property.
-* Zones must be drawn as Polygon, PolyLine (closed), or Ellipse (closed).
-* Zones can overlap.
-* Zones can fall inside or outside the facility's exterior perimeter.
+- Zones must be drawn as Polygon, PolyLine (closed), or Ellipse (closed).
+- Zones can overlap.
+- Zones can fall inside or outside the facility's exterior perimeter.
Name a zone by creating a text object in the ZoneLabel layer, and placing the text object inside the bounds of the zone. For more information, see [ZoneLabel layer](#zonelabel-layer).
You can see an example of the Zone layer in the [sample Drawing package](https:/
The DWG file for each level can contain a UnitLabel layer. The UnitLabel layer adds a name property to units extracted from the Unit layer. Units with a name property can have more details specified in the manifest file.
-* Unit labels must be single-line text entities.
-* Unit labels must fall inside the bounds of their unit.
-* Units must not contain multiple text entities in the UnitLabel layer.
+- Unit labels must be single-line text entities.
+- Unit labels must fall entirely inside the bounds of their unit.
+- Units must not contain multiple text entities in the UnitLabel layer.
You can see an example of the UnitLabel layer in the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples).
You can see an example of the UnitLabel layer in the [sample Drawing package](ht
The DWG file for each level can contain a ZoneLabel layer. This layer adds a name property to zones extracted from the Zone layer. Zones with a name property can have more details specified in the manifest file.
-* Zones labels must be single-line text entities.
-* Zones labels must fall inside the bounds of their zone.
-* Zones must not contain multiple text entities in the ZoneLabel layer.
+- Zones labels must be single-line text entities.
+- Zones labels must fall inside the bounds of their zone.
+- Zones must not contain multiple text entities in the ZoneLabel layer.
You can see an example of the ZoneLabel layer in the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples). ## Manifest file requirements
-The zip folder must contain a manifest file at the root level of the directory, and the file must be named **manifest.json**. It describes the DWG files to allow the [Azure Maps Conversion service](/rest/api/maps/conversion) to parse their content. Only the files identified by the manifest are ingested. Files that are in the zip folder, but aren't properly listed in the manifest, are ignored.
+The zip folder must contain a manifest file at the root level of the directory, and the file must be named **manifest.json**. It describes the DWG files to allow the [Azure Maps Conversion service](/rest/api/maps/v2/conversion) to parse their content. Only the files identified by the manifest are ingested. Files that are in the zip folder, but aren't properly listed in the manifest, are ignored.
The file paths in the `buildingLevels` object of the manifest file must be relative to the root of the zip folder. The DWG file name must exactly match the name of the facility level. For example, a DWG file for the "Basement" level is "Basement.dwg." A DWG file for level 2 is named as "level_2.dwg." Use an underscore, if your level name has a space.
-Although there are requirements when you use the manifest objects, not all objects are required. The following table shows the required and optional objects for version 1.1 of the [Azure Maps Conversion service](/rest/api/maps/conversion).
+Although there are requirements when you use the manifest objects, not all objects are required. The following table shows the required and optional objects for version 1.1 of the [Azure Maps Conversion service](/rest/api/maps/v2/conversion).
+
+>[!NOTE]
+> Unless otherwise specified, all properties with a string property type allow for one thousand characters.
+ | Object | Required | Description | | :-- | :- | :- |
The next sections detail the requirements for each object.
| `name` | string | true | Name of building. | | `streetAddress`| string | false | Address of building. | |`unit` | string | false | Unit in building. |
-| `locality` | string | false | Name of an area, neighborhood, or region. For example, "Overlake" or "Central District." Locality isn't part of the mailing address. |
-| `adminDivisions` | JSON array of strings | false | An array containing address designations (Country, State, City) or (Country, Prefecture, City, Town). Use ISO 3166 country codes and ISO 3166-2 state/territory codes. |
+| `locality` | string | false | Name of an city, town, area, neighborhood, or region.|
+| `adminDivisions` | JSON array of strings | false | An array containing address designations. For example: (Country, State) Use ISO 3166 country codes and ISO 3166-2 state/territory codes. |
| `postalCode` | string | false | The mail sorting code. | | `hoursOfOperation` | string | false | Adheres to the [OSM Opening Hours](https://wiki.openstreetmap.org/wiki/Key:opening_hours/specification) format. |
-| `phone` | string | false | Phone number associated with the building. Must include the country code. |
-| `website` | string | false | Website associated with the building. Must begin with http or https. |
+| `phone` | string | false | Phone number associated with the building. |
+| `website` | string | false | Website associated with the building. |
| `nonPublic` | bool | false | Flag specifying if the building is open to the public. | | `anchorLatitude` | numeric | false | Latitude of a facility anchor (pushpin). | | `anchorLongitude` | numeric | false | Longitude of a facility anchor (pushpin). |
The `unitProperties` object contains a JSON array of unit properties.
| Property | Type | Required | Description | |--||-|-| |`unitName` |string |true |Name of unit to associate with this `unitProperty` record. This record is only valid when a label matching `unitName` is found in the `unitLabel` layers. |
-|`categoryName`| string| false |Category name. For a complete list of categories, refer to [categories](https://aka.ms/pa-indoor-spacecategories). |
-|`navigableBy`| array of strings | false |Indicates the types of navigating agents that can traverse the unit. This property informs the wayfinding capabilities. The permitted values are: `pedestrian`, `wheelchair`, `machine`, `bicycle`, `automobile`, `hiredAuto`, `bus`, `railcar`, `emergency`, `ferry`, `boat`, and `disallowed`.|
-|`routeThroughBehavior`| string| false |The route through behavior for the unit. The permitted values are `disallowed`, `allowed`, and `preferred`. The default value is `allowed`.|
+|`categoryName`| string| false |Purpose of the unit. A list of values that the provided rendering styles can make use of is available [here](https://atlas.microsoft.com/sdk/javascript/indoor/0.1/categories.json). |
|`occupants` |array of directoryInfo objects |false |List of occupants for the unit. | |`nameAlt`| string| false| Alternate name of the unit. | |`nameSubtitle`| string |false| Subtitle of the unit. | |`addressRoomNumber`| string| false| Room, unit, apartment, or suite number of the unit.|
-|`verticalPenetrationCategory`| string| false| When this property is defined, the resulting feature is a vertical penetration (VRT) rather than a unit. You can use VRTs to go to other VRT features in the levels above or below it. Vertical penetration is a [Category](https://aka.ms/pa-indoor-spacecategories) name. If this property is defined, the `categoryName` property is overridden with `verticalPenetrationCategory`. |
+|`verticalPenetrationCategory`| string| false| When this property is defined, the resulting feature is a vertical penetration (VRT) rather than a unit. You can use vertical penetrations to go to other vertical penetration features in the levels above or below it. Vertical penetration is a [Category](https://aka.ms/pa-indoor-spacecategories) name. If this property is defined, the `categoryName` property is overridden with `verticalPenetrationCategory`. |
|`verticalPenetrationDirection`| string| false |If `verticalPenetrationCategory` is defined, optionally define the valid direction of travel. The permitted values are: `lowToHigh`, `highToLow`, `both`, and `closed`. The default value is `both`.| | `nonPublic` | bool | false | Indicates if the unit is open to the public. | | `isRoutable` | bool | false | When this property is set to `false`, you can't go to or through the unit. The default value is `true`. |
The `zoneProperties` object contains a JSON array of zone properties.
| Property | Type | Required | Description | |--||-|-| |zoneName |string |true |Name of zone to associate with `zoneProperty` record. This record is only valid when a label matching `zoneName` is found in the `zoneLabel` layer of the zone. |
-|categoryName| string| false |Category name. For a complete list of categories, refer to [categories](https://aka.ms/pa-indoor-spacecategories). |
+|categoryName| string| false |Purpose of the unit. A list of values that the provided rendering styles can make use of is available [here](https://atlas.microsoft.com/sdk/javascript/indoor/0.1/categories.json).|
|zoneNameAlt| string| false |Alternate name of the zone. | |zoneNameSubtitle| string | false |Subtitle of the zone. | |zoneSetId| string | false | Set ID to establish a relationship among multiple zones so that they can be queried or selected as a group. For example, zones that span multiple levels. |
Below is the manifest file for the sample Drawing package. To download the entir
        {             "unitName": "B01",             "categoryName": "room.office",
-            "navigableBy": ["pedestrian", "wheelchair", "machine"],
-            "routeThroughBehavior": "disallowed",
-            "occupants": [
+           "occupants": [
                {                     "name": "Joe's Office",                     "phone": "1 (425) 555-1234"
Below is the manifest file for the sample Drawing package. To download the entir
## Next steps
-When your Drawing package meets the requirements, you can use the [Azure Maps Conversion service](/rest/api/maps/conversion) to convert the package to a map dataset. Then, you can use the dataset to generate an indoor map by using the indoor maps module.
+When your Drawing package meets the requirements, you can use the [Azure Maps Conversion service](/rest/api/maps/v2/conversion) to convert the package to a map dataset. Then, you can use the dataset to generate an indoor map by using the indoor maps module.
+
+> [!div class="nextstepaction"]
+> [Creator Facility Ontology](creator-facility-ontology.md)
+
+> [!div class="nextstepaction"]
+> [Creator for indoor maps](creator-indoor-maps.md)
+
+> [!div class="nextstepaction"]
+> [Drawing Package Guide](drawing-package-guide.md)
> [!div class="nextstepaction"]
->[Creator (Preview) for indoor maps](creator-indoor-maps.md)
+>[Creator for indoor maps](creator-indoor-maps.md)
> [!div class="nextstepaction"]
-> [Tutorial: Creating a Creator (Preview) indoor map](tutorial-creator-indoor-maps.md)
+> [Tutorial: Creating a Creator indoor map](tutorial-creator-indoor-maps.md)
> [!div class="nextstepaction"] > [Indoor maps dynamic styling](indoor-map-dynamic-styling.md)
azure-maps How To Manage Creator https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/how-to-manage-creator.md
Title: Manage Microsoft Azure Maps Creator (Preview)
-description: In this article, you'll learn how to manage Microsoft Azure Maps Creator (Preview).
+ Title: Manage Microsoft Azure Maps Creator
+description: In this article, you'll learn how to manage Microsoft Azure Maps Creator.
Previously updated : 04/26/2021 Last updated : 05/18/2021
-# Manage Azure Maps Creator (Preview)
+# Manage Azure Maps Creator
-> [!IMPORTANT]
-> Azure Maps Creator services are currently in public preview.
-> This preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-
-Azure Maps Creator lets you create private indoor map data. Using the Azure Maps API and the Indoor Maps module, you can develop interactive and dynamic indoor map web applications. Currently, Creator is only available in the United States using Gen 2 or Gen 1 (S1) pricing tiers.
+You can use Azure Maps Creator to create private indoor map data. Using the Azure Maps API and the Indoor Maps module, you can develop interactive and dynamic indoor map web applications. For pricing information, see [Choose the right pricing tier in Azure Maps](choose-pricing-tier.md).
This article takes you through the steps to create and delete a Creator resource in an Azure Maps account.
-## Create Creator (Preview) Resource
+## Create Creator resource
1. Sign in to the [Azure portal](https://portal.azure.com)
-2. Select your Azure Maps account. If you can't see your Azure Maps account under the **Recent resources**, then navigate to the Azure portal menu. Select **All resources**. Find and select your Azure Maps account.
-
- ![Azure Maps Portal home page](./media/how-to-manage-creator/select-maps-account.png)
+2. Navigate to the Azure portal menu. Select **All resources**, and then select your Azure Maps account.
-3. Once you're on the Azure Maps account page, navigate to the **Overview** option under **Creator**. Select **Create** to create an Azure Maps Creator resource.
+ :::image type="content" border="true" source="./media/how-to-manage-creator/select-all-resources.png" alt-text="Select Azure Maps account":::
- ![Create Azure Maps Creator page](./media/how-to-manage-creator/creator-blade-settings.png)
+3. In the navigation pane, select **Creator overview**, and then select **Create**.
-4. Enter the name and location for your Creator resource. Currently, Creator is only supported in the United States. Select **Review + create**.
+ :::image type="content" border="true" source="./media/how-to-manage-creator/creator-blade-settings.png" alt-text="Create Azure Maps Creator page":::
- ![Enter Creator account information page](./media/how-to-manage-creator/creator-creation-dialog.png)
+4. Enter the name, location, and map provisioning storage units for your Creator resource. Currently, Creator is supported only in the United States. Select **Review + create**.
-5. Review your settings and select **Create**.
+ :::image type="content" source="./media/how-to-manage-creator/creator-creation-dialog.png" alt-text="Enter Creator account information page":::
- ![Confirm Creator account settings page](./media/how-to-manage-creator/creator-create-dialog.png)
+5. Review your settings, and then select **Create**.
-6. When the deployment completes, you'll see a page with a success or a failure message.
+ :::image type="content" source="./media/how-to-manage-creator/creator-create-dialog.png" alt-text="Confirm Creator account settings page":::
- ![Resource deployment status page](./media/how-to-manage-creator/creator-resource-created.png)
+ After the deployment completes, you'll see a page with a success or a failure message.
-7. Select **Go to resource**. Your Creator resource view page shows the status of your Creator resource and the chosen demographic region.
+ :::image type="content" source="./media/how-to-manage-creator/creator-resource-created.png" alt-text="Resource deployment status page":::
- ![Creator status page](./media/how-to-manage-creator/creator-resource-view.png)
+6. Select **Go to resource**. Your Creator resource view page shows the status of your Creator resource and the chosen demographic region.
+ :::image type="content" source="./media/how-to-manage-creator/creator-resource-view.png" alt-text="Creator status page":::
>[!NOTE]
- >From the Creator resource page, you can navigate back to the Azure Maps account it belongs to by selecting Azure Maps Account.
+ >To return to the Azure Maps account, select **Azure Maps Account** in the navigation pane.
+
+## Delete Creator resource
-## Delete Creator (Preview) Resource
+To delete the Creator resource:
-To delete the Creator resource, navigate to your Azure Maps account. Select **Overview** under **Creator**. Select the **Delete** button.
+1. In your Azure Maps account, select **Overview** under **Creator**.
->[!WARNING]
->When you delete the Creator resource of your Azure Maps account, you will also delete the datasets, tilesets, and feature statesets created using Creator services.
+2. Select **Delete**.
-![Creator page with delete button](./media/how-to-manage-creator/creator-delete.png)
+ >[!WARNING]
+ >When you delete the Creator resource of your Azure Maps account, you also delete the conversions, datasets, tilesets, and feature statesets that were created using Creator services.
-Select the **Delete** button and type your Creator name to confirm deletion. Once the resource is deleted, you'll see a confirmation page, like in the image below:
+ :::image type="content" source="./media/how-to-manage-creator/creator-delete.png" alt-text="Creator page with delete button":::
-![Creator page with delete confirmation](./media/how-to-manage-creator/creator-confirm-delete.png)
+3. You'll be asked to confirm deletion by typing in the name of your Creator resource. After the resource is deleted, you see a confirmation page that looks like the following:
+
+ :::image type="content" source="./media/how-to-manage-creator/creator-confirm-delete.png" alt-text="Creator page with delete confirmation":::
## Authentication
-Creator (Preview) inherits Azure Maps Access Control (IAM) settings. All API calls for data access must be sent with authentication and authorization rules.
+Creator inherits Azure Maps Access Control (IAM) settings. All API calls for data access must be sent with authentication and authorization rules.
Creator usage data is incorporated in your Azure Maps usage charts and activity log. For more information, see [Manage authentication in Azure Maps](./how-to-manage-authentication.md).
+>[!Important]
+>We recommend using:
+>
+> * Azure Active Directory (Azure AD) in all solutions that are built with an Azure Maps account using Creator services. For more information, on Azure AD, see [Azure AD authentication](azure-maps-authentication.md#azure-ad-authentication).
+>
+>* Role-based access control settings (RBAC). Using these settings, map makers can act as the Azure Maps Data Contributor role, and Creator map data users can act as the Azure Maps Data Reader role. For more information, see [Authorization with role-based access control](azure-maps-authentication.md#authorization-with-role-based-access-control).
+ ## Access to Creator services
-Creator services (Preview) and services that use data hosted in Creator (for example, Render service), are accessible at a geographical URL. The geographical URL is determined by the location selected during creation. For example, if Creator is created in the United States geographical location, all calls to the Conversion service must be submitted to `us.atlas.microsoft.com/conversion/convert`.
+Creator services and services that use data hosted in Creator (for example, Render service), are accessible at a geographical URL. The geographical URL is determined by the location selected during creation. For example, if Creator is created in a region in the United States geographical location, all calls to the Conversion service must be submitted to `us.atlas.microsoft.com/conversions`. To view mappings of region to geographical location, [see Creator service geographic scope](creator-geographic-scope.md).
-Also, all data imported into Creator should be uploaded into the same geographical location as the Creator resource. For example, if Creator is provisioned in the United Stated, all raw data should be uploaded via `us.atlas.microsoft.com/mapData/upload`.
+Also, all data imported into Creator should be uploaded into the same geographical location as the Creator resource. For example, if Creator is provisioned in the United States, all raw data should be uploaded via `us.atlas.microsoft.com/mapData/upload`.
## Next steps
-Introduction to Creator services (Preview) for indoor mapping:
+Introduction to Creator services for indoor mapping:
> [!div class="nextstepaction"]
-> [Data Upload](creator-indoor-maps.md#upload-a-drawing-package)
+> [Data upload](creator-indoor-maps.md#upload-a-drawing-package)
> [!div class="nextstepaction"]
-> [Data Conversion](creator-indoor-maps.md#convert-a-drawing-package)
+> [Data conversion](creator-indoor-maps.md#convert-a-drawing-package)
> [!div class="nextstepaction"] > [Dataset](creator-indoor-maps.md#datasets)
Introduction to Creator services (Preview) for indoor mapping:
> [!div class="nextstepaction"] > [Feature State set](creator-indoor-maps.md#feature-statesets)
-Learn how to use the Creator services (Preview) to render indoor maps in your application:
+Learn how to use the Creator services to render indoor maps in your application:
> [!div class="nextstepaction"] > [Azure Maps Creator tutorial](tutorial-creator-indoor-maps.md)
Learn how to use the Creator services (Preview) to render indoor maps in your ap
> [Indoor map dynamic styling](indoor-map-dynamic-styling.md) > [!div class="nextstepaction"]
-> [Use the Indoor Maps module](how-to-use-indoor-module.md)
+> [Use the Indoor Maps module](how-to-use-indoor-module.md)
azure-maps How To Render Custom Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/how-to-render-custom-data.md
Title: Render custom data on a raster map | Microsoft Azure Maps
description: Learn how to add pushpins, labels, and geometric shapes to a raster map. See how to use the static image service in Azure Maps for this purpose. Previously updated : 04/26/2020 Last updated : 05/26/2021
This article explains how to use the [static image service](/rest/api/maps/rende
To render custom pushpins, labels, and geometry overlays, you can use the Postman application. You can use Azure Maps [Data Service APIs](/rest/api/maps/data) to store and render overlays. > [!Tip]
-> It is often much more cost effective to use the Azure Maps Web SDK to show a simple map on a web page than to use the static image service. The web SDK uses map tiles and unless the user pans and zooms the map, they will often generate only a fraction of a transaction per map load. Note that the Azure Maps web SDK has options for disabling panning and zooming. Additionally, the Azure Maps web SDK provides a richer set of data visualization options than a static map web service does.
+> To show a simple map on a web page, it's often more cost effective to use the Azure Maps Web SDK, rather than to use the static image service. The web SDK uses map tiles; and unless the user pans and zooms the map, they will often generate only a fraction of a transaction per map load. The Azure Maps web SDK has options for disabling panning and zooming. Additionally, the Azure Maps web SDK provides a richer set of data visualization options than a static map web service does.
## Prerequisites
-### Create an Azure Maps account
-
-To complete the procedures in this article, you first need to create an Azure Maps account and get your maps account key. Follow instructions in [Create an account](quick-demo-map-app.md#create-an-azure-maps-account) to create an Azure Maps account subscription and follow the steps in [get primary key](quick-demo-map-app.md#get-the-primary-key-for-your-account) to get the primary key for your account. For more information on authentication in Azure Maps, see [manage authentication in Azure Maps](./how-to-manage-authentication.md).
+1. [Make an Azure Maps account](quick-demo-map-app.md#create-an-azure-maps-account)
+2. [Obtain a primary subscription key](quick-demo-map-app.md#get-the-primary-key-for-your-account), also known as the primary key or the subscription key.
+This tutorial uses the [Postman](https://www.postman.com/) application, but you may use a different API development environment.
## Render pushpins with labels and a custom image > [!Note] > The procedure in this section requires an Azure Maps account in Gen 1 or Gen 2 pricing tier.
-The Azure Maps account S0 tier supports only a single instance of the `pins` parameter. It allows you to render up to five pushpins, specified in the URL request, with a custom image.
+The Azure Maps account Gen 1 Standard S0 tier supports only a single instance of the `pins` parameter. It allows you to render up to five pushpins, specified in the URL request, with a custom image.
To render pushpins with labels and a custom image, complete these steps:
To render pushpins with labels and a custom image, complete these steps:
```HTTP https://atlas.microsoft.com/map/static/png?subscription-key={subscription-key}&api-version=1.0&layer=basic&style=main&zoom=12&center=-73.98,%2040.77&pins=custom%7Cla15+50%7Cls12%7Clc003b61%7C%7C%27CentralPark%27-73.9657974+40.781971%7C%7Chttps%3A%2F%2Fraw.githubusercontent.com%2FAzure-Samples%2FAzureMapsCodeSamples%2Fmaster%2FAzureMapsCodeSamples%2FCommon%2Fimages%2Ficons%2Fylw-pushpin.png ```+ Here's the resulting image: ![A custom pushpin with a label](./media/how-to-render-custom-data/render-pins.png)
To render pushpins with labels and a custom image, complete these steps:
> [!Note] > The procedure in this section requires an Azure Maps account Gen 1 (S1) or Gen 2 pricing tier.
-You can also obtain the path and pin location information by using the [Data Upload API](/rest/api/maps/data/uploadpreview). Follow the steps below to upload the path and pins data.
+You can also obtain the path and pin location information by using the [Data Upload API](/rest/api/maps/data%v2/uploadpreview/). Follow the steps below to upload the path and pins data.
1. In the Postman app, open a new tab in the collection you created in the previous section. Select the POST HTTP method on the builder tab and enter the following URL to make a POST request: ```HTTP
- https://atlas.microsoft.com/mapData/upload?subscription-key={subscription-key}&api-version=1.0&dataFormat=geojson
+ https://us.atlas.microsoft.com/mapData?subscription-key={subscription-key}&api-version=2.0&dataFormat=geojson
``` 2. On the **Params** tab, enter the following key/value pairs, which are used for the POST request URL. Replace the `subscription-key` value with your Azure Maps subscription key.
-
+ ![Key/value params in Postman](./media/how-to-render-custom-data/postman-key-vals.png) 3. On the **Body** tab, select the raw input format and choose JSON as the input format from the dropdown list. Provide this JSON as data to be uploaded:
You can also obtain the path and pin location information by using the [Data Upl
} ```
-4. Select **Send** and review the response header. Upon a successful request, the Location header will contain the status URI to check the current status of the upload request. The status URI would be of the following format.
+4. Select **Send** and review the response header. Upon a successful request, the *Operation-Location* header will contain the `status URL` to check the current status of the upload request. The `status URL` has the following format:
```HTTP
- https://atlas.microsoft.com/mapData/{uploadStatusId}/status?api-version=1.0
+ https://us.atlas.microsoft.com/mapData/operations/{statusUrl}?api-version=2.0
``` 5. Copy your status URI and append the subscription-key parameter to it with the value of your Azure Maps account subscription key. Use the same account subscription key that you used to upload the data. The status URI format should look like the one below: ```HTTP
- https://atlas.microsoft.com/mapData/{uploadStatusId}/status?api-version=1.0&subscription-key={Subscription-key}
+ https://us.atlas.microsoft.com/mapData/operations/{statusUrl}?api-version=2.0&subscription-key={Subscription-key}
```
-6. To get the udId, open a new tab in the Postman app. Select GET HTTP method on the builder tab. Make a GET request at the status URI. If your data upload was successful, you'll receive a udId in the response body. Copy the udId.
+6. To get the `udid`, open a new tab in the Postman app. Select GET HTTP method on the builder tab. Make a GET request at the `status URL`. If your data upload was successful, you'll receive a `udid` in the response body. Copy the `udid`.
```JSON {
You can also obtain the path and pin location information by using the [Data Upl
} ```
-7. Use the `udId` value received from the Data Upload API to render features on the map. To do so, open a new tab in the collection you created in the preceding section. Select the GET HTTP method on the builder tab, replace the {subscription-key} and {udId} with your values, and enter this URL to make a GET request:
+7. Use the `udid` value received from the Data Upload API to render features on the map. To do so, open a new tab in the collection you created in the preceding section. Select the GET HTTP method on the builder tab, replace the {subscription-key} and {udId} with your values, and enter this URL to make a GET request:
```HTTP https://atlas.microsoft.com/map/static/png?subscription-key={subscription-key}&api-version=1.0&layer=basic&style=main&zoom=12&center=-73.96682739257812%2C40.78119135317995&pins=default|la-35+50|ls12|lc003C62|co9B2F15||'Times Square'-73.98516297340393 40.758781646381024|'Central Park'-73.96682739257812 40.78119135317995&path=lc0000FF|fc0000FF|lw3|la0.80|fa0.30||udid-{udId}
Similarly, you can change, add, and remove other style modifiers.
## Next steps - * Explore the [Azure Maps Get Map Image API](/rest/api/maps/render/getmapimage) documentation.
-* To learn more about Azure Maps Data service (Preview), see the [service documentation](/rest/api/maps/data).
+* To learn more about Azure Maps Data service, see the [service documentation](/rest/api/maps/data).
azure-maps How To Request Elevation Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/how-to-request-elevation-data.md
This article uses the [Postman](https://www.postman.com/) application, but you c
## Request elevation data in raster tile format
-To request elevation data in raster tile format, use the [Render V2 - Get Map Tile API](/rest/api/maps/renderv2). If the tile can be found, the API returns the tile as a GeoTIFF. Otherwise, the API returns 0. All raster DEM tiles use the geoid (sea level) Earth mode. In this example, we'll request elevation data for Mt. Everest.
+To request elevation data in raster tile format, use the [Render V2-Get Map Tile API](/rest/api/maps/renderv2). If the tile can be found, the API returns the tile as a GeoTIFF. Otherwise, the API returns 0. All raster DEM tiles use the geoid (sea level) Earth mode. In this example, we'll request elevation data for Mt. Everest.
>[!TIP] >To retrieve a tile at a specific area on the world map, find the correct tile at the appropriate zoom level. Also note that WorldDEM covers the entire global landmass but it doesn't cover oceans. For more information, see [Zoom levels and tile grid](zoom-levels-and-tile-grid.md).
The following sample webpage describes how to use the map control to display ele
## Next steps
-To further explore the Azure Maps Elevation (Preview) APIs, see:
+To further explore the Azure Maps ElevationAPIs, see:
> [!div class="nextstepaction"]
-> [Elevation (Preview) - Get Data for Lat Long Coordinates](/rest/api/maps/elevation/getdataforpoints)
+> [Elevation - Get Data for Lat Long Coordinates](/rest/api/maps/elevation/getdataforpoints)
> [!div class="nextstepaction"]
-> [Elevation (Preview) - Get Data for Bounding Box](/rest/api/maps/elevation/getdataforboundingbox)
+> [Elevation - Get Data for Bounding Box](/rest/api/maps/elevation/getdataforboundingbox)
> [!div class="nextstepaction"]
-> [Elevation (Preview) - Get Data for Polyline](/rest/api/maps/elevation/getdataforpolyline)
+> [Elevation - Get Data for Polyline](/rest/api/maps/elevation/getdataforpolyline)
> [!div class="nextstepaction"] > [Render V2 ΓÇô Get Map Tile](/rest/api/maps/renderv2)
azure-maps How To Secure Device Code https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/how-to-secure-device-code.md
Create the device based application in Azure AD to enable Azure AD sign in. This
> ![Add app registration details for name and redirect uri](./media/azure-maps-authentication/devicecode-app-registration.png) 3. Navigate to **Authentication** and enable **Treat application as a public client**. This will enable device code authentication with Azure AD.
-
+ > [!div class="mx-imgBorder"] > ![Enable app registration as public client](./media/azure-maps-authentication/devicecode-public-client.png)
-4. To assign delegated API permissions to Azure Maps, go to the application. Then select **API permissions** > **Add a permission**. Under **APIs my organization uses**, search for and select **Azure Maps**.
+4. To assign delegated API permissions to Azure Maps, go to the application. Then select **API permissions** > **Add a permission**. Under **APIs my organization uses**, search for and select **Azure Maps**.
> [!div class="mx-imgBorder"] > ![Add app API permissions](./media/how-to-manage-authentication/app-permissions.png)
Create the device based application in Azure AD to enable Azure AD sign in. This
7. Add code for acquiring token flow in the application, for implementation details see [Device code flow](../active-directory/develop/scenario-desktop-acquire-token.md#device-code-flow). When acquiring tokens, reference the scope: `user_impersonation` which was selected on earlier steps.
-> [!Tip]
-> Use Microsoft Authentication Library (MSAL) to acquire access tokens.
-> See recommendations on [Desktop app that calls web APIs: Code configuration](../active-directory/develop/scenario-desktop-app-configuration.md)
+ > [!Tip]
+ > Use Microsoft Authentication Library (MSAL) to acquire access tokens.
+ > See recommendations on [Desktop app that calls web APIs: Code configuration](../active-directory/develop/scenario-desktop-app-configuration.md)
8. Compose the HTTP request with the acquired token from Azure AD, and sent request with a valid HTTP client. ### Sample request+ Here's a sample request body for uploading a simple Geofence represented as a circle geometry using a center point and a radius. ```http
-POST /mapData/upload?api-version=1.0&dataFormat=geojson
-Host: atlas.microsoft.com
+POST /mapData?api-version=2.0&dataFormat=geojson
+Host: us.atlas.microsoft.com
x-ms-client-id: 30d7cc….9f55 Authorization: Bearer eyJ0e….HNIVN ``` The sample request body below is in GeoJSON:+ ```json { "type": "FeatureCollection",
Authorization: Bearer eyJ0e….HNIVN
} ```
-### Sample response:
+### Sample response header
-Headers:
```http
-Location: https://atlas.microsoft.com/mapData/metadata/{udid}?api-version=1.0
-Access-Control-Expose-Headers: Location
+Operation-Location: https://us.atlas.microsoft.com/mapData/operations/{udid}?api-version=2.0
+Access-Control-Expose-Headers: Operation-Location
```
-Body:
-```json
-{
- "operationId": "{operationId}",
- "status": "Succeeded",
- "created": "2020-01-02 1:02:03 AM +00:00",
- "resourceLocation": "https://atlas.microsoft.com/mapData/metadata/{resourceId}?api-version=1.0"
-}
-```
[!INCLUDE [grant role-based access to users](./includes/grant-rbac-users.md)]
azure-maps How To Use Feedback Tool https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/how-to-use-feedback-tool.md
Azure Maps has been available since May 2018. Azure Maps has been providing fresh map data, easy-to-use REST APIs, and powerful SDKs to support our enterprise customers with different kind of business use cases. The real world is changing every second, and itΓÇÖs crucial for us to provide a factual digital representation to our customers. Our customers that are planning to open or close facilities need our maps to update promptly. So, they can efficiently plan for delivery, maintenance, or customer service at the right facilities. We have created the Azure Maps data feedback site to empower our customers to provide direct data feedback. CustomersΓÇÖ data feedback goes directly to our data providers and their map editors. They can quickly evaluate and incorporate feedback into our mapping products.
-[Azure Maps Data (Preview) feedback site](https://feedback.azuremaps.com) provides an easy way for our customers to provide map data feedback, especially on business points of interest and residential addresses. This article guides you on how to provide different kinds of feedback using the Azure Maps feedback site.
+[Azure Maps Data feedback site](https://feedback.azuremaps.com) provides an easy way for our customers to provide map data feedback, especially on business points of interest and residential addresses. This article guides you on how to provide different kinds of feedback using the Azure Maps feedback site.
## Add a business place or a residential address
azure-maps How To Use Indoor Module https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/how-to-use-indoor-module.md
Title: Use the Azure Maps Indoor Maps module with Microsoft Creator services (Preview)
+ Title: Use the Azure Maps Indoor Maps module with Microsoft Creator services
description: Learn how to use the Microsoft Azure Maps Indoor Maps module to render maps by embedding the module's JavaScript libraries.
# Use the Azure Maps Indoor Maps module
-> [!IMPORTANT]
-> Azure Maps Creator services are currently in public preview.
-> This preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-
-The Azure Maps Web SDK includes the *Azure Maps Indoor* module. The *Azure Maps Indoor* module allows you to render indoor maps created in Azure Maps Creator services (Preview)
+The Azure Maps Web SDK includes the *Azure Maps Indoor* module. The *Azure Maps Indoor* module allows you to render indoor maps created in Azure Maps Creator services.
## Prerequisites 1. [Make an Azure Maps account](quick-demo-map-app.md#create-an-azure-maps-account)
-2. [Create a Creator (Preview) resource](how-to-manage-creator.md)
+2. [Create a Creator resource](how-to-manage-creator.md)
3. [Obtain a primary subscription key](quick-demo-map-app.md#get-the-primary-key-for-your-account), also known as the primary key or the subscription key. 4. Get a `tilesetId` and a `statesetId` by completing the [tutorial for creating Indoor maps](tutorial-creator-indoor-maps.md). You'll need to use these identifiers to render indoor maps with the Azure Maps Indoor Maps module.
This example shows you how to use the *Azure Maps Indoor* module in your web app
4. Initialize a *Map object*. The *Map object* supports the following options: - `Subscription key` is your Azure Maps primary subscription key. - `center` defines a latitude and longitude for your indoor map center location. Provide a value for `center` if you don't want to provide a value for `bounds`. Format should appear as `center`: [-122.13315, 47.63637].
- - `bounds` is the smallest rectangular shape that encloses the tileset map data. Set a value for `bounds` if you don't want to set a value for `center`. You can find your map bounds by calling the [Tileset List API](/rest/api/maps/tileset/listpreview). The Tileset List API returns the `bbox`, which you can parse and assign to `bounds`. Format should appear as `bounds`: [# west, # south, # east, # north].
+ - `bounds` is the smallest rectangular shape that encloses the tileset map data. Set a value for `bounds` if you don't want to set a value for `center`. You can find your map bounds by calling the [Tileset List API](/rest/api/maps/v2/tileset/listpreview). The Tileset List API returns the `bbox`, which you can parse and assign to `bounds`. Format should appear as `bounds`: [# west, # south, # east, # north].
- `style` allows you to set the color of the background. To display a white background, define `style` as "blank". - `zoom` allows you to specify the min and max zoom levels for your map.
Read about the APIs that are related to the *Azure Maps Indoor* module:
> [Drawing package requirements](drawing-requirements.md) >[!div class="nextstepaction"]
-> [Creator (Preview) for indoor maps](creator-indoor-maps.md)
+> [Creator for indoor maps](creator-indoor-maps.md)
Learn more about how to add more data to your map:
azure-maps Indoor Map Dynamic Styling https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/indoor-map-dynamic-styling.md
Title: Implement dynamic styling for Azure Maps Creator (Preview) indoor maps
-description: Learn how to Implement dynamic styling for Creator (Preview) indoor maps
+ Title: Implement dynamic styling for Azure Maps Creator indoor maps
+description: Learn how to Implement dynamic styling for Creator indoor maps
Previously updated : 12/07/2020 Last updated : 05/20/2021
-# Implement dynamic styling for Creator (Preview) indoor maps
+# Implement dynamic styling for Creator indoor maps
-> [!IMPORTANT]
-> Azure Maps Creator services are currently in public preview.
-> This preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-
-Azure Maps Creator [Feature State service](/rest/api/maps/featurestate) lets you apply styles based on the dynamic properties of indoor map data features. For example, you can render facility meeting rooms with a specific color to reflect occupancy status. In this article, we'll show you how to dynamically render indoor map features with the [Feature State service](/rest/api/maps/featurestate) and the [Indoor Web Module](how-to-use-indoor-module.md).
+You can use Azure Maps Creator [Feature State service](/rest/api/maps/v2/featurestate) to apply styles that are based on the dynamic properties of indoor map data features. For example, you can render facility meeting rooms with a specific color to reflect occupancy status. This article describes how to dynamically render indoor map features with the [Feature State service](/rest/api/maps/v2/featurestate) and the [Indoor Web module](how-to-use-indoor-module.md).
## Prerequisites 1. [Create an Azure Maps account](quick-demo-map-app.md#create-an-azure-maps-account) 2. [Obtain a primary subscription key](quick-demo-map-app.md#get-the-primary-key-for-your-account), also known as the primary key or the subscription key.
-3. [Create a Creator (Preview) resource](how-to-manage-creator.md)
+3. [Create a Creator resource](how-to-manage-creator.md)
4. Download the [sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples). 5. [Create an indoor map](tutorial-creator-indoor-maps.md) to obtain a `tilesetId` and `statesetId`. 6. Build a web application by following the steps in [How to use the Indoor Map module](how-to-use-indoor-module.md).
This tutorial uses the [Postman](https://www.postman.com/) application, but you
## Implement dynamic styling
-Once you complete the prerequisites, you should have a simple web application configured with your subscription key, `tilesetId`, and `statesetId`.
+After you complete the prerequisites, you should have a simple web application configured with your subscription key, `tilesetId`, and `statesetId`.
### Select features
-To implement dynamic styling, a feature, such as a meeting or conference room, must be referenced by its feature `id`. You'll use the feature `id` to update the dynamic property or *state* of that feature. To view the features defined in a dataset, you can use one of the following methods:
+To implement dynamic styling, a feature - such as a meeting or conference room - must be referenced by its feature `id`. You use the feature `id` to update the dynamic property or *state* of that feature. To view the features defined in a dataset, you can use one of the following methods:
-* WFS API (Web Feature Service). Datasets can be queried using the WFS API. WFS follows the Open Geospatial Consortium API Features. The WFS API is helpful for querying features within a dataset. For example, you can use WFS to find all mid-size meeting rooms of a given facility and floor level.
+* WFS API (Web Feature service). You can use the [WFS API](/rest/api/maps/v2/wfs) to query datasets. WFS follows the [Open Geospatial Consortium API Features](http://docs.opengeospatial.org/DRAFTS/17-069r1.html). The WFS API is helpful for querying features within a dataset. For example, you can use WFS to find all mid-size meeting rooms of a specific facility and floor level.
-* Implement customized code that allows a user to select features on a map using your web application. In this article, we'll make use of this option.
+* Implement customized code that a user can use to select features on a map using your web application. We use this option in this article.
-The following script implements the mouse click event. The code retrieves the feature `id` based on the clicked point. In your application, you can insert the code below your Indoor Manager code block. Run your application and check the console to obtain the feature `id` of the clicked point.
+The following script implements the mouse-click event. The code retrieves the feature `id` based on the clicked point. In your application, you can insert the code after your Indoor Manager code block. Run your application, and then check the console to obtain the feature `id` of the clicked point.
```javascript /* Upon a mouse click, log the feature properties to the browser's console. */
map.events.add("click", function(e){
The [Create an indoor map](tutorial-creator-indoor-maps.md) tutorial configured the feature stateset to accept state updates for `occupancy`.
-In the next section, we'll set the occupancy *state* of office `UNIT26` to `true`. while office `UNIT27` will be set to `false`.
+In the next section, we'll set the occupancy *state* of office `UNIT26` to `true` and office `UNIT27` to `false`.
### Set occupancy status We'll now update the state of the two offices, `UNIT26` and `UNIT27`:
-1. In the Postman application, select **New**. In the **Create New** window, select **Request**. Enter a **Request name** and select a collection. Click **Save**
+1. In the Postman app, select **New**.
+
+2. In the **Create New** window, select **Collection**.
+
+3. Select **New** again.
+
+4. In the **Create New** window, select **Request**.
-2. Use the [Feature Update States API](/rest/api/maps/featurestate/updatestatespreview) to update the state. Pass the stateset ID, and `UNIT26` for one of the two units. Append your Azure Maps subscription key. Here's the URL of a **POST** request to update the state:
+5. Enter a **Request name** for the request, such as *POST Data Upload*.
+
+6. Select the collection you previously created, and then select **Save**.
+
+7. Enter the following URL to the [Feature Update States API](/rest/api/maps/v2/featurestate/updatestatespreview) (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key and `statesetId` with the `statesetId`):
```http
- https://atlas.microsoft.com/featureState/state?api-version=1.0&statesetID={statesetId}&featureID=UNIT26&subscription-key={Azure-Maps-Primary-Subscription-key}
+ https://us.atlas.microsoft.com/featurestatesets/{statesetId}/featureStates/UNIT26?api-version=2.0&subscription-key={Azure-Maps-Primary-Subscription-key}
```
-3. In the **Headers** of the **POST** request, set `Content-Type` to `application/json`. In the **BODY** of the **POST** request, write the following raw JSON with the feature updates. The update will be saved only if the posted time stamp is after the time stamp used in previous feature state update requests for the same feature `ID`. Pass the "occupied" `keyName` to update its value.
+8. Select the **Headers** tab.
+
+9. In the **KEY** field, select `Content-Type`. In the **VALUE** field, select `application/json`.
+
+ :::image type="content" source="./media/indoor-map-dynamic-styling/stateset-header.png"alt-text="Header tab information for stateset creation.":::
+
+10. Select the **Body** tab.
+
+11. In the dropdown lists, select **raw** and **JSON**.
+
+12. Copy the following JSON style, and then paste it in the **Body** window:
```json {
In the next section, we'll set the occupancy *state* of office `UNIT26` to `true
{ "keyName": "occupied", "value": true,
- "eventTimestamp": "2019-11-14T17:10:20"
+ "eventTimestamp": "2020-11-14T17:10:20"
} ] } ```
-4. Redo step 2 and 3 using `UNIT27`, with the following JSON.
+ >[!IMPORTANT]
+ >The update will be saved only if the posted time stamp is after the time stamp used in previous feature state update requests for the same feature `ID`.
+
+13. Change the URL you used in step 7 by replacing `UNIT26` with `UNIT27`:
+
+ ```http
+ https://us.atlas.microsoft.com/featurestatesets/{statesetId}/featureStates/UNIT27?api-version=2.0&subscription-key={Azure-Maps-Primary-Subscription-key}
+ ```
+
+14. Copy the following JSON style, and then paste it in the **Body** window:
``` json {
In the next section, we'll set the occupancy *state* of office `UNIT26` to `true
{ "keyName": "occupied", "value": false,
- "eventTimestamp": "2019-11-14T17:10:20"
+ "eventTimestamp": "2020-11-14T17:10:20"
} ] }
In the next section, we'll set the occupancy *state* of office `UNIT26` to `true
### Visualize dynamic styles on a map
-The web application you previously opened in a browser should now reflect the updated state of the map features. `UNIT27`(142) should appear green and `UNIT26`(143) should appear red.
+The web application that you previously opened in a browser should now reflect the updated state of the map features:
+- Office `UNIT27`(142) should appear green.
+- Office `UNIT26`(143) should appear red.
![Free room in green and Busy room in red](./media/indoor-map-dynamic-styling/room-state.png)
The web application you previously opened in a browser should now reflect the up
Learn more by reading: > [!div class="nextstepaction"]
-> [Creator (Preview) for indoor mapping](creator-indoor-maps.md)
+> [Creator for indoor mapping](creator-indoor-maps.md)
-See to the references for the APIs mentioned in this article:
+See the references for the APIs mentioned in this article:
> [!div class="nextstepaction"] > [Data Upload](creator-indoor-maps.md#upload-a-drawing-package)
See to the references for the APIs mentioned in this article:
> [Feature State set](creator-indoor-maps.md#feature-statesets) > [!div class="nextstepaction"]
-> [WFS service](creator-indoor-maps.md#web-feature-service-api)
+> [WFS service](creator-indoor-maps.md#web-feature-service-api)
azure-maps Migrate From Bing Maps Web Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/migrate-from-bing-maps-web-services.md
The following table provides the Azure Maps service APIs that provide similar fu
| Spatial Data Services (SDS) | [Search](/rest/api/maps/search) + [Route](/rest/api/maps/route) + other Azure Services | | Time Zone | [Time Zone](/rest/api/maps/timezone) | | Traffic Incidents | [Traffic Incident Details](/rest/api/maps/traffic/gettrafficincidentdetail) |
-| Elevation | [Elevation (Preview)](/rest/api/maps/elevation)
+| Elevation | [Elevation](/rest/api/maps/elevation)
The following service APIs are not currently available in Azure Maps:
The following service APIs are not currently available in Azure Maps:
Azure Maps has several additional REST web services that may be of interest; -- [Azure Maps Creator (Preview) ](./creator-indoor-maps.md) ΓÇô Create a custom private digital twin of buildings and spaces.
+- [Azure Maps Creator ](./creator-indoor-maps.md) ΓÇô Create a custom private digital twin of buildings and spaces.
- [Spatial operations](/rest/api/maps/spatial) ΓÇô Offload complex spatial calculations and operations, such as geofencing, to a service. - [Map Tiles](/rest/api/maps/render/getmaptile) ΓÇô Access road and imagery tiles from Azure Maps as raster and vector tiles. - [Batch routing](/rest/api/maps/route/postroutedirectionsbatchpreview) ΓÇô Allows up to 1,000 route requests to be made in a single batch over a period of time. Routes are calculated in parallel on the server for faster processing.
azure-maps Migrate From Bing Maps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/migrate-from-bing-maps.md
The following table provides a high-level list of Bing Maps features and the rel
| Autosuggest | Γ£ô | | Directions (including truck) | Γ£ô | | Distance Matrix | Γ£ô |
-| Elevations | Γ£ô (Preview) |
+| Elevations | Γ£ô |
| Imagery ΓÇô Static Map | Γ£ô | | Imagery Metadata | Γ£ô | | Isochrones | Γ£ô |
azure-maps Migrate From Google Maps Web Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/migrate-from-google-maps-web-services.md
The table shows the Azure Maps service APIs, which have a similar functionality
| Speed Limits | See [Reverse geocode a coordinate](#reverse-geocode-a-coordinate) section. | | Static Map | [Render](/rest/api/maps/render/getmapimage) | | Time Zone | [Time Zone](/rest/api/maps/timezone) |
-| Elevation | [Elevation (Preview)](/rest/api/maps/elevation) |
+| Elevation | [Elevation](/rest/api/maps/elevation) |
The following service APIs aren't currently available in Azure Maps:
The Azure Maps routing service provides the following APIs for calculating route
- [**Calculate route**](/rest/api/maps/route/getroutedirections): Calculate a route and have the request processed immediately. This API supports both GET and POST requests. POST requests are recommended when specifying a large number of waypoints or when using lots of the route options to ensure that the URL request doesn't become too long and cause issues. The POST Route Direction in Azure Maps has an option can that take in thousands of [supporting points](/rest/api/maps/route/postroutedirections#supportingpoints) and will use them to recreate a logical route path between them (snap to road). - [**Batch route**](/rest/api/maps/route/postroutedirectionsbatchpreview): Create a request containing up to 1,000 route request and have them processed over a period of time. All the data will be processed in parallel on the server and when completed the full result set can be downloaded.-- [**Mobility services (Preview) **](/rest/api/maps/mobility): Calculate routes and directions using public transit.
+- [**Mobility services (Preview)**](/rest/api/maps/mobility): Calculate routes and directions using public transit.
The table cross-references the Google Maps API parameters with the comparable API parameters in Azure Maps.
azure-maps Open Source Projects https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/open-source-projects.md
The following is a list of open-source projects that extend the capabilities of
| Project Name | Description | |-|-| | [Azure Maps Docs](https://github.com/MicrosoftDocs/azure-docs/tree/master/articles/azure-maps) | Source for all Azure Location Based Services documentation. |
-| [Azure Maps Creator (Preview) Tools](https://github.com/Azure-Samples/AzureMapsCreator) | Python tools for Azure Maps Creator (Preview) Tools. |
+| [Azure Maps Creator Tools](https://github.com/Azure-Samples/AzureMapsCreator) | Python tools for Azure Maps Creator Tools. |
A longer list of open-source projects for Azure Maps that includes community created projects is available [here](https://github.com/microsoft/Maps/blob/master/AzureMaps.md)
azure-maps Schema Stateset Stylesobject https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/schema-stateset-stylesobject.md
# StylesObject Schema reference guide for dynamic Maps
-> [!IMPORTANT]
-> Azure Maps Creator services are currently in public preview.
-> This preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-
- The `StylesObject` is a `StyleObject` array representing stateset styles. Use the Azure Maps Creator (Preview) [Feature State service](/rest/api/maps/featurestate) to apply your stateset styles to indoor map data features. Once you've created your stateset styles and associated them with indoor map features, you can use them to create dynamic indoor maps. For more information on creating dynamic indoor maps, see [Implement dynamic styling for Creator indoor maps](indoor-map-dynamic-styling.md).
+ The `StylesObject` is a `StyleObject` array representing stateset styles. Use the Azure Maps Creator [Feature State service](/rest/api/maps/v2/featurestate) to apply your stateset styles to indoor map data features. Once you've created your stateset styles and associated them with indoor map features, you can use them to create dynamic indoor maps. For more information on creating dynamic indoor maps, see [Implement dynamic styling for Creator indoor maps](indoor-map-dynamic-styling.md).
## StyleObject
azure-maps Tutorial Creator Indoor Maps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/tutorial-creator-indoor-maps.md
Title: 'Tutorial: Use Microsoft Azure Maps Creator (Preview) to create indoor maps'
-description: Tutorial on how to use Microsoft Azure Maps Creator (Preview) to create indoor maps
+ Title: 'Tutorial: Use Microsoft Azure Maps Creator to create indoor maps'
+description: Tutorial on how to use Microsoft Azure Maps Creator to create indoor maps
Previously updated : 12/07/2020 Last updated : 5/19/2021
-# Tutorial: Use Creator (Preview) to create indoor maps
+# Tutorial: Use Creator to create indoor maps
-> [!IMPORTANT]
-> Azure Maps Creator services are currently in public preview.
-> This preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
-> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-
-This tutorial shows you how to create indoor maps. In this tutorial, you'll learn how to use the API to:
+This tutorial describes how to create indoor maps. In this tutorial, you'll learn how to:
> [!div class="checklist"]
-> * Upload your indoor map Drawing package
-> * Convert your Drawing package into map data
-> * Create a dataset from your map data
-> * Create a tileset from the data in your dataset
-> * Query the Azure Maps Web Feature Service (WFS) API to learn about your map features
-> * Create a feature stateset by using your map features and the data in your dataset
-> * Update your feature stateset
+> * Upload your indoor map Drawing package.
+> * Convert your Drawing package into map data.
+> * Create a dataset from your map data.
+> * Create a tileset from the data in your dataset.
+> * Query the Azure Maps Web Feature Service (WFS) API to learn about your map features.
+> * Create a feature stateset by using your map features and the data in your dataset.
+> * Update your feature stateset.
## Prerequisites
-To create indoor maps:
-
-1. [Make an Azure Maps account](quick-demo-map-app.md#create-an-azure-maps-account)
+1. [Make an Azure Maps account](quick-demo-map-app.md#create-an-azure-maps-account).
2. [Obtain a primary subscription key](quick-demo-map-app.md#get-the-primary-key-for-your-account), also known as the primary key or the subscription key.
-3. [Create a Creator (Preview) resource](how-to-manage-creator.md)
+3. [Create a Creator resource](how-to-manage-creator.md).
4. Download the [Sample Drawing package](https://github.com/Azure-Samples/am-creator-indoor-data-examples/blob/master/Sample%20-%20Contoso%20Drawing%20Package.zip).
-This tutorial uses the [Postman](https://www.postman.com/) application, but you may choose a different API development environment.
+This tutorial uses the [Postman](https://www.postman.com/) application, but you can use a different API development environment.
>[!IMPORTANT]
-> The API urls in this document may have to be adjusted according to the location of your Creator resource. For more details, see [Access to Creator Services](how-to-manage-creator.md#access-to-creator-services).
+> This tutorial uses the `us.atlas.microsoft.com` geographical URL. If your Creator service wasn't created in the United States, you must use a different geographical URL. For more information, see [Access to Creator Services](how-to-manage-creator.md#access-to-creator-services). To view mappings of region to geographical location, [see Creator service geographic scope](creator-geographic-scope.md).
## Upload a Drawing package
-Use the [Data Upload API](/rest/api/maps/data/uploadpreview) to upload the Drawing package to Azure Maps resources.
+Use the [Data Upload API](/rest/api/maps/data%20v2/uploadpreview) to upload the Drawing package to Azure Maps resources.
+
+The Data Upload API is a long running transaction that implements the pattern defined in [Creator Long-Running Operation API V2](creator-long-running-operation-v2.md).
+
+To upload the Drawing package:
+
+1. In the Postman app, select **New**.
+
+2. In the **Create New** window, select **Collection**.
-The Data Upload API is a long running transaction that implements the pattern defined here. Once the operation completes, we'll use the `udid` to access the uploaded package to convert it. Follow the steps below to obtain the `udid`.
+3. Select **New** again.
-1. Open the Postman app. Near the top of the Postman app, select **New**. In the **Create New** window, select **Collection**. Name the collection and select the **Create** button.
+4. In the **Create New** window, select **Request**.
-2. To create the request, select **New** again. In the **Create New** window, select **Request**. Enter a **Request name** for the request. Select the collection you created in the previous step, and then select **Save**.
+5. Enter a **Request name** for the request, such as *POST Data Upload*.
-3. Select the **POST** HTTP method in the builder tab and enter the following URL to upload the Drawing package to the Azure Maps service. For this request, and other requests mentioned in this article, replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key.
+6. Select the collection you previously created, and then select **Save**.
+
+7. Select the **POST** HTTP method.
+
+8. Enter the following URL to the [Data Upload API](/rest/api/maps/data%20v2/uploadpreview):
```http
- https://atlas.microsoft.com/mapData/upload?api-version=1.0&dataFormat=zip&subscription-key={Azure-Maps-Primary-Subscription-key}
+ https://us.atlas.microsoft.com/mapData?api-version=2.0&dataFormat=dwgzippackage&subscription-key={Azure-Maps-Primary-Subscription-key}
```
-4. In the **Headers** tab, specify a value for the `Content-Type` key. The Drawing package is a zipped folder, so use the `application/octet-stream` value. In the **Body** tab, select **binary**. Click on **Select File** and choose a Drawing package.
+ >[!Important]
+ >For this request, and other requests mentioned in this article, replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key.
+
+9. Select the **Headers** tab.
+
+10. In the **KEY** field, select `Content-Type`.
+
+11. In the **VALUE** field, select `application/octet-stream`.
+
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/data-upload-header.png"alt-text="Header tab information for data upload.":::
+
+12. Select the **Body** tab.
+
+13. In the dropdown list, select **binary**.
+
+14. Select **Select File**, and then select a Drawing package.
+
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/data-upload-body.png" alt-text="Select a Drawing package.":::
+
+15. Select **Send**.
+
+16. In the response window, select the **Headers** tab.
+
+17. Copy the value of the **Operation-Location** key, which is the `status URL`. We'll use the `status URL` to check the status of the Drawing package upload.
+
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/data-upload-response-header.png" alt-text="Copy the status URL in the Location key.":::
+
+### Check the Drawing package upload status
- ![data-management](./media/tutorial-creator-indoor-maps/enter-content-type-dialog.png)
+To check the status of the drawing package and retrieve its unique ID (`udid`):
-5. Click the blue **Send** button and wait for the request to process. Once the request completes, go to the **Headers** tab of the response. Copy the value of the **Location** key, which is the `status URL`.
+1. Select **New**.
-6. To check the status of the API call, create a **GET** HTTP request on the `status URL`. You'll need to append your primary subscription key to the URL for authentication. The **GET** request should look like the following URL:
+2. In the **Create New** window, select **Request**.
+
+3. Enter a **Request name** for the request, such as *GET Data Upload Status*.
+
+4. Select the collection you previously created, and then select **Save**.
+
+5. Select the **GET** HTTP method.
+
+6. Enter the `status URL` you copied in [Upload a Drawing package](#upload-a-drawing-package). The request should look like the following URL (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key):
```http
- https://atlas.microsoft.com/mapData/operations/<operationId>?api-version=1.0&subscription-key={Azure-Maps-Primary-Subscription-key}
+ https://us.atlas.microsoft.com/mapData/operations/<operationId>?api-version=2.0&subscription-key={Azure-Maps-Primary-Subscription-key}
```
-7. When the **GET** HTTP request completes successfully, it will return a `resourceLocation`. The `resourceLocation` contains the unique `udid` for the uploaded content. Optionally, you can use the `resourceLocation` URL to retrieve metadata from this resource in the next step.
+7. Select **Send**.
- ```json
- {
- "status": "Succeeded",
- "resourceLocation": "https://atlas.microsoft.com/mapData/metadata/{udid}?api-version=1.0"
- }
- ```
+8. In the response window, select the **Headers** tab.
+
+9. Copy the value of the **Resource-Location** key, which is the `resource location URL`. The `resource location URL` contains the unique identifier (`udid`) of the drawing package resource.
+
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/resource-location-url.png" alt-text="Copy the resource location URL.":::
+
+### (Optional) Retrieve Drawing package metadata
-8. To retrieve content metadata, create a **GET** HTTP request on the `resourceLocation` URL that was retrieved in step 7. Make sure to append your primary subscription key to the URL for authentication. The **GET** request should like the following URL:
+You can retrieve metadata from the Drawing package resource. The metadata contains information like the resource location URL, creation date, updated date, size, and upload status.
+
+To retrieve content metadata:
+
+1. Select **New**.
+
+2. In the **Create New** window, select **Request**.
+
+3. Enter a **Request name** for the request, such as *GET Data Upload Status*.
+
+4. Select the collection you previously created, and then select **Save**.
+
+5. Select the **GET** HTTP method.
+
+6. Enter the `resource Location URL` you copied in [Check Drawing package upload status](#check-the-drawing-package-upload-status). The request should look like the following URL (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key):
```http
- https://atlas.microsoft.com/mapData/metadata/{udid}?api-version=1.0&subscription-key={Azure-Maps-Primary-Subscription-key}
+ https://us.atlas.microsoft.com/mapData/metadata/{udid}?api-version=2.0&subscription-key={Azure-Maps-Primary-Subscription-key}
```
-9. When the **GET** HTTP request completes successfully, the response body will contain the `udid` specified in the `resourceLocation` of step 7, the location to access/download the content in the future, and some other metadata about the content like created/updated date, size, and so on. An example of the overall response is:
+7. Select **Send**.
+
+8. In the response window, select the **Headers** tab. The metadata should like the following JSON fragment:
```json { "udid": "{udid}",
- "location": "https://atlas.microsoft.com/mapData/{udid}?api-version=1.0",
- "created": "2020-02-03T02:32:25.0509366+00:00",
- "updated": "2020-02-11T06:12:13.0309351+00:00",
- "sizeInBytes": 766,
+ "location": "https://us.atlas.microsoft.com/mapData/6ebf1ae1-2a66-760b-e28c-b9381fcff335?api-version=2.0",
+ "created": "5/18/2021 8:10:32 PM +00:00",
+ "updated": "5/18/2021 8:10:37 PM +00:00",
+ "sizeInBytes": 946901,
"uploadStatus": "Completed" } ``` ## Convert a Drawing package
- Now that the Drawing package is uploaded, we'll use `udid` for the uploaded package to convert the package into map data. The Conversion API uses a long running transaction that implements the pattern defined [here](creator-long-running-operation.md). Once the operation completes, we'll use the `conversionId` to access the converted data. Follow the steps below to obtain the `conversionId`.
+Now that the Drawing package is uploaded, we'll use the `udid` for the uploaded package to convert the package into map data. The Conversion API uses a long-running transaction that implements the pattern defined [here](creator-long-running-operation-v2.md).
+
+To convert a Drawing package:
+
+1. Select **New**.
-1. Select **New**. In the **Create New** window, select **Request**. Enter a **Request name** and select a collection. Click **Save**.
+2. In the **Create New** window, select **Request**.
-2. Select the **POST** HTTP method in the builder tab and enter the following URL to convert your uploaded Drawing package into map data. Use the `udid` for the uploaded package.
+3. Enter a **Request name** for the request, such as *POST Convert Drawing Package*.
+
+4. Select the collection you previously created, and then select **Save**.
+
+5. Select the **POST** HTTP method.
+
+6. Enter the following URL to the [Conversion Service](/rest/api/maps/v2/conversion/convertpreview) (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key and `udid` with the `udid` of the uploaded package):
```http
- https://atlas.microsoft.com/conversion/convert?subscription-key={Azure-Maps-Primary-Subscription-key}&api-version=1.0&udid={udid}&inputType=DWG
+ https://us.atlas.microsoft.com/conversions?subscription-key={Azure-Maps-Primary-Subscription-key}&api-version=2.0&udid={udid}&inputType=DWG&outputOntology=facility-2.0
```
- >[!IMPORTANT]
- > The API urls in this document may have to be adjusted according to the location of your Creator resource. For more details, see [Access to Creator services (Preview) ](how-to-manage-creator.md#access-to-creator-services).
- > If you receive an error with code `"RequiresCreatorResource"`, make sure that you have [provisioned an Azure Maps Creator resource](how-to-manage-creator.md) in you Azure Maps account.
+7. Select **Send**.
+
+8. In the response window, select the **Headers** tab.
+
+9. Copy the value of the **Operation-Location** key, which is the `status URL`. We'll use the `status URL` to check the status of the conversion.
+
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/data-convert-location-url.png" border="true" alt-text="Copy the value of the location key for drawing package.":::
+
+### Check the Drawing package conversion status
+
+After the conversion operation completes, it returns a `conversionId`. We can access the `conversionId` by checking the status of the Drawing package conversion process. The `conversionId` can then be used to access the converted data.
+
+To check the status of the conversion process and retrieve the `conversionId`:
+
+1. Select **New**.
+
+2. In the **Create New** window, select **Request**.
-3. Click the **Send** button and wait for the request to process. Once the request completes, go to the **Headers** tab of the response, and look for the **Location** key. Copy the value of the **Location** key, which is the `status URL` for the conversion request. You will use this in the next step.
+3. Enter a **Request name** for the request, such as *GET Conversion Status*.
- :::image type="content" source="./media/tutorial-creator-indoor-maps/copy-location-uri-dialog.png" border="true" alt-text="Copy the value of the location key":::
+4. Select the collection you previously created, and then select **Save**.
-4. Start a new **GET** HTTP method in the builder tab. Append your Azure Maps primary subscription key to the `status URL`. Make a **GET** request at the `status URL` that you copied in step 3. The `status URL` looks like the following URL:
+5. Select the **GET** HTTP method:
+
+6. Enter the `status URL` you copied in [Convert a Drawing package](#convert-a-drawing-package). The request should look like the following URL (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key):
```http
- https://atlas.microsoft.com/conversion/operations/<operationId>?api-version=1.0&subscription-key={Azure-Maps-Primary-Subscription-key}
+ https://us.atlas.microsoft.com/conversions/operations/<operationId>?api-version=2.0&subscription-key={Azure-Maps-Primary-Subscription-key}
```
- If the conversion process hasn't yet completed, you may see something like the following JSON response:
+7. Select **Send**.
- ```json
- {
- "operationId": "<operationId>",
- "created": "2020-04-22T19:39:54.9518496+00:00",
- "status": "Running"
- }
- ```
+8. In the response window, select the **Headers** tab.
-5. Once the request completes successfully, you'll see a success status message in the response body. Copy the `conversionId` from the `resourceLocation` URL for the converted package. The `conversionId` is used by other API to access the converted map data.
+9. Copy the value of the **Resource-Location** key, which is the `resource location URL`. The `resource location URL` contains the unique identifier (`conversionId`), which can be used by other APIs to access the converted map data.
- ```json
- {
- "operationId": "<operationId>",
- "created": "2020-04-22T19:39:54.9518496+00:00",
- "status": "Succeeded",
- "resourceLocation": "https://atlas.microsoft.com/conversion/{conversionId}?api-version=1.0",
- "properties": {}
- }
- ```
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/data-conversion-id.png" alt-text="Copy the conversion ID.":::
->[!NOTE]
->The Postman application does not natively support HTTP Long Running Requests. As a result, you may notice a long delay while making a **GET** request at the status URL. Wait about thirty seconds and try clicking the **Send** button again until the response shows success or fail.
+The sample Drawing package should be converted without errors or warnings. However, if you receive errors or warnings from your own Drawing package, the JSON response includes a link to the [Drawing error visualizer](drawing-error-visualizer.md). You can use the Drawing Error visualizer to inspect the details of errors and warnings. To receive recommendations to resolve conversion errors and warnings, see [Drawing conversion errors and warnings](drawing-conversion-error-codes.md).
-The sample Drawing package should be converted without errors or warnings. However, if you receive errors or warnings from your own Drawing package, the JSON response will give you a link to the [Drawing error visualizer](drawing-error-visualizer.md). The Drawing Error visualizer allows you to inspect the details of errors and warnings. To receive recommendations on how to resolve conversion errors and warnings, see the [Drawing conversion errors and warnings](drawing-conversion-error-codes.md).
+The following JSON fragment displays a sample conversion warning:
```json { "operationId": "<operationId>",
- "created": "2020-04-22T19:39:54.9518496+00:00",
- "status": "Failed",
- "resourceLocation": "https://atlas.microsoft.com/conversion/{conversionId}?api-version=1.0",
+ "created": "2021-05-19T18:24:28.7922905+00:00",
+ "status": "Succeeded",
+ "warning": {
+ "code": "dwgConversionProblem",
+ "details": [
+ {
+ "code": "warning",
+ "details": [
+ {
+ "code": "manifestWarning",
+ "message": "Ignoring unexpected JSON property: unitProperties[0].nonWheelchairAccessible with value False"
+ }
+ ]
+ }
+ ]
+ },
"properties": { "diagnosticPackageLocation": "https://atlas.microsoft.com/mapData/ce61c3c1-faa8-75b7-349f-d863f6523748?api-version=1.0" }
The sample Drawing package should be converted without errors or warnings. Howev
## Create a dataset
-The dataset is a collection of map features, such as buildings, levels, and rooms. To create a dataset, use the [Dataset Create API](/rest/api/maps/dataset/createpreview). The dataset Create API takes the `conversionId` for the converted Drawing package and returns a `datasetId` of the created dataset. The steps below show you how to create a dataset.
+A dataset is a collection of map features, such as buildings, levels, and rooms. To create a dataset, use the [Dataset Create API](/rest/api/maps/v2/dataset/createpreview). The Dataset Create API takes the `conversionId` for the converted Drawing package and returns a `datasetId` of the created dataset.
+
+To create a dataset:
+
+1. Select **New**.
+
+2. In the **Create New** window, select **Request**.
+
+3. Enter a **Request name** for the request, such as *POST Dataset Create*.
-1. In the Postman application, select **New**. In the **Create New** window, select **Request**. Enter a **Request name** and select a collection. Click **Save**
+4. Select the collection you previously created, and then select **Save**.
-2. Make a **POST** request to the [Dataset Create API](/rest/api/maps/dataset/createpreview) to create a new dataset. Before submitting the request, append both your subscription key and the `conversionId` with the `conversionId` obtained during the Conversion process in step 5. The request should look like the following URL:
+5. Select the **POST** HTTP method.
+
+6. Enter the following URL to the [Dataset API](/rest/api/maps/v2/dataset/createpreview). The request should look like the following URL (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key, and `{conversionId`} with the `conversionId` obtained in [Check Drawing package conversion status](#check-the-drawing-package-conversion-status)):
```http
- https://atlas.microsoft.com/dataset/create?api-version=1.0&conversionID={conversionId}&type=facility&subscription-key={Azure-Maps-Primary-Subscription-key}
+ https://us.atlas.microsoft.com/datasets?api-version=2.0&conversionId={conversionId}&type=facility&subscription-key={Azure-Maps-Primary-Subscription-key}
```
-3. Obtain the `statusURL` in the **Location** key of the response **Headers**.
+7. Select **Send**.
+
+8. In the response window, select the **Headers** tab.
+
+9. Copy the value of the **Operation-Location** key, which is the `status URL`. We'll use the `status URL` to check the status of the dataset.
+
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/data-dataset-location-url.png" border="true" alt-text="Copy the value of the location key for dataset.":::
+
+### Check the dataset creation status
-4. Make a **GET** request at the `statusURL` to obtain the `datasetId`. Append your Azure Maps primary subscription key for authentication. The request should look like the following URL:
+To check the status of the dataset creation process and retrieve the `datasetId`:
+
+1. Select **New**.
+
+2. In the **Create New** window, select **Request**.
+
+3. Enter a **Request name** for the request, such as *GET Dataset Status*.
+
+4. Select the collection you previously created, and then select **Save**.
+
+5. Select the **GET** HTTP method.
+
+6. Enter the `status URL` you copied in [Create a dataset](#create-a-dataset). The request should look like the following URL (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key):
```http
- https://atlas.microsoft.com/dataset/operations/<operationId>?api-version=1.0&subscription-key={Azure-Maps-Primary-Subscription-key}
+ https://us.atlas.microsoft.com/datasets/operations/<operationId>?api-version=2.0&subscription-key={Azure-Maps-Primary-Subscription-key}
```
-5. When the **GET** HTTP request completes successfully, the response header will contain the `datasetId` for the created dataset. Copy the `datasetId`. You'll need to use the `datasetId` to create a tileset.
+7. Select **Send**.
- ```json
- {
- "operationId": "<operationId>",
- "created": "2020-04-22T19:52:38.9352189+00:00",
- "status": "Succeeded",
- "resourceLocation": "https://azure.microsoft.com/dataset/{datasetiId}?api-version=1.0"
- }
- ```
+8. In the response window, select the **Headers** tab. The value of the **Resource-Location** key is the `resource location URL`. The `resource location URL` contains the unique identifier (`datasetId`) of the dataset.
+
+9. Copy the `datasetId`, because you'll use it in the next sections of this tutorial.
+
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/dataset-id.png" alt-text="Copy the dataset ID.":::
## Create a tileset
-A tileset is a set of vector tiles that render on the map. Tilesets are created from existing datasets. However, a tileset is independent from the dataset from which it was sourced. If the dataset is deleted, the tileset will continue to exist. To create a tileset, follow the steps below:
+A tileset is a set of vector tiles that render on the map. Tilesets are created from existing datasets. However, a tileset is independent from the dataset from which it was sourced. If the dataset is deleted, the tileset continues to exist.
+
+To create a tileset:
+
+1. Select **New**.
+
+2. In the **Create New** window, select **Request**.
+
+3. Enter a **Request name** for the request, such as *POST Tileset Create*.
+
+4. Select the collection you previously created, and then select **Save**.
-1. In the Postman application, select **New**. In the **Create New** window, select **Request**. Enter a **Request name** and select a collection. Click **Save**
+5. Select the **POST** HTTP method.
-2. Make a **POST** request in the builder tab. The request URL should look like the following URL:
+6. Enter the following URL to the [Tileset API](/rest/api/maps/v2/tileset/createpreview). The request should look like the following URL (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key), and `{datasetId`} with the `datasetId` obtained in [Check dataset creation status](#check-the-dataset-creation-status):
```http
- https://atlas.microsoft.com/tileset/create/vector?api-version=1.0&datasetID={datasetId}&subscription-key={Azure-Maps-Primary-Subscription-key}
+ https://us.atlas.microsoft.com/tilesets?api-version=2.0&datasetID={datasetId}&subscription-key={Azure-Maps-Primary-Subscription-key}
```
-3. Make a **GET** request at the `statusURL` for the tileset. Append your Azure Maps primary subscription key for authentication. The request should look like the following URL:
+7. Select **Send**.
- ```http
- https://atlas.microsoft.com/tileset/operations/<operationId>?api-version=1.0&subscription-key={Azure-Maps-Primary-Subscription-key}
- ```
+8. In the response window, select the **Headers** tab.
-4. When the **GET** HTTP request completes successfully, the response header will contain the `tilesetId` for the created tileset. Copy the `tilesetId`.
+9. Copy the value of the **Operation-Location** key, which is the `status URL`. We'll use the `status URL` to check the status of the tileset.
- ```json
- {
- "operationId": "<operationId>",
- "createdDateTime": "3/11/2020 8:45:13 PM +00:00",
- "status": "Succeeded",
- "resourceLocation": "https://atlas.microsoft.com/tileset/{tilesetId}?api-version=1.0"
- }
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/data-tileset-location-url.png" border="true" alt-text="Copy the value of the tileset status url.":::
+
+### Check the tileset creation status
+
+To check the status of the dataset creation process and retrieve the `tilesetId`:
+
+1. Select **New**.
+
+2. In the **Create New** window, select **Request**.
+
+3. Enter a **Request name** for the request, such as *GET Tileset Status*.
+
+4. Select the collection you previously created, and then select **Save**.
+
+5. Select the **GET** HTTP method.
+
+6. Enter the `status URL` you copied in [Create a tileset](#create-a-tileset). The request should look like the following URL (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key):
+
+ ```http
+ https://us.atlas.microsoft.com/tilesets/operations/<operationId>?api-version=2.0&subscription-key={Azure-Maps-Primary-Subscription-key}
```
+7. Select **Send**.
+
+8. In the response window, select the **Headers** tab. The value of the **Resource-Location** key is the `resource location URL`. The `resource location URL` contains the unique identifier (`tilesetId`) of the dataset.
+
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/tileset-id.png" alt-text="Copy the tileset ID.":::
+ ## Query datasets with WFS API
- Datasets can be queried using [WFS API](/rest/api/maps/wfs). With the WFS API you can query for feature collections, a specific collection, or a specific feature with a feature **ID**. The feature **ID** uniquely identifies the feature within the dataset. It's used, for example, to identify which feature state should be updated in a given stateset.
+Datasets can be queried using [WFS API](/rest/api/maps/v2/wfs). You can use the WFS API to query for all feature collections or a specific collection. In this section of the tutorial, we'll do both. First we'll query all collections, and then we will query for the `unit` collection.
+
+### Query for feature collections
+
+To query the all collections in your dataset:
+
+1. Select **New**.
-1. In the Postman application, select **New**. In the **Create New** window, select **Request**. Enter a **Request name** and select a collection. Click **Save**
+2. In the **Create New** window, select **Request**.
-2. Make a **GET** request to view a list of the collections in your dataset. Replace `<dataset-id>` with your `datasetId`. Use your Azure Maps primary key instead of the placeholder. The request should look like the following URL:
+3. Enter a **Request name** for the request, such as *GET Dataset Collections*.
+
+4. Select the collection you previously created, and then select **Save**.
+
+5. Select the **GET** HTTP method.
+
+6. Enter the following URL to [WFS API](/rest/api/maps/v2/wfs). The request should look like the following URL (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key), and `{datasetId`} with the `datasetId` obtained in [Check dataset creation status](#check-the-dataset-creation-status):
```http
- https://atlas.microsoft.com/wfs/datasets/{datasetId}/collections?subscription-key={Azure-Maps-Primary-Subscription-key}&api-version=1.0
+ https://us.atlas.microsoft.com/wfs/datasets/{datasetId}/collections?subscription-key={Azure-Maps-Primary-Subscription-key}&api-version=2.0
```
-3. The response body will be delivered in GeoJSON format and will contain all collections in the dataset. For simplicity, the example here only shows the `unit` collection. To see an example that contains all collections, see [WFS Describe Collections API](/rest/api/maps/wfs/collectiondescriptionpreview). To learn more about any collection, you can click on any of the URLs inside the `link` element.
+7. Select **Send**.
+
+8. The response body is returned in GeoJSON format and contains all collections in the dataset. For simplicity, the example here only shows the `unit` collection. To see an example that contains all collections, see [WFS Describe Collections API](/rest/api/maps/v2/wfs/collectiondescriptionpreview). To learn more about any collection, you can select any of the URLs inside the `link` element.
```json {
A tileset is a set of vector tiles that render on the map. Tilesets are created
}, ```
-4. Make a **GET** request for the `unit` feature collections. Replace `{datasetId}` with your `datasetId`. Use your Azure Maps primary key instead of the placeholder. The response body will contain all the features of the `unit` collection. The request should look like the following URL:
+### Query for unit feature collection
+
+In this section, we'll query [WFS API](/rest/api/maps/v2/wfs) for the `unit` feature collection.
+
+To query the unit collection in your dataset:
+
+1. Select **New**.
+
+2. In the **Create New** window, select **Request**.
+
+3. Enter a **Request name** for the request, such as *GET Unit Collection*.
+
+4. Select the collection you previously created, and then select **Save**.
+
+5. Select the **GET** HTTP method.
+
+6. Enter the following URL (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key, and `{datasetId`} with the `datasetId` obtained in [Check dataset creation status](#check-the-dataset-creation-status)):
```http
- https://atlas.microsoft.com/wfs/datasets/{datasetId}/collections/unit/items?subscription-key={Azure-Maps-Primary-Subscription-key}&api-version=1.0
+ https://us.atlas.microsoft.com/wfs/datasets/{datasetId}/collections/unit/items?subscription-key={Azure-Maps-Primary-Subscription-key}&api-version=2.0
```
-5. Copy the feature `id` for a unit feature that has style properties that can be dynamically modified. Because the unit occupancy status and temperature can be dynamically updated, we'll use this feature `id` in the next section. In the following example, the feature `id` is "UNIT26". we'll refer to the style properties of this feature as states, and we'll use the feature to make a stateset.
+7. Select **Send**.
+
+8. After the response returns, copy the feature `id` for one of the `unit` features. In the following example, the feature `id` is "UNIT26". In this tutorial, we'll use "UNIT26" as our feature `id` in the next section.
```json {
A tileset is a set of vector tiles that render on the map. Tilesets are created
## Create a feature stateset
-1. In the Postman application, select **New**. In the **Create New** window, select **Request**. Enter a **Request name** and select a collection. Click **Save**
+Feature statesets define dynamic properties and values on specific features that support them. In this section, we'll create a stateset that defines boolean values and corresponding styles for the **occupancy** property.
+
+To create a stateset:
+
+1. Select **New**.
-2. Make a **POST** request to the [Create Stateset API](/rest/api/maps/featurestate/createstatesetpreview). Use the `datasetId` of the dataset that contains the state you want to modify. The request should look like the following URL:
+2. In the **Create New** window, select **Request**.
+
+3. Enter a **Request name** for the request, such as *POST Create Stateset*.
+
+4. Select the collection you previously created, and then select **Save**.
+
+5. Select the **POST** HTTP method.
+
+6. Enter the following URL to the [Stateset API](/rest/api/maps/v2/featurestate/createstatesetpreview). The request should look like the following URL (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key, and `{datasetId`} with the `datasetId` obtained in [Check dataset creation status](#check-the-dataset-creation-status)):
```http
- https://atlas.microsoft.com/featureState/stateset?api-version=1.0&datasetId={datasetId}&subscription-key={Azure-Maps-Primary-Subscription-key}
+ https://us.atlas.microsoft.com/featurestatesets?api-version=2.0&datasetId={datasetId}&subscription-key={Azure-Maps-Primary-Subscription-key}
```
-3. In the **Headers** of the **POST** request, set `Content-Type` to `application/json`. In the **Body**, provide the raw json styles below to reflect changes to the `occupied` and `temperature` *states*. When you're done, click **Send**.
+7. Select the **Headers** tab.
+
+8. In the **KEY** field, select `Content-Type`.
+
+9. In the **VALUE** field, select `application/json`.
+
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/stateset-header.png"alt-text="Header tab information for stateset creation.":::
+
+10. Select the **Body** tab.
+
+11. In the dropdown lists, select **raw** and **JSON**.
+
+12. Copy the following JSON styles, and then paste them in the **Body** window:
```json {
A tileset is a set of vector tiles that render on the map. Tilesets are created
"false":"#00FF00" } ]
- },
- {
- "keyname":"temperature",
- "type":"number",
- "rules":[
- {
- "range":{
- "exclusiveMaximum":66
- },
- "color":"#00204e"
- },
- {
- "range":{
- "minimum":66,
- "exclusiveMaximum":70
- },
- "color":"#0278da"
- },
- {
- "range":{
- "minimum":70,
- "exclusiveMaximum":74
- },
- "color":"#187d1d"
- },
- {
- "range":{
- "minimum":74,
- "exclusiveMaximum":78
- },
- "color":"#fef200"
- },
- {
- "range":{
- "minimum":78,
- "exclusiveMaximum":82
- },
- "color":"#fe8c01"
- },
- {
- "range":{
- "minimum":82
- },
- "color":"#e71123"
- }
- ]
} ] } ```
-4. Copy the `statesetId` from the response body.
+13. Select **Send**.
+
+14. After the response returns successfully, copy the `statesetId` from the response body. In the next section, we'll use the `statesetId` to change the `occupancy` property state of the unit with feature `id` "UNIT26".
+
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/response-stateset-id.png"alt-text="Stateset ID response.":::
+
+### Update a feature state
+
+To update the `occupied` state of the unit with feature `id` "UNIT26":
+
+1. Select **New**.
+
+2. In the **Create New** window, select **Request**.
-5. Create a **POST** request to update the state: Pass the statesetId and feature `ID` with your Azure Maps subscription key. The request should look like the following URL:
+3. Enter a **Request name** for the request, such as *POST Set Stateset*.
+
+4. Select the collection you previously created, and then select **Save**.
+
+5. Select the **PUT** HTTP method.
+
+6. Enter the following URL to the [Feature Statesets API](/rest/api/maps/v2/featurestate/createstatesetpreview). The request should look like the following URL (replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key, and `{statesetId`} with the `statesetId` obtained in [Create a feature stateset](#create-a-feature-stateset)):
```http
- https://atlas.microsoft.com/featureState/state?api-version=1.0&statesetID={statesetId}&featureID={featureId}&subscription-key={Azure-Maps-Primary-Subscription-key}
+ https://us.atlas.microsoft.com/featurestatesets/{statesetId}/featureStates/UNIT26?api-version=2.0&subscription-key={Azure-Maps-Primary-Subscription-key}
```
-6. In the **Headers** of the **POST** request, set `Content-Type` to `application/json`. In the **BODY** of the **POST** request, copy and paste the JSON in the sample below.
+7. Select the **Headers** tab.
+
+8. In the **KEY** field, select `Content-Type`.
+
+9. In the **VALUE** field, select `application/json`.
+
+ :::image type="content" source="./media/tutorial-creator-indoor-maps/stateset-header.png"alt-text="Header tab information for stateset creation.":::
+
+10. Select the **Body** tab.
+
+11. In the dropdown lists, select **raw** and **JSON**.
+
+12. Copy the following JSON style, and then paste it in the **Body** window:
```json {
A tileset is a set of vector tiles that render on the map. Tilesets are created
{ "keyName": "occupied", "value": true,
- "eventTimestamp": "2019-11-14T17:10:20"
+ "eventTimestamp": "2020-11-14T17:10:20"
} ] } ``` >[!NOTE]
- > The update will only be saved if the time posted stamp is after the time stamp of the previous request. We can pass any keyname that we've previously configured during creation.
+ > The update will be saved only if the time posted stamp is after the time stamp of the previous request.
+
+13. Select **Send**.
-7. Upon a successful update, you'll receive a `200 OK` HTTP status code. If you have [dynamic styling implemented](indoor-map-dynamic-styling.md) for an indoor map, the update will display in your rendered map at the specified time stamp.
+14. After the update completes, you'll receive a `200 OK` HTTP status code. If you implemented [dynamic styling](indoor-map-dynamic-styling.md) for an indoor map, the update displays at the specified time stamp in your rendered map.
-The [Feature Get States API](/rest/api/maps/featurestate/getstatespreview) allows you to retrieve the state of a feature using its feature `ID`. You can also delete the stateset and its resources by using the [Feature State Delete API](/rest/api/maps/featurestate/deletestatesetpreview).
+You can use the [Feature Get Stateset API](/rest/api/maps/v2/featurestate/getstatespreview) to retrieve the state of a feature using its feature `id`. You can also use the [Feature State Delete State API](/rest/api/maps/v2/featurestate/deletestatesetpreview) to delete the stateset and its resources.
-To learn more about the different Azure Maps Creator services (Preview) discussed in this article see, [Creator Indoor Maps](creator-indoor-maps.md).
+To learn more about the different Azure Maps Creator services discussed in this article, see [Creator Indoor Maps](creator-indoor-maps.md).
## Clean up resources
-There are no resources that require cleanup.
+There aren't any resources that require cleanup.
## Next steps
-To learn how to use the indoor maps module, see
+To learn how to use the Indoor Maps module, see
> [!div class="nextstepaction"]
-> [Use the Indoor Maps module](how-to-use-indoor-module.md)
+> [Use the Indoor Maps module](how-to-use-indoor-module.md)
azure-maps Tutorial Ev Routing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/tutorial-ev-routing.md
for loc in range(len(searchPolyResponse["results"])):
reachableLocations.append(location) ```
-## Upload the reachable range and charging points to Azure Maps Data service (Preview)
+## Upload the reachable range and charging points to Azure Maps Data service
-On a map, you'll want to visualize the charging stations and the boundary for the maximum reachable range of the electric vehicle. To do so, upload the boundary data and charging stations data as geojson objects to Azure Maps Data service (Preview). Use the [Data Upload API](/rest/api/maps/data/uploadpreview).
+On a map, you'll want to visualize the charging stations and the boundary for the maximum reachable range of the electric vehicle. To do so, upload the boundary data and charging stations data as geojson objects to Azure Maps Data service. Use the [Data Upload API](/rest/api/maps/data%20v2/uploadpreview).
To upload the boundary and charging point data to Azure Maps Data service, run the following two cells:
rangeData = {
] }
-# Upload the range data to Azure Maps Data service (Preview).
-uploadRangeResponse = await session.post("https://atlas.microsoft.com/mapData/upload?subscription-key={}&api-version=1.0&dataFormat=geojson".format(subscriptionKey), json = rangeData)
+# Upload the range data to Azure Maps Data service.
+uploadRangeResponse = await session.post("https://us.atlas.microsoft.com/mapData?subscription-key={}&api-version=2.0&dataFormat=geojson".format(subscriptionKey), json = rangeData)
rangeUdidRequest = uploadRangeResponse.headers["Location"]+"&subscription-key={}".format(subscriptionKey)
poiData = {
] }
-# Upload the electric vehicle charging station data to Azure Maps Data service (Preview).
-uploadPOIsResponse = await session.post("https://atlas.microsoft.com/mapData/upload?subscription-key={}&api-version=1.0&dataFormat=geojson".format(subscriptionKey), json = poiData)
+# Upload the electric vehicle charging station data to Azure Maps Data service.
+uploadPOIsResponse = await session.post("https://us.atlas.microsoft.com/mapData?subscription-key={}&api-version=2.0&dataFormat=geojson".format(subscriptionKey), json = poiData)
poiUdidRequest = uploadPOIsResponse.headers["Location"]+"&subscription-key={}".format(subscriptionKey)
routeData = {
## Visualize the route
-To help visualize the route, you first upload the route data as a geojson object to Azure Maps Data service (Preview). To do so, use the Azure Maps [Data Upload API](/rest/api/maps/data/uploadpreview). Then, call the rendering service, [Get Map Image API](/rest/api/maps/render/getmapimage), to render the route on the map, and visualize it.
+To help visualize the route, you first upload the route data as a geojson object to Azure Maps Data service . To do so, use the Azure Maps [Data Upload API](/rest/api/maps/data%20v2/uploadpreview). Then, call the rendering service, [Get Map Image API](/rest/api/maps/render/getmapimage), to render the route on the map, and visualize it.
To get an image for the rendered route on the map, run the following script: ```python
-# Upload the route data to Azure Maps Data service (Preview).
-routeUploadRequest = await session.post("https://atlas.microsoft.com/mapData/upload?subscription-key={}&api-version=1.0&dataFormat=geojson".format(subscriptionKey), json = routeData)
+# Upload the route data to Azure Maps Data service .
+routeUploadRequest = await session.post("https://atlas.microsoft.com/mapData?subscription-key={}&api-version=2.0&dataFormat=geojson".format(subscriptionKey), json = routeData)
udidRequestURI = routeUploadRequest.headers["Location"]+"&subscription-key={}".format(subscriptionKey)
To explore the Azure Maps APIs that are used in this tutorial, see:
* [Get Route Range](/rest/api/maps/route/getrouterange) * [Post Search Inside Geometry](/rest/api/maps/search/postsearchinsidegeometry)
-* [Data Upload](/rest/api/maps/data/uploadpreview)
+* [Data Upload](/rest/api/maps/data%20v2/uploadpreview)
* [Render - Get Map Image](/rest/api/maps/render/getmapimage) * [Post Route Matrix](/rest/api/maps/route/postroutematrix) * [Get Route Directions](/rest/api/maps/route/getroutedirections)
azure-maps Tutorial Geofence https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/tutorial-geofence.md
This tutorial walks you through the basics of creating and using Azure Maps geof
Azure Maps provides a number of services to support the tracking of equipment entering and exiting the construction area. In this tutorial, you: > [!div class="checklist"]
-> * Upload [Geofencing GeoJSON data](geofence-geojson.md) that defines the construction site areas you want to monitor. You'll use the [Data Upload API](/rest/api/maps/data/uploadpreview) to upload geofences as polygon coordinates to your Azure Maps account.
+> * Upload [Geofencing GeoJSON data](geofence-geojson.md) that defines the construction site areas you want to monitor. You'll use the [Data Upload API](/rest/api/maps/data%20v2/uploadpreview) to upload geofences as polygon coordinates to your Azure Maps account.
> * Set up two [logic apps](../event-grid/handler-webhooks.md#logic-apps) that, when triggered, send email notifications to the construction site operations manager when equipment enters and exits the geofence area. > * Use [Azure Event Grid](../event-grid/overview.md) to subscribe to enter and exit events for your Azure Maps geofence. You set up two webhook event subscriptions that call the HTTP endpoints defined in your two logic apps. The logic apps then send the appropriate email notifications of equipment moving beyond or entering the geofence. > * Use [Search Geofence Get API](/rest/api/maps/spatial/getgeofence) to receive notifications when a piece of equipment exits and enters the geofence areas.
This tutorial uses the [Postman](https://www.postman.com/) application, but you
For this tutorial, you upload geofencing GeoJSON data that contains a `FeatureCollection`. The `FeatureCollection` contains two geofences that define polygonal areas within the construction site. The first geofence has no time expiration or restrictions. The second one can only be queried against during business hours (9:00 AM-5:00 PM in the Pacific Time zone), and will no longer be valid after January 1, 2022. For more information on the GeoJSON format, see [Geofencing GeoJSON data](geofence-geojson.md). >[!TIP]
->You can update your geofencing data at any time. For more information, see [Data Upload API](/rest/api/maps/data/uploadpreview).
+>You can update your geofencing data at any time. For more information, see [Data Upload API](/rest/api/maps/data%20v2/uploadpreview).
1. Open the Postman app. Near the top, select **New**. In the **Create New** window, select **Collection**. Name the collection and select **Create**.
For this tutorial, you upload geofencing GeoJSON data that contains a `FeatureCo
3. Select the **POST** HTTP method in the builder tab, and enter the following URL to upload the geofencing data to Azure Maps. For this request, and other requests mentioned in this article, replace `{Azure-Maps-Primary-Subscription-key}` with your primary subscription key. ```HTTP
- https://atlas.microsoft.com/mapData/upload?subscription-key={Azure-Maps-Primary-Subscription-key}&api-version=1.0&dataFormat=geojson
+ https://us.atlas.microsoft.com/mapData?subscription-key={Azure-Maps-Primary-Subscription-key}&api-version=2.0&dataFormat=geojson
``` The `geojson` parameter in the URL path represents the data format of the data being uploaded.
For this tutorial, you upload geofencing GeoJSON data that contains a `FeatureCo
} ```
-5. Select **Send**, and wait for the request to process. When the request completes, go to the **Headers** tab of the response. Copy the value of the **Location** key, which is the `status URL`.
+5. Select **Send**, and wait for the request to process. When the request completes, go to the **Headers** tab of the response. Copy the value of the **Operation-Location** key, which is the `status URL`.
```http
- https://atlas.microsoft.com/mapData/operations/<operationId>?api-version=1.0
+ https://us.atlas.microsoft.com/mapData/operations/<operationId>?api-version=2.0
``` 6. To check the status of the API call, create a **GET** HTTP request on the `status URL`. You'll need to append your primary subscription key to the URL for authentication. The **GET** request should like the following URL: ```HTTP
- https://atlas.microsoft.com/mapData/<operationId>/status?api-version=1.0&subscription-key={Subscription-key}
+ https://us.atlas.microsoft.com/mapData/<operationId>?api-version=2.0&subscription-key={Subscription-key}
```
-7. When the **GET** HTTP request completes successfully, it returns a `resourceLocation`. The `resourceLocation` contains the unique `udid` for the uploaded content. Save this `udid` to query the Get Geofence API in the last section of this tutorial. Optionally, you can use the `resourceLocation` URL to retrieve metadata from this resource in the next step.
+7. When the request completes successfully, select the **Headers** tab in the response window. Copy the value of the **Resource-Location** key, which is the `resource location URL`. The `resource location URL` contains the unique identifier (`udid`) of the uploaded data. Save the `udid` to query the Get Geofence API in the last section of this tutorial. Optionally, you can use the `resource location URL` to retrieve metadata from this resource in the next step.
- ```json
- {
- "status": "Succeeded",
- "resourceLocation": "https://atlas.microsoft.com/mapData/metadata/{udid}?api-version=1.0"
- }
- ```
+ :::image type="content" source="./media/tutorial-geofence/resource-location-url.png" alt-text="Copy the resource location URL.":::
-8. To retrieve content metadata, create a **GET** HTTP request on the `resourceLocation` URL that was retrieved in step 7. Make sure to append your primary subscription key to the URL for authentication. The **GET** request should like the following URL:
+8. To retrieve content metadata, create a **GET** HTTP request on the `resource location URL` that was retrieved in step 7. Make sure to append your primary subscription key to the URL for authentication. The **GET** request should like the following URL:
```http
- https://atlas.microsoft.com/mapData/metadata/{udid}?api-version=1.0&subscription-key={Azure-Maps-Primary-Subscription-key}
+ https://us.atlas.microsoft.com/mapData/metadata/{udid}?api-version=2.0&subscription-key={Azure-Maps-Primary-Subscription-key}
```
-9. When the **GET** HTTP request completes successfully, the response body will contain the `udid` specified in the `resourceLocation` of step 7. It will also contain the location to access and download the content in the future, and other metadata about the content. An example of the overall response is:
+9. When the request completes successfully, select the **Headers** tab in the response window. The metadata should like the following JSON fragment:
```json { "udid": "{udid}",
- "location": "https://atlas.microsoft.com/mapData/{udid}?api-version=1.0",
- "created": "7/15/2020 6:11:43 PM +00:00",
- "updated": "7/15/2020 6:11:45 PM +00:00",
- "sizeInBytes": 1962,
+ "location": "https://us.atlas.microsoft.com/mapData/6ebf1ae1-2a66-760b-e28c-b9381fcff335?api-version=2.0",
+ "created": "5/18/2021 8:10:32 PM +00:00",
+ "updated": "5/18/2021 8:10:37 PM +00:00",
+ "sizeInBytes": 946901,
"uploadStatus": "Completed" } ```
azure-maps Tutorial Iot Hub Maps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-maps/tutorial-iot-hub-maps.md
In this tutorial you will:
> [!div class="checklist"] > * Create an Azure storage account to log car tracking data.
-> * Upload a geofence to the Azure Maps Data service (Preview) by using the Data Upload API.
+> * Upload a geofence to the Azure Maps Data service by using the Data Upload API.
> * Create a hub in Azure IoT Hub, and register a device. > * Create a function in Azure Functions, implementing business logic based on Azure Maps spatial analytics. > * Subscribe to IoT device telemetry events from the Azure function via Azure Event Grid.
Follow these steps to upload the geofence by using the Azure Maps Data Upload AP
3. Select the **POST** HTTP method in the builder tab, and enter the following URL to upload the geofence to the Data Upload API. Make sure to replace `{subscription-key}` with your primary subscription key. ```HTTP
- https://atlas.microsoft.com/mapData/upload?subscription-key={subscription-key}&api-version=1.0&dataFormat=geojson
+ https://us.atlas.microsoft.com/mapData?subscription-key={subscription-key}&api-version=2.0&dataFormat=geojson
``` In the URL path, the `geojson` value against the `dataFormat` parameter represents the format of the data being uploaded. 4. Select **Body** > **raw** for the input format, and choose **JSON** from the drop-down list. [Open the JSON data file](https://raw.githubusercontent.com/Azure-Samples/iothub-to-azure-maps-geofencing/master/src/Data/geofence.json?token=AKD25BYJYKDJBJ55PT62N4C5LRNN4), and copy the JSON into the body section. Select **Send**.
-5. Select **Send** and wait for the request to process. After the request completes, go to the **Headers** tab of the response. Copy the value of the **Location** key, which is the `status URL`.
+5. Select **Send** and wait for the request to process. After the request completes, go to the **Headers** tab of the response. Copy the value of the **Operation-Location** key, which is the `status URL`.
```http
- https://atlas.microsoft.com/mapData/operations/<operationId>?api-version=1.0
+ https://us.atlas.microsoft.com/mapData/operations/<operationId>?api-version=2.0
``` 6. To check the status of the API call, create a **GET** HTTP request on the `status URL`. You'll need to append your primary subscription key to the URL for authentication. The **GET** request should like the following URL: ```HTTP
- https://atlas.microsoft.com/mapData/<operationId>/status?api-version=1.0&subscription-key={subscription-key}
+ https://us.atlas.microsoft.com/mapData/<operationId>/status?api-version=2.0&subscription-key={subscription-key}
```
-
-7. When the **GET** HTTP request completes successfully, it returns a `resourceLocation`. The `resourceLocation` contains the unique `udid` for the uploaded content. Copy this `udid` for later use in this tutorial.
-
- ```json
- {
- "status": "Succeeded",
- "resourceLocation": "https://atlas.microsoft.com/mapData/metadata/{udid}?api-version=1.0"
- }
- ```
+
+7. When the request completes successfully, select the **Headers** tab in the response window. Copy the value of the **Resource-Location** key, which is the `resource location URL`. The `resource location URL` contains the unique identifier (`udid`) of the uploaded data. Copy the `udid` for later use in this tutorial.
+
+ :::image type="content" source="./media/tutorial-iot-hub-maps/resource-location-url.png" alt-text="Copy the resource location URL.":::
## Create an IoT hub
azure-monitor Alerts Webhooks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/alerts/alerts-webhooks.md
The POST operation contains the following JSON payload and schema for all metric
* Learn how to [execute Azure Automation scripts (runbooks) on Azure alerts](https://go.microsoft.com/fwlink/?LinkId=627081). * Learn how to [use a logic app to send an SMS message via Twilio from an Azure alert](https://github.com/Azure/azure-quickstart-templates/tree/master/demos/alert-to-text-message-with-logic-app). * Learn how to [use a logic app to send a Slack message from an Azure alert](https://github.com/Azure/azure-quickstart-templates/tree/master/demos/alert-to-slack-with-logic-app).
-* Learn how to [use a logic app to send a message to an Azure queue from an Azure alert](https://github.com/Azure/azure-quickstart-templates/tree/master/demos/alert-to-queue-with-logic-app).
+* Learn how to [use a logic app to send a message to an Azure Queue from an Azure alert](https://github.com/Azure/azure-quickstart-templates/tree/master/demos/alert-to-queue-with-logic-app).
azure-monitor Logs Data Export https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/logs-data-export.md
If the data export rule includes a table that doesn't exist, it will fail with t
## Supported tables Supported tables currently are limited to those specified below. All data from the table will be exported unless limitations are specified. This list will be updated as support for additional tables is added. -
-| Table | Limitations |
+| Table | Limitations |
|:|:|
+| AACAudit | |
+| AACHttpRequest | |
| AADDomainServicesAccountLogon | | | AADDomainServicesAccountManagement | | | AADDomainServicesDirectoryServiceAccess | |
Supported tables currently are limited to those specified below. All data from t
| AADServicePrincipalSignInLogs | | | ABSBotRequests | | | ACSBillingUsage | |
+| ACSChatIncomingOperations | |
| ACSSMSIncomingOperations | | | ADAssessmentRecommendation | | | ADFActivityRun | | | ADFPipelineRun | |
+| ADFSSignInLogs | |
| ADFTriggerRun | |
+| ADPAudit | |
+| ADPRequests | |
| ADReplicationResult | | | ADSecurityAssessmentRecommendation | | | ADTDigitalTwinsOperation | |
Supported tables currently are limited to those specified below. All data from t
| AegDeliveryFailureLogs | | | AegPublishFailureLogs | | | Alert | |
-| AmlOnlineEndpointConsoleLog | |
| ApiManagementGatewayLogs | | | AppCenterError | | | AppPlatformSystemLogs | |
Supported tables currently are limited to those specified below. All data from t
| AutoscaleEvaluationsLog | | | AutoscaleScaleActionsLog | | | AWSCloudTrail | |
+| AzureActivityV2 | |
| AzureAssessmentRecommendation | | | AzureDevOpsAuditing | | | BehaviorAnalytics | | | BlockchainApplicationLog | | | BlockchainProxyLog | |
+| CDBControlPlaneRequests | |
+| CDBDataPlaneRequests | |
+| CDBMongoRequests | |
+| CDBPartitionKeyRUConsumption | |
+| CDBPartitionKeyStatistics | |
+| CDBQueryRuntimeStatistics | |
| CommonSecurityLog | | | ComputerGroup | | | ConfigurationData | Partial support ΓÇô some of the data is ingested through internal services that isn't supported for export. This portion is missing in export currently. | | ContainerImageInventory | | | ContainerInventory | | | ContainerLog | |
+| ContainerLogV2 | |
| ContainerNodeInventory | | | ContainerServiceLog | | | CoreAzureBackup | |
Supported tables currently are limited to those specified below. All data from t
| DatabricksSQLPermissions | | | DatabricksSSH | | | DatabricksWorkspace | |
+| DeviceFileEvents | |
+| DeviceNetworkEvents | |
+| DeviceNetworkInfo | |
+| DeviceProcessEvents | |
+| DeviceRegistryEvents | |
| DnsEvents | | | DnsInventory | |
+| DummyHydrationFact | |
| Dynamics365Activity | |
-| Event | Partial support ΓÇô data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics Extension agent is collected though storage while this path isnΓÇÖt supported in export. |
+| EmailAttachmentInfo | |
+| EmailEvents | |
+| EmailUrlInfo | |
+| Event | Partial support ΓÇô data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics Extension agent is collected though storage while this path isnΓÇÖt supported in export.2 |
| ExchangeAssessmentRecommendation | | | FailedIngestion | | | FunctionAppLogs | |
+| HDInsightAmbariClusterAlerts | |
+| HDInsightAmbariSystemMetrics | |
+| HDInsightHadoopAndYarnLogs | |
+| HDInsightHadoopAndYarnMetrics | |
+| HDInsightHiveAndLLAPLogs | |
+| HDInsightHiveAndLLAPMetrics | |
+| HDInsightHiveTezAppStats | |
+| HDInsightOozieLogs | |
+| HDInsightSecurityLogs | |
+| HDInsightSparkApplicationEvents | |
+| HDInsightSparkBlockManagerEvents | |
+| HDInsightSparkEnvironmentEvents | |
+| HDInsightSparkExecutorEvents | |
+| HDInsightSparkJobEvents | |
+| HDInsightSparkLogs | |
+| HDInsightSparkSQLExecutionEvents | |
+| HDInsightSparkStageEvents | |
+| HDInsightSparkStageTaskAccumulables | |
+| HDInsightSparkTaskEvents | |
| Heartbeat | | | HuntingBookmark | | | InsightsMetrics | Partial support ΓÇô some of the data is ingested through internal services that isn't supported for export. This portion is missing in export currently. |
Supported tables currently are limited to those specified below. All data from t
| MicrosoftAzureBastionAuditLogs | | | MicrosoftDataShareReceivedSnapshotLog | | | MicrosoftDataShareSentSnapshotLog | |
+| MicrosoftDataShareShareLog | |
| MicrosoftHealthcareApisAuditLogs | | | NWConnectionMonitorPathResult | | | NWConnectionMonitorTestResult | | | OfficeActivity | Partial support ΓÇô some of the data to ingested via webhooks from O365 into LA. This portion is missing in export currently. | | Operation | Partial support ΓÇô some of the data is ingested through internal services that isn't supported for export. This portion is missing in export currently. |
-| Perf | Partial support ΓÇô only Windows perf data currently is supported. The Linux perf data is missing in export currently. |
-| PowerBIDatasetsTenant | |
+| Perf | Partial support ΓÇô only windows perf data is currently supported. The Linux perf data is missing in export currently. |
| PowerBIDatasetsWorkspace | |
-| PowerBIDatasetsWorkspacePreview | |
+| PurviewScanStatusLogs | |
| SCCMAssessmentRecommendation | | | SCOMAssessmentRecommendation | | | SecurityAlert | | | SecurityBaseline | | | SecurityBaselineSummary | |
+| SecurityCef | |
| SecurityDetection | |
-| SecurityEvent | Partial support ΓÇô data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics Extension agent is collected though storage while this path isnΓÇÖt supported in export. |
+| SecurityEvent | Partial support ΓÇô data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics Extension agent is collected though storage while this path isnΓÇÖt supported in export.2 |
| SecurityIncident | | | SecurityIoTRawEvent | | | SecurityNestedRecommendation | | | SecurityRecommendation | |
+| SentinelHealth | |
| SfBAssessmentRecommendation | | | SfBOnlineAssessmentRecommendation | | | SharePointOnlineAssessmentRecommendation | |
Supported tables currently are limited to those specified below. All data from t
| SynapseSqlPoolRequestSteps | | | SynapseSqlPoolSqlRequests | | | SynapseSqlPoolWaits | |
-| Syslog | Partial support ΓÇô data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics Extension agent is collected though storage while this path isnΓÇÖt supported in export. |
+| Syslog | Partial support ΓÇô data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics Extension agent is collected though storage while this path isnΓÇÖt supported in export.2 |
| ThreatIntelligenceIndicator | | | Update | Partial support ΓÇô some of the data is ingested through internal services that isn't supported for export. This portion is missing in export currently. | | UpdateRunProgress | | | UpdateSummary | | | Usage | |
+| UserAccessAnalytics | |
+| UserPeerAnalytics | |
| Watchlist | | | WindowsEvent | | | WindowsFirewall | | | WireData | Partial support ΓÇô some of the data is ingested through internal services that isn't supported for export. This portion is missing in export currently. |
+| WorkloadDiagnosticLogs | |
+| WVDAgentHealthStatus | |
| WVDCheckpoints | | | WVDConnections | | | WVDErrors | | | WVDFeeds | | | WVDManagement | | - ## Next steps - [Query the exported data from Azure Data Explorer](../logs/azure-data-explorer-query-storage.md).
azure-monitor Manage Cost Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/manage-cost-storage.md
na Previously updated : 05/07/2021 Last updated : 05/27/2021
The default pricing for Log Analytics is a **Pay-As-You-Go** model based on data
- Number of VMs monitored - Type of data collected from each monitored VM
-In addition to the Pay-As-You-Go model, Log Analytics has **Capacity Reservation** tiers which enable you to save as much as 25% compared to the Pay-As-You-Go price. The capacity reservation pricing enables you to buy a reservation starting at 100 GB/day. Any usage above the reservation level (overage) will be billed at that same price per GB as provided by the current capacity reservation level. The Capacity Reservation tiers have a 31-day commitment period. During the commitment period, you can change to a higher level Capacity Reservation tier (which will restart the 31-day commitment period), but you cannot move back to Pay-As-You-Go or to a lower Capacity Reservation tier until after the commitment period is finished. Billing for the Capacity Reservation tiers is done on a daily basis. [Learn more](https://azure.microsoft.com/pricing/details/monitor/) about Log Analytics Pay-As-You-Go and Capacity Reservation pricing.
+In addition to the Pay-As-You-Go model, Log Analytics has **Commitment Tiers** which enable you to save as much as 25% compared to the Pay-As-You-Go price. The commitment tier pricing enables you to make a commitment to buy data ingestion starting at 100 GB/day at a lower price than Pay-As-You-Go pricing. Any usage above the commitment level (overage) will be billed at that same price per GB as provided by the current commitment tier. The commitment tiers have a 31-day commitment period. During the commitment period, you can change to a higher commitment tier (which will restart the 31-day commitment period), but you cannot move back to Pay-As-You-Go or to a lower commitment tier until after the commitment period is finished. Billing for the commitment tiers is done on a daily basis. [Learn more](https://azure.microsoft.com/pricing/details/monitor/) about Log Analytics Pay-As-You-Go and Commitment Tier pricing.
> [!NOTE]
-> Until early May 2021, capacity reservation overage was billed at the Pay-As-You-Go price. The change to bill overage at the same price-per-GB as the current capacity reservation level reduces the need for users with large reservation levels to fine-tune their reservation level.
+> Starting June 2, 2021, **Capacity Reservations** are now called **Commitment Tiers**. Data collected above your commitment tier level (overage) is now billed at the same price-per-GB as the current commitment tier level, lowering costs compared to the old method of billing at the Pay-As-You-Go rate, and reducing the need for users with large data volumes to fine-tune their commitment level. Additionally, three new larger commitment tiers have been added at 1000, 2000 and 5000 GB/day.
In all pricing tiers, an event's data size is calculated from a string representation of the properties which are stored in Log Analytics for this event, whether the data is sent from an agent or added during the ingestion process. This includes any [custom fields](custom-fields.md) that are added as data is collected and then stored in Log Analytics. Several properties common to all data types, including some [Log Analytics Standard Properties](./log-standard-columns.md), are excluded in the calculation of the event size. This includes `_ResourceId`, `_SubscriptionId`, `_ItemId`, `_IsBillable`, `_BilledSize` and `Type`. All other properties stored in Log Analytics are included in the calculation of the event size. Some data types are free from data ingestion charges altogether, for example the AzureActivity, Heartbeat and Usage types. To determine whether an event was excluded from billing for data ingestion, you can use the `_IsBillable` property as shown [below](#data-volume-for-specific-events). Usage is reported in GB (1.0E9 bytes).
Also, note that some solutions, such as [Azure Defender (Security Center)](https
### Log Analytics Dedicated Clusters
-Log Analytics Dedicated Clusters are collections of workspaces into a single managed Azure Data Explorer cluster to support advanced scenarios such as [Customer-Managed Keys](customer-managed-keys.md). Log Analytics Dedicated Clusters use a Capacity Reservation pricing model which must be configured to at least 1000 GB/day. This capacity level has a 25% discount compared to Pay-As-You-Go pricing. Any usage above the reservation level will be billed at the Pay-As-You-Go rate. The cluster Capacity Reservation has a 31-day commitment period after the reservation level is increased. During the commitment period the capacity reservation level cannot be reduced, but it can be increased at any time. When workspaces are associated to a cluster, the data ingestion billing for those workspaces are done at the cluster level using the configured capacity reservation level. Learn more about [creating a Log Analytics Clusters](customer-managed-keys.md#create-cluster) and [associating workspaces to it](customer-managed-keys.md#link-workspace-to-cluster). Capacity Reservation pricing information is available at the [Azure Monitor pricing page]( https://azure.microsoft.com/pricing/details/monitor/).
+Log Analytics Dedicated Clusters are collections of workspaces into a single managed Azure Data Explorer cluster to support advanced scenarios such as [Customer-Managed Keys](customer-managed-keys.md). Log Analytics Dedicated Clusters use a commitment tier pricing model which must be configured to at least 1000 GB/day. The cluster commitment tier has a 31-day commitment period after the commitment level is increased. During the commitment period the commitment tier level cannot be reduced, but it can be increased at any time. When workspaces are associated to a cluster, the data ingestion billing for those workspaces are done at the cluster level using the configured commitment tier level. Learn more about [creating a Log Analytics Clusters](customer-managed-keys.md#create-cluster) and [associating workspaces to it](customer-managed-keys.md#link-workspace-to-cluster). Commitment tier pricing information is available at the [Azure Monitor pricing page]( https://azure.microsoft.com/pricing/details/monitor/).
-The cluster capacity reservation level is configured via programmatically with Azure Resource Manager using the `Capacity` parameter under `Sku`. The `Capacity` is specified in units of GB and can have values of 1000 GB/day or more in increments of 100 GB/day. This is detailed at [Azure Monitor customer-managed key](customer-managed-keys.md#create-cluster). If your cluster needs a reservation above 2000 GB/day contact us at [LAIngestionRate@microsoft.com](mailto:LAIngestionRate@microsoft.com).
+The cluster commitment tier level is configured via programmatically with Azure Resource Manager using the `Capacity` parameter under `Sku`. The `Capacity` is specified in units of GB and can have values of 1000 GB/day or more in increments of 100 GB/day. This is detailed at [Azure Monitor customer-managed key](customer-managed-keys.md#create-cluster).
There are two modes of billing for usage on a cluster. These can be specified by the `billingType` parameter when [configuring your cluster](customer-managed-keys.md#customer-managed-key-operations). The two modes are: 1. **Cluster**: in this case (which is the default), billing for ingested data is done at the cluster level. The ingested data quantities from each workspace associated to a cluster is aggregated to calculate the daily bill for the cluster. Note that per-node allocations from [Azure Defender (Security Center)](../../security-center/index.yml) are applied at the workspace level prior to this aggregation of aggregated data across all workspaces in the cluster.
-2. **Workspaces**: the Capacity Reservation costs for your Cluster are attributed proportionately to the workspaces in the Cluster (after accounting for per-node allocations from [Azure Defender (Security Center)](../../security-center/index.yml) for each workspace.) If the total data volume ingested into a workspace for a day is less than the Capacity Reservation, then each workspace is billed for its ingested data at the effective per-GB Capacity Reservation rate by billing them a fraction of the Capacity Reservation, and the unused part of the Capacity Reservation is billed to the cluster resource. If the total data volume ingested into a workspace for a day is more than the Capacity Reservation, then each workspace is billed for a fraction of the Capacity Reservation based on itΓÇÖs fraction of the ingested data that day, and each workspace for a fraction of the ingested data above the Capacity Reservation. There is nothing billed to the cluster resource if the total data volume ingested into a workspace for a day is over the Capacity Reservation.
+2. **Workspaces**: the commitment tier costs for your Cluster are attributed proportionately to the workspaces in the Cluster (after accounting for per-node allocations from [Azure Defender (Security Center)](../../security-center/index.yml) for each workspace.) If the total data volume ingested into a workspace for a day is less than the commitment tier, then each workspace is billed for its ingested data at the effective per-GB commitment tier rate by billing them a fraction of the commitment tier, and the unused part of the commitment tier is billed to the cluster resource. If the total data volume ingested into a workspace for a day is more than the commitment tier, then each workspace is billed for a fraction of the commitment tier based on itΓÇÖs fraction of the ingested data that day, and each workspace for a fraction of the ingested data above the commitment tier. There is nothing billed to the cluster resource if the total data volume ingested into a workspace for a day is over the commitment tier.
In cluster billing options, data retention is billed at per-workspace. Note that cluster billing starts when the cluster is created, regardless of whether workspaces have been associated to the cluster. Also, note that workspaces associated to a cluster no longer have a pricing tier.
Log Analytics charges are added to your Azure bill. You can see details of your
Azure provides a great deal of useful functionality in the [Azure Cost Management + Billing](../../cost-management-billing/costs/quick-acm-cost-analysis.md?toc=%2fazure%2fbilling%2fTOC.json) hub. For instance, the "Cost analysis" functionality enables you to view your spends for Azure resources. First, add a filter by "Resource type" (to microsoft.operationalinsights/workspace for Log Analytics and microsoft.operationalinsights/cluster for Log Analytics Clusters) will allow you to track your Log Analytics spend. Then for "Group by" select "Meter category" or "Meter". Note that other services such as Azure Defender (Security Center) and Azure Sentinel also bill their usage against Log Analytics workspace resources. To see the mapping to Service name, you can select the Table view instead of a chart. More understanding of your usage can be gained by [downloading your usage from the Azure portal](../../cost-management-billing/manage/download-azure-invoice-daily-usage-date.md#download-usage-in-azure-portal).
-In the downloaded spreadsheet you can see usage per Azure resource (e.g. Log Analytics workspace) per day. In this Excel spreadsheet, usage from your Log Analytics workspaces can be found by first filtering on the "Meter Category" column to show "Log Analytics", "Insight and Analytics" (used by some of the legacy pricing tiers) and "Azure Monitor" (used by Capacity Reservation pricing tiers), and then adding a filter on the "Instance ID" column which is "contains workspace" or "contains cluster" (the latter to include Log Analytics Cluster usage). The usage is shown in the "Consumed Quantity" column and the unit for each entry is shown in the "Unit of Measure" column. More details are available to help you [understand your Microsoft Azure bill](../../cost-management-billing/understand/review-individual-bill.md).
+In the downloaded spreadsheet you can see usage per Azure resource (e.g. Log Analytics workspace) per day. In this Excel spreadsheet, usage from your Log Analytics workspaces can be found by first filtering on the "Meter Category" column to show "Log Analytics", "Insight and Analytics" (used by some of the legacy pricing tiers) and "Azure Monitor" (used by commitment tier pricing tiers), and then adding a filter on the "Instance ID" column which is "contains workspace" or "contains cluster" (the latter to include Log Analytics Cluster usage). The usage is shown in the "Consumed Quantity" column and the unit for each entry is shown in the "Unit of Measure" column. More details are available to help you [understand your Microsoft Azure bill](../../cost-management-billing/understand/review-individual-bill.md).
## Changing pricing tier
To change the Log Analytics pricing tier of your workspace,
1. In the Azure portal, open **Usage and estimated costs** from your workspace where you'll see a list of each of the pricing tiers available to this workspace.
-2. Review the estimated costs for each of the pricing tiers. This estimate is based on the last 31 days of usage, so this cost estimate relies on the last 31 days being representative of your typical usage. In the example below you can see how, based on the data patterns from the last 31 days, this workspace would cost less in the Pay-As-You-Go tier (#1) compared to the 100 GB/day Capacity Reservation tier (#2).
+2. Review the estimated costs for each of the pricing tiers. This estimate is based on the last 31 days of usage, so this cost estimate relies on the last 31 days being representative of your typical usage. In the example below you can see how, based on the data patterns from the last 31 days, this workspace would cost less in the Pay-As-You-Go tier (#1) compared to the 100 GB/day commitment tier tier (#2).
:::image type="content" source="media/manage-cost-storage/pricing-tier-estimated-costs.png" alt-text="Pricing tiers"::: 3. After reviewing the estimated costs based on the last 31 days of usage, if you decide to change the pricing tier, click **Select**.
-You can also [set the pricing tier via Azure Resource Manager](./resource-manager-workspace.md) using the `sku` parameter (`pricingTier` in the Azure Resource Manager template).
+### Changing pricing tier via ARM
+
+You can also [set the pricing tier via Azure Resource Manager](./resource-manager-workspace.md) using the `sku` object to set the pricing tier, and the `capacityReservationLevel` parameter if the pricing tier is `capacityresrvation`. (Learn more about [setting workspace properties via ARM](/azure/templates/microsoft.operationalinsights/2020-08-01/workspaces?tabs=json#workspacesku-object).) Here is a sample ARM template to set your workspace to a 300 GB/day commitment tier (which in ARM is called `capacityreservation`).
+
+```
+{
+ "$schema": https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#,
+ "contentVersion": "1.0.0.0",
+ "resources": [
+ {
+ "name": "YourWorkspaceName",
+ "type": "Microsoft.OperationalInsights/workspaces",
+ "apiVersion": "2020-08-01",
+ "location": "yourWorkspaceRegion",
+ "properties": {
+ "sku": {
+ "name": "capacityreservation",
+ "capacityReservationLevel": 300
+ }
+ }
+ }
+ ]
+}
+```
+
+To use this template via PowerShell, after [installing the Azure Az PowerShell module](/powershell/azure/install-az-ps), log into Azure using `Connect-AzAccount`, select the subscription containing your workspace using `Select-AzSubscription -SubscriptionId YourSubscriptionId`, and apply the template (saved in a file named template.json):
+
+```
+New-AzResourceGroupDeployment -ResourceGroupName "YourResourceGroupName" -TemplateFile "template.json"
+```
+
+To set the pricing tier to other values such as Pay-As-You-Go (called `pergb2018` for the sku), omit the `capacityReservationLevel` property. Learn more about [creating ARM templates](/azure/azure-resource-manager/templates/template-tutorial-create-first-template?tabs=azure-powershell), [/azure/azure-resource-manager/templates/template-tutorial-create-first-template?tabs=azure-powershell](adding a resource to your template), and [applying templates](https://docs.microsoft.com/azure/azure-monitor/resource-manager-samples).
## Legacy pricing tiers
-Subscriptions who had a Log Analytics workspace or Application Insights resource in it before April 2, 2018, or are linked to an Enterprise Agreement that started prior to February 1, 2019, will continue to have access to use the legacy pricing tiers: **Free**, **Standalone (Per GB)** and **Per Node (OMS)**. Workspaces in the Free pricing tier will have daily data ingestion limited to 500 MB (except for security data types collected by [Azure Defender (Security Center)](../../security-center/index.yml)) and the data retention is limited to 7 days. The Free pricing tier is intended only for evaluation purposes. Workspaces in the Standalone or Per Node pricing tiers have user-configurable retention from 30 to 730 days.
+Subscriptions who had a Log Analytics workspace or Application Insights resource in it before April 2, 2018, or are linked to an Enterprise Agreement that started prior to February 1, 2019, will continue to have access to use the legacy pricing tiers: **Free Trial**, **Standalone (Per GB)** and **Per Node (OMS)**. Workspaces in the Free pricing tier will have daily data ingestion limited to 500 MB (except for security data types collected by [Azure Defender (Security Center)](../../security-center/index.yml)) and the data retention is limited to 7 days. The Free Trial pricing tier is intended only for evaluation purposes. Workspaces in the Standalone or Per Node pricing tiers have user-configurable retention from 30 to 730 days.
Usage on the Standalone pricing tier is billed by the ingested data volume. It is reported in the **Log Analytics** service and the meter is named "Data Analyzed".
There are also some behaviors between the use of legacy Log Analytics tiers and
1. If the workspace is in the legacy Standard or Premium tier, Azure Defender will be billed only for Log Analytics data ingestion, not per node. 2. If the workspace is in the legacy Per Node tier, Azure Defender will be billed using the current [Azure Defender node-based pricing model](https://azure.microsoft.com/pricing/details/security-center/).
-3. In other pricing tiers (including Capacity Reservations), if Azure Defender was enabled before June 19, 2017, Azure Defender will be billed only for Log Analytics data ingestion. Otherwise Azure Defender will be billed using the current Azure Defender node-based pricing model.
+3. In other pricing tiers (including commitment tiers), if Azure Defender was enabled before June 19, 2017, Azure Defender will be billed only for Log Analytics data ingestion. Otherwise Azure Defender will be billed using the current Azure Defender node-based pricing model.
More details of pricing tier limitations are available at [Azure subscription and service limits, quotas, and constraints](../../azure-resource-manager/management/azure-subscription-service-limits.md#log-analytics-workspaces).
To get you started, here are the recommended settings for the alert querying the
- Target: Select your Log Analytics resource - Criteria: - Signal name: Custom log search
- - Search query: `_LogOperation | where Operation == "Data collection Status" | where Detail contains "OverQuota"`
+ - Search query: `_LogOperation | where Operation == "Data collection Stopped" | where Detail contains "OverQuota"`
- Based on: Number of results - Condition: Greater than - Threshold: 0
Once alert is defined and the limit is reached, an alert is triggered and perfor
## Troubleshooting why usage is higher than expected Higher usage is caused by one, or both of:-- More nodes than expected sending data to Log Analytics workspace-- More data than expected being sent to Log Analytics workspace (perhaps due to starting to use a new solution or a configuration change to an existing solution)
+- More nodes than expected sending data to Log Analytics workspace: see [Understanding nodes sending data](#understanding-nodes-sending-data)
+- More data than expected being sent to Log Analytics workspace (perhaps due to starting to use a new solution or a configuration change to an existing solution): see [Understanding ingested data volume](#understanding-ingested-data-volume)
+
+If you observe high data ingestion reported using the `Usage` records (see [below](#data-volume-by-solution)), but you don't observed the same results summing `_BilledSize` directly on the [data type](#data-volume-for-specific-events), it's possible you have significant late arriving data. [Here](#late-arriving-data) is more information on how to diagnose this.
## Understanding nodes sending data
You can also parse the `_ResourceId` more fully if needed as well using
> Some of the fields of the Usage data type, while still in the schema, have been deprecated and will their values are no longer populated. > These are **Computer** as well as fields related to ingestion (**TotalBatches**, **BatchesWithinSla**, **BatchesOutsideSla**, **BatchesCapped** and **AverageProcessingTimeMs**.
+## Late arriving data
+
+Situations can arise where data is ingested with old timestamps, for instance if an agent cannot communicate to Log Analytics due to a connectivity issue, or situations when a host has an incorrectly time date/time. This can manifest itself by an apparent discrepency between the ingested data reported by the `Usage` data type, and a query summing `_BilledSize` over the raw data for a particular day specified by `TimeGenerated`, the timestamp when the event was generated.
+
+To diagnose late arriving data issues, use the `_TimeReceived` column ([learn more](./log-standard-columns.md#_timereceived)) in addition to the `TimeGenerated` column. `_TimeReceived` is the time when the the record was received by the Azure Monitor ingestion point in the Azure cloud. For instance, when using the `Usage` records, you have observed high ingested data volumes of `W3CIISLog` data on May 2, 2021, here is a query that will identify the timestamps on this ingested data:
-### Querying for common data types
+```Kusto
+W3CIISLog
+| where TimeGenerated > datetime(1970-01-01)
+| where _TimeReceived >= datetime(2021-05-02) and _TimeReceived < datetime(2021-05-03)
+| where _IsBillable == true
+| summarize BillableDataMB = sum(_BilledSize)/1.E6 by bin(TimeGenerated, 1d)
+| sort by TimeGenerated asc
+```
+
+The `where TimeGenerated > datetime(1970-01-01)` is just present to provide the clue to the Log Analytics user interface to look over all data.
+
+## Querying for common data types
To dig deeper into the source of data for a particular data type, here are some useful example queries:
To see the number of distinct Automation nodes, use the query:
## Evaluating the legacy Per Node pricing tier
-The decision of whether workspaces with access to the legacy **Per Node** pricing tier are better off in that tier or in a current **Pay-As-You-Go** or **Capacity Reservation** tier is often difficult for customers to assess. This involves understanding the trade-off between the fixed cost per monitored node in the Per Node pricing tier and its included data allocation of 500 MB/node/day and the cost of just paying for ingested data in the Pay-As-You-Go (Per GB) tier.
+The decision of whether workspaces with access to the legacy **Per Node** pricing tier are better off in that tier or in a current **Pay-As-You-Go** or **Commitment Tier** is often difficult for customers to assess. This involves understanding the trade-off between the fixed cost per monitored node in the Per Node pricing tier and its included data allocation of 500 MB/node/day and the cost of just paying for ingested data in the Pay-As-You-Go (Per GB) tier.
To facilitate this assessment, the following query can be used to make a recommendation for the optimal pricing tier based on a workspace's usage patterns. This query looks at the monitored nodes and data ingested into a workspace in the last 7 days, and for each day evaluates which pricing tier would have been optimal. To use the query, you need to specify
let workspaceHasSecurityCenter = false; // Specify if the workspace has Azure S
let PerNodePrice = 15.; // Enter your montly price per monitored nodes let PerNodeOveragePrice = 2.30; // Enter your price per GB for data overage in the Per Node pricing tier let PerGBPrice = 2.30; // Enter your price per GB in the Pay-as-you-go pricing tier
-let CarRes100Price = 196.; // Enter your price for the 100 GB/day Capacity Reservation
-let CarRes200Price = 368.; // Enter your price for the 200 GB/day Capacity Reservation
-let CarRes300Price = 540.; // Enter your price for the 300 GB/day Capacity Reservation
-let CarRes400Price = 704.; // Enter your price for the 400 GB/day Capacity Reservation
-let CarRes500Price = 865.; // Enter your price for the 500 GB/day Capacity Reservation
+let CarRes100Price = 196.; // Enter your price for the 100 GB/day commitment tier
+let CarRes200Price = 368.; // Enter your price for the 200 GB/day commitment tier
+let CarRes300Price = 540.; // Enter your price for the 300 GB/day commitment tier
+let CarRes400Price = 704.; // Enter your price for the 400 GB/day commitment tier
+let CarRes500Price = 865.; // Enter your price for the 500 GB/day commitment tier
// let SecurityDataTypes=dynamic(["SecurityAlert", "SecurityBaseline", "SecurityBaselineSummary", "SecurityDetection", "SecurityEvent", "WindowsFirewall", "MaliciousIPCommunication", "LinuxAuditLog", "SysmonEvent", "ProtectionStatus", "WindowsEvent", "Update", "UpdateSummary"]); let StartDate = startofday(datetime_add("Day",-1*daysToEvaluate,now()));
union *
| extend Recommendation = case( MinCost == PerNodeDailyCost, "Per node tier", MinCost == PerGBDailyCost, "Pay-as-you-go tier",
- MinCost == CapRes100DailyCost, "Capacity Reservation (100 GB/day)",
- MinCost == CapRes200DailyCost, "Capacity Reservation (200 GB/day)",
- MinCost == CapRes300DailyCost, "Capacity Reservation (300 GB/day)",
- MinCost == CapRes400DailyCost, "Capacity Reservation (400 GB/day)",
- MinCost == CapRes500AndAboveDailyCost, strcat("Capacity Reservation (",CapResLevel500AndAbove," GB/day)"),
+ MinCost == CapRes100DailyCost, "Cap**ommitment tier (100 GB/day)",
+ MinCost == CapRes200DailyCost, "Commitment tier (200 GB/day)",
+ MinCost == CapRes300DailyCost, "Commitment tier (300 GB/day)",
+ MinCost == CapRes400DailyCost, "Commitment tier (400 GB/day)",
+ MinCost == CapRes500AndAboveDailyCost, strcat("Commitment tier (",CapResLevel500AndAbove," GB/day)"),
"Error" ) | project day, nodesPerDay, ASCnodesPerDay, NonSecurityDataGB, SecurityDataGB, OverageGB, AvgGbPerNode, PerGBDailyCost, PerNodeDailyCost,
When data collection stops, the OperationStatus is **Warning**. When data collec
To be notified when data collection stops, use the steps described in *Create daily data cap* alert to be notified when data collection stops. Use the steps described in [create an action group](../alerts/action-groups.md) to configure an e-mail, webhook, or runbook action for the alert rule.
-## Late arriving data
-
-Situations can arise where data is ingested with very old timestamps, for instance if an agent cannot communicate to Log Analytics due to a connectivity issue, or situations when a host has an incorrectly time date/time. To diagnose these issues, use the `_TimeReceived` column ([learn more](./log-standard-columns.md#_timereceived)) in addition to the `TimeGenerated` column. `TimeReceived` is the time when the the record was received by the Azure Monitor ingestion point in the Azure cloud.
- ## Limits summary There are some additional Log Analytics limits, some of which depend on the Log Analytics pricing tier. These are documented at [Azure subscription and service limits, quotas, and constraints](../../azure-resource-manager/management/azure-subscription-service-limits.md#log-analytics-workspaces).
azure-netapp-files Create Volumes Dual Protocol https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/create-volumes-dual-protocol.md
To create NFS volumes, see [Create an NFS volume](azure-netapp-files-create-volu
* Create a reverse lookup zone on the DNS server and then add a pointer (PTR) record of the AD host machine in that reverse lookup zone. Otherwise, the dual-protocol volume creation will fail. * The **Allow local NFS users with LDAP** option in Active Directory connections intends to provide occasional and temporary access to local users. When this option is enabled, user authentication and lookup from the LDAP server stop working. As such, you should keep this option *disabled* on Active Directory connections, except for the occasion when a local user needs to access LDAP-enabled volumes. In that case, you should disable this option as soon as local user access is no longer required for the volume. See [Allow local NFS users with LDAP to access a dual-protocol volume](#allow-local-nfs-users-with-ldap-to-access-a-dual-protocol-volume) about managing local user access. * Ensure that the NFS client is up to date and running the latest updates for the operating system.
-* Dual-protocol volumes support both Acitve Diectory Domain Services (ADDS) and Azure Active Directory Domain Services (AADDS).
+* Dual-protocol volumes support both Active Directory Domain Services (ADDS) and Azure Active Directory Domain Services (AADDS).
* Dual-protocol volumes do not support the use of LDAP over TLS with AADDS. See [LDAP over TLS considerations](configure-ldap-over-tls.md#considerations). * The NFS version used by a dual-protocol volume is NFSv3. As such, the following considerations apply: * Dual protocol does not support the Windows ACLS extended attributes `set/get` from NFS clients.
azure-resource-manager Add Template To Azure Pipelines https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/add-template-to-azure-pipelines.md
+
+ Title: CI/CD with Azure Pipelines and Bicep files
+description: Describes how to configure continuous integration in Azure Pipelines by using Bicep files. It shows how to use a PowerShell script, or copy files to a staging location and deploy from there. (Bicep)
+++ Last updated : 06/01/2021+
+# Integrate Bicep with Azure Pipelines
+
+You can integrate Bicep file with Azure Pipelines for continuous integration and continuous deployment (CI/CD). In this article, you learn how to build a Bicep file into a JSON template and then use two advanced ways to deploy templates with Azure Pipelines.
+
+## Select your option
+
+Before proceeding with this article, let's consider the different options for deploying an ARM template from a pipeline.
+
+* **Use Azure CLI task**. Use this task to run `az bicep build` to build your Bicep files before deploying the JSON templates.
+
+* **Use ARM template deployment task**. This option is the easiest option. This approach works when you want to deploy a template directly from a repository. This option isn't covered in this article but instead is covered in the tutorial [Continuous integration of ARM templates with Azure Pipelines](../templates/deployment-tutorial-pipeline.md). It shows how to use the [ARM template deployment task](https://github.com/microsoft/azure-pipelines-tasks/blob/master/Tasks/AzureResourceManagerTemplateDeploymentV3/README.md) to deploy a template from your GitHub repo.
+
+* **Add task that runs an Azure PowerShell script**. This option has the advantage of providing consistency throughout the development life cycle because you can use the same script that you used when running local tests. Your script deploys the template but can also perform other operations such as getting values to use as parameters. This option is shown in this article. See [Azure PowerShell task](#azure-powershell-task).
+
+ Visual Studio provides the [Azure Resource Group project](../templates/create-visual-studio-deployment-project.md) that includes a PowerShell script. The script stages artifacts from your project to a storage account that Resource Manager can access. Artifacts are items in your project such as linked templates, scripts, and application binaries. If you want to continue using the script from the project, use the PowerShell script task shown in this article.
+
+* **Add tasks to copy and deploy tasks**. This option offers a convenient alternative to the project script. You configure two tasks in the pipeline. One task stages the artifacts to an accessible location. The other task deploys the template from that location. This option is shown in this article. See [Copy and deploy tasks](#copy-and-deploy-tasks).
+
+## Prepare your project
+
+This article assumes your ARM template and Azure DevOps organization are ready for creating the pipeline. The following steps show how to make sure you're ready:
+
+* You have an Azure DevOps organization. If you don't have one, [create one for free](/azure/devops/pipelines/get-started/pipelines-sign-up). If your team already has an Azure DevOps organization, make sure you're an administrator of the Azure DevOps project that you want to use.
+
+* You've configured a [service connection](/azure/devops/pipelines/library/connect-to-azure) to your Azure subscription. The tasks in the pipeline execute under the identity of the service principal. For steps to create the connection, see [Create a DevOps project](../templates/deployment-tutorial-pipeline.md#create-a-devops-project).
+
+* You have an [ARM template](../templates/quickstart-create-templates-use-visual-studio-code.md) that defines the infrastructure for your project.
+
+## Create pipeline
+
+1. If you haven't added a pipeline previously, you need to create a new pipeline. From your Azure DevOps organization, select **Pipelines** and **New pipeline**.
+
+ ![Add new pipeline](./media/add-template-to-azure-pipelines/new-pipeline.png)
+
+1. Specify where your code is stored. The following image shows selecting **Azure Repos Git**.
+
+ ![Select code source](./media/add-template-to-azure-pipelines/select-source.png)
+
+1. From that source, select the repository that has the code for your project.
+
+ ![Select repository](./media/add-template-to-azure-pipelines/select-repo.png)
+
+1. Select the type of pipeline to create. You can select **Starter pipeline**.
+
+ ![Select pipeline](./media/add-template-to-azure-pipelines/select-pipeline.png)
+
+You're ready to either add an Azure PowerShell task or the copy file and deploy tasks.
+
+## Azure CLI task
+
+This section shows how to build a Bicep file into a JSON template before the JSON template is deployed.
+
+The following YML file builds a Bicep file by using an [Azure CLI task](/azure/devops/pipelines/tasks/deploy/azure-cli):
+
+```yml
+trigger:
+- master
+
+pool:
+ vmImage: 'ubuntu-latest'
+
+steps:
+- task: AzureCLI@2
+ inputs:
+ azureSubscription: 'script-connection'
+ scriptType: bash
+ scriptLocation: inlineScript
+ inlineScript: |
+ az --version
+ az bicep build --file ./azuredeploy.bicep
+```
+
+For `azureSubscription`, provide the name of the service connection you created.
+
+For `scriptType`, use **bash**.
+
+For `scriptLocation`, use **inlineScript**, or **scriptPath**. If you specify **scriptPath**, you will also need to specify a `scriptPath` parameter.
+
+In `inlineScript`, specify your script lines. The script provided in the sample builds a bicep file called *azuredeploy.bicep* and exists in the root of the repo.
+
+## Azure PowerShell task
+
+This section shows how to configure continuous deployment by using a single task that runs the PowerShell script in your project. If you need a PowerShell script that deploys a template, see [Deploy-AzTemplate.ps1](https://github.com/Azure/azure-quickstart-templates/blob/master/Deploy-AzTemplate.ps1) or [Deploy-AzureResourceGroup.ps1](https://github.com/Azure/azure-quickstart-templates/blob/master/Deploy-AzureResourceGroup.ps1).
+
+The following YAML file creates an [Azure PowerShell task](/azure/devops/pipelines/tasks/deploy/azure-powershell):
+
+```yml
+trigger:
+- master
+
+pool:
+ vmImage: 'ubuntu-latest'
+
+steps:
+- task: AzurePowerShell@5
+ inputs:
+ azureSubscription: 'script-connection'
+ ScriptType: 'FilePath'
+ ScriptPath: './Deploy-AzTemplate.ps1'
+ ScriptArguments: -Location 'centralus' -ResourceGroupName 'demogroup' -TemplateFile templates\mainTemplate.json
+ azurePowerShellVersion: 'LatestVersion'
+```
+
+When you set the task to `AzurePowerShell@5`, the pipeline uses the [Az module](/powershell/azure/new-azureps-module-az).
+
+```yaml
+steps:
+- task: AzurePowerShell@3
+```
+
+For `azureSubscription`, provide the name of the service connection you created.
+
+```yaml
+inputs:
+ azureSubscription: '<your-connection-name>'
+```
+
+For `scriptPath`, provide the relative path from the pipeline file to your script. You can look in your repository to see the path.
+
+```yaml
+ScriptPath: '<your-relative-path>/<script-file-name>.ps1'
+```
+
+In `ScriptArguments`, provide any parameters needed by your script. The following example shows some parameters for a script, but you'll need to customize the parameters for your script.
+
+```yaml
+ScriptArguments: -Location 'centralus' -ResourceGroupName 'demogroup' -TemplateFile templates\mainTemplate.json
+```
+
+When you select **Save**, the build pipeline is automatically run. Go back to the summary for your build pipeline, and watch the status.
+
+![View results](./media/add-template-to-azure-pipelines/view-results.png)
+
+You can select the currently running pipeline to see details about the tasks. When it finishes, you see the results for each step.
+
+## Copy and deploy tasks
+
+This section shows how to configure continuous deployment by using a two tasks. The first task stages the artifacts to a storage account and the second task deploy the template.
+
+To copy files to a storage account, the service principal for the service connection must be assigned the Storage Blob Data Contributor or Storage Blob Data Owner role. For more information, see [Get started with AzCopy](../../storage/common/storage-use-azcopy-v10.md).
+
+The following YAML shows the [Azure file copy task](/azure/devops/pipelines/tasks/deploy/azure-file-copy).
+
+```yml
+trigger:
+- master
+
+pool:
+ vmImage: 'windows-latest'
+
+steps:
+- task: AzureFileCopy@4
+ inputs:
+ SourcePath: 'templates'
+ azureSubscription: 'copy-connection'
+ Destination: 'AzureBlob'
+ storage: 'demostorage'
+ ContainerName: 'projecttemplates'
+ name: AzureFileCopy
+```
+
+There are several parts of this task to revise for your environment. The `SourcePath` indicates the location of the artifacts relative to the pipeline file.
+
+```yaml
+SourcePath: '<path-to-artifacts>'
+```
+
+For `azureSubscription`, provide the name of the service connection you created.
+
+```yaml
+azureSubscription: '<your-connection-name>'
+```
+
+For storage and container name, provide the names of the storage account and container you want to use for storing the artifacts. The storage account must exist.
+
+```yaml
+storage: '<your-storage-account-name>'
+ContainerName: '<container-name>'
+```
+
+After creating the copy file task, you're ready to add the task to deploy the staged template.
+
+The following YAML shows the [Azure Resource Manager template deployment task](https://github.com/microsoft/azure-pipelines-tasks/blob/master/Tasks/AzureResourceManagerTemplateDeploymentV3/README.md):
+
+```yaml
+- task: AzureResourceManagerTemplateDeployment@3
+ inputs:
+ deploymentScope: 'Resource Group'
+ azureResourceManagerConnection: 'copy-connection'
+ subscriptionId: '00000000-0000-0000-0000-000000000000'
+ action: 'Create Or Update Resource Group'
+ resourceGroupName: 'demogroup'
+ location: 'West US'
+ templateLocation: 'URL of the file'
+ csmFileLink: '$(AzureFileCopy.StorageContainerUri)templates/mainTemplate.json$(AzureFileCopy.StorageContainerSasToken)'
+ csmParametersFileLink: '$(AzureFileCopy.StorageContainerUri)templates/mainTemplate.parameters.json$(AzureFileCopy.StorageContainerSasToken)'
+ deploymentMode: 'Incremental'
+ deploymentName: 'deploy1'
+```
+
+There are several parts of this task to review in greater detail.
+
+* `deploymentScope`: Select the scope of deployment from the options: `Management Group`, `Subscription`, and `Resource Group`.
+
+* `azureResourceManagerConnection`: Provide the name of the service connection you created.
+
+* `subscriptionId`: Provide the target subscription ID. This property only applies to the Resource Group deployment scope and the subscription deployment scope.
+
+* `resourceGroupName` and `location`: provide the name and location of the resource group you want to deploy to. The task creates the resource group if it doesn't exist.
+
+ ```yml
+ resourceGroupName: '<resource-group-name>'
+ location: '<location>'
+ ```
+
+* `csmFileLink`: Provide the link for the staged template. When setting the value, use variables returned from the file copy task. The following example links to a template named mainTemplate.json. The folder named **templates** is included because that where the file copy task copied the file to. In your pipeline, provide the path to your template and the name of your template.
+
+ ```yml
+ csmFileLink: '$(AzureFileCopy.StorageContainerUri)templates/mainTemplate.json$(AzureFileCopy.StorageContainerSasToken)'
+ ```
+
+Your pipeline look like:
+
+```yml
+trigger:
+- master
+
+pool:
+ vmImage: 'windows-latest'
+
+steps:
+- task: AzureFileCopy@4
+ inputs:
+ SourcePath: 'templates'
+ azureSubscription: 'copy-connection'
+ Destination: 'AzureBlob'
+ storage: 'demostorage'
+ ContainerName: 'projecttemplates'
+ name: AzureFileCopy
+- task: AzureResourceManagerTemplateDeployment@3
+ inputs:
+ deploymentScope: 'Resource Group'
+ azureResourceManagerConnection: 'copy-connection'
+ subscriptionId: '00000000-0000-0000-0000-000000000000'
+ action: 'Create Or Update Resource Group'
+ resourceGroupName: 'demogroup'
+ location: 'West US'
+ templateLocation: 'URL of the file'
+ csmFileLink: '$(AzureFileCopy.StorageContainerUri)templates/mainTemplate.json$(AzureFileCopy.StorageContainerSasToken)'
+ csmParametersFileLink: '$(AzureFileCopy.StorageContainerUri)templates/mainTemplate.parameters.json$(AzureFileCopy.StorageContainerSasToken)'
+ deploymentMode: 'Incremental'
+ deploymentName: 'deploy1'
+```
+
+When you select **Save**, the build pipeline is automatically run. Go back to the summary for your build pipeline, and watch the status.
+
+## Example
+
+The following pipeline shows how to build a Bicep file and how to deploy the compiled template:
++
+## Next steps
+
+* To use the what-if operation in a pipeline, see [Test ARM templates with What-If in a pipeline](https://4bes.nl/2021/03/06/test-arm-templates-with-what-if/).
+* To learn about using Bicep file with GitHub Actions, see [Deploy Bicep files by using GitHub Actions](./deploy-github-actions.md).
azure-resource-manager Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/best-practices.md
+
+ Title: Learn best practices when developing Bicep files
+description: Describes practices to follow when creating your Bicep files so they work well and are easy to maintain.
+++ Last updated : 06/01/2021+
+# Best practices for Bicep
+
+This article recommends practices to follow when developing your Bicep files. These practices make your Bicep file easier to understand and use.
+
+## Parameters
+
+* Use good naming for parameter declarations. Good names make your templates easy to read and understand. Make sure you're using clear, descriptive names, and be consistent in your naming.
+
+* Think carefully about the parameters your template uses. Try to use parameters for settings that change between deployments. Variables and hard-coded values can be used for settings that don't change between deployments.
+
+* Be mindful of the default values you use. Make sure the default values are safe for anyone to deploy. For example, consider using low cost pricing tiers and SKUs so that someone deploying the template to a test environment doesn't incur a large cost unnecessarily.
+
+* Use the `@allowed` decorator sparingly. If you use this decorator too broadly, you might block valid deployments. As Azure services add SKUs and sizes, your allowed list might not be up to date. For example, allowing only Premium v3 SKUs might make sense in production, but it prevents you from using the same template in non-production environments.
+
+* It's a good practice to provide descriptions for your parameters. Try to make the descriptions helpful, and provide any important information about what the template needs the parameter values to be.
+
+ You can also use `//` comments for some information.
+
+* You can put parameter declarations anywhere in the template file, although it's usually a good idea to put them at the top of the file so your Bicep code is easy to read.
+
+* It's a good practice to specify the minimum and maximum character length for parameters that control naming. These limitations help avoid errors later during deployment.
+
+## Naming
+
+* The [uniqueString() function](bicep-functions-string.md#uniquestring) is useful for creating globally unique resource names. When you provide the same parameters, it returns the same string every time. Passing in the resource group ID means the string is the same on every deployment to the same resource group, but different when you deploy to different resource groups or subscriptions.
+
+* Sometimes the uniqueString() function creates strings that start with a number. Some Azure resources, like storage accounts, don't allow their names to start with numbers. This requirement means it's a good idea to use string interpolation to create resource names. You can add a prefix to the unique string.
+
+* It's often a good idea to use template expressions to create resource names. Many Azure resource types have rules about the allowed characters and length of their names. Embedding the creation of resource names in the template means that anyone who uses the template doesn't have to remember to follow these rules themselves.
+
+## Resource definitions
+
+* Instead of embedding complex expressions directly into resource properties, use variables to contain the expressions. This approach makes your Bicep file easier to read and understand. It avoids cluttering your resource definitions with logic.
+
+* Try to use resource properties as outputs, rather than making assumptions about how resources will behave. For example, if you need to output the URL to an App Service app, use the defaultHostname property of the app instead of creating a string for the URL yourself. Sometimes these assumptions aren't correct in different environments, or the resources change the way they work. It's safer to have the resource tell you its own properties.
+
+* It's a good idea to use a recent API version for each resource. New features in Azure services are sometimes available only in newer API versions.
+
+* When possible, avoid using the [reference](./bicep-functions-resource.md#reference) and [resourceId](./bicep-functions-resource.md#resourceid) functions in your Bicep file. You can access any resource in Bicep by using the symbolic name. For example, if you define a storage account with the symbolic name toyDesignDocumentsStorageAccount, you can access its resource ID by using the expression `toyDesignDocumentsStorageAccount.id`. By using the symbolic name, you create an implicit dependency between resources.
+
+* If the resource isn't deployed in the Bicep file, you can still get a symbolic reference to the resource using the **existing** keyword.
+
+## Child resources
+
+* Avoid nesting too many layers deep. Too much nesting makes your Bicep code harder to read and work with.
+
+* It's best to avoid constructing resource names for child resources. You lose the benefits that Bicep provides when it understands the relationships between your resources. Use the `parent` property or nesting instead.
+
+## Outputs
+
+* Make sure you don't create outputs for sensitive data. Output values can be accessed by anyone who has access to the deployment history. They're not appropriate for handling secrets.
+
+* Instead of passing property values around through outputs, use the `existing` keyword to look up properties of resources that already exist. It's a best practice to look up keys from other resources in this way instead of passing them around through outputs. You'll always get the most up-to-date data.
+
+## Tenant scopes
+
+You can't create policies or role assignments at the [tenant scope](deploy-to-tenant.md). However, if you need to grant access or apply policies across your whole organization, you can deploy these resources to the root management group.
+
+## Next steps
+
+* For an introduction to Bicep, see [Bicep quickstart](quickstart-create-bicep-use-visual-studio-code.md).
+* For information about the parts of a Bicep file, see [Understand the structure and syntax of Bicep files](file.md).
azure-resource-manager Bicep Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-cli.md
+
+ Title: Bicep CLI commands and overview
+description: Describes the commands that you can use in the Bicep CLI. These commands include building Azure Resource Manager templates from Bicep.
+ Last updated : 06/01/2021+
+# Bicep CLI commands
+
+This article describes the commands you can use in the Bicep CLI. You must have the [Bicep CLI installed](./install.md) to run the commands.
+
+This article shows how to run the commands in Azure CLI. If you're not using Azure CLI, run the commands without `az` at the start of the command. For example, `az bicep version` becomes ``bicep version``.
+
+## build
+
+The **build** command converts a Bicep file to an Azure Resource Manager template (ARM template). Typically, you don't need to run this command because it runs automatically when you deploy a Bicep file. Run it manually when you want to see the ARM template JSON that is created from your Bicep file.
+
+The following example converts a Bicep file named **main.bicep** to an ARM template named **main.json**. The new file is created in the same directory as the Bicep file.
+
+```azurecli
+az bicep build --file main.bicep
+```
+
+The next example saves **main.json** to a different directory.
+
+```azurecli
+ az bicep build --file main.bicep --outdir c:\jsontemplates
+```
+
+The next example specifies the name and location of the file to create.
+
+```azurecli
+az bicep build --file main.bicep --outfile c:\jsontemplates\azuredeploy.json
+```
+
+To print the file to **stdout**, use:
+
+```azurecli
+az bicep build --file main.bicep --stdout
+```
+
+## decompile
+
+The **decompile** command converts an ARM template to a Bicep file.
+
+```azurecli
+az bicep decompile --file main.json
+```
+
+For more information about using this command, see [Decompiling ARM template JSON to Bicep](decompile.md).
+
+## install
+
+The **install** command adds the Bicep CLI to your local environment. For more information, see [Install Bicep tools](install.md).
+
+To install the latest version, use:
+
+```azurecli
+az bicep install
+```
+
+To install a specific version:
+
+```azurecli
+az bicep install --version v0.3.255
+```
+
+## list-versions
+
+The **list-vesions** command returns all available versions of the Bicep CLI. Use this command to see if you want to [upgrade](#upgrade) or [install](#install) a new version.
+
+```azurecli
+az bicep list-versions
+```
+
+The command returns an array of available versions.
+
+```azurecli
+[
+ "v0.3.539",
+ "v0.3.255",
+ "v0.3.126",
+ "v0.3.1",
+ "v0.2.328",
+ "v0.2.317",
+ "v0.2.212",
+ "v0.2.59",
+ "v0.2.14",
+ "v0.2.3",
+ "v0.1.226-alpha",
+ "v0.1.223-alpha",
+ "v0.1.37-alpha",
+ "v0.1.1-alpha"
+]
+```
+
+## upgrade
+
+The **upgrade** command updates your installed version with the latest version.
+
+```azurecli
+az bicep upgrade
+```
+
+## version
+
+The **version** command returns your installed version.
+
+```azurecli
+az bicep version
+```
+
+The command shows the version number.
+
+```azurecli
+Bicep CLI version 0.3.539 (c8b397dbdd)
+```
+
+If you haven't installed Bicep CLI, you see an error indicating Bicep CLI wasn't found.
+
+## Next steps
+
+To learn about deploying a Bicep file, see:
+
+* [Azure CLI](deploy-cli.md)
+* [Cloud Shell](deploy-cloud-shell.md)
+* [PowerShell](deploy-powershell.md)
azure-resource-manager Bicep Functions Any https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions-any.md
+
+ Title: Bicep functions - any
+description: Describes the any function that is available in Bicep to convert types.
+ Last updated : 06/01/2021++
+# Any function for Bicep
+
+Bicep supports a function called `any()` to resolve type errors in the Bicep type system. You use this function when the format of the value you provide doesn't match what the type system expects. For example, if the property requires a number but you need to provide it as a string, like `'0.5'`. Use the `any()` function to suppress the error reported by the type system.
+
+This function doesn't exist in the Azure Resource Manager template runtime. It's only used by Bicep and isn't emitted in the JSON for the built template.
+
+## any
+
+`any(value)`
+
+Returns a value that is compatible with any data type.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| value | Yes | all types | The value to convert to a compatible type. |
+
+### Return value
+
+The value in a form that is compatible with any data type.
+
+### Examples
+
+The following example shows how to use the `any()` function to provide numeric values as strings.
+
+```bicep
+resource wpAci 'microsoft.containerInstance/containerGroups@2019-12-01' = {
+ name: 'wordpress-containerinstance'
+ location: location
+ properties: {
+ containers: [
+ {
+ name: 'wordpress'
+ properties: {
+ ...
+ resources: {
+ requests: {
+ cpu: any('0.5')
+ memoryInGB: any('0.7')
+ }
+ }
+ }
+ }
+ ]
+ }
+}
+```
+
+The function works on any assigned value in Bicep. The following example uses `any()` with a ternary expression as an argument.
+
+```bicep
+publicIPAddress: any((pipId == '') ? null : {
+ id: pipId
+})
+```
+
+## Next steps
+
+For more complex uses of the `any()` function, see the following examples:
+
+* [Child resources that require a specific names](https://github.com/Azure/bicep/blob/main/docs/examples/201/api-management-create-all-resources/main.bicep#L246)
+* [A resource property not defined in the resource's type, even though it exists](https://github.com/Azure/bicep/blob/main/docs/examples/201/log-analytics-with-solutions-and-diagnostics/main.bicep#L26)
+
azure-resource-manager Bicep Functions Array https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions-array.md
+
+ Title: Bicep functions - arrays
+description: Describes the functions to use in a Bicep file for working with arrays.
+++ Last updated : 06/01/2021++
+# Array functions for Bicep
+
+Resource Manager provides several functions for working with arrays in Bicep:
+
+* [array](#array)
+* [concat](#concat)
+* [contains](#contains)
+* [empty](#empty)
+* [first](#first)
+* [intersection](#intersection)
+* [last](#last)
+* [length](#length)
+* [max](#max)
+* [min](#min)
+* [range](#range)
+* [skip](#skip)
+* [take](#take)
+* [union](#union)
+
+To get an array of string values delimited by a value, see [split](./bicep-functions-string.md#split).
+
+## array
+
+`array(convertToArray)`
+
+Converts the value to an array.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| convertToArray |Yes |int, string, array, or object |The value to convert to an array. |
+
+### Return value
+
+An array.
+
+### Example
+
+The following example shows how to use the array function with different types.
+
+```bicep
+param intToConvert int = 1
+param stringToConvert string = 'efgh'
+param objectToConvert object = {
+ 'a': 'b'
+ 'c': 'd'
+}
+
+output intOutput array = array(intToConvert)
+output stringOutput array = array(stringToConvert)
+output objectOutput array = array(objectToConvert)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| intOutput | Array | [1] |
+| stringOutput | Array | ["efgh"] |
+| objectOutput | Array | [{"a": "b", "c": "d"}] |
+
+## concat
+
+`concat(arg1, arg2, arg3, ...)`
+
+Combines multiple arrays and returns the concatenated array.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array |The first array for concatenation. |
+| additional arguments |No |array |Additional arrays in sequential order for concatenation. |
+
+This function takes any number of arrays and combines them.
+
+### Return value
+
+An array of concatenated values.
+
+### Example
+
+The following example shows how to combine two arrays.
+
+```bicep
+param firstArray array = [
+ '1-1'
+ '1-2'
+ '1-3'
+]
+param secondArray array = [
+ '2-1'
+ '2-2'
+ '2-3'
+]
+
+output return array = concat(firstArray, secondArray)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| return | Array | ["1-1", "1-2", "1-3", "2-1", "2-2", "2-3"] |
+
+## contains
+
+`contains(container, itemToFind)`
+
+Checks whether an array contains a value, an object contains a key, or a string contains a substring. The string comparison is case-sensitive. However, when testing if an object contains a key, the comparison is case-insensitive.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| container |Yes |array, object, or string |The value that contains the value to find. |
+| itemToFind |Yes |string or int |The value to find. |
+
+### Return value
+
+**True** if the item is found; otherwise, **False**.
+
+### Example
+
+The following example shows how to use contains with different types:
+
+```bicep
+param stringToTest string = 'OneTwoThree'
+param objectToTest object = {
+ 'one': 'a'
+ 'two': 'b'
+ 'three': 'c'
+}
+param arrayToTest array = [
+ 'one'
+ 'two'
+ 'three'
+]
+
+output stringTrue bool = contains(stringToTest, 'e')
+output stringFalse bool = contains(stringToTest, 'z')
+output objectTrue bool = contains(objectToTest, 'one')
+output objectFalse bool = contains(objectToTest, 'a')
+output arrayTrue bool = contains(arrayToTest, 'three')
+output arrayFalse bool = contains(arrayToTest, 'four')
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| stringTrue | Bool | True |
+| stringFalse | Bool | False |
+| objectTrue | Bool | True |
+| objectFalse | Bool | False |
+| arrayTrue | Bool | True |
+| arrayFalse | Bool | False |
+
+## empty
+
+`empty(itemToTest)`
+
+Determines if an array, object, or string is empty.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| itemToTest |Yes |array, object, or string |The value to check if it is empty. |
+
+### Return value
+
+Returns **True** if the value is empty; otherwise, **False**.
+
+### Example
+
+The following example checks whether an array, object, and string are empty.
+
+```bicep
+param testArray array = []
+param testObject object = {}
+param testString string = ''
+
+output arrayEmpty bool = empty(testArray)
+output objectEmpty bool = empty(testObject)
+output stringEmpty bool = empty(testString)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayEmpty | Bool | True |
+| objectEmpty | Bool | True |
+| stringEmpty | Bool | True |
+
+## first
+
+`first(arg1)`
+
+Returns the first element of the array, or first character of the string.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array or string |The value to retrieve the first element or character. |
+
+### Return value
+
+The type (string, int, array, or object) of the first element in an array, or the first character of a string.
+
+### Example
+
+The following example shows how to use the first function with an array and string.
+
+```bicep
+param arrayToTest array = [
+ 'one'
+ 'two'
+ 'three'
+]
+
+output arrayOutput string = first(arrayToTest)
+output stringOutput string = first('One Two Three')
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayOutput | String | one |
+| stringOutput | String | O |
+
+## intersection
+
+`intersection(arg1, arg2, arg3, ...)`
+
+Returns a single array or object with the common elements from the parameters.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array or object |The first value to use for finding common elements. |
+| arg2 |Yes |array or object |The second value to use for finding common elements. |
+| additional arguments |No |array or object |Additional values to use for finding common elements. |
+
+### Return value
+
+An array or object with the common elements.
+
+### Example
+
+The following example shows how to use intersection with arrays and objects:
+
+```bicep
+param firstObject object = {
+ 'one': 'a'
+ 'two': 'b'
+ 'three': 'c'
+}
+
+param secondObject object = {
+ 'one': 'a'
+ 'two': 'z'
+ 'three': 'c'
+}
+
+param firstArray array = [
+ 'one'
+ 'two'
+ 'three'
+]
+
+param secondArray array = [
+ 'two'
+ 'three'
+]
+
+output objectOutput object = intersection(firstObject, secondObject)
+output arrayOutput array = intersection(firstArray, secondArray)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| objectOutput | Object | {"one": "a", "three": "c"} |
+| arrayOutput | Array | ["two", "three"] |
+
+## last
+
+`last (arg1)`
+
+Returns the last element of the array, or last character of the string.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array or string |The value to retrieve the last element or character. |
+
+### Return value
+
+The type (string, int, array, or object) of the last element in an array, or the last character of a string.
+
+### Example
+
+The following example shows how to use the last function with an array and string.
+
+```bicep
+param arrayToTest array = [
+ 'one'
+ 'two'
+ 'three'
+]
+
+output arrayOutput string = last(arrayToTest)
+output stringOutput string = last('One Two three')
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayOutput | String | three |
+| stringOutput | String | e |
+
+## length
+
+`length(arg1)`
+
+Returns the number of elements in an array, characters in a string, or root-level properties in an object.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array, string, or object |The array to use for getting the number of elements, the string to use for getting the number of characters, or the object to use for getting the number of root-level properties. |
+
+### Return value
+
+An int.
+
+### Example
+
+The following example shows how to use length with an array and string:
+
+```bicep
+param arrayToTest array = [
+ 'one'
+ 'two'
+ 'three'
+]
+param stringToTest string = 'One Two Three'
+param objectToTest object = {
+ 'propA': 'one'
+ 'propB': 'two'
+ 'propC': 'three'
+ 'propD': {
+ 'propD-1': 'sub'
+ 'propD-2': 'sub'
+ }
+}
+
+output arrayLength int = length(arrayToTest)
+output stringLength int = length(stringToTest)
+output objectLength int = length(objectToTest)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayLength | Int | 3 |
+| stringLength | Int | 13 |
+| objectLength | Int | 4 |
+
+## max
+
+`max(arg1)`
+
+Returns the maximum value from an array of integers or a comma-separated list of integers.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array of integers, or comma-separated list of integers |The collection to get the maximum value. |
+
+### Return value
+
+An int representing the maximum value.
+
+### Example
+
+The following example shows how to use max with an array and a list of integers:
+
+```bicep
+param arrayToTest array = [
+ 0
+ 3
+ 2
+ 5
+ 4
+]
+
+output arrayOutput int = max(arrayToTest)
+output intOutput int = max(0,3,2,5,4)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayOutput | Int | 5 |
+| intOutput | Int | 5 |
+
+## min
+
+`min(arg1)`
+
+Returns the minimum value from an array of integers or a comma-separated list of integers.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array of integers, or comma-separated list of integers |The collection to get the minimum value. |
+
+### Return value
+
+An int representing the minimum value.
+
+### Example
+
+The following example shows how to use min with an array and a list of integers:
+
+```bicep
+param arrayToTest array = [
+ 0
+ 3
+ 2
+ 5
+ 4
+]
+
+output arrayOutput int = min(arrayToTest)
+output intOutput int = min(0,3,2,5,4)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayOutput | Int | 0 |
+| intOutput | Int | 0 |
+
+## range
+
+`range(startIndex, count)`
+
+Creates an array of integers from a starting integer and containing a number of items.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| startIndex |Yes |int |The first integer in the array. The sum of startIndex and count must be no greater than 2147483647. |
+| count |Yes |int |The number of integers in the array. Must be non-negative integer up to 10000. |
+
+### Return value
+
+An array of integers.
+
+### Example
+
+The following example shows how to use the range function:
+
+```bicep
+param startingInt int = 5
+param numberOfElements int = 3
+
+output rangeOutput array = range(startingInt, numberOfElements)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| rangeOutput | Array | [5, 6, 7] |
+
+## skip
+
+`skip(originalValue, numberToSkip)`
+
+Returns an array with all the elements after the specified number in the array, or returns a string with all the characters after the specified number in the string.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| originalValue |Yes |array or string |The array or string to use for skipping. |
+| numberToSkip |Yes |int |The number of elements or characters to skip. If this value is 0 or less, all the elements or characters in the value are returned. If it is larger than the length of the array or string, an empty array or string is returned. |
+
+### Return value
+
+An array or string.
+
+### Example
+
+The following example skips the specified number of elements in the array, and the specified number of characters in a string.
+
+```bicep
+param testArray array = [
+ 'one'
+ 'two'
+ 'three'
+]
+param elementsToSkip int = 2
+param testString string = 'one two three'
+param charactersToSkip int = 4
+
+output arrayOutput array = skip(testArray, elementsToSkip)
+output stringOutput string = skip(testString, charactersToSkip)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayOutput | Array | ["three"] |
+| stringOutput | String | two three |
+
+## take
+
+`take(originalValue, numberToTake)`
+
+Returns an array with the specified number of elements from the start of the array, or a string with the specified number of characters from the start of the string.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| originalValue |Yes |array or string |The array or string to take the elements from. |
+| numberToTake |Yes |int |The number of elements or characters to take. If this value is 0 or less, an empty array or string is returned. If it is larger than the length of the given array or string, all the elements in the array or string are returned. |
+
+### Return value
+
+An array or string.
+
+### Example
+
+The following example takes the specified number of elements from the array, and characters from a string.
+
+```bicep
+param testArray array = [
+ 'one'
+ 'two'
+ 'three'
+]
+param elementsToTake int = 2
+param testString string = 'one two three'
+param charactersToTake int = 2
+
+output arrayOutput array = take(testArray, elementsToTake)
+output stringOutput string = take(testString, charactersToTake)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayOutput | Array | ["one", "two"] |
+| stringOutput | String | on |
+
+## union
+
+`union(arg1, arg2, arg3, ...)`
+
+Returns a single array or object with all elements from the parameters. Duplicate values or keys are only included once.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array or object |The first value to use for joining elements. |
+| arg2 |Yes |array or object |The second value to use for joining elements. |
+| additional arguments |No |array or object |Additional values to use for joining elements. |
+
+### Return value
+
+An array or object.
+
+### Example
+
+The following example shows how to use union with arrays and objects:
+
+```bicep
+param firstObject object = {
+ 'one': 'a'
+ 'two': 'b'
+ 'three': 'c1'
+}
+
+param secondObject object = {
+ 'three': 'c2'
+ 'four': 'd'
+ 'five': 'e'
+}
+
+param firstArray array = [
+ 'one'
+ 'two'
+ 'three'
+]
+
+param secondArray array = [
+ 'three'
+ 'four'
+]
+
+output objectOutput object = union(firstObject, secondObject)
+output arrayOutput array = union(firstArray, secondArray)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| objectOutput | Object | {"one": "a", "two": "b", "three": "c2", "four": "d", "five": "e"} |
+| arrayOutput | Array | ["one", "two", "three", "four"] |
+
+## Next steps
+
+* For a description of the sections in a Bicep file, see [Understand the structure and syntax of Bicep files](./file.md).
azure-resource-manager Bicep Functions Date https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions-date.md
+
+ Title: Bicep functions - date
+description: Describes the functions to use in a Bicep file to work with dates.
+++ Last updated : 06/01/2021++
+# Date functions for Bicep
+
+Resource Manager provides the following functions for working with dates in your Bicep file:
+
+* [dateTimeAdd](#datetimeadd)
+* [utcNow](#utcnow)
+
+## dateTimeAdd
+
+`dateTimeAdd(base, duration, [format])`
+
+Adds a time duration to a base value. ISO 8601 format is expected.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| base | Yes | string | The starting datetime value for the addition. Use [ISO 8601 timestamp format](https://en.wikipedia.org/wiki/ISO_8601). |
+| duration | Yes | string | The time value to add to the base. It can be a negative value. Use [ISO 8601 duration format](https://en.wikipedia.org/wiki/ISO_8601#Durations). |
+| format | No | string | The output format for the date time result. If not provided, the format of the base value is used. Use either [standard format strings](/dotnet/standard/base-types/standard-date-and-time-format-strings) or [custom format strings](/dotnet/standard/base-types/custom-date-and-time-format-strings). |
+
+### Return value
+
+The datetime value that results from adding the duration value to the base value.
+
+### Examples
+
+The following example shows different ways of adding time values.
+
+```bicep
+param baseTime string = utcNow('u')
+
+var add3Years = dateTimeAdd(baseTime, 'P3Y')
+var subtract9Days = dateTimeAdd(baseTime, '-P9D')
+var add1Hour = dateTimeAdd(baseTime, 'PT1H')
+
+output add3YearsOutput string = add3Years
+output subtract9DaysOutput string = subtract9Days
+output add1HourOutput string = add1Hour
+```
+
+When the preceding example is deployed with a base time of `2020-04-07 14:53:14Z`, the output is:
+
+| Name | Type | Value |
+| - | - | -- |
+| add3YearsOutput | String | 4/7/2023 2:53:14 PM |
+| subtract9DaysOutput | String | 3/29/2020 2:53:14 PM |
+| add1HourOutput | String | 4/7/2020 3:53:14 PM |
+
+The next example shows how to set the start time for an Automation schedule.
+
+```bicep
+param omsAutomationAccountName string = 'demoAutomation'
+param scheduleName string = 'demSchedule1'
+param baseTime string = utcNow('u')
+
+var startTime = dateTimeAdd(baseTime, 'PT1H')
+
+...
+
+resource scheduler 'Microsoft.Automation/automationAccounts/schedules@2015-10-31' = {
+ name: concat(omsAutomationAccountName, '/', scheduleName)
+ properties: {
+ description: 'Demo Scheduler'
+ startTime: startTime
+ interval: 1
+ frequency: 'Hour'
+ }
+}
+```
+
+## utcNow
+
+`utcNow(format)`
+
+Returns the current (UTC) datetime value in the specified format. If no format is provided, the ISO 8601 (`yyyyMMddTHHmmssZ`) format is used. **This function can only be used in the default value for a parameter.**
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| format |No |string |The URI encoded value to convert to a string. Use either [standard format strings](/dotnet/standard/base-types/standard-date-and-time-format-strings) or [custom format strings](/dotnet/standard/base-types/custom-date-and-time-format-strings). |
+
+### Remarks
+
+You can only use this function within an expression for the default value of a parameter. Using this function anywhere else in a Bicep file returns an error. The function isn't allowed in other parts of the Bicep file because it returns a different value each time it's called. Deploying the same Bicep file with the same parameters wouldn't reliably produce the same results.
+
+If you use the [option to rollback on error](../templates/rollback-on-error.md) to an earlier successful deployment, and the earlier deployment includes a parameter that uses utcNow, the parameter isn't reevaluated. Instead, the parameter value from the earlier deployment is automatically reused in the rollback deployment.
+
+Be careful redeploying a Bicep file that relies on the utcNow function for a default value. When you redeploy and don't provide a value for the parameter, the function is reevaluated. If you want to update an existing resource rather than create a new one, pass in the parameter value from the earlier deployment.
+
+### Return value
+
+The current UTC datetime value.
+
+### Examples
+
+The following example shows different formats for the datetime value.
+
+```bicep
+param utcValue string = utcNow()
+param utcShortValue string = utcNow('d')
+param utcCustomValue string = utcNow('M d')
+
+output utcOutput string = utcValue
+output utcShortOutput string = utcShortValue
+output utcCustomOutput string = utcCustomValue
+```
+
+The output from the preceding example varies for each deployment but will be similar to:
+
+| Name | Type | Value |
+| - | - | -- |
+| utcOutput | string | 20190305T175318Z |
+| utcShortOutput | string | 03/05/2019 |
+| utcCustomOutput | string | 3 5 |
+
+The next example shows how to use a value from the function when setting a tag value.
+
+```bicep
+param utcShort string = utcNow('d')
+param rgName string
+
+resource myRg 'Microsoft.Resources/resourceGroups@2020-10-01' = {
+ name: rgName
+ location: 'westeurope'
+ tags: {
+ createdDate: utcShort
+ }
+}
+
+output utcShortOutput string = utcShort
+```
+
+## Next steps
+
+* For a description of the sections in a Bicep file, see [Understand the structure and syntax of Bicep files](./file.md).
azure-resource-manager Bicep Functions Deployment https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions-deployment.md
+
+ Title: Bicep functions - deployment
+description: Describes the functions to use in a Bicep file to retrieve deployment information.
+++ Last updated : 06/01/2021++
+# Deployment functions for Bicep
+
+Resource Manager provides the following functions for getting values related to the current deployment of your Bicep file:
+
+* [deployment](#deployment)
+* [environment](#environment)
+
+To get values from resources, resource groups, or subscriptions, see [Resource functions](./bicep-functions-resource.md).
+
+## deployment
+
+`deployment()`
+
+Returns information about the current deployment operation.
+
+### Return value
+
+This function returns the object that is passed during deployment. The properties in the returned object differ based on whether you are:
+
+* deploying a local Bicep file.
+* deploying to a resource group or deploying to one of the other scopes ([Azure subscription](deploy-to-subscription.md), [management group](deploy-to-management-group.md), or [tenant](deploy-to-tenant.md)).
+
+When deploying a local Bicep file to a resource group: the function returns the following format:
+
+```json
+{
+ "name": "",
+ "properties": {
+ "template": {
+ "$schema": "",
+ "contentVersion": "",
+ "parameters": {},
+ "variables": {},
+ "resources": [],
+ "outputs": {}
+ },
+ "templateHash": "",
+ "parameters": {},
+ "mode": "",
+ "provisioningState": ""
+ }
+}
+```
+
+When you deploy to an Azure subscription, management group, or tenant, the return object includes a `location` property. The location property is included when deploying a local Bicep file. The format is:
+
+```json
+{
+ "name": "",
+ "location": "",
+ "properties": {
+ "template": {
+ "$schema": "",
+ "contentVersion": "",
+ "resources": [],
+ "outputs": {}
+ },
+ "templateHash": "",
+ "parameters": {},
+ "mode": "",
+ "provisioningState": ""
+ }
+}
+```
+
+### Example
+
+The following example returns the deployment object:
+
+```bicep
+output deploymentOutput object = deployment()
+```
+
+The preceding example returns the following object:
+
+```json
+{
+ "name": "deployment",
+ "properties": {
+ "template": {
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "resources": [],
+ "outputs": {
+ "deploymentOutput": {
+ "type": "Object",
+ "value": "[deployment()]"
+ }
+ }
+ },
+ "templateHash": "13135986259522608210",
+ "parameters": {},
+ "mode": "Incremental",
+ "provisioningState": "Accepted"
+ }
+}
+```
+
+## environment
+
+`environment()`
+
+Returns information about the Azure environment used for deployment.
+
+### Return value
+
+This function returns properties for the current Azure environment. The following example shows the properties for global Azure. Sovereign clouds may return slightly different properties.
+
+```json
+{
+ "name": "",
+ "gallery": "",
+ "graph": "",
+ "portal": "",
+ "graphAudience": "",
+ "activeDirectoryDataLake": "",
+ "batch": "",
+ "media": "",
+ "sqlManagement": "",
+ "vmImageAliasDoc": "",
+ "resourceManager": "",
+ "authentication": {
+ "loginEndpoint": "",
+ "audiences": [
+ "",
+ ""
+ ],
+ "tenant": "",
+ "identityProvider": ""
+ },
+ "suffixes": {
+ "acrLoginServer": "",
+ "azureDatalakeAnalyticsCatalogAndJob": "",
+ "azureDatalakeStoreFileSystem": "",
+ "azureFrontDoorEndpointSuffix": "",
+ "keyvaultDns": "",
+ "sqlServerHostname": "",
+ "storage": ""
+ }
+}
+```
+
+### Example
+
+The following example Bicep file returns the environment object.
+
+```bicep
+output environmentOutput object = environment()
+```
+
+The preceding example returns the following object when deployed to global Azure:
+
+```json
+{
+ "name": "AzureCloud",
+ "gallery": "https://gallery.azure.com/",
+ "graph": "https://graph.windows.net/",
+ "portal": "https://portal.azure.com",
+ "graphAudience": "https://graph.windows.net/",
+ "activeDirectoryDataLake": "https://datalake.azure.net/",
+ "batch": "https://batch.core.windows.net/",
+ "media": "https://rest.media.azure.net",
+ "sqlManagement": "https://management.core.windows.net:8443/",
+ "vmImageAliasDoc": "https://raw.githubusercontent.com/Azure/azure-rest-api-specs/master/arm-compute/quickstart-templates/aliases.json",
+ "resourceManager": "https://management.azure.com/",
+ "authentication": {
+ "loginEndpoint": "https://login.windows.net/",
+ "audiences": [
+ "https://management.core.windows.net/",
+ "https://management.azure.com/"
+ ],
+ "tenant": "common",
+ "identityProvider": "AAD"
+ },
+ "suffixes": {
+ "acrLoginServer": ".azurecr.io",
+ "azureDatalakeAnalyticsCatalogAndJob": "azuredatalakeanalytics.net",
+ "azureDatalakeStoreFileSystem": "azuredatalakestore.net",
+ "azureFrontDoorEndpointSuffix": "azurefd.net",
+ "keyvaultDns": ".vault.azure.net",
+ "sqlServerHostname": ".database.windows.net",
+ "storage": "core.windows.net"
+ }
+}
+```
+
+## Next steps
+
+* For a description of the sections in a Bicep file, see [Understand the structure and syntax of Bicep files](./file.md).
azure-resource-manager Bicep Functions Logical https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions-logical.md
+
+ Title: Bicep functions - logical
+description: Describes the functions to use in a Bicep file to determine logical values.
+++ Last updated : 06/01/2021++
+# Logical functions for Bicep
+
+Resource Manager provides a `bool` function for Bicep. Some of the Azure Resource Manager JSON logical functions are replaced with [Bicep logical operators](./operators-logical.md).
+
+## bool
+
+`bool(arg1)`
+
+Converts the parameter to a boolean.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |string or int |The value to convert to a boolean. |
+
+### Return value
+
+A boolean of the converted value.
+
+### Examples
+
+The following [example template](https://github.com/Azure/azure-docs-json-samples/blob/master/azure-resource-manager/functions/bool.json) shows how to use bool with a string or integer.
+
+```bicep
+output trueString bool = bool('true')
+output falseString bool = bool('false')
+output trueInt bool = bool(1)
+output falseInt bool = bool(0)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| trueString | Bool | True |
+| falseString | Bool | False |
+| trueInt | Bool | True |
+| falseInt | Bool | False |
+
+## Next steps
+
+* For a description of the sections in a Bicep file, see [Understand the structure and syntax of Bicep files](./file.md).
azure-resource-manager Bicep Functions Numeric https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions-numeric.md
+
+ Title: Bicep functions - numeric
+description: Describes the functions to use in a Bicep file to work with numbers.
+++ Last updated : 06/01/2021++
+# Numeric functions for Bicep
+
+Resource Manager provides the following functions for working with integers in your Bicep file:
+
+* [int](#int)
+* [max](#max)
+* [min](#min)
+
+Some of the Azure Resource Manager JSON numeric functions are replaced with [Bicep numeric operators](./operators-numeric.md).
+
+## int
+
+`int(valueToConvert)`
+
+Converts the specified value to an integer.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| valueToConvert |Yes |string or int |The value to convert to an integer. |
+
+### Return value
+
+An integer of the converted value.
+
+### Example
+
+The following example converts the user-provided parameter value to integer.
+
+```bicep
+param stringToConvert string = '4'
+
+output inResult int = int(stringToConvert)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| intResult | Int | 4 |
+
+## max
+
+`max (arg1)`
+
+Returns the maximum value from an array of integers or a comma-separated list of integers.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array of integers, or comma-separated list of integers |The collection to get the maximum value. |
+
+### Return value
+
+An integer representing the maximum value from the collection.
+
+### Example
+
+The following example shows how to use max with an array and a list of integers:
+
+```bicep
+param arrayToTest array = [
+ 0
+ 3
+ 2
+ 5
+ 4
+]
+
+output arrayOutPut int = max(arrayToTest)
+output intOutput int = max(0,3,2,5,4)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayOutput | Int | 5 |
+| intOutput | Int | 5 |
+
+## min
+
+`min (arg1)`
+
+Returns the minimum value from an array of integers or a comma-separated list of integers.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array of integers, or comma-separated list of integers |The collection to get the minimum value. |
+
+### Return value
+
+An integer representing minimum value from the collection.
+
+### Example
+
+The following example shows how to use min with an array and a list of integers:
+
+```bicep
+param arrayToTest array = [
+ 0
+ 3
+ 2
+ 5
+ 4
+]
+
+output arrayOutPut int = min(arrayToTest)
+output intOutput int = min(0,3,2,5,4)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayOutput | Int | 0 |
+| intOutput | Int | 0 |
+
+## Next steps
+
+* For a description of the sections in a Bicep file, see [Understand the structure and syntax of Bicep files](./file.md).
azure-resource-manager Bicep Functions Object https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions-object.md
+
+ Title: Bicep functions - objects
+description: Describes the functions to use in a Bicep file for working with objects.
+++ Last updated : 06/01/2021++
+# Object functions for Bicep
+
+Resource Manager provides several functions for working with objects in your Bicep file:
+
+* [contains](#contains)
+* [empty](#empty)
+* [intersection](#intersection)
+* [json](#json)
+* [length](#length)
+* [union](#union)
+
+## contains
+
+`contains(container, itemToFind)`
+
+Checks whether an array contains a value, an object contains a key, or a string contains a substring. The string comparison is case-sensitive. However, when testing if an object contains a key, the comparison is case-insensitive.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| container |Yes |array, object, or string |The value that contains the value to find. |
+| itemToFind |Yes |string or int |The value to find. |
+
+### Return value
+
+**True** if the item is found; otherwise, **False**.
+
+### Example
+
+The following example shows how to use contains with different types:
+
+```bicep
+param stringToTest string = 'OneTwoThree'
+param objectToTest object = {
+ 'one': 'a'
+ 'two': 'b'
+ 'three': 'c'
+}
+param arrayToTest array = [
+ 'one'
+ 'two'
+ 'three'
+]
+
+output stringTrue bool = contains(stringToTest, 'e')
+output stringFalse bool = contains(stringToTest, 'z')
+output objectTrue bool = contains(objectToTest, 'one')
+output objectFalse bool = contains(objectToTest, 'a')
+output arrayTrue bool = contains(arrayToTest, 'three')
+output arrayFalse bool = contains(arrayToTest, 'four')
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| stringTrue | Bool | True |
+| stringFalse | Bool | False |
+| objectTrue | Bool | True |
+| objectFalse | Bool | False |
+| arrayTrue | Bool | True |
+| arrayFalse | Bool | False |
+
+## empty
+
+`empty(itemToTest)`
+
+Determines if an array, object, or string is empty.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| itemToTest |Yes |array, object, or string |The value to check if it's empty. |
+
+### Return value
+
+Returns **True** if the value is empty; otherwise, **False**.
+
+### Example
+
+The following example checks whether an array, object, and string are empty.
+
+```bicep
+param testArray array = []
+param testObject object = {}
+param testString string = ''
+
+output arrayEmpty bool = empty(testArray)
+output objectEmpty bool = empty(testObject)
+output stringEmpty bool = empty(testString)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayEmpty | Bool | True |
+| objectEmpty | Bool | True |
+| stringEmpty | Bool | True |
+
+## intersection
+
+`intersection(arg1, arg2, arg3, ...)`
+
+Returns a single array or object with the common elements from the parameters.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array or object |The first value to use for finding common elements. |
+| arg2 |Yes |array or object |The second value to use for finding common elements. |
+| additional arguments |No |array or object |Additional values to use for finding common elements. |
+
+### Return value
+
+An array or object with the common elements.
+
+### Example
+
+The following example shows how to use intersection with arrays and objects:
+
+```bicep
+param firstObject object = {
+ 'one': 'a'
+ 'two': 'b'
+ 'three': 'c'
+}
+param secondObject object = {
+ 'one': 'a'
+ 'two': 'z'
+ 'three': 'c'
+}
+param firstArray array = [
+ 'one'
+ 'two'
+ 'three'
+]
+param secondArray array = [
+ 'two'
+ 'three'
+]
+
+output objectOutput object = intersection(firstObject, secondObject)
+output arrayOutput array = intersection(firstArray, secondArray)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| objectOutput | Object | {"one": "a", "three": "c"} |
+| arrayOutput | Array | ["two", "three"] |
+
+<a id="json"></a>
+
+## json
+
+`json(arg1)`
+
+Converts a valid JSON string into a JSON data type.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |string |The value to convert to JSON. The string must be a properly formatted JSON string. |
+
+### Return value
+
+The JSON data type from the specified string, or an empty value when **null** is specified.
+
+### Remarks
+
+If you need to include a parameter value or variable in the JSON object, use the [concat](./bicep-functions-string.md#concat) function to create the string that you pass to the function.
+
+### Example
+
+The following example shows how to use the json function. Notice that you can pass in **null** for an empty object.
+
+```bicep
+param jsonEmptyObject string = 'null'
+param jsonObject string = '{\'a\': \'b\'}'
+param jsonString string = '\'test\''
+param jsonBoolean string = 'true'
+param jsonInt string = '3'
+param jsonArray string = '[[1,2,3]]'
+param concatValue string = 'demo value'
+
+output emptyObjectOutput bool = empty(json(jsonEmptyObject))
+output objectOutput object = json(jsonObject)
+output stringOutput string =json(jsonString)
+output booleanOutput bool = json(jsonBoolean)
+output intOutput int = json(jsonInt)
+output arrayOutput array = json(jsonArray)
+output concatObjectOutput object = json(concat('{"a": "', concatValue, '"}'))
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| emptyObjectOutput | Boolean | True |
+| objectOutput | Object | {"a": "b"} |
+| stringOutput | String | test |
+| booleanOutput | Boolean | True |
+| intOutput | Integer | 3 |
+| arrayOutput | Array | [ 1, 2, 3 ] |
+| concatObjectOutput | Object | { "a": "demo value" } |
+
+## length
+
+`length(arg1)`
+
+Returns the number of elements in an array, characters in a string, or root-level properties in an object.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array, string, or object |The array to use for getting the number of elements, the string to use for getting the number of characters, or the object to use for getting the number of root-level properties. |
+
+### Return value
+
+An int.
+
+### Example
+
+The following example shows how to use length with an array and string:
+
+```bicep
+param arrayToTest array = [
+ 'one'
+ 'two'
+ 'three'
+]
+param stringToTest string = 'One Two Three'
+param objectToTest object = {
+ 'propA': 'one'
+ 'propB': 'two'
+ 'propC': 'three'
+ 'propD': {
+ 'propD-1': 'sub'
+ 'propD-2': 'sub'
+ }
+}
+
+output arrayLength int = length(arrayToTest)
+output stringLength int = length(stringToTest)
+output objectLength int = length(objectToTest)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayLength | Int | 3 |
+| stringLength | Int | 13 |
+| objectLength | Int | 4 |
+
+## union
+
+`union(arg1, arg2, arg3, ...)`
+
+Returns a single array or object with all elements from the parameters. Duplicate values or keys are only included once.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array or object |The first value to use for joining elements. |
+| arg2 |Yes |array or object |The second value to use for joining elements. |
+| additional arguments |No |array or object |Additional values to use for joining elements. |
+
+### Return value
+
+An array or object.
+
+### Example
+
+The following example shows how to use union with arrays and objects:
+
+```bicep
+param firstObject object = {
+ 'one': 'a'
+ 'two': 'b'
+ 'three': 'c1'
+}
+
+param secondObject object = {
+ 'three': 'c2'
+ 'four': 'd'
+ 'five': 'e'
+}
+
+param firstArray array = [
+ 'one'
+ 'two'
+ 'three'
+]
+
+param secondArray array = [
+ 'three'
+ 'four'
+]
+
+output objectOutput object = union(firstObject, secondObject)
+output arrayOutput array = union(firstArray, secondArray)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| objectOutput | Object | {"one": "a", "two": "b", "three": "c2", "four": "d", "five": "e"} |
+| arrayOutput | Array | ["one", "two", "three", "four"] |
+
+## Next steps
+
+* For a description of the sections in a Bicep file, see [Understand the structure and syntax of Bicep files](./file.md).
azure-resource-manager Bicep Functions Resource https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions-resource.md
+
+ Title: Bicep functions - resources
+description: Describes the functions to use in a Bicep file to retrieve values about resources.
+++ Last updated : 06/01/2021++
+# Resource functions for Bicep
+
+Resource Manager provides the following functions for getting resource values in your Bicep file:
+
+* [extensionResourceId](#extensionresourceid)
+* [list*](#list)
+* [pickZones](#pickzones)
+* [reference](#reference)
+* [resourceId](#resourceid)
+* [subscriptionResourceId](#subscriptionresourceid)
+* [tenantResourceId](#tenantresourceid)
+
+To get values from the current deployment, see [Deployment value functions](./bicep-functions-deployment.md).
+
+## extensionResourceId
+
+`extensionResourceId(resourceId, resourceType, resourceName1, [resourceName2], ...)`
+
+Returns the resource ID for an [extension resource](../management/extension-resource-types.md), which is a resource type that is applied to another resource to add to its capabilities.
+
+The extensionResourceId function is available in Bicep files, but typically you don't need it. Instead, use the symbolic name for the resource and access the `id` property.
+
+The basic format of the resource ID returned by this function is:
+
+```json
+{scope}/providers/{extensionResourceProviderNamespace}/{extensionResourceType}/{extensionResourceName}
+```
+
+The scope segment varies by the resource being extended.
+
+When the extension resource is applied to a **resource**, the resource ID is returned in the following format:
+
+```json
+/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{baseResourceProviderNamespace}/{baseResourceType}/{baseResourceName}/providers/{extensionResourceProviderNamespace}/{extensionResourceType}/{extensionResourceName}
+```
+
+When the extension resource is applied to a **resource group**, the format is:
+
+```json
+/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{extensionResourceProviderNamespace}/{extensionResourceType}/{extensionResourceName}
+```
+
+When the extension resource is applied to a **subscription**, the format is:
+
+```json
+/subscriptions/{subscriptionId}/providers/{extensionResourceProviderNamespace}/{extensionResourceType}/{extensionResourceName}
+```
+
+When the extension resource is applied to a **management group**, the format is:
+
+```json
+/providers/Microsoft.Management/managementGroups/{managementGroupName}/providers/{extensionResourceProviderNamespace}/{extensionResourceType}/{extensionResourceName}
+```
+
+A custom policy definition deployed to a management group is implemented as an extension resource. To create and assign a policy, deploy the following Bicep file to a management group.
+
+```bicep
+targetScope = 'managementGroup'
+
+@description('An array of the allowed locations, all other locations will be denied by the created policy.')
+param allowedLocations array = [
+ 'australiaeast'
+ 'australiasoutheast'
+ 'australiacentral'
+]
+
+resource policyDefinition 'Microsoft.Authorization/policyDefinitions@2019-09-01' = {
+ name: 'locationRestriction'
+ properties: {
+ policyType: 'Custom'
+ mode: 'All'
+ parameters: {}
+ policyRule: {
+ if: {
+ not: {
+ field: 'location'
+ in: allowedLocations
+ }
+ }
+ then: {
+ effect: 'deny'
+ }
+ }
+ }
+}
+
+resource policyAssignment 'Microsoft.Authorization/policyAssignments@2019-09-01' = {
+ name: 'locationAssignment'
+ properties: {
+ policyDefinitionId: policyDefinition.id
+ }
+}
+```
+++
+Built-in policy definitions are tenant level resources. For an example of deploying a built-in policy definition, see [tenantResourceId](#tenantresourceid).
+
+## getSecret
+
+`getSecret([secretName])`
+
+Returns the secret value stored in Azure Key Vault. You can use the getSecret function to obtain a key vault secret and pass the return value to a string parameter of a Bicep module. The getSecret function can only be called on a `Microsoft.KeyVault/vaults` resource and can be used only with parameter with `@secure()` decorator.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| secretName | Yes | string | The name of the secret stored in a key vault. |
+
+### Return value
+
+The secret value for the secret name.
+
+### Example
+
+The following Bicep file is used as a module. It has an *adminPassword* parameter defined with the `@secure()` decorator.
+
+```bicep
+param sqlServerName string
+param adminLogin string
+
+@secure()
+param adminPassword string
+
+resource sqlServer 'Microsoft.Sql/servers@2020-11-01-preview' = {
+ ...
+}
+```
+
+The following Bicep file consumes the preceding Bicep file as a module. The Bicep file references an existing key vault, and calls the `getSecret` function to retrieve the key vault secret, and then passes the value as a parameter to the module.
+
+```bicep
+param sqlServerName string
+param adminLogin string
+
+param subscriptionId string
+param kvResourceGroup string
+param kvName string
+
+resource kv 'Microsoft.KeyVault/vaults@2019-09-01' existing = {
+ name: kvName
+ scope: resourceGroup(subscriptionId, kvResourceGroup )
+}
+
+module sql './sql.bicep' = {
+ name: 'deploySQL'
+ params: {
+ sqlServerName: sqlServerName
+ adminLogin: adminLogin
+ adminPassword: kv.getSecret('vmAdminPassword')
+ }
+}
+```
+
+<a id="listkeys"></a>
+<a id="list"></a>
+
+## list*
+
+`list{Value}(resourceName or resourceIdentifier, apiVersion, functionValues)`
+
+The syntax for this function varies by name of the list operations. Each implementation returns values for the resource type that supports a list operation. The operation name must start with `list` and may have a suffix. Some common usages are `list`, `listKeys`, `listKeyValue`, and `listSecrets`.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| resourceName or resourceIdentifier |Yes |string |Unique identifier for the resource. |
+| apiVersion |Yes |string |API version of resource runtime state. Typically, in the format, **yyyy-mm-dd**. |
+| functionValues |No |object | An object that has values for the function. Only provide this object for functions that support receiving an object with parameter values, such as **listAccountSas** on a storage account. An example of passing function values is shown in this article. |
+
+### Valid uses
+
+The list functions can be used in the properties of a resource definition. Don't use a list function that exposes sensitive information in the outputs section of a Bicep file. Output values are stored in the deployment history and could be retrieved by a malicious user.
+
+When used with [property loop](./loop-properties.md), you can use the list functions for `input` because the expression is assigned to the resource property. You can't use them with `count` because the count must be determined before the list function is resolved.
+
+### Implementations
+
+The possible uses of list* are shown in the following table.
+
+| Resource type | Function name |
+| - | - |
+| Microsoft.Addons/supportProviders | listsupportplaninfo |
+| Microsoft.AnalysisServices/servers | [listGatewayStatus](/rest/api/analysisservices/servers/listgatewaystatus) |
+| Microsoft.ApiManagement/service/authorizationServers | [listSecrets](/rest/api/apimanagement/2019-12-01/authorizationserver/listsecrets) |
+| Microsoft.ApiManagement/service/gateways | [listKeys](/rest/api/apimanagement/2019-12-01/gateway/listkeys) |
+| Microsoft.ApiManagement/service/identityProviders | [listSecrets](/rest/api/apimanagement/2019-12-01/identityprovider/listsecrets) |
+| Microsoft.ApiManagement/service/namedValues | [listValue](/rest/api/apimanagement/2019-12-01/namedvalue/listvalue) |
+| Microsoft.ApiManagement/service/openidConnectProviders | [listSecrets](/rest/api/apimanagement/2019-12-01/openidconnectprovider/listsecrets) |
+| Microsoft.ApiManagement/service/subscriptions | [listSecrets](/rest/api/apimanagement/2019-12-01/subscription/listsecrets) |
+| Microsoft.AppConfiguration/configurationStores | [ListKeys](/rest/api/appconfiguration/configurationstores/listkeys) |
+| Microsoft.AppPlatform/Spring | [listTestKeys](/rest/api/azurespringcloud/services/listtestkeys) |
+| Microsoft.Automation/automationAccounts | [listKeys](/rest/api/automation/keys/listbyautomationaccount) |
+| Microsoft.Batch/batchAccounts | [listkeys](/rest/api/batchmanagement/batchaccount/getkeys) |
+| Microsoft.BatchAI/workspaces/experiments/jobs | [listoutputfiles](/rest/api/batchai/jobs/listoutputfiles) |
+| Microsoft.Blockchain/blockchainMembers | [listApiKeys](/rest/api/blockchain/2019-06-01-preview/blockchainmembers/listapikeys) |
+| Microsoft.Blockchain/blockchainMembers/transactionNodes | [listApiKeys](/rest/api/blockchain/2019-06-01-preview/transactionnodes/listapikeys) |
+| Microsoft.BotService/botServices/channels | [listChannelWithKeys](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/botservice/resource-manager/Microsoft.BotService/stable/2020-06-02/botservice.json#L553) |
+| Microsoft.Cache/redis | [listKeys](/rest/api/redis/redis/listkeys) |
+| Microsoft.CognitiveServices/accounts | [listKeys](/rest/api/cognitiveservices/accountmanagement/accounts/listkeys) |
+| Microsoft.ContainerRegistry/registries | [listBuildSourceUploadUrl](/rest/api/containerregistry/registries%20(tasks)/getbuildsourceuploadurl) |
+| Microsoft.ContainerRegistry/registries | [listCredentials](/rest/api/containerregistry/registries/listcredentials) |
+| Microsoft.ContainerRegistry/registries | [listUsages](/rest/api/containerregistry/registries/listusages) |
+| Microsoft.ContainerRegistry/registries/agentpools | listQueueStatus |
+| Microsoft.ContainerRegistry/registries/buildTasks | listSourceRepositoryProperties |
+| Microsoft.ContainerRegistry/registries/buildTasks/steps | listBuildArguments |
+| Microsoft.ContainerRegistry/registries/taskruns | listDetails |
+| Microsoft.ContainerRegistry/registries/webhooks | [listEvents](/rest/api/containerregistry/webhooks/listevents) |
+| Microsoft.ContainerRegistry/registries/runs | [listLogSasUrl](/rest/api/containerregistry/runs/getlogsasurl) |
+| Microsoft.ContainerRegistry/registries/tasks | [listDetails](/rest/api/containerregistry/tasks/getdetails) |
+| Microsoft.ContainerService/managedClusters | [listClusterAdminCredential](/rest/api/aks/managedclusters/listclusteradmincredentials) |
+| Microsoft.ContainerService/managedClusters | [listClusterMonitoringUserCredential](/rest/api/aks/managedclusters/listclustermonitoringusercredentials) |
+| Microsoft.ContainerService/managedClusters | [listClusterUserCredential](/rest/api/aks/managedclusters/listclusterusercredentials) |
+| Microsoft.ContainerService/managedClusters/accessProfiles | [listCredential](/rest/api/aks/managedclusters/getaccessprofile) |
+| Microsoft.DataBox/jobs | listCredentials |
+| Microsoft.DataFactory/datafactories/gateways | listauthkeys |
+| Microsoft.DataFactory/factories/integrationruntimes | [listauthkeys](/rest/api/datafactory/integrationruntimes/listauthkeys) |
+| Microsoft.DataLakeAnalytics/accounts/storageAccounts/Containers | [listSasTokens](/rest/api/datalakeanalytics/storageaccounts/listsastokens) |
+| Microsoft.DataShare/accounts/shares | [listSynchronizations](/rest/api/datashare/2020-09-01/shares/listsynchronizations) |
+| Microsoft.DataShare/accounts/shareSubscriptions | [listSourceShareSynchronizationSettings](/rest/api/datashare/2020-09-01/sharesubscriptions/listsourcesharesynchronizationsettings) |
+| Microsoft.DataShare/accounts/shareSubscriptions | [listSynchronizationDetails](/rest/api/datashare/2020-09-01/sharesubscriptions/listsynchronizationdetails) |
+| Microsoft.DataShare/accounts/shareSubscriptions | [listSynchronizations](/rest/api/datashare/2020-09-01/sharesubscriptions/listsynchronizations) |
+| Microsoft.Devices/iotHubs | [listkeys](/rest/api/iothub/iothubresource/listkeys) |
+| Microsoft.Devices/iotHubs/iotHubKeys | [listkeys](/rest/api/iothub/iothubresource/getkeysforkeyname) |
+| Microsoft.Devices/provisioningServices/keys | [listkeys](/rest/api/iot-dps/iotdpsresource/listkeysforkeyname) |
+| Microsoft.Devices/provisioningServices | [listkeys](/rest/api/iot-dps/iotdpsresource/listkeys) |
+| Microsoft.DevTestLab/labs | [ListVhds](/rest/api/dtl/labs/listvhds) |
+| Microsoft.DevTestLab/labs/schedules | [ListApplicable](/rest/api/dtl/schedules/listapplicable) |
+| Microsoft.DevTestLab/labs/users/serviceFabrics | [ListApplicableSchedules](/rest/api/dtl/servicefabrics/listapplicableschedules) |
+| Microsoft.DevTestLab/labs/virtualMachines | [ListApplicableSchedules](/rest/api/dtl/virtualmachines/listapplicableschedules) |
+| Microsoft.DocumentDB/databaseAccounts | [listConnectionStrings](/rest/api/cosmos-db-resource-provider/2021-03-01-preview/databaseaccounts/listconnectionstrings) |
+| Microsoft.DocumentDB/databaseAccounts | [listKeys](/rest/api/cosmos-db-resource-provider/2021-03-01-preview/databaseaccounts/listkeys) |
+| Microsoft.DocumentDB/databaseAccounts/notebookWorkspaces | [listConnectionInfo](/rest/api/cosmos-db-resource-provider/2021-03-15/notebookworkspaces/listconnectioninfo) |
+| Microsoft.DomainRegistration | [listDomainRecommendations](/rest/api/appservice/domains/listrecommendations) |
+| Microsoft.DomainRegistration/topLevelDomains | [listAgreements](/rest/api/appservice/topleveldomains/listagreements) |
+| Microsoft.EventGrid/domains | [listKeys](/rest/api/eventgrid/version2020-06-01/domains/listsharedaccesskeys) |
+| Microsoft.EventGrid/topics | [listKeys](/rest/api/eventgrid/version2020-06-01/topics/listsharedaccesskeys) |
+| Microsoft.EventHub/namespaces/authorizationRules | [listkeys](/rest/api/eventhub) |
+| Microsoft.EventHub/namespaces/disasterRecoveryConfigs/authorizationRules | [listkeys](/rest/api/eventhub) |
+| Microsoft.EventHub/namespaces/eventhubs/authorizationRules | [listkeys](/rest/api/eventhub) |
+| Microsoft.ImportExport/jobs | [listBitLockerKeys](/rest/api/storageimportexport/bitlockerkeys/list) |
+| Microsoft.Kusto/Clusters/Databases | [ListPrincipals](/rest/api/azurerekusto/databases/listprincipals) |
+| Microsoft.LabServices/users | [ListEnvironments](/rest/api/labservices/globalusers/listenvironments) |
+| Microsoft.LabServices/users | [ListLabs](/rest/api/labservices/globalusers/listlabs) |
+| Microsoft.Logic/integrationAccounts/agreements | [listContentCallbackUrl](/rest/api/logic/agreements/listcontentcallbackurl) |
+| Microsoft.Logic/integrationAccounts/assemblies | [listContentCallbackUrl](/rest/api/logic/integrationaccountassemblies/listcontentcallbackurl) |
+| Microsoft.Logic/integrationAccounts | [listCallbackUrl](/rest/api/logic/integrationaccounts/getcallbackurl) |
+| Microsoft.Logic/integrationAccounts | [listKeyVaultKeys](/rest/api/logic/integrationaccounts/listkeyvaultkeys) |
+| Microsoft.Logic/integrationAccounts/maps | [listContentCallbackUrl](/rest/api/logic/maps/listcontentcallbackurl) |
+| Microsoft.Logic/integrationAccounts/partners | [listContentCallbackUrl](/rest/api/logic/partners/listcontentcallbackurl) |
+| Microsoft.Logic/integrationAccounts/schemas | [listContentCallbackUrl](/rest/api/logic/schemas/listcontentcallbackurl) |
+| Microsoft.Logic/workflows | [listCallbackUrl](/rest/api/logic/workflows/listcallbackurl) |
+| Microsoft.Logic/workflows | [listSwagger](/rest/api/logic/workflows/listswagger) |
+| Microsoft.Logic/workflows/runs/actions | [listExpressionTraces](/rest/api/logic/workflowrunactions/listexpressiontraces) |
+| Microsoft.Logic/workflows/runs/actions/repetitions | [listExpressionTraces](/rest/api/logic/workflowrunactionrepetitions/listexpressiontraces) |
+| Microsoft.Logic/workflows/triggers | [listCallbackUrl](/rest/api/logic/workflowtriggers/listcallbackurl) |
+| Microsoft.Logic/workflows/versions/triggers | [listCallbackUrl](/rest/api/logic/workflowversions/listcallbackurl) |
+| Microsoft.MachineLearning/webServices | [listkeys](/rest/api/machinelearning/webservices/listkeys) |
+| Microsoft.MachineLearning/Workspaces | listworkspacekeys |
+| Microsoft.MachineLearningServices/workspaces/computes | [listKeys](/rest/api/azureml/workspacesandcomputes/machinelearningcompute/listkeys) |
+| Microsoft.MachineLearningServices/workspaces/computes | [listNodes](/rest/api/azureml/workspacesandcomputes/machinelearningcompute/listnodes) |
+| Microsoft.MachineLearningServices/workspaces | [listKeys](/rest/api/azureml/workspacesandcomputes/workspaces/listkeys) |
+| Microsoft.Maps/accounts | [listKeys](/rest/api/maps-management/accounts/listkeys) |
+| Microsoft.Media/mediaservices/assets | [listContainerSas](/rest/api/media/assets/listcontainersas) |
+| Microsoft.Media/mediaservices/assets | [listStreamingLocators](/rest/api/media/assets/liststreaminglocators) |
+| Microsoft.Media/mediaservices/streamingLocators | [listContentKeys](/rest/api/media/streaminglocators/listcontentkeys) |
+| Microsoft.Media/mediaservices/streamingLocators | [listPaths](/rest/api/media/streaminglocators/listpaths) |
+| Microsoft.Network/applicationSecurityGroups | listIpConfigurations |
+| Microsoft.NotificationHubs/Namespaces/authorizationRules | [listkeys](/rest/api/notificationhubs/namespaces/listkeys) |
+| Microsoft.NotificationHubs/Namespaces/NotificationHubs/authorizationRules | [listkeys](/rest/api/notificationhubs/notificationhubs/listkeys) |
+| Microsoft.OperationalInsights/workspaces | [list](/rest/api/loganalytics/workspaces/list) |
+| Microsoft.OperationalInsights/workspaces | listKeys |
+| Microsoft.PolicyInsights/remediations | [listDeployments](/rest/api/policy/remediations/listdeploymentsatresourcegroup) |
+| Microsoft.RedHatOpenShift/openShiftClusters | [listCredentials](/rest/api/openshift/openshiftclusters/listcredentials) |
+| Microsoft.Relay/namespaces/authorizationRules | [listkeys](/rest/api/relay/namespaces/listkeys) |
+| Microsoft.Relay/namespaces/disasterRecoveryConfigs/authorizationRules | listkeys |
+| Microsoft.Relay/namespaces/HybridConnections/authorizationRules | [listkeys](/rest/api/relay/hybridconnections/listkeys) |
+| Microsoft.Relay/namespaces/WcfRelays/authorizationRules | [listkeys](/rest/api/relay/wcfrelays/listkeys) |
+| Microsoft.Search/searchServices | [listAdminKeys](/rest/api/searchmanagement/adminkeys/get) |
+| Microsoft.Search/searchServices | [listQueryKeys](/rest/api/searchmanagement/querykeys/listbysearchservice) |
+| Microsoft.ServiceBus/namespaces/authorizationRules | [listkeys](/rest/api/servicebus/stable/namespaces%20-%20authorization%20rules/listkeys) |
+| Microsoft.ServiceBus/namespaces/disasterRecoveryConfigs/authorizationRules | [listkeys](/rest/api/servicebus/stable/disasterrecoveryconfigs/listkeys) |
+| Microsoft.ServiceBus/namespaces/queues/authorizationRules | [listkeys](/rest/api/servicebus/stable/queues%20-%20authorization%20rules/listkeys) |
+| Microsoft.ServiceBus/namespaces/topics/authorizationRules | [listkeys](/rest/api/servicebus/stable/topics%20ΓÇô%20authorization%20rules/listkeys) |
+| Microsoft.SignalRService/SignalR | [listkeys](/rest/api/signalr/signalr/listkeys) |
+| Microsoft.Storage/storageAccounts | [listAccountSas](/rest/api/storagerp/storageaccounts/listaccountsas) |
+| Microsoft.Storage/storageAccounts | [listkeys](/rest/api/storagerp/storageaccounts/listkeys) |
+| Microsoft.Storage/storageAccounts | [listServiceSas](/rest/api/storagerp/storageaccounts/listservicesas) |
+| Microsoft.StorSimple/managers/devices | [listFailoverSets](/rest/api/storsimple/devices/listfailoversets) |
+| Microsoft.StorSimple/managers/devices | [listFailoverTargets](/rest/api/storsimple/devices/listfailovertargets) |
+| Microsoft.StorSimple/managers | [listActivationKey](/rest/api/storsimple/managers/getactivationkey) |
+| Microsoft.StorSimple/managers | [listPublicEncryptionKey](/rest/api/storsimple/managers/getpublicencryptionkey) |
+| Microsoft.Synapse/workspaces/integrationRuntimes | [listAuthKeys](/rest/api/synapse/integrationruntimeauthkeys/list) |
+| Microsoft.Web/connectionGateways | ListStatus |
+| microsoft.web/connections | listconsentlinks |
+| Microsoft.Web/customApis | listWsdlInterfaces |
+| microsoft.web/locations | listwsdlinterfaces |
+| microsoft.web/apimanagementaccounts/apis/connections | listconnectionkeys |
+| microsoft.web/apimanagementaccounts/apis/connections | listsecrets |
+| microsoft.web/sites/backups | [list](/rest/api/appservice/webapps/listbackups) |
+| Microsoft.Web/sites/config | [list](/rest/api/appservice/webapps/listconfigurations) |
+| microsoft.web/sites/functions | [listkeys](/rest/api/appservice/webapps/listfunctionkeys)
+| microsoft.web/sites/functions | [listsecrets](/rest/api/appservice/webapps/listfunctionsecrets) |
+| microsoft.web/sites/hybridconnectionnamespaces/relays | [listkeys](/rest/api/appservice/appserviceplans/listhybridconnectionkeys) |
+| microsoft.web/sites | [listsyncfunctiontriggerstatus](/rest/api/appservice/webapps/listsyncfunctiontriggers) |
+| microsoft.web/sites/slots/functions | [listsecrets](/rest/api/appservice/webapps/listfunctionsecretsslot) |
+| microsoft.web/sites/slots/backups | [list](/rest/api/appservice/webapps/listbackupsslot) |
+| Microsoft.Web/sites/slots/config | [list](/rest/api/appservice/webapps/listconfigurationsslot) |
+| microsoft.web/sites/slots/functions | [listsecrets](/rest/api/appservice/webapps/listfunctionsecretsslot) |
+
+To determine which resource types have a list operation, you have the following options:
+
+* View the [REST API operations](/rest/api/) for a resource provider, and look for list operations. For example, storage accounts have the [listKeys operation](/rest/api/storagerp/storageaccounts).
+* Use the [Get-ΓÇïAzProviderΓÇïOperation](/powershell/module/az.resources/get-azprovideroperation) PowerShell cmdlet. The following example gets all list operations for storage accounts:
+
+ ```powershell
+ Get-AzProviderOperation -OperationSearchString "Microsoft.Storage/*" | where {$_.Operation -like "*list*"} | FT Operation
+ ```
+
+* Use the following Azure CLI command to filter only the list operations:
+
+ ```azurecli
+ az provider operation show --namespace Microsoft.Storage --query "resourceTypes[?name=='storageAccounts'].operations[].name | [?contains(@, 'list')]"
+ ```
+
+### Return value
+
+The returned object varies by the list function you use. For example, the listKeys for a storage account returns the following format:
+
+```json
+{
+ "keys": [
+ {
+ "keyName": "key1",
+ "permissions": "Full",
+ "value": "{value}"
+ },
+ {
+ "keyName": "key2",
+ "permissions": "Full",
+ "value": "{value}"
+ }
+ ]
+}
+```
+
+Other list functions have different return formats. To see the format of a function, include it in the outputs section as shown in the example Bicep file.
+
+### Remarks
+
+Specify the resource by using either the resource name or the [resourceId function](#resourceid). When using a list function in the same Bicep file that deploys the referenced resource, use the resource name.
+
+If you use a **list** function in a resource that is conditionally deployed, the function is evaluated even if the resource isn't deployed. You get an error if the **list** function refers to a resource that doesn't exist. Use the [conditional expression **?:** operator](./operators-logical.md#conditional-expression--) to make sure the function is only evaluated when the resource is being deployed.
+
+### List example
+
+The following example uses listKeys when setting a value for [deployment scripts](../templates/deployment-script-template.md).
+
+```bicep
+storageAccountSettings: {
+ storageAccountName: storageAccountName
+ storageAccountKey: listKeys(resourceId('Microsoft.Storage/storageAccounts', storageAccountName), '2019-06-01').keys[0].value
+}
+```
+
+The next example shows a list function that takes a parameter. In this case, the function is **listAccountSas**. Pass an object for the expiry time. The expiry time must be in the future.
+
+```bicep
+param accountSasProperties object {
+ default: {
+ signed
+ signedPermission: 'r'
+ signedExpiry: '2020-08-20T11:00:00Z'
+ signedResourceTypes: 's'
+ }
+}
+...
+sasToken: listAccountSas(storagename, '2018-02-01', accountSasProperties).accountSasToken
+```
+
+## pickZones
+
+`pickZones(providerNamespace, resourceType, location, [numberOfZones], [offset])`
+
+Determines whether a resource type supports zones for a region.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| providerNamespace | Yes | string | The resource provider namespace for the resource type to check for zone support. |
+| resourceType | Yes | string | The resource type to check for zone support. |
+| location | Yes | string | The region to check for zone support. |
+| numberOfZones | No | integer | The number of logical zones to return. The default is 1. The number must be a positive integer from 1 to 3. Use 1 for single-zoned resources. For multi-zoned resources, the value must be less than or equal to the number of supported zones. |
+| offset | No | integer | The offset from the starting logical zone. The function returns an error if offset plus numberOfZones exceeds the number of supported zones. |
+
+### Return value
+
+An array with the supported zones. When using the default values for offset and numberOfZones, a resource type and region that supports zones returns the following array:
+
+```json
+[
+ "1"
+]
+```
+
+When the `numberOfZones` parameter is set to 3, it returns:
+
+```json
+[
+ "1",
+ "2",
+ "3"
+]
+```
+
+When the resource type or region doesn't support zones, an empty array is returned.
+
+```json
+[
+]
+```
+
+### pickZones example
+
+The following Bicep file shows three results for using the pickZones function.
+
+```bicep
+output supported array = pickZones('Microsoft.Compute', 'virtualMachines', 'westus2')
+output notSupportedRegion array = pickZones('Microsoft.Compute', 'virtualMachines', 'northcentralus')
+output notSupportedType array = pickZones('Microsoft.Cdn', 'profiles', 'westus2')
+```
+
+The output from the preceding examples returns three arrays.
+
+| Name | Type | Value |
+| - | - | -- |
+| supported | array | [ "1" ] |
+| notSupportedRegion | array | [] |
+| notSupportedType | array | [] |
+
+You can use the response from pickZones to determine whether to provide null for zones or assign virtual machines to different zones.
+
+## reference
+
+`reference(resourceName or resourceIdentifier, [apiVersion], ['Full'])`
+
+Returns an object representing a resource's runtime state.
+
+The reference function is available in Bicep files, but typically you don't need it. Instead, use the symbolic name for the resource.
+
+The following example deploys a storage account. It uses the symbolic name `stg` for the storage account to return a property.
+
+```bicep
+param storageAccountName string
+
+resource stg 'Microsoft.Storage/storageAccounts@2019-06-01' = {
+ name: storageAccountName
+ location: 'eastus'
+ kind: 'Storage'
+ sku: {
+ name: 'Standard_LRS'
+ }
+}
+
+output storageEndpoint object = stg.properties.primaryEndpoints
+```
+
+To get a property from an existing resource that isn't deployed in the template, use the `existing` keyword:
+
+```bicep
+param storageAccountName string
+
+resource stg 'Microsoft.Storage/storageAccounts@2019-06-01' existing = {
+ name: storageAccountName
+}
+
+// use later in template as often as needed
+output blobAddress string = stg.properties.primaryEndpoints.blob
+```
+
+For more information, see [Reference resources](./compare-template-syntax.md#reference-resources) and the [JSON template reference function](../templates/template-functions-resource.md#reference).
++
+## resourceId
+
+`resourceId([subscriptionId], [resourceGroupName], resourceType, resourceName1, [resourceName2], ...)`
+
+Returns the unique identifier of a resource.
+
+The resourceId function is available in Bicep files, but typically you don't need it. Instead, use the symbolic name for the resource and access the `id` property.
+
+You use this function when the resource name is ambiguous or not provisioned within the same Bicep file. The format of the returned identifier varies based on whether the deployment happens at the scope of a resource group, subscription, management group, or tenant.
+
+For example:
+
+```bicep
+param storageAccountName string
+
+resource stg 'Microsoft.Storage/storageAccounts@2019-06-01' = {
+ name: storageAccountName
+ location: 'eastus'
+ kind: 'Storage'
+ sku: {
+ name: 'Standard_LRS'
+ }
+}
+
+output storageID string = stg.id
+```
+
+To get the resource ID for a resource that isn't deployed in the Bicep file, use the existing keyword.
+
+```bicep
+param storageAccountName string
+
+resource stg 'Microsoft.Storage/storageAccounts@2019-06-01' existing = {
+ name: storageAccountName
+}
+
+output storageID string = stg.id
+```
+
+For more information, see the [JSON template resourceId function](../templates/template-functions-resource.md#resourceid)
+
+## subscriptionResourceId
+
+`subscriptionResourceId([subscriptionId], resourceType, resourceName1, [resourceName2], ...)`
+
+Returns the unique identifier for a resource deployed at the subscription level.
+
+The subscriptionResourceId function is available in Bicep files, but typically you don't need it. Instead, use the symbolic name for the resource and access the `id` property.
+
+The identifier is returned in the following format:
+
+```json
+/subscriptions/{subscriptionId}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}
+```
+
+### Remarks
+
+You use this function to get the resource ID for resources that are [deployed to the subscription](deploy-to-subscription.md) rather than a resource group. The returned ID differs from the value returned by the [resourceId](#resourceid) function by not including a resource group value.
+
+### subscriptionResourceID example
+
+The following Bicep file assigns a built-in role. You can deploy it to either a resource group or subscription. It uses the subscriptionResourceId function to get the resource ID for built-in roles.
+
+```bicep
+param principalId string {
+ metadata: {
+ 'description': 'principalId'
+ }
+}
+param builtInRoleType string {
+ 'allowed': [
+ 'Owner'
+ 'Contributor'
+ 'Reader'
+ ]
+ 'metadata': {
+ 'description': 'Built-in role to assign'
+ }
+}
+param roleNameGuid string {
+ default: newGuid()
+ metadata: {
+ 'description': 'A new GUID used to identify the role assignment'
+ }
+}
+
+var roleDefinitionId = {
+ Owner: {
+ id: subscriptionResourceId('Microsoft.Authorization/roleDefinitions', '8e3af657-a8ff-443c-a75c-2fe8c4bcb635')
+ }
+ Contributor: {
+ id: subscriptionResourceId('Microsoft.Authorization/roleDefinitions', 'b24988ac-6180-42a0-ab88-20f7382dd24c')
+ }
+ Reader: {
+ id: subscriptionResourceId('Microsoft.Authorization/roleDefinitions', 'acdd72a7-3385-48ef-bd42-f606fba81ae7')
+ }
+}
+
+resource myRoleAssignment 'Microsoft.Authorization/roleAssignments@2018-09-01-preview' = {
+ name: roleNameGuid
+ properties: {
+ roleDefinitionId: roleDefinitionId[builtInRoleType].id
+ principalId: principalId
+ }
+}
+```
+
+## tenantResourceId
+
+`tenantResourceId(resourceType, resourceName1, [resourceName2], ...)`
+
+Returns the unique identifier for a resource deployed at the tenant level.
+
+The tenantResourceId function is available in Bicep files, but typically you don't need it. Instead, use the symbolic name for the resource and access the `id` property.
+
+The identifier is returned in the following format:
+
+```json
+/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}
+```
+
+Built-in policy definitions are tenant level resources. To deploy a policy assignment that references a built-in policy definition, use the tenantResourceId function.
+
+```bicep
+param policyDefinitionID string{
+ default: '0a914e76-4921-4c19-b460-a2d36003525a'
+ metadata: {
+ 'description': 'Specifies the ID of the policy definition or policy set definition being assigned.'
+ }
+}
+
+param policyAssignmentName string {
+ default: guid(policyDefinitionID, resourceGroup().name)
+ metadata: {
+ 'description': 'Specifies the name of the policy assignment, can be used defined or an idempotent name as the defaultValue provides.'
+ }
+}
+
+resource myPolicyAssignment 'Microsoft.Authorization/policyAssignments@2019-09-01' = {
+ name: policyAssignmentName
+ properties: {
+ scope: subscriptionResourceId('Microsoft.Resources/resourceGroups', resourceGroup().name)
+ policyDefinitionId: tenantResourceId('Microsoft.Authorization/policyDefinitions', policyDefinitionID)
+ }
+}
+```
+
+## Next steps
+
+* For a description of the sections in a Bicep file, see [Understand the structure and syntax of Bicep files](./file.md).
+* To iterate a specified number of times when creating a type of resource, see [Deploy multiple instances of resources in Bicep](./loop-resources.md).
+* To see how to deploy the Bicep file you've created, see [Deploy resources with Bicep and Azure PowerShell](./deploy-powershell.md).
azure-resource-manager Bicep Functions Scope https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions-scope.md
+
+ Title: Bicep functions - scopes
+description: Describes the functions to use in a Bicep file to retrieve values about deployment scopes.
+ Last updated : 06/01/2021++
+# Scope functions for Bicep
+
+Resource Manager provides the following functions for getting scope values in your Bicep file:
+
+* [managementGroup](#managementgroup)
+* [resourceGroup](#resourcegroup)
+* [subscription](#subscription)
+* [tenant](#tenant)
+
+## managementGroup
+
+`managementGroup()`
+
+`managementGroup(name)`
+
+Returns an object used for setting the scope to a management group.
+
+### Remarks
+
+`managementGroup()` can only be used on a [management group deployments](deploy-to-management-group.md). It returns the current management group for the deployment operation.
+
+`managementGroup(name)` can be used for any deployment scope.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| name |No |string |The unique identifier for the management group to deploy to. Don't use the display name for the management group. If you don't provide a value, the current management group is returned. |
+
+### Return value
+
+An object used for setting the `scope` property on a [module](modules.md#configure-module-scopes) or [extension resource type](scope-extension-resources.md).
+
+### Management group example
+
+The following example sets the scope for a module to a management group.
+
+```bicep
+param managementGroupName string
+
+module 'module.bicep' = {
+ name: 'deployToMG'
+ scope: managementGroup(managementGroupName)
+}
+```
+
+## resourceGroup
+
+`resourceGroup()`
+
+`resourceGroup(resourceGroupName)`
+
+`resourceGroup(subscriptionId, resourceGroupName)`
+
+Returns an object used for setting the scope to a resource group.
+
+Or
+
+Returns an object that represents the current resource group.
+
+### Remarks
+
+The resourceGroup function has two distinct uses. One usage is for setting the scope on a [module](modules.md#configure-module-scopes) or [extension resource type](scope-extension-resources.md). The other usage is for getting details about the current resource group. The placement of the function determines its usage. When used to set the `scope` property, it returns a scope object.
+
+`resourceGroup()` can be used for either setting scope or getting details about the resource group.
+
+`resourceGroup(resourceGroupName)` and `resourceGroup(subscriptionId, resourceGroupName)` can only be used for setting scope.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| resourceGroupName |No |string | The name of the resource group to deploy to. If you don't provide a value, the current resource group is returned. |
+| subscriptionId |No |string |The unique identifier for the subscription to deploy to. If you don't provide a value, the current subscription is returned. |
+
+### Return value
+
+When used for setting scope, the function returns an object that is valid for the `scope` property on a module or extension resource type.
+
+When used for getting details about the resource group, the function returns the following format:
+
+```json
+{
+ "id": "/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}",
+ "name": "{resourceGroupName}",
+ "type":"Microsoft.Resources/resourceGroups",
+ "location": "{resourceGroupLocation}",
+ "managedBy": "{identifier-of-managing-resource}",
+ "tags": {
+ },
+ "properties": {
+ "provisioningState": "{status}"
+ }
+}
+```
+
+The **managedBy** property is returned only for resource groups that contain resources that are managed by another service. For Managed Applications, Databricks, and AKS, the value of the property is the resource ID of the managing resource.
+
+### Resource group example
+
+The following example scopes a module to a resource group.
+
+```bicep
+param resourceGroupName string
+
+module exampleModule 'module.bicep' = {
+ name: 'exampleModule'
+ scope: resourceGroup(resourceGroupName)
+}
+```
+
+The next example returns the properties of the resource group.
+
+```bicep
+output resourceGroupOutput object = resourceGroup()
+```
+
+It returns an object in the following format:
+
+```json
+{
+ "id": "/subscriptions/{subscription-id}/resourceGroups/examplegroup",
+ "name": "examplegroup",
+ "type":"Microsoft.Resources/resourceGroups",
+ "location": "southcentralus",
+ "properties": {
+ "provisioningState": "Succeeded"
+ }
+}
+```
+
+A common use of the resourceGroup function is to create resources in the same location as the resource group. The following example uses the resource group location for a default parameter value.
+
+```bicep
+param location string = resourceGroup().location
+```
+
+You can also use the resourceGroup function to apply tags from the resource group to a resource. For more information, see [Apply tags from resource group](../management/tag-resources.md#apply-tags-from-resource-group).
+
+## subscription
+
+`subscription()`
+
+`subscription(subscriptionId)`
+
+Returns an object used for setting the scope to a subscription.
+
+Or
+
+Returns details about the subscription for the current deployment.
+
+### Remarks
+
+The subscription function has two distinct uses. One usage is for setting the scope on a [module](modules.md#configure-module-scopes) or [extension resource type](scope-extension-resources.md). The other usage is for getting details about the current subscription. The placement of the function determines its usage. When used to set the `scope` property, it returns a scope object.
+
+`subscription(subscriptionId)` can only be used for setting scope.
+
+`subscription()` can be used for setting scope or getting details about the subscription.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| subscriptionId |No |string |The unique identifier for the subscription to deploy to. If you don't provide a value, the current subscription is returned. |
+
+### Return value
+
+When used for setting scope, the function returns an object that is valid for the `scope` property on a module or extension resource type.
+
+When used for getting details about the subscription, the function returns the following format:
+
+```json
+{
+ "id": "/subscriptions/{subscription-id}",
+ "subscriptionId": "{subscription-id}",
+ "tenantId": "{tenant-id}",
+ "displayName": "{name-of-subscription}"
+}
+```
+
+### Subscription example
+
+The following example scopes a module to the subscription.
+
+```bicep
+module exampleModule 'module.bicep' = {
+ name: 'deployToSub'
+ scope: subscription()
+}
+```
+
+The next example returns the details for a subscription.
+
+```bicep
+output subscriptionOutput object = subscription()
+```
+
+## tenant
+
+`tenant()`
+
+Returns an object used for setting the scope to the tenant.
+
+### Remarks
+
+`tenant()` can be used with any deployment scope. It always returns the current tenant.
+
+### Return value
+
+An object used for setting the `scope` property on a [module](modules.md#configure-module-scopes) or [extension resource type](scope-extension-resources.md).
+
+### Tenant example
+
+The following example shows a module deployed to the tenant.
+
+```bicep
+module exampleModule 'module.bicep' = {
+ name: 'deployToTenant'
+ scope: tenant()
+}
+```
+
+## Next steps
+
+To learn more about deployment scopes, see:
+
+* [Resource group deployments](deploy-to-resource-group.md)
+* [Subscription deployments](deploy-to-subscription.md)
+* [Management group deployments](deploy-to-management-group.md)
+* [Tenant deployments](deploy-to-tenant.md)
azure-resource-manager Bicep Functions String https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/bicep-functions-string.md
+
+ Title: Bicep functions - string
+description: Describes the functions to use in a Bicep file to work with strings.
+++ Last updated : 06/01/2021++
+# String functions for Bicep
+
+Resource Manager provides the following functions for working with strings in your Bicep file:
+
+* [base64](#base64)
+* [base64ToJson](#base64tojson)
+* [base64ToString](#base64tostring)
+* [concat](#concat)
+* [contains](#contains)
+* [dataUri](#datauri)
+* [dataUriToString](#datauritostring)
+* [empty](#empty)
+* [endsWith](#endswith)
+* [first](#first)
+* [format](#format)
+* [guid](#guid)
+* [indexOf](#indexof)
+* [json](#json)
+* [last](#last)
+* [lastIndexOf](#lastindexof)
+* [length](#length)
+* [newGuid](#newguid)
+* [padLeft](#padleft)
+* [replace](#replace)
+* [skip](#skip)
+* [split](#split)
+* [startsWith](#startswith)
+* [string](#string)
+* [substring](#substring)
+* [take](#take)
+* [toLower](#tolower)
+* [toUpper](#toupper)
+* [trim](#trim)
+* [uniqueString](#uniquestring)
+* [uri](#uri)
+* [uriComponent](#uricomponent)
+* [uriComponentToString](#uricomponenttostring)
+
+## base64
+
+`base64(inputString)`
+
+Returns the base64 representation of the input string.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| inputString |Yes |string |The value to return as a base64 representation. |
+
+### Return value
+
+A string containing the base64 representation.
+
+### Examples
+
+The following example shows how to use the base64 function.
+
+```bicep
+param stringData string = 'one, two, three'
+param jsonFormattedData string = '{\'one\': \'a\', \'two\': \'b\'}'
+
+var base64String = base64(stringData)
+var base64Object = base64(jsonFormattedData)
+
+output base64Output string = base64String
+output toStringOutput string = base64ToString(base64String)
+output toJsonOutput object = base64ToJson(base64Object)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| base64Output | String | b25lLCB0d28sIHRocmVl |
+| toStringOutput | String | one, two, three |
+| toJsonOutput | Object | {"one": "a", "two": "b"} |
+
+## base64ToJson
+
+`base64tojson`
+
+Converts a base64 representation to a JSON object.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| base64Value |Yes |string |The base64 representation to convert to a JSON object. |
+
+### Return value
+
+A JSON object.
+
+### Examples
+
+The following example uses the base64ToJson function to convert a base64 value:
+
+```bicep
+param stringData string = 'one, two, three'
+param jsonFormattedData string = '{\'one\': \'a\', \'two\': \'b\'}'
+
+var base64String = base64(stringData)
+var base64Object = base64(jsonFormattedData)
+
+output base64Output string = base64String
+output toStringOutput string = base64ToString(base64String)
+output toJsonOutput object = base64ToJson(base64Object)
+
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| base64Output | String | b25lLCB0d28sIHRocmVl |
+| toStringOutput | String | one, two, three |
+| toJsonOutput | Object | {"one": "a", "two": "b"} |
+
+## base64ToString
+
+`base64ToString(base64Value)`
+
+Converts a base64 representation to a string.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| base64Value |Yes |string |The base64 representation to convert to a string. |
+
+### Return value
+
+A string of the converted base64 value.
+
+### Examples
+
+The following example uses the base64ToString function to convert a base64 value:
+
+```bicep
+param stringData string = 'one, two, three'
+param jsonFormattedData string = '{\'one\': \'a\', \'two\': \'b\'}'
+
+var base64String = base64(stringData)
+var base64Object = base64(jsonFormattedData)
+
+output base64Output string = base64String
+output toStringOutput string = base64ToString(base64String)
+output toJsonOutput object = base64ToJson(base64Object)
+```
+++
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| base64Output | String | b25lLCB0d28sIHRocmVl |
+| toStringOutput | String | one, two, three |
+| toJsonOutput | Object | {"one": "a", "two": "b"} |
+
+## concat
+
+Instead of using the concat function, use string interpolation.
+
+```bicep
+param prefix string = 'prefix'
+
+output concatOutput string = '${prefix}And${uniqueString(resourceGroup().id)}'
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| concatOutput | String | prefixAnd5yj4yjf5mbg72 |
+
+## contains
+
+`contains (container, itemToFind)`
+
+Checks whether an array contains a value, an object contains a key, or a string contains a substring. The string comparison is case-sensitive. However, when testing if an object contains a key, the comparison is case-insensitive.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| container |Yes |array, object, or string |The value that contains the value to find. |
+| itemToFind |Yes |string or int |The value to find. |
+
+### Return value
+
+**True** if the item is found; otherwise, **False**.
+
+### Examples
+
+The following example shows how to use contains with different types:
+
+```bicep
+param stringToTest string = 'OneTwoThree'
+param objectToTest object = {
+ 'one': 'a'
+ 'two': 'b'
+ 'three': 'c'
+}
+param arrayToTest array = [
+ 'one'
+ 'two'
+ 'three'
+]
+
+output stringTrue bool = contains(stringToTest, 'e')
+output stringFalse bool = contains(stringToTest, 'z')
+output objectTrue bool = contains(objectToTest, 'one')
+output objectFalse bool = contains(objectToTest, 'a')
+output arrayTrue bool = contains(arrayToTest, 'three')
+output arrayFalse bool = contains(arrayToTest, 'four')
+```
+++
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| stringTrue | Bool | True |
+| stringFalse | Bool | False |
+| objectTrue | Bool | True |
+| objectFalse | Bool | False |
+| arrayTrue | Bool | True |
+| arrayFalse | Bool | False |
+
+## dataUri
+
+`dataUri(stringToConvert)`
+
+Converts a value to a data URI.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| stringToConvert |Yes |string |The value to convert to a data URI. |
+
+### Return value
+
+A string formatted as a data URI.
+
+### Examples
+
+The following example converts a value to a data URI, and converts a data URI to a string:
+
+```bicep
+param stringToTest string = 'Hello'
+param dataFormattedString string = 'data:;base64,SGVsbG8sIFdvcmxkIQ=='
+
+output dataUriOutput string = dataUri(stringToTest)
+output toStringOutput string = dataUriToString(dataFormattedString)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| dataUriOutput | String | data:text/plain;charset=utf8;base64,SGVsbG8= |
+| toStringOutput | String | Hello, World! |
+
+## dataUriToString
+
+`dataUriToString(dataUriToConvert)`
+
+Converts a data URI formatted value to a string.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| dataUriToConvert |Yes |string |The data URI value to convert. |
+
+### Return value
+
+A string containing the converted value.
+
+### Examples
+
+The following example converts a value to a data URI, and converts a data URI to a string:
+
+```bicep
+param stringToTest string = 'Hello'
+param dataFormattedString string = 'data:;base64,SGVsbG8sIFdvcmxkIQ=='
+
+output dataUriOutput string = dataUri(stringToTest)
+output toStringOutput string = dataUriToString(dataFormattedString)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| dataUriOutput | String | data:text/plain;charset=utf8;base64,SGVsbG8= |
+| toStringOutput | String | Hello, World! |
+
+## empty
+
+`empty(itemToTest)`
+
+Determines if an array, object, or string is empty.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| itemToTest |Yes |array, object, or string |The value to check if it's empty. |
+
+### Return value
+
+Returns **True** if the value is empty; otherwise, **False**.
+
+### Examples
+
+The following example checks whether an array, object, and string are empty.
+
+```bicep
+param testArray array = []
+param testObject object = {}
+param testString string = ''
+
+output arrayEmpty bool = empty(testArray)
+output objectEmpty bool = empty(testObject)
+output stringEmpty bool = empty(testString)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayEmpty | Bool | True |
+| objectEmpty | Bool | True |
+| stringEmpty | Bool | True |
+
+## endsWith
+
+`endsWith(stringToSearch, stringToFind)`
+
+Determines whether a string ends with a value. The comparison is case-insensitive.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| stringToSearch |Yes |string |The value that contains the item to find. |
+| stringToFind |Yes |string |The value to find. |
+
+### Return value
+
+**True** if the last character or characters of the string match the value; otherwise, **False**.
+
+### Examples
+
+The following example shows how to use the startsWith and endsWith functions:
+
+```bicep
+output startsTrue bool = startsWith('abcdef', 'ab')
+output startsCapTrue bool = startsWith('abcdef', 'A')
+output startsFalse bool = startsWith('abcdef', 'e')
+output endsTrue bool = endsWith('abcdef', 'ef')
+output endsCapTrue bool = endsWith('abcdef', 'F')
+output endsFalse bool = endsWith('abcdef', 'e')
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| startsTrue | Bool | True |
+| startsCapTrue | Bool | True |
+| startsFalse | Bool | False |
+| endsTrue | Bool | True |
+| endsCapTrue | Bool | True |
+| endsFalse | Bool | False |
+
+## first
+
+`first(arg1)`
+
+Returns the first character of the string, or first element of the array.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array or string |The value to retrieve the first element or character. |
+
+### Return value
+
+A string of the first character, or the type (string, int, array, or object) of the first element in an array.
+
+### Examples
+
+The following example shows how to use the first function with an array and string.
+
+```bicep
+param arrayToTest array = [
+ 'one'
+ 'two'
+ 'three'
+]
+
+output arrayOutput string = first(arrayToTest)
+output stringOutput string = first('One Two Three')
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayOutput | String | one |
+| stringOutput | String | O |
+
+## format
+
+`format(formatString, arg1, arg2, ...)`
+
+Creates a formatted string from input values.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| formatString | Yes | string | The composite format string. |
+| arg1 | Yes | string, integer, or boolean | The value to include in the formatted string. |
+| additional arguments | No | string, integer, or boolean | Additional values to include in the formatted string. |
+
+### Remarks
+
+Use this function to format a string in your Bicep file. It uses the same formatting options as the [System.String.Format](/dotnet/api/system.string.format) method in .NET.
+
+### Examples
+
+The following example shows how to use the format function.
+
+```bicep
+param greeting string = 'Hello'
+param name string = 'User'
+param numberToFormat int = 8175133
+
+output formatTest string = format('{0}, {1}. Formatted number: {2:N0}', greeting, name, numberToFormat)
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| formatTest | String | Hello, User. Formatted number: 8,175,133 |
+
+## guid
+
+`guid(baseString, ...)`
+
+Creates a value in the format of a globally unique identifier based on the values provided as parameters.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| baseString |Yes |string |The value used in the hash function to create the GUID. |
+| additional parameters as needed |No |string |You can add as many strings as needed to create the value that specifies the level of uniqueness. |
+
+### Remarks
+
+This function is helpful when you need to create a value in the format of a globally unique identifier. You provide parameter values that limit the scope of uniqueness for the result. You can specify whether the name is unique down to subscription, resource group, or deployment.
+
+The returned value isn't a random string, but rather the result of a hash function on the parameters. The returned value is 36 characters long. It isn't globally unique. To create a new GUID that isn't based on that hash value of the parameters, use the [newGuid](#newguid) function.
+
+The following examples show how to use guid to create a unique value for commonly used levels.
+
+Unique scoped to subscription
+
+```bicep
+guid(subscription().subscriptionId)
+```
+
+Unique scoped to resource group
+
+```bicep
+guid(resourceGroup().id)
+```
+
+Unique scoped to deployment for a resource group
+
+```bicep
+guid(resourceGroup().id, deployment().name)
+```
+
+### Return value
+
+A string containing 36 characters in the format of a globally unique identifier.
+
+### Examples
+
+The following example returns results from guid:
+
+```bicep
+output guidPerSubscription string = guid(subscription().subscriptionId)
+output guidPerResourceGroup string = guid(resourceGroup().id)
+output guidPerDeployment string = guid(resourceGroup().id, deployment().name)
+```
+
+## indexOf
+
+`indexOf(stringToSearch, stringToFind)`
+
+Returns the first position of a value within a string. The comparison is case-insensitive.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| stringToSearch |Yes |string |The value that contains the item to find. |
+| stringToFind |Yes |string |The value to find. |
+
+### Return value
+
+An integer that represents the position of the item to find. The value is zero-based. If the item isn't found, -1 is returned.
+
+### Examples
+
+The following example shows how to use the indexOf and lastIndexOf functions:
+
+```bicep
+output firstT int = indexOf('test', 't')
+output lastT int = lastIndexOf('test', 't')
+output firstString int = indexOf('abcdef', 'CD')
+output lastString int = lastIndexOf('abcdef', 'AB')
+output notFound int = indexOf('abcdef', 'z')
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| firstT | Int | 0 |
+| lastT | Int | 3 |
+| firstString | Int | 2 |
+| lastString | Int | 0 |
+| notFound | Int | -1 |
+
+<a id="json"></a>
+
+## json
+
+`json(arg1)`
+
+Converts a valid JSON string into a JSON data type. For more information, see [json function](./bicep-functions-object.md#json).
+
+## last
+
+`last (arg1)`
+
+Returns last character of the string, or the last element of the array.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| arg1 |Yes |array or string |The value to retrieve the last element or character. |
+
+### Return value
+
+A string of the last character, or the type (string, int, array, or object) of the last element in an array.
+
+### Examples
+
+The following example shows how to use the last function with an array and string.
+
+```bicep
+param arrayToTest array = [
+ 'one'
+ 'two'
+ 'three'
+]
+
+output arrayOutput string = last(arrayToTest)
+output stringOutput string = last('One Two Three')
+```
+
+The output from the preceding example with the default values is:
+
+| Name | Type | Value |
+| - | - | -- |
+| arrayOutput | String | three |
+| stringOutput | String | e |
+
+## lastIndexOf
+
+`lastIndexOf(stringToSearch, stringToFind)`
+
+Returns the last position of a value within a string. The comparison is case-insensitive.
+
+### Parameters
+
+| Parameter | Required | Type | Description |
+|: |: |: |: |
+| stringToSearch |Yes |string |The value that contains the item to find. |
+| stringToFind |Yes |string |The value to find. |
+
+### Return value
+
+An integer that represents the last position of the item to find. The value is zero-based. If the item isn't found, -1 is returned.
+
+### Examples
+
+The following example shows how to use the indexOf and lastIndexOf functions:
+
+```bicep
+output firstT int = indexOf('test', 't')
+output lastT int = lastIndexOf('test', 't')
+output firstString int = indexOf('abcdef', 'CD')
+output lastString int = lastIndexOf('abcdef', 'AB')
+output notFound int = indexOf('abcdef', 'z')
+```
+
+The