Updates from: 10/29/2022 01:10:35
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Custom Email Mailjet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/custom-email-mailjet.md
Add the following technical profiles to the `<ClaimsProviders>` element.
<Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.OneTimePasswordProtocolProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" /> <Metadata> <Item Key="Operation">GenerateCode</Item>
- <Item Key="CodeExpirationInSeconds">1200</Item>
+ <Item Key="CodeExpirationInSeconds">600</Item>
<Item Key="CodeLength">6</Item> <Item Key="CharacterSet">0-9</Item>
- <Item Key="ReuseSameCode">true</Item>
<Item Key="NumRetryAttempts">5</Item>
+ <Item Key="NumCodeGenerationAttempts">10</Item>
+ <Item Key="ReuseSameCode">false</Item>
</Metadata> <InputClaims> <InputClaim ClaimTypeReferenceId="email" PartnerClaimType="identifier" />
active-directory-b2c Custom Email Sendgrid https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/custom-email-sendgrid.md
Add the following technical profiles to the `<ClaimsProviders>` element.
<Protocol Name="Proprietary" Handler="Web.TPEngine.Providers.OneTimePasswordProtocolProvider, Web.TPEngine, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" /> <Metadata> <Item Key="Operation">GenerateCode</Item>
- <Item Key="CodeExpirationInSeconds">1200</Item>
+ <Item Key="CodeExpirationInSeconds">600</Item>
<Item Key="CodeLength">6</Item> <Item Key="CharacterSet">0-9</Item>
- <Item Key="ReuseSameCode">true</Item>
<Item Key="NumRetryAttempts">5</Item>
+ <Item Key="NumCodeGenerationAttempts">10</Item>
+ <Item Key="ReuseSameCode">false</Item>
</Metadata> <InputClaims> <InputClaim ClaimTypeReferenceId="email" PartnerClaimType="identifier" />
active-directory-b2c Error Codes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/error-codes.md
Previously updated : 07/18/2022 Last updated : 10/28/2022
The following errors can be returned by the Azure Active Directory B2C service.
| `AADB2C90016` | The requested client assertion type '{0}' does not match the expected type '{1}'. | deprecated | | `AADB2C90017` | The client assertion provided in the request is invalid: {0} | deprecated | | `AADB2C90018` | The client ID '{0}' specified in the request is not registered in tenant '{1}'. | [Register a web application](tutorial-register-applications.md), [Sending authentication requests](openid-connect.md#send-authentication-requests) |
-| `AADB2C90019` | The key container with ID '{0}' in tenant '{1}' does not has a valid key. Reason: {2}. | |
+| `AADB2C90019` | The key container with ID '{0}' in tenant '{1}' does not have a valid key. Reason: {2}. | |
| `AADB2C90021` | The technical profile '{0}' does not exist in the policy '{1}' of tenant '{2}'. | | | `AADB2C90022` | Unable to return metadata for the policy '{0}' in tenant '{1}'. | [Share the application's metadata publicly](saml-service-provider.md) | | `AADB2C90023` | Profile '{0}' does not contain the required metadata key '{1}'. | |
The following errors can be returned by the Azure Active Directory B2C service.
| `AADB2C90031` | Policy '{0}' does not specify a default user journey. Ensure that the policy or it's parents specify a default user journey as part of a relying party section. | [Default user journey](relyingparty.md#defaultuserjourney) | | `AADB2C90035` | The service is temporarily unavailable. Please retry after a few minutes. | | | `AADB2C90036` | The request does not contain a URI to redirect the user to post logout. Specify a URI in the post_logout_redirect_uri parameter field. | [Send a sign-out request](openid-connect.md#send-a-sign-out-request) |
-| `AADB2C90037` | An error occurred while processing the request. Please contact administrator of the site you are trying to access. | |
+| `AADB2C90037` | An error occurred while processing the request. Please locate the `CorellationId` from the response. | [Submit a new support request](find-help-open-support-ticket.md), and include the `CorrelationId`. |
| `AADB2C90039` | The request contains a client assertion, but the provided policy '{0}' in tenant '{1}' is missing a client_secret in RelyingPartyPolicy. | deprecated | | `AADB2C90040` | User journey '{0}' does not contain a send claims step. | [User journey orchestration steps](userjourneys.md#orchestrationsteps) | | `AADB2C90043` | The prompt included in the request contains invalid values. Expected 'none', 'login', 'consent' or 'select_account'. | |
active-directory-b2c One Time Password Technical Profile https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory-b2c/one-time-password-technical-profile.md
The following settings can be used to configure code generation mode:
| Attribute | Required | Description | | | -- | -- |
+| Operation | Yes | The operation to be performed. Possible value: `GenerateCode`. |
| CodeExpirationInSeconds | No | Time in seconds until code expiration. Minimum: `60`; Maximum: `1200`; Default: `600`. Every time a code is provided (same code using `ReuseSameCode`, or a new code), the code expiration is extended. This time is also used to set retry timeout (once max attempts are reached, user is locked out from attempting to obtain new codes until this time expires) | | CodeLength | No | Length of the code. The default value is `6`. | | CharacterSet | No | The character set for the code, formatted for use in a regular expression. For example, `a-z0-9A-Z`. The default value is `0-9`. The character set must include a minimum of 10 different characters in the set specified. | | NumRetryAttempts | No | The number of verification attempts before the code is considered invalid. The default value is `5`. For example, if you set NumRetryAttempts to 2 it will allow you only 2 attempts in total (first + 1 retry). For the 3rd attempt it will throw max attempts reached irrespective of whether the code is correct or not.|
-| NumCodeGenerationAttempts | No | The number of maximum code generation attempts per identifier. The default value is 10 if not specified. |
-| Operation | Yes | The operation to be performed. Possible value: `GenerateCode`. |
+| NumCodeGenerationAttempts | No | The number of maximum code generation attempts per identifier. The default value is `10` if not specified. |
| ReuseSameCode | No | Whether the same code should be given rather than generating a new code when given code has not expired and is still valid. The default value is `false`. |
The following example `TechnicalProfile` is used for generating a code:
<Item Key="CodeLength">6</Item> <Item Key="CharacterSet">0-9</Item> <Item Key="NumRetryAttempts">5</Item>
- <Item Key="NumCodeGenerationAttempts">15</Item>
+ <Item Key="NumCodeGenerationAttempts">10</Item>
<Item Key="ReuseSameCode">false</Item> </Metadata> <InputClaims>
active-directory How To Mfa Number Match https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/authentication/how-to-mfa-number-match.md
This topic covers how to enable number matching in Microsoft Authenticator push notifications to improve user sign-in security. >[!NOTE]
->Number matching is a key security upgrade to traditional second factor notifications in Microsoft Authenticator that will begin to be enabled by default for all users starting February 27, 2023.<br>
+>Number matching is a key security upgrade to traditional second factor notifications in Microsoft Authenticator that will begin to be enabled by default for all users starting February 28, 2023.<br>
>We highly recommend enabling number matching in the near-term for improved sign-in security. ## Prerequisites
To enable number matching in the Azure AD portal, complete the following steps:
### When will my tenant see number matching if I don't use the Azure portal or Graph API to roll out the change?
-Number match will be enabled for all users of Microsoft Authenticator app after February 27, 2023. Relevant services will begin deploying these changes after February 27, 2023 and users will start to see number match in approval requests. As services deploy, some may see number match while others don't. To ensure consistent behavior for all your users, we highly recommend you use the Azure portal or Graph API to roll out number match for all Microsoft Authenticator users.
+Number match will be enabled for all users of Microsoft Authenticator app after February 28, 2023. Relevant services will begin deploying these changes after February 28, 2023 and users will start to see number match in approval requests. As services deploy, some may see number match while others don't. To ensure consistent behavior for all your users, we highly recommend you use the Azure portal or Graph API to roll out number match for all Microsoft Authenticator users.
### Can I opt out of number matching?
active-directory Active Directory Acs Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/azuread-dev/active-directory-acs-migration.md
In these cases, you might want to consider migrating your web application to ano
![This image shows the Auth0 logo](./media/active-directory-acs-migration/rsz-auth0.png)
-[Auth0](https://auth0.com/acs) is a flexible cloud identity service that has created [high-level migration guidance for customers of Access Control](https://auth0.com/acs), and supports nearly every feature that ACS does.
+[Auth0](https://auth0.com/access-management) is a flexible cloud identity service that has created [high-level migration guidance for customers of Access Control](https://auth0.com/access-management), and supports nearly every feature that ACS does.
![This image shows the Ping Identity logo](./media/active-directory-acs-migration/rsz-ping.png)
In these cases, you might consider migrating your web application to another clo
![This image shows the Auth0 logo](./media/active-directory-acs-migration/rsz-auth0.png)
-[Auth0](https://auth0.com/acs) is a flexible cloud identity service that has created [high-level migration guidance for customers of Access Control](https://auth0.com/acs), and supports nearly every feature that ACS does.
+[Auth0](https://auth0.com/access-management) is a flexible cloud identity service that has created [high-level migration guidance for customers of Access Control](https://auth0.com/access-management), and supports nearly every feature that ACS does.
![This image shows the Ping Identity logo](./media/active-directory-acs-migration/rsz-ping.png) [Ping Identity](https://www.pingidentity.com) offers two solutions similar to ACS. PingOne is a cloud identity service that supports many of the same features as ACS, and PingFederate is a similar on premises identity product that offers more flexibility. Refer to Ping's ACS retirement guidance for more details on using these products.
active-directory Msal National Cloud https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/msal-national-cloud.md
To enable your MSAL Python application for sovereign clouds:
To enable your MSAL for Java application for sovereign clouds: - Register your application in a specific portal, depending on the cloud. For more information on how to choose the portal refer [App registration endpoints](authentication-national-cloud.md#app-registration-endpoints)-- Use any of the [samples](https://github.com/AzureAD/microsoft-authentication-library-for-java/tree/dev/src/samples) from the repo with a few changes to the configuration, depending on the cloud, which are mentioned next.
+- Use any of the [samples](https://github.com/AzureAD/microsoft-authentication-library-for-java/tree/dev/msal4j-sdk/src/samples) from the repo with a few changes to the configuration, depending on the cloud, which are mentioned next.
- Use a specific authority, depending on the cloud you registered the application in. For more information on authorities for different clouds, refer [Azure AD Authentication endpoints](authentication-national-cloud.md#azure-ad-authentication-endpoints). Here's an example authority:
National cloud documentation:
- [Azure Government](../../azure-government/index.yml) - [Azure China 21Vianet](/azure/china/)-- [Azure Germany (closes on October 29, 2021)](../../germany/index.yml)
+- [Azure Germany (closes on October 29, 2021)](../../germany/index.yml)
active-directory Msal Net Token Cache Serialization https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/msal-net-token-cache-serialization.md
The recommendation is:
## [ASP.NET Core web apps and web APIs](#tab/aspnetcore)
-The [Microsoft.Identity.Web.TokenCache](https://www.nuget.org/packages/Microsoft.Identity.Web.TokenCache) NuGet package provides token cache serialization within the [Microsoft.Identity.Web](https://github.com/AzureAD/microsoft-identity-web) library.
+The [Microsoft.Identity.Web.TokenCache](https://www.nuget.org/packages/Microsoft.Identity.Web.TokenCache) NuGet package provides token cache serialization within the [Microsoft.Identity.Web](https://github.com/AzureAD/microsoft-identity-web) library.
+
+If you're using the MSAL library directly in an ASP.NET Core app, consider moving to use [Microsoft.Identity.Web](https://github.com/AzureAD/microsoft-identity-web), which provides a simpler, higher-level API. Otherwise, see the [Non-ASP.NET Core web apps and web APIs](/azure/active-directory/develop/msal-net-token-cache-serialization?tabs=aspnet#configuring-the-token-cache), which covers direct MSAL usage.
+ | Extension method | Description | | - | |
namespace CommonCacheMsalV3
} ```
+For more details see the sample: https://github.com/Azure-Samples/active-directory-dotnet-v1-to-v2/tree/master/TokenCacheMigration/ADAL2MSAL
++ ## Monitor cache hit ratios and cache performance
active-directory Msal Net Xamarin Android Considerations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/msal-net-xamarin-android-considerations.md
protected override void OnActivityResult(int requestCode,
To support System WebView, the *AndroidManifest.xml* file should contain the following values: ```xml
-<activity android:name="microsoft.identity.client.BrowserTabActivity" android:configChanges="orientation|screenSize">
+<activity android:name="microsoft.identity.client.BrowserTabActivity" android:configChanges="orientation|screenSize" android:exported="true">
<intent-filter> <action android:name="android.intent.action.VIEW" /> <category android:name="android.intent.category.DEFAULT" />
Alternatively, [create the activity in code](/xamarin/android/platform/android-m
Here's an example of a class that represents the values of the XML file: ```csharp
- [Activity]
+ [Activity(Exported = true)]
[IntentFilter(new[] { Intent.ActionView }, Categories = new[] { Intent.CategoryBrowsable, Intent.CategoryDefault }, DataHost = "auth",
active-directory V2 Protocols Oidc https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/v2-protocols-oidc.md
The value of `{tenant}` varies based on the application's sign-in audience as sh
| `8eaef023-2b34-4da1-9baa-8bc8c9d6a490` or `contoso.onmicrosoft.com` | Only users from a specific Azure AD tenant (directory members with a work or school account or directory guests with a personal Microsoft account) can sign in to the application. <br/><br/>The value can be the domain name of the Azure AD tenant or the tenant ID in GUID format. You can also use the consumer tenant GUID, `9188040d-6c67-4c5b-b112-36a304b66dad`, in place of `consumers`. | > [!TIP]
-> Note that when using the `common` or `consumers` authority for personal Microsoft accounts, the consuming resource application must be configured to support such type of accounts in accordance with [signInAudience](https://learn.microsoft.com/en-us/azure/active-directory/develop/supported-accounts-validation).
+> Note that when using the `common` or `consumers` authority for personal Microsoft accounts, the consuming resource application must be configured to support such type of accounts in accordance with [signInAudience](/azure/active-directory/develop/supported-accounts-validation).
You can also find your app's OpenID configuration document URI in its app registration in the Azure portal.
active-directory Workload Identity Federation Considerations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/workload-identity-federation-considerations.md
The following table describes limits on requests to the user-assigned managed id
| Operation | Requests-per-second per Azure AD tenant | Requests-per-second per subscription | Requests-per-second per resource | |-|-|-|-|
-| [Create or update](/rest/api/managedidentity/user-assigned-identities/create-or-update) requests | 10 | 2 | 0.25 |
-| [Get](/rest/api/managedidentity/user-assigned-identities/get) requests | 30 | 10 | 0.5 |
-| [List by resource group](/rest/api/managedidentity/user-assigned-identities/list-by-resource-group) or [List by subscription](/rest/api/managedidentity/user-assigned-identities/list-by-subscription) requests | 15 | 5 | 0.25 |
-| [Delete](/rest/api/managedidentity/user-assigned-identities/delete) requests | 10 | 2 | 0.25 |
+| [Create or update](/rest/api/managedidentity/2022-01-31-preview/user-assigned-identities/create-or-update) requests | 10 | 2 | 0.25 |
+| [Get](/rest/api/managedidentity/2022-01-31-preview/user-assigned-identities/get) requests | 30 | 10 | 0.5 |
+| [List by resource group](/rest/api/managedidentity/2022-01-31-preview/user-assigned-identities/list-by-resource-group) or [List by subscription](/rest/api/managedidentity/2022-01-31-preview/user-assigned-identities/list-by-subscription) requests | 15 | 5 | 0.25 |
+| [Delete](/rest/api/managedidentity/2022-01-31-preview/user-assigned-identities/delete) requests | 10 | 2 | 0.25 |
## Errors
The following error codes may be returned when creating, updating, getting, list
| 400 | Federated Identity Credential name '{ficName}' is invalid. | Alphanumeric, dash, underscore, no more than 3-120 symbols. First symbol is alphanumeric. | | 404 | The parent user-assigned identity doesn't exist. | Check user assigned identity name in federated identity credentials resource path. | | 400 | Issuer and subject combination already exists for this Managed Identity. | This is a constraint. List all federated identity credentials associated with the user-assigned identity to find existing federated identity credential. |
-| 409 | Conflict | Concurrent write request to federated identity credential resources under the same user-assigned identity has been denied.
+| 409 | Conflict | Concurrent write request to federated identity credential resources under the same user-assigned identity has been denied.
active-directory Workload Identity Federation Create Trust https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/develop/workload-identity-federation-create-trust.md
In the **Federated credential scenario** drop-down box, select **GitHub actions
Specify the **Organization** and **Repository** for your GitHub Actions workflow.
-For **Entity type**, select **Environment**, **Branch**, **Pull request**, or **Tag** and specify the value. The values must exactly match the configuration in the [GitHub workflow](https://docs.github.com/actions/using-workflows/workflow-syntax-for-github-actions#on). For more info, read the [examples](#entity-type-examples).
+For **Entity type**, select **Environment**, **Branch**, **Pull request**, or **Tag** and specify the value. The values must exactly match the configuration in the [GitHub workflow](https://docs.github.com/actions/using-workflows/workflow-syntax-for-github-actions#on). Pattern matching is not supported for branches and tags. Specify an environment if your on-push workflow runs against many branches or tags. For more info, read the [examples](#entity-type-examples).
Add a **Name** for the federated credential.
active-directory Troubleshoot Device Windows Joined https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/troubleshoot-device-windows-joined.md
+
+ Title: Troubleshoot registered, hybrid, and Azure AD joined Windows machines
+description: This article helps you troubleshoot hybrid Azure Active Directory-joined Windows 10 and Windows 11 devices
+++++ Last updated : 08/29/2022++++++
+# Troubleshooting Windows devices in Azure AD
+
+If you have a Windows 11 or Windows 10 device that isn't working with Azure Active Directory (Azure AD) correctly, start your troubleshooting here.
+
+1. Sign in to the **Azure portal**.
+1. Browse to **Azure Active Directory** > **Devices** > **Diagnose and solve problems**.
+1. Select **Troubleshoot** under the **Windows 10+ related issue** troubleshooter.
+ :::image type="content" source="media/troubleshoot-device-windows-joined/devices-troubleshoot-windows.png" alt-text="A screenshot showing the Windows troubleshooter located in the diagnose and solve pane of the Azure portal." lightbox="media/troubleshoot-device-windows-joined/devices-troubleshoot-windows.png":::
+1. Select **instructions** and follow the steps to download, run, and collect the required logs for the troubleshooter to analyze.
+1. Return to the Azure portal when you've collected and zipped the `authlogs` folder and contents.
+1. Select **Browse** and choose the zip file you wish to upload.
+ :::image type="content" source="media/troubleshoot-device-windows-joined/devices-troubleshoot-windows-upload.png" alt-text="A screenshot showing how to browse to select the logs gathered in the previous step to allow the troubleshooter to make recommendations." lightbox="media/troubleshoot-device-windows-joined/devices-troubleshoot-windows-upload.png":::
+
+The troubleshooter will review the contents of the file you uploaded and provide suggested next steps. These next steps may include links to documentation or contacting support for further assistance.
+
+## Next steps
+
+- [Troubleshoot devices by using the dsregcmd command](troubleshoot-device-dsregcmd.md)
+- [Troubleshoot hybrid Azure AD-joined devices](troubleshoot-hybrid-join-windows-current.md)
+- [Troubleshooting hybrid Azure Active Directory joined down-level devices](troubleshoot-hybrid-join-windows-legacy.md)
+- [Troubleshoot pending device state](/troubleshoot/azure/active-directory/pending-devices)
+- [MDM enrollment of Windows 10-based devices](/windows/client-management/mdm/mdm-enrollment-of-windows-devices)
+- [Troubleshooting Windows device enrollment errors in Intune](/troubleshoot/mem/intune/troubleshoot-windows-enrollment-errors)
active-directory Troubleshoot Hybrid Join Windows Current https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/devices/troubleshoot-hybrid-join-windows-current.md
Previously updated : 02/15/2022 Last updated : 08/29/2022
Use Event Viewer to look for the log entries that are logged by the Azure AD Clo
> [!NOTE] > When you're collecting network traces, it's important to *not* use Fiddler during repro.
-1. Run `netsh trace start scenario=internetClient_dbg capture=yes persistent=yes`.
+1. Run `netsh trace start scenario=internetClient_dbg capture=yes persistent=yes`.
1. Lock and unlock the device. For hybrid-joined devices, wait a minute or more to allow the PRT acquisition task to finish. 1. Run `netsh trace stop`. 1. Share the *nettrace.cab* file with Support.
active-directory Licensing Service Plan Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/enterprise-users/licensing-service-plan-reference.md
Previously updated : 10/04/2022 Last updated : 10/28/2022
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
- **Service plans included (friendly names)**: A list of service plans (friendly names) in the product that correspond to the string ID and GUID >[!NOTE]
->This information last updated on October 4th, 2022.<br/>You can also download a CSV version of this table [here](https://download.microsoft.com/download/e/3/e/e3e9faf2-f28b-490a-9ada-c6089a1fc5b0/Product%20names%20and%20service%20plan%20identifiers%20for%20licensing.csv).
+>This information last updated on October 28th, 2022.<br/>You can also download a CSV version of this table [here](https://download.microsoft.com/download/e/3/e/e3e9faf2-f28b-490a-9ada-c6089a1fc5b0/Product%20names%20and%20service%20plan%20identifiers%20for%20licensing.csv).
><br/> | Product name | String ID | GUID | Service plans included | Service plans included (friendly names) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| Microsoft 365 A3 student use benefits | M365EDU_A3_STUUSEBNFT | 18250162-5d87-4436-a834-d795c15c80f3 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a)<br/>UNIVERSAL_PRINT_NO_SEEDING (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Apps for enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Kaizala Pro (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint (Plan 2) for Education (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a)<br/>Universal Print Without Seeding (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Windows 10/11 Enterprise (e7c91390-7625-45be-94e0-e16907e03118)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Common Data Service (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9) | | Microsoft 365 A3 - Unattended License for students use benefit | M365EDU_A3_STUUSEBNFT_RPA1 | 1aa94593-ca12-4254-a738-81a5972958e8 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>OFFICESUBSCRIPTION_unattended (8d77e2d9-9e28-4450-8431-0def64078fc5)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a)<br/>UNIVERSAL_PRINT_NO_SEEDING (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Apps for Enterprise (Unattended) (8d77e2d9-9e28-4450-8431-0def64078fc5)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Kaizala Pro (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint (Plan 2) for Education (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a)<br/>Universal Print Without Seeding (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Windows 10/11 Enterprise (e7c91390-7625-45be-94e0-e16907e03118)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Common Data Service (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9) | | Microsoft 365 A5 for Faculty | M365EDU_A5_FACULTY | e97c048c-37a4-45fb-ab50-922fbf07a370 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EXCEL_PREMIUM (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>OFFICE_FORMS_PLAN_3 (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>INSIDER_RISK_MANAGEMENT (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>VIVA_LEARNING_SEEDED (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Excel Advanced Analytics (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>Microsoft Forms (Plan 3) (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft Kaizala Pro (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (Plan 3) (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>Microsoft Insider Risk Management (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint (Plan 2) for Education (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Viva Learning Seeded (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a)<br/>Microsoft Defender for Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Windows 10/11 Enterprise (e7c91390-7625-45be-94e0-e16907e03118)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Common Data Service (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Power Automate for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Power Virtual Agents for Office 365 (ded3d325-1bdc-453e-8432-5bac26d7a014) |
-| Microsoft 365 A5 for Students | M365EDU_A5_STUDENT | 46c119d4-0379-4a9d-85e4-97c66d3f909e | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>OFFICE_FORMS_PLAN_3 (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Advanced Threat Protection (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Flow for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Cloud App Security (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>MICROSOFT DEFENDER FOR ENDPOINT (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Microsoft Forms (Plan 3) (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E5 SKU (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PowerApps for Office 365 Plan 3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
-| Microsoft 365 A5 for students use benefit | M365EDU_A5_STUUSEBNFT | 31d57bc7-3a05-4867-ab53-97a17835a411 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>UNIVERSAL_PRINT_NO_SEEDING (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Common Data Service - O365 P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection and Governance Analytics ΓÇô Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft 365 Apps for enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Cloud App Security (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Universal Print Without Seeding (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
+| Microsoft 365 A5 for students | M365EDU_A5_STUDENT | 46c119d4-0379-4a9d-85e4-97c66d3f909e | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EXCEL_PREMIUM (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>OFFICE_FORMS_PLAN_3 (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>INSIDER_RISK_MANAGEMENT (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Excel Advanced Analytics (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>Microsoft Forms (Plan 3) (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft Kaizala Pro (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (Plan 3) (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>RETIRED - Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>RETIRED - Microsoft Insider Risk Management (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint (Plan 2) for Education (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a)<br/>Microsoft Defender for Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Windows 10/11 Enterprise (e7c91390-7625-45be-94e0-e16907e03118)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Common Data Service (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Power Automate for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Power Virtual Agents for Office 365 (ded3d325-1bdc-453e-8432-5bac26d7a014) |
+| Microsoft 365 A5 student use benefits | M365EDU_A5_STUUSEBNFT | 31d57bc7-3a05-4867-ab53-97a17835a411 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a)<br/>UNIVERSAL_PRINT_NO_SEEDING (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Kaizala Pro (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint (Plan 2) for Education (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a)<br/>Universal Print Without Seeding (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Windows 10/11 Enterprise (e7c91390-7625-45be-94e0-e16907e03118)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Common Data Service (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Cloud Apps Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9 |
| Microsoft 365 A5 without Audio Conferencing for students use benefit | M365EDU_A5_NOPSTNCONF_STUUSEBNFT | 81441ae1-0b31-4185-a6c0-32b6b84d419f| AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>UNIVERSAL_PRINT_NO_SEEDING (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Common Data Service - O365 P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection and Governance Analytics - Premium) (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft 365 Apps for enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Cloud App Security (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Universal Print Without Seeding (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) | | Microsoft 365 Apps for Business | O365_BUSINESS | cdd28e44-67e3-425e-be4c-737fab2899d3 | FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>OFFICE_BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>ONEDRIVESTANDARD (13696edf-5a08-49f6-8134-03083ed8ba30)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97) | MICROSOFT FORMS (PLAN E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>OFFICE 365 BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>ONEDRIVESTANDARD (13696edf-5a08-49f6-8134-03083ed8ba30)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97) | | Microsoft 365 Apps for Business | SMB_BUSINESS | b214fe43-f5a3-4703-beeb-fa97188220fc | FORMS_PLAN_E1 (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>OFFICE_BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>ONEDRIVESTANDARD (13696edf-5a08-49f6-8134-03083ed8ba30)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97) | MICROSOFT FORMS (PLAN E1) (159f4cd6-e380-449f-a816-af1a9ef76344)<br/>OFFICE 365 BUSINESS (094e7854-93fc-4d55-b2c0-3ab5369ebdc1)<br/>ONEDRIVESTANDARD (13696edf-5a08-49f6-8134-03083ed8ba30)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| Microsoft 365 E3 - Unattended License | SPE_E3_RPA1 | c2ac2ee4-9bb1-47e4-8541-d689c7e83371 | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>CDS_O365_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>OFFICESUBSCRIPTION_unattended (8d77e2d9-9e28-4450-8431-0def64078fc5)<br/>M365_LIGHTHOUSE_CUSTOMER_PLAN1 (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>M365_LIGHTHOUSE_PARTNER_PLAN1 (d55411c9-cfff-40a9-87c7-240f14df7da5)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>FORMS_PLAN_E3 (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>POWER_VIRTUAL_AGENTS_O365_P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/> WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>WIN10_PRO_ENT_SUB (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Common Data Service - O365 P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Common Data Service for Teams_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Apps for Enterprise (Unattended) (8d77e2d9-9e28-4450-8431-0def64078fc5)<br/>Microsoft 365 Lighthouse (Plan 1) (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>Microsoft 365 Lighthouse (Plan 2) (d55411c9-cfff-40a9-87c7-240f14df7da5)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan E3) (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala Pro Plan 3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Power Virtual Agents for Office 365 P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SharePoint (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/> To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Windows 10 Enterprise (Original) (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653) | | Microsoft 365 E3_USGOV_DOD | SPE_E3_USGOV_DOD | d61d61cc-f992-433f-a577-5bd016037eeb | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS_AR_DOD (fd500458-c24c-478e-856c-a6067a8376cd)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c) | Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams for DOD (AR) (fd500458-c24c-478e-856c-a6067a8376cd)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office Online (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SharePoint Online (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c) | | Microsoft 365 E3_USGOV_GCCHIGH | SPE_E3_USGOV_GCCHIGH | ca9d1dd9-dfe9-4fef-b97c-9bc1ea3c3658 | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS_AR_GCCHIGH (9953b155-8aef-4c56-92f3-72b0487fce41)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c) | Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1(6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/> Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/> Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/> Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/> Microsoft Teams for GCCHigh (AR) (9953b155-8aef-4c56-92f3-72b0487fce41)<br/> Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/> Office Online (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/> SharePoint Online (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c) |
-| Microsoft 365 E5 | SPE_E5 | 06ebc4ee-1bb5-47dd-8120-11324bc54e06 | RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>GRAPH_CONNECTORS_SEARCH_INDEX (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>M365_LIGHTHOUSE_CUSTOMER_PLAN1 (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EXCEL_PREMIUM (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>FORMS_PLAN_E5 (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>INSIDER_RISK_MANAGEMENT (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>VIVA_LEARNING_SEEDED (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>WIN10_PRO_ENT_SUB (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014)<br/>Windows_Autopatch (9a6eeb79-0b4b-4bf0-9808-39d99a2cd5a3) | Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Graph Connectors Search with Index (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft 365 Lighthouse (Plan 1) (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Excel Advanced Analytics (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>Microsoft Forms (Plan E5) (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft Kaizala Pro (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Power Apps for Office 365 (Plan 3) (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>Microsoft Insider Risk Management (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>SharePoint (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Viva Learning Seeded (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>Microsoft Defender for Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Windows 10/11 Enterprise (Original) (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Common Data Service (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Power Automate for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Power Virtual Agents for Office 365 (ded3d325-1bdc-453e-8432-5bac26d7a014)<br/>Windows Autopatch (9a6eeb79-0b4b-4bf0-9808-39d99a2cd5a3) |
+| Microsoft 365 E5 | SPE_E5 | 06ebc4ee-1bb5-47dd-8120-11324bc54e06 | RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>GRAPH_CONNECTORS_SEARCH_INDEX (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>M365_LIGHTHOUSE_CUSTOMER_PLAN1 (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EXCEL_PREMIUM (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>FORMS_PLAN_E5 (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>INSIDER_RISK_MANAGEMENT (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>VIVA_LEARNING_SEEDED (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>WIN10_PRO_ENT_SUB (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>Windows_Autopatch (9a6eeb79-0b4b-4bf0-9808-39d99a2cd5a3)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014) | Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Graph Connectors Search with Index (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft 365 Lighthouse (Plan 1) (6f23d6a9-adbf-481c-8538-b4c095654487)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Excel Advanced Analytics (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>Microsoft Forms (Plan E5) (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft Kaizala Pro (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Power Apps for Office 365 (Plan 3) (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>RETIRED - Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>RETIRED - Microsoft Insider Risk Management (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>SharePoint (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Viva Learning Seeded (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>Microsoft Defender for Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Windows 10/11 Enterprise (Original) (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>Windows Autopatch (9a6eeb79-0b4b-4bf0-9808-39d99a2cd5a3)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Common Data Service (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Power Automate for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Power Virtual Agents for Office 365 (ded3d325-1bdc-453e-8432-5bac26d7a014) |
| Microsoft 365 E5 Developer (without Windows and Audio Conferencing) | DEVELOPERPACK_E5 | c42b9cae-ea4f-4ab7-9717-81576235ccac | RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>GRAPH_CONNECTORS_SEARCH_INDEX (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EXCEL_PREMIUM (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>FORMS_PLAN_E5 (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>INSIDER_RISK_MANAGEMENT (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>VIVA_LEARNING_SEEDED (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014) | Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Graph Connectors Search with Index (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Excel Advanced Analytics (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>Microsoft Forms (Plan E5) (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Power Apps for Office 365 (Plan 3) (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>Microsoft Insider Risk Management (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>SharePoint (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Viva Learning Seeded (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Common Data Service (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Power Automate for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Power Virtual Agents for Office 365 (ded3d325-1bdc-453e-8432-5bac26d7a014) | | Microsoft 365 E5 Compliance | INFORMATION_PROTECTION_COMPLIANCE | 184efa21-98c3-4e5d-95ab-d07053a96e67 | LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>INSIDER_RISK_MANAGEMENT (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2) | Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>Microsoft Insider Risk Management (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2) | | Microsoft 365 E5 Security | IDENTITY_THREAT_PROTECTION | 26124093-3d78-432b-b5dc-48bf992543d5 | MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f) | Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Microsoft Defender for Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| Microsoft 365 E5 without Audio Conferencing | SPE_E5_NOPSTNCONF | cd2925a3-5076-4233-8931-638a8c94f773 | RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>GRAPH_CONNECTORS_SEARCH_INDEX (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EXCEL_PREMIUM (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>FORMS_PLAN_E5 (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>INSIDER_RISK_MANAGEMENT (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>VIVA_LEARNING_SEEDED (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>WIN10_PRO_ENT_SUB (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014) | Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Graph Connectors Search with Index (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Apps for enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Excel Advanced Analytics (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>Microsoft Forms (Plan E5) (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft Kaizala Pro (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Power Apps for Office 365 (Plan 3) (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>Microsoft Insider Risk Management (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>SharePoint (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Viva Learning Seeded (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>Microsoft Defender for Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Windows 10/11 Enterprise (Original) (21b439ba-a0ca-424f-a6cc-52f954a5b111)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Common Data Service (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Power Automate for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Power Virtual Agents for Office 365 (ded3d325-1bdc-453e-8432-5bac26d7a014) | | Microsoft 365 F1 | M365_F1 | 44575883-256e-4a79-9da4-ebe9acabe2b2 | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_ENTERPRISE_GOV (6a76346d-5d6e-4051-9fe3-ed3f312b5597)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>STREAM_O365_K (3ffba0d2-38e5-4d5e-8ec0-98f2b05c09d9)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>SHAREPOINTDESKLESS (902b47e5-dcb2-4fdc-858b-c63a90a2bdb9)<br/>MCOIMP (afc06cb0-b4f4-4473-8286-d644f70d8faf)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Rights Management (6a76346d-5d6e-4051-9fe3-ed3f312b5597)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Stream for O365 K SKU (3ffba0d2-38e5-4d5e-8ec0-98f2b05c09d9)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>SharePoint Online Kiosk (902b47e5-dcb2-4fdc-858b-c63a90a2bdb9)<br/>Skype for Business Online (Plan 1) (afc06cb0-b4f4-4473-8286-d644f70d8faf)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653) | | Microsoft 365 F3 | SPE_F1 | 66b55226-6b4f-492c-910c-a3b7a3c9d993 | RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_F1 (90db65a7-bf11-4904-a79f-ef657605145b)<br/>EXCHANGE_S_DESKLESS (4a82b400-a79f-41a4-b4e2-e94f5787b113)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>FORMS_PLAN_K (f07046bd-2a3c-4b96-b0be-dea79d7cbfb8)<br/>KAIZALA_O365_P1 (73b2a583-6a59-42e3-8e83-54db46bc3278)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>OFFICEMOBILE_SUBSCRIPTION (c63d4d19-e8cb-460e-b37c-4d6c34603745)<br/>PROJECT_O365_F3 (7f6f28c2-34bb-4d4b-be36-48ca2e77e1ec)<br/>SHAREPOINTDESKLESS (902b47e5-dcb2-4fdc-858b-c63a90a2bdb9)<br/>MCOIMP (afc06cb0-b4f4-4473-8286-d644f70d8faf)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_FIRSTLINE (80873e7a-cd2a-4e67-b061-1b5381a676a5)<br/>VIVA_LEARNING_SEEDED (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>WHITEBOARD_FIRSTLINE1 (36b29273-c6d0-477a-aca6-6fbe24f538e3)<br/>WIN10_ENT_LOC_F1 (e041597c-9c7f-4ed9-99b0-2663301576f7)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>DYN365_CDS_O365_F1 (ca6e61ec-d4f4-41eb-8b88-d96e0e14323f)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>STREAM_O365_K (3ffba0d2-38e5-4d5e-8ec0-98f2b05c09d9)<br/>POWERAPPS_O365_S1 (e0287f9f-e222-4f98-9a83-f379e249159a)<br/>FLOW_O365_S1 (bd91b1a4-9f94-4ecf-b45b-3a65e5c8128a)<br/>POWER_VIRTUAL_AGENTS_O365_F1 (ba2fdb48-290b-4632-b46a-e4ecc58ac11a) | Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (90db65a7-bf11-4904-a79f-ef657605145b)<br/>Exchange Online Kiosk (4a82b400-a79f-41a4-b4e2-e94f5787b113)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan F1) (f07046bd-2a3c-4b96-b0be-dea79d7cbfb8)<br/>Microsoft Kaizala Pro (73b2a583-6a59-42e3-8e83-54db46bc3278)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Office Mobile Apps for Office 365 (c63d4d19-e8cb-460e-b37c-4d6c34603745)<br/>Project for Office (Plan F) (7f6f28c2-34bb-4d4b-be36-48ca2e77e1ec)<br/>SharePoint Kiosk (902b47e5-dcb2-4fdc-858b-c63a90a2bdb9)<br/>Skype for Business Online (Plan 1) (afc06cb0-b4f4-4473-8286-d644f70d8faf)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Firstline) (80873e7a-cd2a-4e67-b061-1b5381a676a5)<br/>Viva Learning Seeded (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>Whiteboard (Firstline) (36b29273-c6d0-477a-aca6-6fbe24f538e3)<br/>Windows 10 Enterprise E3 (Local Only) (e041597c-9c7f-4ed9-99b0-2663301576f7)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Common Data Service (ca6e61ec-d4f4-41eb-8b88-d96e0e14323f)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Defender for Cloud Apps Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Stream for Office 365 F3 (3ffba0d2-38e5-4d5e-8ec0-98f2b05c09d9)<br/>Power Apps for Office 365 F3 (e0287f9f-e222-4f98-9a83-f379e249159a)<br/>Power Automate for Office 365 F3 (bd91b1a4-9f94-4ecf-b45b-3a65e5c8128a)<br/>Power Virtual Agents for Office 365 (ba2fdb48-290b-4632-b46a-e4ecc58ac11a) |
-| Microsoft 365 F5 Compliance Add-on | SPE_F5_COMP | 91de26be-adfa-4a3d-989e-9131cc23dda7 | Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Loss Prevention (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>Exchange Online Archiving (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2) | LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>BPOS_S_DlpAddOn (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2) |
-| Microsoft 365 F5 Compliance Add-on AR (DOD)_USGOV_DOD | SPE_F5_COMP_AR_D_USGOV_DOD | 9cfd6bc3-84cd-4274-8a21-8c7c41d6c350 | Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Microsoft Defender for Cloud Apps for DOD (6ebdddb7-8e55-4af2-952b-69e77262f96c) | Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Microsoft Defender for Cloud Apps for DOD (6ebdddb7-8e55-4af2-952b-69e77262f96c) |
+| Microsoft 365 F5 Compliance Add-on | SPE_F5_COMP | 91de26be-adfa-4a3d-989e-9131cc23dda7 | LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>BPOS_S_DlpAddOn (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2) | Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Loss Prevention (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>Exchange Online Archiving (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2) |
+| Microsoft 365 F5 Compliance Add-on AR DOD_USGOV_DOD | SPE_F5_COMP_AR_D_USGOV_DOD | 9cfd6bc3-84cd-4274-8a21-8c7c41d6c350 | Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Microsoft Defender for Cloud Apps for DOD (6ebdddb7-8e55-4af2-952b-69e77262f96c) | Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Microsoft Defender for Cloud Apps for DOD (6ebdddb7-8e55-4af2-952b-69e77262f96c) |
| Microsoft 365 F5 Compliance Add-on AR_USGOV_GCCHIGH | SPE_F5_COMP_AR_USGOV_GCCHIGH | 9f436c0e-fb32-424b-90be-6a9f2919d506 | Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Loss Prevention (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>Exchange Online Archiving (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2) | LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>BPOS_S_DlpAddOn (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2) | | Microsoft 365 F5 Compliance Add-on GCC | SPE_F5_COMP_GCC | 3f17cf90-67a2-4fdb-8587-37c1539507e1 | Customer Lockbox for Government (89b5d3b1-3855-49fe-b46c-87c66dbc1526)<br/>Data Loss Prevention (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>Exchange Online Archiving (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Office 365 Advanced eDiscovery for Government (d1cbfb67-18a8-4792-b643-630b7f19aad1)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2) | LOCKBOX_ENTERPRISE_GOV (89b5d3b1-3855-49fe-b46c-87c66dbc1526)<br/>BPOS_S_DlpAddOn (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>EQUIVIO_ANALYTICS_GOV (d1cbfb67-18a8-4792-b643-630b7f19aad1)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2) | | Microsoft 365 F5 Security Add-on | SPE_F5_SEC | 67ffe999-d9ca-49e1-9d2c-03fb28aa7a48 | MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f) | Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Defender for Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f) |
-| Microsoft 365 F5 Security + Compliance Add-on | SPE_F5_SECCOMP | 32b47245-eb31-44fc-b945-a8b1576c439f | LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>BPOS_S_DlpAddOn (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f) | Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Loss Prevention (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>Exchange Online Archiving (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Microsoft Defender for Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f) |
+| Microsoft 365 F5 Security + Compliance Add-on | SPE_F5_SECCOMP | 32b47245-eb31-44fc-b945-a8b1576c439f | LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>BPOS_S_DlpAddOn (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>EXCHANGE_S_ARCHIVE_ADDON (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f) | Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Loss Prevention (9bec7e34-c9fa-40b7-a9d1-bd6d1165c7ed)<br/>Exchange Online Archiving (176a09a6-7ec5-4039-ac02-b2791c6ba793)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Microsoft Defender for Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Microsoft Defender for Cloud Apps (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f) |
| Microsoft Flow Free | FLOW_FREE | f30db892-07e9-47e9-837c-80727f46fd3d | DYN365_CDS_VIRAL (17ab22cd-a0b3-4536-910a-cb6eb12696c0)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>FLOW_P2_VIRAL (50e68c76-46c6-4674-81f9-75456511b170) | COMMON DATA SERVICE - VIRAL (17ab22cd-a0b3-4536-910a-cb6eb12696c0)<br/>EXCHANGE FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318)<br/>FLOW FREE (50e68c76-46c6-4674-81f9-75456511b170) | | Microsoft 365 E5 Suite Features | M365_E5_SUITE_COMPONENTS | 99cc8282-2f74-4954-83b7-c6a9a1999067 | Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e) | Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e) | | Microsoft 365 F1 | M365_F1_COMM | 50f60901-3181-4b75-8a2c-4c8e4c1d5a72 | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>DYN365_CDS_O365_F1 (ca6e61ec-d4f4-41eb-8b88-d96e0e14323f)<br/>EXCHANGE_S_DESKLESS (4a82b400-a79f-41a4-b4e2-e94f5787b113)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>STREAM_O365_K (3ffba0d2-38e5-4d5e-8ec0-98f2b05c09d9)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>SHAREPOINTDESKLESS (902b47e5-dcb2-4fdc-858b-c63a90a2bdb9)<br/>MCOIMP (afc06cb0-b4f4-4473-8286-d644f70d8faf)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/> RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>DYN365_CDS_O365_F1 (ca6e61ec-d4f4-41eb-8b88-d96e0e14323f)<br/>EXCHANGE_S_DESKLESS (4a82b400-a79f-41a4-b4e2-e94f5787b113)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>STREAM_O365_K (3ffba0d2-38e5-4d5e-8ec0-98f2b05c09d9)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>SHAREPOINTDESKLESS (902b47e5-dcb2-4fdc-858b-c63a90a2bdb9)<br/>MCOIMP (afc06cb0-b4f4-4473-8286-d644f70d8faf)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| Office 365 A1 Plus for Students | STANDARDWOFFPACK_IW_STUDENT | e82ae690-a2d5-4d76-8d30-7c6e01e6022e | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/> DYN365_CDS_O365_P1 (40b010bb-0b69-4654-ac5e-ba161433f4b4)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_STANDARD (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>KAIZALA_O365_P2 (54fc630f-5a40-48ee-8965-af0503c1386e)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>PROJECT_O365_P1 (a55dfd10-0864-46d9-a3cd-da5991a3e0e2)<br/>SCHOOL_DATA_SYNC_P1 (c33802dd-1b50-4b9a-8bb9-f13d2cdeadac)<br/>SHAREPOINTSTANDARD_EDU (0a4983bb-d3e5-4a09-95d8-b2d0127b3df5)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>WHITEBOARD_PLAN1 (b8afc642-032e-4de5-8c0a-507a7bba7e5d)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Common Data Service - O365 P1 (40b010bb-0b69-4654-ac5e-ba161433f4b4)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 1) (9aaf7827-d63c-4b61-89c3-182f06f82e5c)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Kaizala Pro Plan 2 (54fc630f-5a40-48ee-8965-af0503c1386e)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Project for Office (Plan E1) (a55dfd10-0864-46d9-a3cd-da5991a3e0e2)<br/>School Data Sync (Plan 1) (c33802dd-1b50-4b9a-8bb9-f13d2cdeadac)<br/>SharePoint (Plan 1) for Education (0a4983bb-d3e5-4a09-95d8-b2d0127b3df5)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Whiteboard (Plan 1) (b8afc642-032e-4de5-8c0a-507a7bba7e5d)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) | | Office 365 A3 for Faculty | ENTERPRISEPACKPLUS_FACULTY | e578b273-6db4-4691-bba0-8d691f4da603 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>CDS_O365_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>POWER_VIRTUAL_AGENTS_O365_P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/> YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Common Data Service - O365 P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Common Data Service for Teams_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Apps for enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Kaizala Pro Plan 3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Power Virtual Agents for Office 365 P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) | | Office 365 A3 for Students | ENTERPRISEPACKPLUS_STUDENT | 98b6e773-24d4-4c0d-a968-6e787a1f8204 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>CDS_O365_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>POWER_VIRTUAL_AGENTS_O365_P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Common Data Service - O365 P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Common Data Service for Teams_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Kaizala Pro Plan 3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Power Virtual Agents for Office 365 P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint (Plan 2) for Education (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
-| Office 365 A5 for Faculty| ENTERPRISEPREMIUM_FACULTY | a4585165-0533-458a-97e3-c400570268c4 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>OFFICE_FORMS_PLAN_3 (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Flow for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Forms (Plan 3) (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E5 SKU (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PowerApps for Office 365 Plan 3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
-| Office 365 A5 for Students | ENTERPRISEPREMIUM_STUDENT | ee656612-49fa-43e5-b67e-cb1fdf7699df | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>OFFICE_FORMS_PLAN_3 (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Flow for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Forms (Plan 3) (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E5 SKU (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PowerApps for Office 365 Plan 3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
+| Office 365 A5 for faculty| ENTERPRISEPREMIUM_FACULTY | a4585165-0533-458a-97e3-c400570268c4 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EXCEL_PREMIUM (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>OFFICE_FORMS_PLAN_3 (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>VIVA_LEARNING_SEEDED (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Excel Advanced Analytics (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>Microsoft Forms (Plan 3) (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Kaizala Pro (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (Plan 3) (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>RETIRED - Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint (Plan 2) for Education (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Viva Learning Seeded (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a)<br/>Common Data Service (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Power Automate for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Power Virtual Agents for Office 365 (ded3d325-1bdc-453e-8432-5bac26d7a014 |
+| Office 365 A5 for students | ENTERPRISEPREMIUM_STUDENT | ee656612-49fa-43e5-b67e-cb1fdf7699df | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EXCEL_PREMIUM (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>OFFICE_FORMS_PLAN_3 (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Excel Advanced Analytics (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>Microsoft Forms (Plan 3) (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Kaizala Pro (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (Plan 3) (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>RETIRED - Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint (Plan 2) for Education (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a)<br/>Common Data Service (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Power Automate for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Power Virtual Agents for Office 365 (ded3d325-1bdc-453e-8432-5bac26d7a014) |
| Office 365 Advanced Compliance | EQUIVIO_ANALYTICS | 1b1b1f7a-8355-43b6-829f-336cfccb744c | LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f) | Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f) | | Microsoft Defender for Office 365 (Plan 1) | ATP_ENTERPRISE | 4ef96642-f096-40de-a3e9-d83fb2f90211 | ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939) | Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939) | | Office 365 Extra File Storage for GCC | SHAREPOINTSTORAGE_GOV | e5788282-6381-469f-84f0-3d7d4021d34d | EXCHANGE_S_FOUNDATION_GOV (922ba911-5694-4e99-a794-73aed9bfeec8)<br/>SHAREPOINTSTORAGE_GOV (e5bb877f-6ac9-4461-9e43-ca581543ab16) | EXCHANGE_S_FOUNDATION_GOV (922ba911-5694-4e99-a794-73aed9bfeec8)<br/>SHAREPOINTSTORAGE_GOV (e5bb877f-6ac9-4461-9e43-ca581543ab16) |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| Office 365 E3_USGOV_GCCHIGH | ENTERPRISEPACK_USGOV_GCCHIGH | aea38a85-9bd5-4981-aa00-616b411205bf | EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS_AR_GCCHIGH (9953b155-8aef-4c56-92f3-72b0487fce41)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c) | Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams for GCCHigh (AR) (9953b155-8aef-4c56-92f3-72b0487fce41)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office Online (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>SharePoint Online (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c) | | Office 365 E4 | ENTERPRISEWITHSCAL | 1392051d-0cb9-4b7a-88d5-621fee5e8711 | BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>FORMS_PLAN_E3 (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>MCOVOICECONF (27216c54-caf8-4d0d-97e2-517afb5c08f6)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>MICROSOFT STAFFHUB (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EXCHANGE ONLINE (PLAN 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW FOR OFFICE 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>MICROSOFT FORMS (PLAN E3) (2789c901-c14e-48ab-a76a-be334d9d793a)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 3) (27216c54-caf8-4d0d-97e2-517afb5c08f6)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>POWERAPPS FOR OFFICE 365(c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT AZURE ACTIVE DIRECTORY RIGHTS (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>SHAREPOINT ONLINE (PLAN 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>MICROSOFT STREAM FOR O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | | Office 365 E5 | ENTERPRISEPREMIUM | c7df2760-2c81-4ef7-b578-5b5392b571df | DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>GRAPH_CONNECTORS_SEARCH_INDEX (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EXCEL_PREMIUM (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>FORMS_PLAN_E5 (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | Common Data Service - O365 P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Common Data Service for Teams_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Graph Connectors Search with Index (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics ΓÇô Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 ΓÇô Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>M365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Apps for enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Excel Advanced Analytics (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>Microsoft Forms (Plan E5) (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E5 SKU (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office for the web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Power Automate for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>Power Virtual Agents for Office 365 P3 (ded3d325-1bdc-453e-8432-5bac26d7a014)<br/>PowerApps for Office 365 Plan 3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>SharePoint (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653) |
-| Office 365 E5 WITHOUT AUDIO CONFERENCING | ENTERPRISEPREMIUM_NOPSTNCONF | 26d45bd9-adf1-46cd-a9e1-51e9a5524128 | ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>FORMS_PLAN_E5 (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | OFFICE 365 CLOUD APP SECURITY (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>POWER BI PRO (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>MICROSOFT STAFFHUB (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>OFFICE 365 ADVANCED EDISCOVERY (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>EXCHANGE ONLINE (PLAN 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW FOR OFFICE 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>MICROSOFT FORMS (PLAN E5) (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>PHONE SYSTEM (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>POWERAPPS FOR OFFICE 365 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>MICROSOFT PLANNER(b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT AZURE ACTIVE DIRECTORY RIGHTS (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>SHAREPOINT ONLINE (PLAN 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>OFFICE ONLINE (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>MICROSOFT STREAM FOR O365 E5 SKU (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>OFFICE 365 ADVANCED THREAT PROTECTION (PLAN 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) |
+| Office 365 E5 without Audio Conferencing | ENTERPRISEPREMIUM_NOPSTNCONF | 26d45bd9-adf1-46cd-a9e1-51e9a5524128 | RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>GRAPH_CONNECTORS_SEARCH_INDEX (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>EXCEL_PREMIUM (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>FORMS_PLAN_E5 (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>SHAREPOINTENTERPRISE (5dbe027f-2339-4123-9542-606e4d348a72)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>VIVA_LEARNING_SEEDED (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014) | Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Common Data Service for Teams (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Graph Connectors Search with Index (a6520331-d7d4-4276-95f5-15c0933bc757)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics - Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Excel Advanced Analytics (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>Microsoft Forms (Plan E5) (e212cbc7-0961-4c40-9825-01117710dcb1)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Kaizala Pro (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Nucleus (db4d623d-b514-490b-b7ef-8885eee514de)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Cloud App Security (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Power Apps for Office 365 (Plan 3) (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>RETIRED - Microsoft Communications Compliance (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>SharePoint (Plan 2) (5dbe027f-2339-4123-9542-606e4d348a72)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Viva Learning Seeded (b76fb638-6ba6-402a-b9f9-83d28acb3d86)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653)<br/>Common Data Service (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Power Automate for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Power Virtual Agents for Office 365 (ded3d325-1bdc-453e-8432-5bac26d7a014) |
| Office 365 F3 | DESKLESSPACK | 4b585984-651b-448a-9e53-3b10f069cf7f | DYN365_CDS_O365_F1 (ca6e61ec-d4f4-41eb-8b88-d96e0e14323f)<br/>CDS_O365_F1 (90db65a7-bf11-4904-a79f-ef657605145b)<br/>EXCHANGE_S_DESKLESS (4a82b400-a79f-41a4-b4e2-e94f5787b113)<br/>RMS_S_BASIC (31cf2cfc-6b0d-4adc-a336-88b724ed8122)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>FORMS_PLAN_K (f07046bd-2a3c-4b96-b0be-dea79d7cbfb8)<br/>KAIZALA_O365_P1 (73b2a583-6a59-42e3-8e83-54db46bc3278)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_K (3ffba0d2-38e5-4d5e-8ec0-98f2b05c09d9)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>SHAREPOINTWAC (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>OFFICEMOBILE_SUBSCRIPTION (c63d4d19-e8cb-460e-b37c-4d6c34603745)<br/>POWERAPPS_O365_S1 (e0287f9f-e222-4f98-9a83-f379e249159a)<br/>FLOW_O365_S1 (bd91b1a4-9f94-4ecf-b45b-3a65e5c8128a)<br/>POWER_VIRTUAL_AGENTS_O365_F1 (ba2fdb48-290b-4632-b46a-e4ecc58ac11a)<br/>PROJECT_O365_F3 (7f6f28c2-34bb-4d4b-be36-48ca2e77e1ec)<br/>SHAREPOINTDESKLESS (902b47e5-dcb2-4fdc-858b-c63a90a2bdb9)<br/>MCOIMP (afc06cb0-b4f4-4473-8286-d644f70d8faf)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_FIRSTLINE (80873e7a-cd2a-4e67-b061-1b5381a676a5)<br/>WHITEBOARD_FIRSTLINE1 (36b29273-c6d0-477a-aca6-6fbe24f538e3)<br/>YAMMER_ENTERPRISE (7547a3fe-08ee-4ccb-b430-5077c5041653) | Common Data Service - O365 F1 (ca6e61ec-d4f4-41eb-8b88-d96e0e14323f)<br/>Common Data Service for Teams_F1 (90db65a7-bf11-4904-a79f-ef657605145b)<br/>Exchange Online Kiosk (4a82b400-a79f-41a4-b4e2-e94f5787b113)<br/>Microsoft Azure Rights Management Service (31cf2cfc-6b0d-4adc-a336-88b724ed8122)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan F1) (f07046bd-2a3c-4b96-b0be-dea79d7cbfb8)<br/>Microsoft Kaizala Pro Plan 1 (73b2a583-6a59-42e3-8e83-54db46bc3278)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for Office 365 F3 (3ffba0d2-38e5-4d5e-8ec0-98f2b05c09d9)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office for the Web (e95bec33-7c88-4a70-8e19-b10bd9d0c014)<br/>Office Mobile Apps for Office 365 (c63d4d19-e8cb-460e-b37c-4d6c34603745)<br/>Power Apps for Office 365 F3 (e0287f9f-e222-4f98-9a83-f379e249159a)<br/>Power Automate for Office 365 F3 (bd91b1a4-9f94-4ecf-b45b-3a65e5c8128a)<br/>Power Virtual Agents for Office 365 F1 (ba2fdb48-290b-4632-b46a-e4ecc58ac11a)<br/>Project for Office (Plan F) (7f6f28c2-34bb-4d4b-be36-48ca2e77e1ec)<br/>SharePoint Kiosk (902b47e5-dcb2-4fdc-858b-c63a90a2bdb9)<br/>Skype for Business Online (Plan 1) (afc06cb0-b4f4-4473-8286-d644f70d8faf)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Firstline) (80873e7a-cd2a-4e67-b061-1b5381a676a5)<br/>Whiteboard (Firstline) (36b29273-c6d0-477a-aca6-6fbe24f538e3)<br/>Yammer Enterprise (7547a3fe-08ee-4ccb-b430-5077c5041653) | | Office 365 G1 GCC | STANDARDPACK_GOV | 3f4babde-90ec-47c6-995d-d223749065d1 | DYN365_CDS_O365_P1_GCC (8eb5e9bc-783f-4425-921a-c65f45dd72c6)<br/>CDS_O365_P1_GCC (959e5dec-6522-4d44-8349-132c27c3795a)<br/>EXCHANGE_S_STANDARD_GOV (e9b4930a-925f-45e2-ac2a-3f7788ca6fdd)<br/>FORMS_GOV_E1 (f4cba850-4f34-4fd2-a341-0fddfdce1e8f)<br/>MYANALYTICS_P2_GOV (6e5b7995-bd4f-4cbd-9d19-0e32010c72f0)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>STREAM_O365_E1_GOV (15267263-5986-449d-ac5c-124f3b49b2d6)<br/>TEAMS_GOV (304767db-7d23-49e8-a945-4a7eb65f9f28)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>PROJECTWORKMANAGEMENT_GOV (5b4ef465-7ea1-459a-9f91-033317755a51)<br/>SHAREPOINTWAC_GOV (8f9f0f3b-ca90-406c-a842-95579171f8ec)<br/>OFFICEMOBILE_SUBSCRIPTION_GOV (4ccb60ee-9523-48fd-8f63-4b090f1ad77a)<br/>POWERAPPS_O365_P1_GOV (c42aa49a-f357-45d5-9972-bc29df885fee)<br/>FLOW_O365_P1_GOV (ad6c8870-6356-474c-901c-64d7da8cea48)<br/>SharePoint Plan 1G (f9c43823-deb4-46a8-aa65-8b551f0c4f8a)<br/>MCOSTANDARD_GOV (a31ef4a2-f787-435e-8335-e47eb0cafc94)<br/>BPOS_S_TODO_1 (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>WHITEBOARD_PLAN1 (b8afc642-032e-4de5-8c0a-507a7bba7e5d) | Common Data Service - O365 P1 GCC (8eb5e9bc-783f-4425-921a-c65f45dd72c6)<br/>Common Data Service for Teams_P1 GCC (959e5dec-6522-4d44-8349-132c27c3795a)<br/>Exchange Online (Plan 1) for Government (e9b4930a-925f-45e2-ac2a-3f7788ca6fdd)<br/>Forms for Government (Plan E1) (f4cba850-4f34-4fd2-a341-0fddfdce1e8f)<br/>Insights by MyAnalytics for Government (6e5b7995-bd4f-4cbd-9d19-0e32010c72f0)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft Stream for O365 for Government (E1) (15267263-5986-449d-ac5c-124f3b49b2d6)<br/>Microsoft Teams for Government (304767db-7d23-49e8-a945- 4a7eb65f9f28)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Planner for Government (5b4ef465-7ea1-459a-9f91-033317755a51)<br/>Office for the Web for Government (8f9f0f3b-ca90-406c-a842-95579171f8ec)<br/>Office Mobile Apps for Office 365 for GCC (4ccb60ee-9523-48fd-8f63-4b090f1ad77a)<br/> Power Apps for Office 365 for Government (c42aa49a-f357-45d5-9972-bc29df885fee)<br/>Power Automate for Office 365 for Government (ad6c8870-6356-474c-901c-64d7da8cea48)<br/>SharePoint Plan 1G (f9c43823-deb4-46a8-aa65-8b551f0c4f8a)<br/>Skype for Business Online (Plan 2) for Government (a31ef4a2-f787-435e-8335-e47eb0cafc94)<br/>To-Do (Plan 1) (5e62787c-c316-451f-b873-1d05acd4d12c)<br/>Whiteboard (Plan 1) (b8afc642-032e-4de5-8c0a-507a7bba7e5d) | | Office 365 G3 GCC | ENTERPRISEPACK_GOV | 535a3a29-c5f0-42fe-8215-d3b9e1f38c4a | RMS_S_ENTERPRISE_GOV (6a76346d-5d6e-4051-9fe3-ed3f312b5597)<br/>DYN365_CDS_O365_P2_GCC (06162da2-ebf9-4954-99a0-00fee96f95cc)<br/>CDS_O365_P2_GCC (a70bbf38-cdda-470d-adb8-5804b8770f41)<br/>EXCHANGE_S_ENTERPRISE_GOV (8c3069c0-ccdb-44be-ab77-986203a67df2)<br/>FORMS_GOV_E3 (24af5f65-d0f3-467b-9f78-ea798c4aeffc)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2_GOV (6e5b7995-bd4f-4cbd-9d19-0e32010c72f0)<br/>OFFICESUBSCRIPTION_GOV (de9234ff-6483-44d9-b15e-dca72fdd27af)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>STREAM_O365_E3_GOV (2c1ada27-dbaa-46f9-bda6-ecb94445f758)<br/>TEAMS_GOV (304767db-7d23-49e8-a945-4a7eb65f9f28)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>PROJECTWORKMANAGEMENT_GOV (5b4ef465-7ea1-459a-9f91-033317755a51)<br/>SHAREPOINTWAC_GOV (8f9f0f3b-ca90-406c-a842-95579171f8ec)<br/>POWERAPPS_O365_P2_GOV (0a20c815-5e81-4727-9bdc-2b5a117850c3)<br/>FLOW_O365_P2_GOV (c537f360-6a00-4ace-a7f5-9128d0ac1e4b)<br/>SHAREPOINTENTERPRISE_GOV (153f85dd-d912-4762-af6c-d6e0fb4f6692)<br/>MCOSTANDARD_GOV (a31ef4a2-f787-435e-8335-e47eb0cafc94) | AZURE RIGHTS MANAGEMENT (6a76346d-5d6e-4051-9fe3-ed3f312b5597)<br/>COMMON DATA SERVICE - O365 P2 GCC (06162da2-ebf9-4954-99a0-00fee96f95cc)<br/>COMMON DATA SERVICE FOR TEAMS_P2 GCC (a70bbf38-cdda-470d-adb8-5804b8770f41)<br/>EXCHANGE PLAN 2G (8c3069c0-ccdb-44be-ab77-986203a67df2)<br/>FORMS FOR GOVERNMENT (PLAN E3) (24af5f65-d0f3-467b-9f78-ea798c4aeffc)<br/>INFORMATION PROTECTION AND GOVERNANCE ANALYTICS ΓÇô PREMIUM (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>INFORMATION PROTECTION AND GOVERNANCE ANALYTICS ΓÇô STANDARD (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>INFORMATION PROTECTION FOR OFFICE 365 ΓÇô STANDARD (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>INSIGHTS BY MYANALYTICS FOR GOVERNMENT (6e5b7995-bd4f-4cbd-9d19-0e32010c72f0)<br/>MICROSOFT 365 APPS FOR ENTERPRISE G (de9234ff-6483-44d9-b15e-dca72fdd27af)<br/>MICROSOFT BOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>MICROSOFT STREAM FOR O365 FOR GOVERNMENT (E3) (2c1ada27-dbaa-46f9-bda6-ecb94445f758)<br/>MICROSOFT TEAMS FOR GOVERNMENT (304767db-7d23-49e8-a945-4a7eb65f9f28)<br/>MOBILE DEVICE MANAGEMENT FOR OFFICE 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>OFFICE 365 PLANNER FOR GOVERNMENT (5b4ef465-7ea1-459a-9f91-033317755a51)<br/>OFFICE FOR THE WEB (GOVERNMENT) (8f9f0f3b-ca90-406c-a842-95579171f8ec)<br/>POWER APPS FOR OFFICE 365 FOR GOVERNMENT (0a20c815-5e81-4727-9bdc-2b5a117850c3)<br/>POWER AUTOMATE FOR OFFICE 365 FOR GOVERNMENT (c537f360-6a00-4ace-a7f5-9128d0ac1e4b)<br/>SHAREPOINT PLAN 2G (153f85dd-d912-4762-af6c-d6e0fb4f6692)<br/>SKYPE FOR BUSINESS ONLINE (PLAN 2) FOR GOVERNMENT (a31ef4a2-f787-435e-8335-e47eb0cafc94) |
active-directory Tutorial Bulk Invite https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/tutorial-bulk-invite.md
Title: Tutorial for bulk inviting B2B collaboration users - Azure AD
-description: In this tutorial, you learn how to use PowerShell and a CSV file to send bulk invitations to external Azure AD B2B collaboration users. You'll use the Microsoft.Graph.Users PowerShell module.
+description: In this tutorial, you learn how to send bulk invitations using a CSV file to external Azure AD B2B collaboration users.
Previously updated : 02/16/2022 Last updated : 10/24/2022
# Customer intent: As a tenant administrator, I want to send B2B invitations to multiple external users at the same time so that I can avoid having to send individual invitations to each user. + # Tutorial: Bulk invite Azure AD B2B collaboration users
-If you use Azure Active Directory (Azure AD) B2B collaboration to work with external partners, you can invite multiple guest users to your organization at the same time. In this tutorial, you learn how to use the Azure portal to send bulk invitations to external users. Specifically, you'll follow these steps:
+If you use [Azure Active Directory (Azure AD) B2B collaboration](what-is-b2b.md) to work with external partners, you can invite multiple guest users to your organization at the same time. In this tutorial, you learn how to use the Azure portal to send bulk invitations to external users. Specifically, you'll follow these steps:
> [!div class="checklist"]
+>
> * Use **Bulk invite users** to prepare a comma-separated value (.csv) file with the user information and invitation preferences > * Upload the .csv file to Azure AD > * Verify the users were added to the directory If you donΓÇÖt have Azure Active Directory, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
-## Understand the CSV template
-
-Download and fill in the bulk upload CSV template to help you successfully invite Azure AD guest users in bulk. The CSV template you download might look like this example:
-
-![Spreadsheet for upload and call-outs explaining the purpose and values for each row and column](media/tutorial-bulk-invite/understand-template.png)
-
-### CSV template structure
-
-The rows in a downloaded CSV template are as follows:
--- **Version number**: The first row containing the version number must be included in the upload CSV.-- **Column headings**: The format of the column headings is &lt;*Item name*&gt; [PropertyName] &lt;*Required or blank*&gt;. For example, `Email address to invite [inviteeEmail] Required`. Some older versions of the template might have slight variations.-- **Examples row**: We've included in the template a row of examples of values for each column. You must remove the examples row and replace it with your own entries.-
-### Additional guidance
--- The first two rows of the upload template must not be removed or modified, or the upload can't be processed.-- The required columns are listed first.-- We don't recommend adding new columns to the template. Any columns you add are ignored and not processed.-- We recommend that you download the latest version of the CSV template as often as possible. ## Prerequisites You need two or more test email accounts that you can send the invitations to. The accounts must be from outside your organization. You can use any type of account, including social accounts such as gmail.com or outlook.com addresses. + ## Invite guest users in bulk 1. Sign in to the Azure portal with an account that is a global administrator in the organization.
You need two or more test email accounts that you can send the invitations to. T
3. Under **Manage**, select **All Users**. 4. Select **Bulk operations** > **Bulk invite**.
- ![Bulk invite button](media/tutorial-bulk-invite/bulk-invite-button.png)
+ :::image type="content" source="media/tutorial-bulk-invite/bulk-invite-button.png" alt-text="Screenshot of the bulk invite button.":::
-4. On the **Bulk invite users** page, select **Download** to get a valid .csv template with invitation properties.
- ![Download the CSV file](media/tutorial-bulk-invite/download-button.png)
+4. On the **Bulk invite users** page, select **Download** to get a [valid .csv template](tutorial-bulk-invite.md#understand-the-csv-template) with invitation properties.
+
+ :::image type="content" source="media/tutorial-bulk-invite/download-button.png" alt-text="Screenshot of the download the csv file button.":::
1. Open the .csv template and add a line for each guest user. Required values are:
You need two or more test email accounts that you can send the invitations to. T
* **Redirection url** - the URL to which the invited user is forwarded after accepting the invitation. If you want to forward the user to the My Apps page, you must change this value to https://myapps.microsoft.com or https://myapplications.microsoft.com.
- ![Example of a CSV file with guest users entered](media/tutorial-bulk-invite/bulk-invite-csv.png)
+ :::image type="content" source="media/tutorial-bulk-invite/bulk-invite-csv.png" alt-text="Screenshot of the example csv file with guest users entered.":::
> [!NOTE] > Don't use commas in the **Customized invitation message** because they'll prevent the message from being parsed successfully.
You need two or more test email accounts that you can send the invitations to. T
9. When your file passes validation, select **Submit** to start the Azure bulk operation that adds the invitations. 10. To view the job status, select **Click here to view the status of each operation**. Or, you can select **Bulk operation results** in the **Activity** section. For details about each line item within the bulk operation, select the values under the **# Success**, **# Failure**, or **Total Requests** columns. If failures occurred, the reasons for failure will be listed.
- ![Example of bulk operation results](media/tutorial-bulk-invite/bulk-operation-results.png)
+ :::image type="content" source="media/tutorial-bulk-invite/bulk-operation-results.png" alt-text="Screenshot of the bulk operation results." lightbox="media/tutorial-bulk-invite/bulk-operation-results.png":::
+ 11. When the job completes, you'll see a notification that the bulk operation succeeded. +
+## Understand the CSV template
+
+Download and fill in the bulk upload CSV template to help you successfully invite Azure AD guest users in bulk. The CSV template you download might look like this example:
+
+![Spreadsheet for upload and call-outs explaining the purpose and values for each row and column](media/tutorial-bulk-invite/understand-template.png)
+
+### CSV template structure
+
+The rows in a downloaded CSV template are as follows:
+
+- **Version number**: The first row containing the version number must be included in the upload CSV.
+- **Column headings**: The format of the column headings is &lt;*Item name*&gt; [PropertyName] &lt;*Required or blank*&gt;. For example, `Email address to invite [inviteeEmail] Required`. Some older versions of the template might have slight variations.
+- **Examples row**: We've included in the template a row of examples of values for each column. You must remove the examples row and replace it with your own entries.
+
+### Additional guidance
+
+- The first two rows of the upload template must not be removed or modified, or the upload can't be processed.
+- The required columns are listed first.
+- We don't recommend adding new columns to the template. Any columns you add are ignored and not processed.
+- We recommend that you download the latest version of the CSV template as often as possible.
++ ## Verify guest users in the directory Check to see that the guest users you added exist in the directory either in the Azure portal or by using PowerShell.
For example: `Remove-MgUser -UserId "lstokes_fabrikam.com#EXT#@contoso.onmicroso
## Next steps
-In this tutorial, you sent bulk invitations to guest users outside of your organization. Next, learn how the invitation redemption process works.
+In this tutorial, you sent bulk invitations to guest users outside of your organization. Next, learn how the invitation redemption process works, and how to enforce multi-factor authentication for guest users.
+
-> [!div class="nextstepaction"]
-> [Learn about the Azure AD B2B collaboration invitation redemption process](redemption-experience.md)
+- [Learn about the Azure AD B2B collaboration invitation redemption process](redemption-experience.md)
+- [Enforce multi-factor authentication for B2B guest users](b2b-tutorial-require-mfa.md)
active-directory How To Upgrade Previous Version https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/how-to-upgrade-previous-version.md
This method is preferred when you have a single server and less than about 100,0
![In-place upgrade](./media/how-to-upgrade-previous-version/inplaceupgrade.png)
-If you've made changes to the out-of-box synchronization rules, then these rules are set back to the default configuration on upgrade. To make sure that your configuration is kept between upgrades, make sure that you make changes as they're described in [Best practices for changing the default configuration](how-to-connect-sync-best-practices-changing-default-configuration.md). If you already changed the default sync rules, please see how to [Fix modified default rules in Azure AD Connect](/active-directory/hybrid/how-to-connect-sync-best-practices-changing-default-configuration), before starting the upgrade process.
+If you've made changes to the out-of-box synchronization rules, then these rules are set back to the default configuration on upgrade. To make sure that your configuration is kept between upgrades, make sure that you make changes as they're described in [Best practices for changing the default configuration](how-to-connect-sync-best-practices-changing-default-configuration.md). If you already changed the default sync rules, please see how to [Fix modified default rules in Azure AD Connect](/azure/active-directory/hybrid/how-to-connect-sync-best-practices-changing-default-configuration), before starting the upgrade process.
During in-place upgrade, there may be changes introduced that require specific synchronization activities (including Full Import step and Full Synchronization step) to be executed after upgrade completes. To defer such activities, refer to section [How to defer full synchronization after upgrade](#how-to-defer-full-synchronization-after-upgrade).
active-directory Concept Workload Identity Risk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/identity-protection/concept-workload-identity-risk.md
We detect risk on workload identities across sign-in behavior and offline indica
| Leaked Credentials | Offline | This risk detection indicates that the account's valid credentials have been leaked. This leak can occur when someone checks in the credentials in public code artifact on GitHub, or when the credentials are leaked through a data breach. <br><br> When the Microsoft leaked credentials service acquires credentials from GitHub, the dark web, paste sites, or other sources, they're checked against current valid credentials in Azure AD to find valid matches. | | Malicious application | Offline | This detection indicates that Microsoft has disabled an application for violating our terms of service. We recommend [conducting an investigation](https://go.microsoft.com/fwlink/?linkid=2208429) of the application.| | Suspicious application | Offline | This detection indicates that Microsoft has identified an application that may be violating our terms of service, but has not disabled it. We recommend [conducting an investigation](https://go.microsoft.com/fwlink/?linkid=2208429) of the application.|
+| Anomalous service principal activity | Offline | This risk detection indicates that suspicious patterns of activity have been identified for an authenticated service principal. The post-authentication behavior of service principals is assessed for anomalies. This behavior is based on actions occurring for the account, along with any sign-in risk detected. |
## Identify risky workload identities
active-directory Manage Self Service Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/manage-apps/manage-self-service-access.md
To enable self-service application access to an application, follow the steps be
1. Select the **Save** button at the top of the pane to finish.
-Once you complete self-service application configuration, users can navigate to their My Apps portal and select **Add self-service apps** to find the apps that are enabled with self-service access. Business approvers also see a notification in their My Apps portal. You can enable an email notifying them when a user has requested access to an application that requires their approval.
+Once you complete self-service application configuration, users can navigate to their My Apps portal and select **Request new apps** to find the apps that are enabled with self-service access. Business approvers also see a notification in their My Apps portal. You can enable an email notifying them when a user has requested access to an application that requires their approval.
## Next steps
active-directory Howto Verifiable Credentials Partner Au10tix https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/verifiable-credentials/howto-verifiable-credentials-partner-au10tix.md
As a developer you can share these steps with your tenant administrator to obtai
As a developer you now have the request URL and body from your tenant admin, follow these steps to update your application or website:
-1. Add the request URL and body to your application or website to request Verified IDs from your users. Note: If you are using [one of the sample apps](https://aka.ms/vcsample), you'll need to replace the contents of the presentation_request_config.json with the request body obtained.
+1. Add the request URL and body to your application or website to request Verified IDs from your users.
+ >[!Note]
+ >If you are using [one of the sample apps](https://aka.ms/vcsample), you'll need to replace the contents of the `presentation_request_config.json` with the request body obtained in [Part 1](#part-1). The sample code overwrites the `trustedIssuers` values with `IssuerAuthority` value from `appsettings.json`. Copy the `trustedIssuers` value from the payload to `IssuerAuthority` in `appsettings.json` file.
1. Be sure to replace the values for the "url", "state", and "api-key" with your respective values. 1. [Grant permissions](verifiable-credentials-configure-tenant.md#grant-permissions-to-get-access-tokens) to your app to obtain access token for the Verified ID service request service principal.
active-directory Howto Verifiable Credentials Partner Lexisnexis https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/verifiable-credentials/howto-verifiable-credentials-partner-lexisnexis.md
As a developer you'll provide the steps below to your tenant administrator. The
As a developer you now have the request URL and body from your tenant admin, follow these steps to update your application or website: 1. Add the request URL and body to your application or website to request Verified IDs from your users.
- >[!NOTE]
- > If you are using [one of the sample apps](https://aka.ms/vcsample) you need to replace the contents of the presentation_request_config.json with the request body obtained.
+ >[!Note]
+ >If you are using [one of the sample apps](https://aka.ms/vcsample), you'll need to replace the contents of the `presentation_request_config.json` with the request body obtained in [Part 1](#part-1). The sample code overwrites the `trustedIssuers` values with `IssuerAuthority` value from `appsettings.json`. Copy the `trustedIssuers` value from the payload to `IssuerAuthority` in `appsettings.json` file.
1. Replace the values for the "url", "state", and "api-key" with your respective values. 1. Grant your app [permissions](verifiable-credentials-configure-tenant.md#grant-permissions-to-get-access-tokens) to obtain an access token for the Verified ID service request service principal.
active-directory Partner Gallery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/verifiable-credentials/partner-gallery.md
To be considered into Entra Verified ID partner documentation, submit your appli
|:-|:--|:--| |![Screenshot of au10tix logo.](media/partner-gallery/au10tix.png) | [AU10TIX](https://www.au10tix.com/solutions/microsoft-azure-active-directory-verifiable-credentials-program) improves Verifiability While Protecting Privacy For Businesses, Employees, Contractors, Vendors, And Customers. | [Configure Verified ID by AU10TIX as your Identity Verification Partner](https://aka.ms/au10tixvc). | | ![Screenshot of a LexisNexis logo.](media/partner-gallery/lexisnexis.png) | [LexisNexis](https://solutions.risk.lexisnexis.com/did-microsoft) risk solutions Verifiable credentials enables faster onboarding for employees, students, citizens, or others to access services. | [Configure Verified ID by LexisNexis Risk Solutions as your Identity Verification Partner](https://aka.ms/lexisnexisvc). |
+| ![Screenshot of a Vu logo.](medi) |
| ![Screenshot of a Onfido logo.](media/partner-gallery/onfido.jpeg) | [Onfido](https://onfido.com/landing/onfido-microsoft-idv-service/) Start issuing and accepting verifiable credentials in minutes. With verifiable credentials and Onfido you can verify a personΓÇÖs identity while respecting privacy. Digitally validate information on a personΓÇÖs ID or their biometrics.| * |
-| ![Screenshot of a Vu logo.](media/partner-gallery/vu.png) | [Vu Security](https://landings.vusecurity.com/microsoft-verifiable-credentials) Verifiable credentials with just a selfie and your ID.| * |
| ![Screenshot of a Jumio logo.](media/partner-gallery/jumio.jpeg) | [Jumio](https://www.jumio.com/microsoft-verifiable-credentials/) is helping to support a new form of digital identity by Microsoft based on verifiable credentials and decentralized identifiers standards to let consumers verify once and use everywhere.| * | | ![Screenshot of a Idemia logo.](media/partner-gallery/idemia.png) | [Idemia](https://na.idemia.com/identity/verifiable-credentials/) Integration with Verified ID enables ΓÇ£Verify once, use everywhereΓÇ¥ functionality.| * | | ![Screenshot of a Acuant logo.](media/partner-gallery/acuant.png) | [Acuant](https://www.acuant.com/microsoft-acuant-verifiable-credentials-my-digital-id/) - My Digital ID - Create Your Digital Identity Once, Use It Everywhere.| * |
active-directory Partner Vu https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/verifiable-credentials/partner-vu.md
+
+ Title: Configure Verified ID by VU Identity Card as your Identity Verification Partner
+description: This article shows you the steps you need to follow to configure VU Identity Card as your identity verification partner
++++++ Last updated : 10/26/2022+
+# Customer intent: As a developer, I'm looking for information about the open standards that are supported by Microsoft Entra Verified ID.
++
+# Configure Verified ID by VU Identity Card as your Identity Verification Partner
+
+In this article, we cover the steps needed to integrate Microsoft Entra Verified ID with VU Identity Card, a product of [VU Security](https://www.vusecurity.com/). VU Identity Card creates secure and frictionless digital experiences that enhance
+biometric onboarding and verification scenarios throughout the lifecycle
+of citizens and organizations.
+
+VU Identity Card provides flexible and simple onboarding, authentication and
+verification processes on any device. It focuses on user experience and
+security without impacting the business.
+
+To learn more about VU Security and its complete set of solutions, visit
+<https://www.vusecurity.com>
+
+## Prerequisites
+
+To get started with the VU Identity Card, ensure the following prerequisites are met:
+
+- A tenant [configured](https://learn.microsoft.com/azure/active-directory/verifiable-credentials/verifiablee-credentials-configure-tenant)
+ for Entra Verified ID service.
+
+ - If you don\'t have an existing tenant, you can [create an Azure
+ account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F)
+ for free.
+
+- The tenant must complete the VU Identity Card onboarding process.
+
+ - To create an account, [contact VU Security](https://landings.vusecurity.com/microsoft-verifiable-credentials/).
+
+>[!Important]
+>Before you proceed, you must have received the URL from VU Security for users to be issued Verified IDs. If you have not yet received it, follow up with VU Security before you attempt following the steps documented in this article.
+
+## Scenario description
+
+VU Identity Card works as a link between users who need to access an application and applications that require secure access control, regardless of how users access the system.
+
+Verifiable credentials can be used to enable faster and easier user onboarding by replacing some human interactions. For example, a user or employee who wants to create or remotely access an account can use a Verified ID through VU Identity Card to verify their identity without using vulnerable or overly complex passwords or the requirement to be on-site.
+
+Learn more about [account onboarding](https://learn.microsoft.com/azure/active-directory/verifiable-credentials/plan-verification-solution#account-onboarding).
+
+In this account onboarding scenario, Vu plays the Trusted ID proofing issuer role.
++
+## Configure your application to use VU Identity Card
+
+Follow these steps to incorporate VU Identity Card solution into your Apps.
+
+### Part 1
+
+As a developer you can share these steps with your tenant administrator to obtain the verification request URL, and body for your application or website to request Verified IDs from your users.
+
+1. Go to Microsoft Entra portal - [**Verified ID**](https://entra.microsoft.com/#view/Microsoft_AAD_DecentralizedIdentity/ResourceOverviewBlade)
+
+ >[!NOTE]
+ >Verify that the tenant configured for Verified ID meets the prerequisites.
+
+2. Go to **Quickstart** > **Verification Request** >
+ [**Start**](https://entra.microsoft.com/#view/Microsoft_AAD_DecentralizedIdentity/QuickStartVerifierBlade)
+
+3. Choose **Select Issuer**.
+
+4. Look for **VUSecurity** in the Search/select issuers
+ drop-down.
+
+ [ ![Screenshot of the portal section used to choose issuers.](./media/partner-vu/select-issuers.png)](./media/partner-vu/select-issuers.png#lightbox)
+
+5. Check the **VUIdentityCard** credential with the attributes
+ such as firstname, lastname, number, country, gender, birth-date, and nationality or any other credential type.
+ >[!NOTE]
+ >Number attribute refers to National ID. For example, DNI-National Identification Number in Argentina.
+
+6. Select **Add** and then select **Review**.
+
+7. Download the request body and Copy/paste POST API request URL.
+
+### Part 2
+
+As a developer you now have the request URL and body from your tenant admin, follow these steps to update your application or website:
+
+1. Add the request URL and body to your application or website to request Verified IDs from your users.
+ >[!Note]
+ >If you are using [one of the sample apps](https://aka.ms/vcsample), you'll need to replace the contents of the `presentation_request_config.json` with the request body obtained in [Part 1](#part-1). The sample code overwrites the `trustedIssuers` values with `IssuerAuthority` value from `appsettings.json`. Copy the `trustedIssuers` value from the payload to `IssuerAuthority` in `appsettings.json` file.
+
+1. Be sure to replace the values for the **url**, **state**, and **api-key** with your respective values.
+
+1. [Grant permissions](verifiable-credentials-configure-tenant.md#grant-permissions-to-get-access-tokens) to your app to obtain access token for the Verified ID service request service principal.
+
+## Test the user flow
+
+User flow is specific to your application or website. However if you are using one of the sample apps follow the steps outlined as part of the [sample app's documentation](https://aka.ms/vcsample).
+
+## Next steps
+
+- [Verifiable credentials admin API](admin-api.md)
+- [Request Service REST API issuance specification](issuance-request-api.md)
aks Cluster Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/cluster-configuration.md
description: Learn how to configure a cluster in Azure Kubernetes Service (AKS)
Previously updated : 10/04/2022 Last updated : 10/28/2022 # Configure an AKS cluster
This enables an OIDC Issuer URL of the provider which allows the API server to d
### Prerequisites
-* The Azure CLI version 2.40.0 or higher. Run `az --version` to find your version. If you need to install or upgrade, see [Install Azure CLI][azure-cli-install].
+* The Azure CLI version 2.42.0 or higher. Run `az --version` to find your version. If you need to install or upgrade, see [Install Azure CLI][azure-cli-install].
* AKS version 1.22 and higher. If your cluster is running version 1.21 and the OIDC Issuer preview is enabled, we recommend you upgrade the cluster to the minimum required version supported. ### Create an AKS cluster with OIDC Issuer
aks Cluster Extensions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/cluster-extensions.md
A conceptual overview of this feature is available in [Cluster extensions - Azur
* [Azure CLI](/cli/azure/install-azure-cli) version >= 2.16.0 installed. > [!NOTE]
-> If you have enabled [AAD-based pod identity][use-azure-ad-pod-identity] on your AKS cluster or are considering implementing it,
-> we recommend you first review [Migrate to workload identity][migrate-workload-identity] to understand our
+> If you have enabled [Azure AD pod-managed identity][use-azure-ad-pod-identity] on your AKS cluster or are considering implementing it,
+> we recommend you first review [Workload identity overview][workload-identity-overview] to understand our
> recommendations and options to set up your cluster to use an Azure AD workload identity (preview). > This authentication method replaces pod-managed identity (preview), which integrates with the Kubernetes native capabilities > to federate with any external identity providers.
+>
+> The open source Azure AD pod-managed identity (preview) in Azure Kubernetes Service has been deprecated as of 10/24/2022.
### Set up the Azure CLI extension for cluster extensions
az k8s-extension delete --name azureml --cluster-name <clusterName> --resource-g
[gitops-overview]: ../azure-arc/kubernetes/conceptual-gitops-flux2.md [k8s-extension-reference]: /cli/azure/k8s-extension [use-managed-identity]: ./use-managed-identity.md
-[migrate-workload-identity]: workload-identity-overview.md
+[workload-identity-overview]: workload-identity-overview.md
+[use-azure-ad-pod-identity]: use-azure-ad-pod-identity.md
<!-- EXTERNAL --> [arc-k8s-regions]: https://azure.microsoft.com/global-infrastructure/services/?products=azure-arc&regions=all
aks Csi Secrets Store Identity Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/csi-secrets-store-identity-access.md
The Secrets Store CSI Driver on Azure Kubernetes Service (AKS) provides a variet
An [Azure AD workload identity][workload-identity] is an identity used by an application running on a pod that can authenticate itself against other Azure services that support it, such as Storage or SQL. It integrates with the capabilities native to Kubernetes to federate with external identity providers. In this security model, the AKS cluster acts as token issuer where Azure Active Directory uses OpenID Connect to discover public signing keys and verify the authenticity of the service account token before exchanging it for an Azure AD token. Your workload can exchange a service account token projected to its volume for an Azure AD token using the Azure Identity client library using the Azure SDK or the Microsoft Authentication Library (MSAL). > [!NOTE]
-> This authentication method replaces pod-managed identity (preview).
+> This authentication method replaces Azure AD pod-managed identity (preview). The open source Azure AD pod-managed identity (preview) in Azure Kubernetes Service has been deprecated as of 10/24/2022.
### Prerequisites
aks Csi Secrets Store Nginx Tls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/csi-secrets-store-nginx-tls.md
helm install ingress-nginx/ingress-nginx --generate-name \
The ingress controllerΓÇÖs deployment will reference the Secrets Store CSI Driver's Azure Key Vault provider. > [!NOTE]
-> If not using Azure Active Directory (AAD) pod identity as your method of access, remove the line with `--set controller.podLabels.aadpodidbinding=$AAD_POD_IDENTITY_NAME`
+> If not using Azure Active Directory (Azure AD) pod-managed identity as your method of access, remove the line with `--set controller.podLabels.aadpodidbinding=$AAD_POD_IDENTITY_NAME`
```bash helm install ingress-nginx/ingress-nginx --generate-name \
aks Csi Secrets Store Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/csi-secrets-store-troubleshooting.md
Error message in logs/events:
Warning FailedMount 74s kubelet MountVolume.SetUp failed for volume "secrets-store-inline" : kubernetes.io/csi: mounter.SetupAt failed: rpc error: code = Unknown desc = failed to mount secrets store objects for pod default/test, err: rpc error: code = Unknown desc = failed to mount objects, error: failed to get keyvault client: failed to get key vault token: nmi response failed with status code: 404, err: <nil> ```
-Description: The Node Managed Identity (NMI) component in *aad-pod-identity* returned an error for a token request. For more information about the error and to resolve it, check the NMI pod logs and refer to the [Azure AD pod identity troubleshooting guide][aad-troubleshooting].
+Description: The Node Managed Identity (NMI) component in *aad-pod-identity* returned an error for a token request. For more information about the error and to resolve it, check the NMI pod logs and refer to the [Azure AD pod-managed identity troubleshooting guide][aad-troubleshooting].
> [!NOTE] > Azure Active Directory (Azure AD) is abbreviated as *aad* in the *aad-pod-identity* string.
aks Developer Best Practices Pod Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/developer-best-practices-pod-security.md
Title: Developer best practices - Pod security in Azure Kubernetes Services (AKS
description: Learn the developer best practices for how to secure pods in Azure Kubernetes Service (AKS) Previously updated : 07/28/2020 Last updated : 10/27/2022
This best practices article focuses on how to secure pods in AKS. You learn how
> [!div class="checklist"] > * Use pod security context to limit access to processes and services or privilege escalation
-> * Authenticate with other Azure resources using pod managed identities
+> * Authenticate with other Azure resources using Azure Active Directory workload identities
> * Request and retrieve credentials from a digital vault such as Azure Key Vault You can also read the best practices for [cluster security][best-practices-cluster-security] and for [container image management][best-practices-container-image-management].
Work with your cluster operator to determine what security context settings you
To limit the risk of credentials being exposed in your application code, avoid the use of fixed or shared credentials. Credentials or keys shouldn't be included directly in your code. If these credentials are exposed, the application needs to be updated and redeployed. A better approach is to give pods their own identity and way to authenticate themselves, or automatically retrieve credentials from a digital vault.
-### Use Azure Container Compute Upstream projects
-
-> [!IMPORTANT]
-> Associated AKS open source projects are not supported by Azure technical support. They are provided for users to self-install into clusters and gather feedback from our community.
-
-The following [associated AKS open source projects][aks-associated-projects] let you automatically authenticate pods or request credentials and keys from a digital vault. These projects are maintained by the Azure Container Compute Upstream team and are part of a [broader list of projects available for use](https://github.com/Azure/container-compute-upstream/blob/master/README.md#support).
-
- * [Azure Active Directory workload identity][aad-workload-identity] (preview)
- * [Azure Key Vault Provider for Secrets Store CSI Driver](https://github.com/Azure/secrets-store-csi-driver-provider-azure#usage)
- #### Use an Azure AD workload identity (preview) A workload identity is an identity used by an application running on a pod that can authenticate itself against other Azure services that support it, such as Storage or SQL. It integrates with the capabilities native to Kubernetes to federate with external identity providers. In this security model, the AKS cluster acts as token issuer, Azure Active Directory uses OpenID Connect to discover public signing keys and verify the authenticity of the service account token before exchanging it for an Azure AD token. Your workload can exchange a service account token projected to its volume for an Azure AD token using the Azure Identity client library using the [Azure SDK][azure-sdk-download] or the [Microsoft Authentication Library][microsoft-authentication-library] (MSAL).
-For more information about workload identities, see [Configure an AKS cluster to use Azure AD workload identities with your applications][aad-workload-identity]
+For more information about workload identities, see [Configure an AKS cluster to use Azure AD workload identities with your applications][workload-identity-overview]
#### Use Azure Key Vault with Secrets Store CSI Driver
-Using the pod identity project enables authentication against supporting Azure services. For your own services or applications without managed identities for Azure resources, you can still authenticate using credentials or keys. A digital vault can be used to store these secret contents.
+Using the [Azure AD workload identity][workload-identity-overview] enables authentication against supporting Azure services. For your own services or applications without managed identities for Azure resources, you can still authenticate using credentials or keys. A digital vault can be used to store these secret contents.
When applications need a credential, they communicate with the digital vault, retrieve the latest secret contents, and then connect to the required service. Azure Key Vault can be this digital vault. The simplified workflow for retrieving a credential from Azure Key Vault using pod managed identities is shown in the following diagram: :::image type="content" source="media/developer-best-practices-pod-security/basic-key-vault.svg" alt-text="Simplified workflow for retrieving a credential from Key Vault using a pod managed identity":::
-With Key Vault, you store and regularly rotate secrets such as credentials, storage account keys, or certificates. You can integrate Azure Key Vault with an AKS cluster using the [Azure Key Vault provider for the Secrets Store CSI Driver](https://github.com/Azure/secrets-store-csi-driver-provider-azure#usage). The Secrets Store CSI driver enables the AKS cluster to natively retrieve secret contents from Key Vault and securely provide them only to the requesting pod. Work with your cluster operator to deploy the Secrets Store CSI Driver onto AKS worker nodes. You can use a pod managed identity to request access to Key Vault and retrieve the secret contents needed through the Secrets Store CSI Driver.
+With Key Vault, you store and regularly rotate secrets such as credentials, storage account keys, or certificates. You can integrate Azure Key Vault with an AKS cluster using the [Azure Key Vault provider for the Secrets Store CSI Driver][aks-keyvault-csi-driver]. The Secrets Store CSI driver enables the AKS cluster to natively retrieve secret contents from Key Vault and securely provide them only to the requesting pod. Work with your cluster operator to deploy the Secrets Store CSI Driver onto AKS worker nodes. You can use an Azure AD workload identity to request access to Key Vault and retrieve the secret contents needed through the Secrets Store CSI Driver.
## Next steps This article focused on how to secure your pods. To implement some of these areas, see the following articles:
-* [Use workload managed identities for Azure resources with AKS][aad-workload-identity] (preview)
+* [Use Azure AD workload identities for Azure resources with AKS][workload-identity-overview] (preview)
* [Integrate Azure Key Vault with AKS][aks-keyvault-csi-driver] <!-- EXTERNAL LINKS -->
-[aks-keyvault-csi-driver]: https://github.com/Azure/secrets-store-csi-driver-provider-azure#usage
[linux-capabilities]: http://man7.org/linux/man-pages/man7/capabilities.7.html [selinux-labels]: https://kubernetes.io/docs/reference/generated/kubernetes-api/v1.19/#selinuxoptions-v1-core [aks-associated-projects]: https://awesomeopensource.com/projects/aks?categoryPage=11 [azure-sdk-download]: https://azure.microsoft.com/downloads/ <!-- INTERNAL LINKS -->
-[aad-workload-identity]: workload-identity-overview.md
[best-practices-cluster-security]: operator-best-practices-cluster-security.md [best-practices-container-image-management]: operator-best-practices-container-image-management.md [apparmor-seccomp]: operator-best-practices-cluster-security.md#secure-container-access-to-resources
-[microsoft-authentication-library]: ../active-directory/develop/msal-overview.md
+[microsoft-authentication-library]: ../active-directory/develop/msal-overview.md
+[workload-identity-overview]: workload-identity-overview.md
+[aks-keyvault-csi-driver]: csi-secrets-store-driver.md
aks Operator Best Practices Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/operator-best-practices-identity.md
With pod-managed identities (preview) for Azure resources, you automatically req
> recommendations and options to set up your cluster to use an Azure AD workload identity (preview). > This authentication method replaces pod-managed identity (preview), which integrates with the Kubernetes native capabilities > to federate with any external identity providers.
+>
+> The open source Azure AD pod-managed identity (preview) in Azure Kubernetes Service has been deprecated as of 10/24/2022.
Azure Active Directory pod-managed identity (preview) supports two modes of operation:
For more information about cluster operations in AKS, see the following best pra
[aks-best-practices-cluster-isolation]: operator-best-practices-cluster-isolation.md [azure-ad-rbac]: azure-ad-rbac.md [aad-pod-identity]: ./use-azure-ad-pod-identity.md
-[use-azure-ad-pod-identity]: ./use-azure-ad-pod-identity.md#create-an-identity
[workload-identity-overview]: workload-identity-overview.md
aks Quickstart Helm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/quickstart-helm.md
For more information about using Helm, see the Helm documentation.
[az-aks-get-credentials]: /cli/azure/aks#az_aks_get_credentials [import-azakscredential]: /powershell/module/az.aks/import-azakscredential [az-aks-install-cli]: /cli/azure/aks#az_aks_install_cli
-[install-azakskubectl]: /powershell/module/az.aks/install-azakskubectl
+[install-azakskubectl]: /powershell/module/az.aks/install-azaksclitool
[azure-vote-app]: https://github.com/Azure-Samples/azure-voting-app-redis.git [kubectl]: https://kubernetes.io/docs/user-guide/kubectl/ [helm]: https://helm.sh/
For more information about using Helm, see the Helm documentation.
[helm-existing]: kubernetes-helm.md [helm-install]: https://helm.sh/docs/intro/install/ [sp-delete]: kubernetes-service-principal.md#other-considerations
-[acr-helm]: ../container-registry/container-registry-helm-repos.md
+[acr-helm]: ../container-registry/container-registry-helm-repos.md
aks Use Azure Ad Pod Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/use-azure-ad-pod-identity.md
Azure Active Directory (Azure AD) pod-managed identities use Kubernetes primitiv
> This authentication method replaces pod-managed identity (preview), which integrates with the > Kubernetes native capabilities to federate with any external identity providers on behalf of the > application.
+>
+> The open source Azure AD pod-managed identity (preview) in Azure Kubernetes Service has been deprecated as of 10/24/2022.
[!INCLUDE [preview features callout](./includes/preview/preview-callout.md)]
You must have the following resource installed:
### Limitations
-* A maximum of 200 pod identities are allowed for a cluster.
-* A maximum of 200 pod identity exceptions are allowed for a cluster.
+* A maximum of 200 pod-managed identities are allowed for a cluster.
+* A maximum of 200 pod-managed identity exceptions are allowed for a cluster.
* Pod-managed identities are available on Linux node pools only. * This feature is only supported for Virtual Machine Scale Sets backed clusters.
az extension update --name aks-preview
### Operation mode options
-Azure AD pod identity supports two modes of operation:
-
+Azure AD pod-managed identity supports two modes of operation:
+ * **Standard Mode**: In this mode, the following two components are deployed to the AKS cluster: * [Managed Identity Controller (MIC)](https://azure.github.io/aad-pod-identity/docs/concepts/mic/): An MIC is a Kubernetes controller that watches for changes to pods, [AzureIdentity](https://azure.github.io/aad-pod-identity/docs/concepts/azureidentity/) and [AzureIdentityBinding](https://azure.github.io/aad-pod-identity/docs/concepts/azureidentitybinding/) through the Kubernetes API Server. When it detects a relevant change, the MIC adds or deletes [AzureAssignedIdentity](https://azure.github.io/aad-pod-identity/docs/concepts/azureassignedidentity/) as needed. Specifically, when a pod is scheduled, the MIC assigns the managed identity on Azure to the underlying virtual machine scale set used by the node pool during the creation phase. When all pods using the identity are deleted, it removes the identity from the virtual machine scale set of the node pool, unless the same managed identity is used by other pods. The MIC takes similar actions when AzureIdentity or AzureIdentityBinding are created or deleted. * [Node Managed Identity (NMI)](https://azure.github.io/aad-pod-identity/docs/concepts/nmi/): NMI is a pod that runs as a DaemonSet on each node in the AKS cluster. NMI intercepts security token requests to the [Azure Instance Metadata Service](../virtual-machines/linux/instance-metadata-service.md?tabs=linux) on each node, redirect them to itself and validates if the pod has access to the identity it's requesting a token for and fetch the token from the Azure AD tenant on behalf of the application. * **Managed Mode**: This mode offers only NMI. When installed via the AKS cluster add-on, Azure manages creation of Kubernetes primitives (AzureIdentity and AzureIdentityBinding) and identity assignment in response to CLI commands by the user. Otherwise, if installed via Helm chart, the identity needs to be manually assigned and managed by the user. For more information, see [Pod identity in managed mode](https://azure.github.io/aad-pod-identity/docs/configure/pod_identity_in_managed_mode/).
-When you install the Azure AD pod identity via Helm chart or YAML manifest as shown in the [Installation Guide](https://azure.github.io/aad-pod-identity/docs/getting-started/installation/), you can choose between the `standard` and `managed` mode. If you instead decide to install the Azure AD pod identity using the AKS cluster add-on as shown in this article, the setup will use the `managed` mode.
+When you install the Azure AD pod-managed identity via Helm chart or YAML manifest as shown in the [Installation Guide](https://azure.github.io/aad-pod-identity/docs/getting-started/installation/), you can choose between the `standard` and `managed` mode. If you instead decide to install the Azure AD pod-managed identity using the AKS cluster add-on as shown in this article, the setup will use the `managed` mode.
## Create an AKS cluster with Azure Container Networking Interface (CNI)
Update an existing AKS cluster with Azure CNI to include pod-managed identity.
az aks update -g $MY_RESOURCE_GROUP -n $MY_CLUSTER --enable-pod-identity ```
-## Using Kubenet network plugin with Azure Active Directory pod-managed identities
+## Using Kubenet network plugin with Azure Active Directory pod-managed identities
> [!IMPORTANT]
-> Running aad-pod-identity in a cluster with Kubenet is not a recommended configuration due to security concerns. Default Kubenet configuration fails to prevent ARP spoofing, which could be utilized by a pod to act as another pod and gain access to an identity it's not intended to have. Please follow the mitigation steps and configure policies before enabling aad-pod-identity in a cluster with Kubenet.
+> Running Azure AD pod-managed identity in a cluster with Kubenet is not a recommended configuration due to security concerns. Default Kubenet configuration fails to prevent ARP spoofing, which could be utilized by a pod to act as another pod and gain access to an identity it's not intended to have. Please follow the mitigation steps and configure policies before enabling Azure AD pod-managed identity in a cluster with Kubenet.
### Mitigation
Add NET_RAW to "Required drop capabilities"
If you are not using Azure Policy, you can use OpenPolicyAgent admission controller together with Gatekeeper validating webhook. Provided you have Gatekeeper already installed in your cluster, add the ConstraintTemplate of type K8sPSPCapabilities:
-```
+```bash
kubectl apply -f https://raw.githubusercontent.com/open-policy-agent/gatekeeper-library/master/library/pod-security-policy/capabilities/template.yaml ```+ Add a template to limit the spawning of Pods with the NET_RAW capability:
-```
+```yml
apiVersion: constraints.gatekeeper.sh/v1beta1 kind: K8sPSPCapabilities metadata:
az role assignment create --role "Virtual Machine Contributor" --assignee "$IDEN
## Create a pod identity
-Create a pod identity for the cluster using `az aks pod-identity add`.
+Create a pod-managed identity for the cluster using `az aks pod-identity add`.
```azurecli-interactive export POD_IDENTITY_NAME="my-pod-identity"
az aks pod-identity add --resource-group myResourceGroup --cluster-name myAKSClu
``` > [!NOTE]
-> The "POD_IDENTITY_NAME" has to be a valid [DNS subdomain name] as defined in [RFC 1123].
+> The "POD_IDENTITY_NAME" has to be a valid [DNS subdomain name] as defined in [RFC 1123].
> [!NOTE]
-> When you assign the pod identity by using `pod-identity add`, the Azure CLI attempts to grant the Managed Identity Operator role over the pod identity (*IDENTITY_RESOURCE_ID*) to the cluster identity.
+> When you assign the pod-managed identity by using `pod-identity add`, the Azure CLI attempts to grant the Managed Identity Operator role over the pod-managed identity (*IDENTITY_RESOURCE_ID*) to the cluster identity.
Azure will create an AzureIdentity resource in your cluster representing the identity in Azure, and an AzureIdentityBinding resource which connects the AzureIdentity to a selector. You can view these resources with
kubectl get azureidentitybinding -n $POD_IDENTITY_NAMESPACE
## Run a sample application
-For a pod to use AAD pod-managed identity, the pod needs an *aadpodidbinding* label with a value that matches a selector from a *AzureIdentityBinding*. By default, the selector will match the name of the pod identity, but it can also be set using the `--binding-selector` option when calling `az aks pod-identity add`.
+For a pod to use Azure AD pod-managed identity, the pod needs an *aadpodidbinding* label with a value that matches a selector from a *AzureIdentityBinding*. By default, the selector will match the name of the pod-managed identity, but it can also be set using the `--binding-selector` option when calling `az aks pod-identity add`.
-To run a sample application using AAD pod-managed identity, create a `demo.yaml` file with the following contents. Replace *POD_IDENTITY_NAME*, *IDENTITY_CLIENT_ID*, and *IDENTITY_RESOURCE_GROUP* with the values from the previous steps. Replace *SUBSCRIPTION_ID* with your subscription ID.
+To run a sample application using Azure AD pod-managed identity, create a `demo.yaml` file with the following contents. Replace *POD_IDENTITY_NAME*, *IDENTITY_CLIENT_ID*, and *IDENTITY_RESOURCE_GROUP* with the values from the previous steps. Replace *SUBSCRIPTION_ID* with your subscription ID.
> [!NOTE] > In the previous steps, you created the *POD_IDENTITY_NAME*, *IDENTITY_CLIENT_ID*, and *IDENTITY_RESOURCE_GROUP* variables. You can use a command such as `echo` to display the value you set for variables, for example `echo $POD_IDENTITY_NAME`.
spec:
kubernetes.io/os: linux ```
-Notice the pod definition has an *aadpodidbinding* label with a value that matches the name of the pod identity you ran `az aks pod-identity add` in the previous step.
+Notice the pod definition has an *aadpodidbinding* label with a value that matches the name of the pod-managed identity you ran `az aks pod-identity add` in the previous step.
-Deploy `demo.yaml` to the same namespace as your pod identity using `kubectl apply`:
+Deploy `demo.yaml` to the same namespace as your pod-managed identity using `kubectl apply`:
-```azurecli-interactive
+```bash
kubectl apply -f demo.yaml --namespace $POD_IDENTITY_NAMESPACE ``` Verify the sample application successfully runs using `kubectl logs`.
-```azurecli-interactive
+```bash
kubectl logs demo --follow --namespace $POD_IDENTITY_NAMESPACE ``` Verify that the logs show a token is successfully acquired and the *GET* operation is successful.
-
+ ```output ... successfully doARMOperations vm count 0
metadata:
## Clean up
-To remove an Azure AD pod-managed identity from your cluster, remove the sample application and the pod identity from the cluster. Then remove the identity.
+To remove an Azure AD pod-managed identity from your cluster, remove the sample application and the pod-managed identity from the cluster. Then remove the identity.
-```azurecli-interactive
+```bash
kubectl delete pod demo --namespace $POD_IDENTITY_NAMESPACE
+```
+
+```azurecli
az aks pod-identity delete --name ${POD_IDENTITY_NAME} --namespace ${POD_IDENTITY_NAMESPACE} --resource-group myResourceGroup --cluster-name myAKSCluster
+```
+
+```azurecli
az identity delete -g ${IDENTITY_RESOURCE_GROUP} -n ${IDENTITY_NAME} ```
For more information on managed identities, see [Managed identities for Azure re
<!-- LINKS - internal --> [workload-identity-overview]: workload-identity-overview.md-
-<!-- LINKS - external -->
[az-aks-create]: /cli/azure/aks#az_aks_create [az-aks-get-credentials]: /cli/azure/aks#az_aks_get_credentials [az-extension-add]: /cli/azure/extension#az_extension_add
For more information on managed identities, see [Managed identities for Azure re
[az-group-create]: /cli/azure/group#az_group_create [az-identity-create]: /cli/azure/identity#az_identity_create [az-managed-identities]: ../active-directory/managed-identities-azure-resources/overview.md
-[az-role-assignment-create]: /cli/azure/role/assignment#az_role_assignment_create
+
+<!-- LINKS - external -->
[RFC 1123]: https://tools.ietf.org/html/rfc1123 [DNS subdomain name]: https://kubernetes.io/docs/concepts/overview/working-with-objects/names/#dns-subdomain-names
app-service Migration Alternatives https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/migration-alternatives.md
The [back up and restore](../manage-backup.md) feature allows you to keep your a
:::image type="content" source="./media/migration/configure-custom-backup.png" alt-text="Screenshot that shows how to configure custom backup for an App Service app."::: >
-The step-by-step instructions in the current documentation for [backup and restore](../manage-backup.md) should be sufficient to allow you to use this feature. You can select a custom backup and use that to restore the app to an App Service in your App Service Environment v3.
+You can select a custom backup and restore it to an App Service in your App Service Environment v3. You must create the App Service you will restore to before restoring the app. You can choose to restore the backup to the production slot, an existing slot, or a newly created slot that you can create during the restoration process.
:::image type="content" source="./media/migration/back-up-restore-sample.png" alt-text="Screenshot that shows how to use backup to restore App Service app in App Service Environment v3.":::
app-service Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/overview.md
Title: App Service Environment overview
description: This article discusses the Azure App Service Environment feature of Azure App Service. Previously updated : 07/29/2022 Last updated : 10/28/2022
App Service Environment v3 is available in the following regions:
### Azure Public:
-| Region | Normal and dedicated host | Availability zone support |
-| -- | :--: | :-: |
-| Australia East | ✅ | ✅ |
-| Australia Southeast | ✅ | |
-| Brazil South | ✅ | ✅ |
-| Canada Central | ✅ | ✅ |
-| Canada East | ✅ | |
-| Central India | ✅ | ✅ |
-| Central US | ✅ | ✅ |
-| East Asia | ✅ | ✅ |
-| East US | ✅ | ✅ |
-| East US 2 | ✅ | ✅ |
-| France Central | ✅ | ✅ |
-| Germany West Central | ✅ | ✅ |
-| Japan East | ✅ | ✅ |
-| Korea Central | ✅ | ✅ |
-| North Central US | ✅ | |
-| North Europe | ✅ | ✅ |
-| Norway East | ✅ | ✅ |
-| South Africa North | ✅ | ✅ |
-| South Central US | ✅ | ✅ |
-| Southeast Asia | ✅ | ✅ |
-| Sweden Central | ✅ | ✅ |
-| Switzerland North | ✅ | ✅ |
-| UAE North | ✅ | |
-| UK South | ✅ | ✅ |
-| UK West | ✅ | |
-| West Central US | ✅ | |
-| West Europe | ✅ | ✅ |
-| West US | ✅ | |
-| West US 2 | ✅ | ✅ |
-| West US 3 | ✅ | ✅ |
+| Region | Single zone support | Availability zone support | Single zone support |
+| -- | :--: | :-: | :-: |
+| | App Service Environment v3 | App Service Environment v3 | App Service Environment v1/v2 |
+| Australia Central | | | ✅ |
+| Australia Central 2 | | | ✅ |
+| Australia East | ✅ | ✅ | ✅ |
+| Australia Southeast | ✅ | | ✅ |
+| Brazil South | ✅ | ✅ | ✅ |
+| Brazil Southeast | | | ✅ |
+| Canada Central | ✅ | ✅ | ✅ |
+| Canada East | ✅ | | ✅ |
+| Central India | ✅ | ✅ | ✅ |
+| Central US | ✅ | ✅ | ✅ |
+| East Asia | ✅ | ✅ | ✅ |
+| East US | ✅ | ✅ | ✅ |
+| East US 2 | ✅ | ✅ | ✅ |
+| France Central | ✅ | ✅ | ✅ |
+| France South | | | ✅ |
+| Germany North | | | ✅ |
+| Germany West Central | ✅ | ✅ | ✅ |
+| Japan East | ✅ | ✅ | ✅ |
+| Japan West | | | ✅ |
+| Jio India West | | | ✅ |
+| Korea Central | ✅ | ✅ | ✅ |
+| Korea South | | | ✅ |
+| North Central US | ✅ | | ✅ |
+| North Europe | ✅ | ✅ | ✅ |
+| Norway East | ✅ | ✅ | ✅ |
+| Norway West | | | ✅ |
+| South Africa North | ✅ | ✅ | ✅ |
+| South Africa West | | | ✅ |
+| South Central US | ✅ | ✅ | ✅ |
+| South India | | | ✅ |
+| Southeast Asia | ✅ | ✅ | ✅ |
+| Sweden Central | ✅ | ✅ | |
+| Switzerland North | ✅ | ✅ | ✅ |
+| Switzerland West | | | ✅ |
+| UAE Central | | | ✅ |
+| UAE North | ✅ | | ✅ |
+| UK South | ✅ | ✅ | ✅ |
+| UK West | ✅ | | ✅ |
+| West Central US | ✅ | | ✅ |
+| West Europe | ✅ | ✅ | ✅ |
+| West India | | | ✅ |
+| West US | ✅ | | ✅ |
+| West US 2 | ✅ | ✅ | ✅ |
+| West US 3 | ✅ | ✅ | ✅ |
### Azure Government:
-| Region | Normal and dedicated host | Availability zone support |
-| -- | :-: | :-: |
-| US Gov Texas | ✅ | |
-| US Gov Arizona | ✅ | |
-| US Gov Virginia | ✅ | |
+| Region | Single zone support | Availability zone support | Single zone support |
+| -- | :--: | :-: | :-: |
+| | App Service Environment v3 | App Service Environment v3 | App Service Environment v1/v2 |
+| US DoD Central | | | ✅ |
+| US DoD East | | | ✅ |
+| US Gov Arizona | ✅ | | ✅ |
+| US Gov Iowa | | | ✅ |
+| US Gov Texas | ✅ | | ✅ |
+| US Gov Virginia | ✅ | | ✅ |
+
+### Azure China:
+
+| Region | Single zone support | Availability zone support | Single zone support |
+| -- | :--: | :-: | :-: |
+| | App Service Environment v3 (preview) | App Service Environment v3 (preview) | App Service Environment v1/v2 |
+| China East 2 | | | ✅ |
+| China East 3 | ✅ | | |
+| China North 2 | | | ✅ |
+| China North 3 | ✅ | ✅ | |
## App Service Environment v2
app-service Quickstart Wordpress https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/quickstart-wordpress.md
When no longer needed, you can delete the resource group, App service, and all r
## Manage the MySQL flexible server, username, or password -- The MySQL Flexible Server is created behind a private [Virtual Network](../virtual-network/virtual-networks-overview.md) and can't be accessed directly. To access or manage the database, use phpMyAdmin that's deployed with the WordPress site. You can access phpMyAdmin by following these steps:
+- The MySQL Flexible Server is created behind a private [Virtual Network](/azure/virtual-network/virtual-networks-overview) and can't be accessed directly. To access or manage the database, use phpMyAdmin that's deployed with the WordPress site. You can access phpMyAdmin by following these steps:
- Navigate to the URL: https://`<sitename>`.azurewebsites.net/phpmyadmin - Login with the flexible server's username and password
Congratulations, you've successfully completed this quickstart!
> [Tutorial: PHP app with MySQL](tutorial-php-mysql-app.md) > [!div class="nextstepaction"]
-> [Configure PHP app](configure-language-php.md)
+> [Configure PHP app](configure-language-php.md)
app-service Reference Dangling Subdomain Prevention https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/reference-dangling-subdomain-prevention.md
The risks of subdomain takeover include:
- Phishing campaigns - Further risks of classic attacks such as XSS, CSRF, CORS bypass
-Learn more about Subdomain Takeover at [Dangling DNS and subdomain takeover](/azure/security/fundamentals/subdomain-takeover.md).
+Learn more about Subdomain Takeover at [Dangling DNS and subdomain takeover](/azure/security/fundamentals/subdomain-takeover).
Azure App Service provides [Name Reservation Service](#how-app-service-prevents-subdomain-takeovers) and [domain verification tokens](#how-you-can-prevent-subdomain-takeovers) to prevent subdomain takeovers. ## How App Service prevents subdomain takeovers
applied-ai-services Concept Read https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-read.md
recommendations: false
> [!NOTE] >
-> For general, in-the-wild images like labels, street signs, and posters, use the [Computer Vision v4.0 preview Read](../../cognitive-services/Computer-vision/concept-ocr.md) feature optimized for general, non-document images with a performance-enhanced synchronous API that makes it easier to embed OCR in your user experience scenarios.
+> For extracting text from in-the-wild images like labels, street signs, and posters, use the [Computer Vision v4.0 preview Read](../../cognitive-services/Computer-vision/concept-ocr.md) feature optimized for general, non-document images with a performance-enhanced synchronous API that makes it easier to embed OCR in your user experience scenarios.
> ## What is OCR for documents?
applied-ai-services Sdk Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/sdk-overview.md
Previously updated : 10/20/2022 Last updated : 10/27/2022 recommendations: false
Form Recognizer SDK supports the following languages and platforms:
|<ul><li> Python </li></ul>| <ul><li>3.1.x</li></ul> | <ul><li> v2.1 (default)</li><li>v2.0</li></ul> |<ul><li>**FormRecognizerClient**</li><li>**FormTrainingClient**</li></ul> | |<ul><li> Python</li></ul>| <ul><li>3.0.0</li></ul> | <ul><li>v2.0</li></ul>| <ul><li> **FormRecognizerClient**</li><li>**FormTrainingClient**</li></ul> |
-## Changelog and release history
-
-#### Form Recognizer SDK September 2022 GA release
-
-This release includes the following updates:
-
-> [!IMPORTANT]
-> The `DocumentAnalysisClient` and `DocumentModelAdministrationClient` now target API version v3.0 GA, released 2022-08-31. These clients are no longer supported by API versions 2020-06-30-preview or earlier.
-
-### [**C#**](#tab/csharp)
-
-* **Version 4.0.0 GA (2022-09-08)**
-* **Supports REST API v3.0 and v2.0 clients**
-
-[**Package (NuGet)**](https://www.nuget.org/packages/Azure.AI.FormRecognizer/4.0.0)
-
-[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/CHANGELOG.md)
-
-[**Migration guide**](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.FormRecognizer_4.0.0/sdk/formrecognizer/Azure.AI.FormRecognizer/MigrationGuide.md)
-
-[**ReadMe**](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.FormRecognizer_4.0.0/sdk/formrecognizer/Azure.AI.FormRecognizer/README.md)
-
-[**Samples**](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.FormRecognizer_4.0.0/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/README.md)
-
-### [**Java**](#tab/java)
-
-* **Version 4.0.0 GA (2022-09-08)**
-* **Supports REST API v3.0 and v2.0 clients**
-
-[**Package (Maven)**](https://oss.sonatype.org/#nexus-search;quick~azure-ai-formrecognizer)
-
-[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-jav)
-
-[**Migration guide**](https://github.com/Azure/azure-sdk-for-jav)
-
-[**ReadMe**](https://github.com/Azure/azure-sdk-for-jav)
-
-[**Samples**](https://github.com/Azure/azure-sdk-for-jav)
-
-### [**JavaScript**](#tab/javascript)
-
-* **Version 4.0.0 GA (2022-09-08)**
-* **Supports REST API v3.0 and v2.0 clients**
-
-[**Package (npm)**](https://www.npmjs.com/package/@azure/ai-form-recognizer)
-
-[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-js/blob/%40azure/ai-form-recognizer_4.0.0/sdk/formrecognizer/ai-form-recognizer/CHANGELOG.md)
-
-[**Migration guide**](https://github.com/Azure/azure-sdk-for-js/blob/%40azure/ai-form-recognizer_4.0.0/sdk/formrecognizer/ai-form-recognizer/MIGRATION-v3_v4.md)
-
-[**ReadMe**](https://github.com/Azure/azure-sdk-for-js/blob/%40azure/ai-form-recognizer_4.0.0/sdk/formrecognizer/ai-form-recognizer/README.md)
-
-[**Samples**](https://github.com/witemple-msft/azure-sdk-for-js/blob/7e3196f7e529212a6bc329f5f06b0831bf4cc174/sdk/formrecognizer/ai-form-recognizer/samples/v4/javascript/README.md)
-
-### [Python](#tab/python)
-
-> [!NOTE]
-> Python 3.7 or later is required to use this package.
-
-* **Version 3.2.0 GA (2022-09-08)**
-* **Supports REST API v3.0 and v2.0 clients**
-
-[**Package (PyPi)**](https://pypi.org/project/azure-ai-formrecognizer/3.2.0/)
-
-[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0/sdk/formrecognizer/azure-ai-formrecognizer/CHANGELOG.md)
-
-[**Migration guide**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0/sdk/formrecognizer/azure-ai-formrecognizer/MIGRATION_GUIDE.md)
-
-[**ReadMe**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0/sdk/formrecognizer/azure-ai-formrecognizer/README.md)
-
-[**Samples**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0/sdk/formrecognizer/azure-ai-formrecognizer/samples/README.md)
---
-#### Form Recognizer SDK beta August 2022 preview release
-
-This release includes the following updates:
-
-### [**C#**](#tab/csharp)
-
-**Version 4.0.0-beta.5 (2022-08-09)**
-**Supports REST API 2022-06-30-preview clients**
-
-[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/CHANGELOG.md#400-beta5-2022-08-09)
-
-[**Package (NuGet)**](https://www.nuget.org/packages/Azure.AI.FormRecognizer/4.0.0-beta.5)
-
-[**SDK reference documentation**](/dotnet/api/overview/azure/ai.formrecognizer-readme?view=azure-dotnet-preview&preserve-view=true)
-
-### [**Java**](#tab/java)
-
-**Version 4.0.0-beta.6 (2022-08-10)**
-**Supports REST API 2022-06-30-preview and earlier clients**
-
-[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-jav#400-beta6-2022-08-10)
-
- [**Package (Maven)**](https://oss.sonatype.org/#nexus-search;quick~azure-ai-formrecognizer)
-
- [**SDK reference documentation**](/java/api/overview/azure/ai-formrecognizer-readme?view=azure-java-preview&preserve-view=true)
-
-### [**JavaScript**](#tab/javascript)
-
-**Version 4.0.0-beta.6 (2022-08-09)**
-**Supports REST API 2022-06-30-preview and earlier clients**
-
- [**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-js/blob/%40azure/ai-form-recognizer_4.0.0-beta.6/sdk/formrecognizer/ai-form-recognizer/CHANGELOG.md)
-
- [**Package (npm)**](https://www.npmjs.com/package/@azure/ai-form-recognizer/v/4.0.0-beta.6)
-
- [**SDK reference documentation**](/javascript/api/overview/azure/ai-form-recognizer-readme?view=azure-node-preview&preserve-view=true)
-
-### [Python](#tab/python)
-
-> [!IMPORTANT]
-> Python 3.6 is no longer supported in this release. Use Python 3.7 or later.
-
-**Version 3.2.0b6 (2022-08-09)**
-**Supports REST API 2022-06-30-preview and earlier clients**
-
- [**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b6/sdk/formrecognizer/azure-ai-formrecognizer/CHANGELOG.md)
-
- [**Package (PyPi)**](https://pypi.org/project/azure-ai-formrecognizer/3.2.0b6/)
-
- [**SDK reference documentation**](https://pypi.org/project/azure-ai-formrecognizer/3.2.0b6/)
---
-### Form Recognizer SDK beta June 2022 preview release
-
-This release includes the following updates:
-
-### [**C#**](#tab/csharp)
-
-**Version 4.0.0-beta.4 (2022-06-08)**
-
-[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.FormRecognizer_4.0.0-beta.4/sdk/formrecognizer/Azure.AI.FormRecognizer/CHANGELOG.md)
-
-[**Package (NuGet)**](https://www.nuget.org/packages/Azure.AI.FormRecognizer/4.0.0-beta.4)
-
-[**SDK reference documentation**](/dotnet/api/azure.ai.formrecognizer?view=azure-dotnet-preview&preserve-view=true)
-
-### [**Java**](#tab/java)
-
-**Version 4.0.0-beta.5 (2022-06-07)**
-
-[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-jav)
-
- [**Package (Maven)**](https://search.maven.org/artifact/com.azure/azure-ai-formrecognizer/4.0.0-beta.5/jar)
-
- [**SDK reference documentation**](/java/api/overview/azure/ai-formrecognizer-readme?view=azure-java-preview&preserve-view=true)
-
-### [**JavaScript**](#tab/javascript)
-
-**Version 4.0.0-beta.4 (2022-06-07)**
-
- [**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-js/blob/%40azure/ai-form-recognizer_4.0.0-beta.4/sdk/formrecognizer/ai-form-recognizer/CHANGELOG.md)
-
- [**Package (npm)**](https://www.npmjs.com/package/@azure/ai-form-recognizer/v/4.0.0-beta.4)
-
- [**SDK reference documentation**](/javascript/api/@azure/ai-form-recognizer/?view=azure-node-preview&preserve-view=true)
-
-### [Python](#tab/python)
-
-**Version 3.2.0b5 (2022-06-07**
-
- [**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b5/sdk/formrecognizer/azure-ai-formrecognizer/CHANGELOG.md)
-
- [**Package (PyPi)**](https://pypi.org/project/azure-ai-formrecognizer/3.2.0b5/)
-
- [**SDK reference documentation**](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer?view=azure-python-preview&preserve-view=true)
--- ## Use Form Recognizer SDK in your applications The Form Recognizer SDK enables the use and management of the Form Recognizer service in your application. The SDK builds on the underlying Form Recognizer REST API allowing you to easily use those APIs within your programming language paradigm. Here's how you use the Form Recognizer SDK for your preferred language:
Here's how to acquire and use the [DefaultAzureCredential](/python/api/azure-ide
```python pip install azure-identity ```+ 1. [Register an Azure AD application and create a new service principal](../../cognitive-services/authentication.md?tabs=powershell#assign-a-role-to-a-service-principal). 1. Grant access to Form Recognizer by assigning the **`Cognitive Services User`** role to your service principal.
For more information, *see* [Authenticate the client](https://github.com/Azure/a
### 4. Build your application
-First, you'll create a client object to interact with the Form Recognizer SDK, and then call methods on that client object to interact with the service. The SDKs provide both synchronous and asynchronous methods. For more insight, try a [quickstart](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) in a language of your choice.
+You'll create a client object to interact with the Form Recognizer SDK, and then call methods on that client object to interact with the service. The SDKs provide both synchronous and asynchronous methods. For more insight, try a [quickstart](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) in a language of your choice.
+
+## Changelog and release history
+
+#### Form Recognizer SDK September 2022 GA release
+
+This release includes the following updates:
+
+> [!IMPORTANT]
+> The `DocumentAnalysisClient` and `DocumentModelAdministrationClient` now target API version v3.0 GA, released 2022-08-31. These clients are no longer supported by API versions 2020-06-30-preview or earlier.
+
+### [**C#**](#tab/csharp)
+
+* **Version 4.0.0 GA (2022-09-08)**
+* **Supports REST API v3.0 and v2.0 clients**
+
+[**Package (NuGet)**](https://www.nuget.org/packages/Azure.AI.FormRecognizer/4.0.0)
+
+[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/CHANGELOG.md)
+
+[**Migration guide**](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.FormRecognizer_4.0.0/sdk/formrecognizer/Azure.AI.FormRecognizer/MigrationGuide.md)
+
+[**ReadMe**](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.FormRecognizer_4.0.0/sdk/formrecognizer/Azure.AI.FormRecognizer/README.md)
+
+[**Samples**](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.FormRecognizer_4.0.0/sdk/formrecognizer/Azure.AI.FormRecognizer/samples/README.md)
+
+### [**Java**](#tab/java)
+
+* **Version 4.0.0 GA (2022-09-08)**
+* **Supports REST API v3.0 and v2.0 clients**
+
+[**Package (Maven)**](https://oss.sonatype.org/#nexus-search;quick~azure-ai-formrecognizer)
+
+[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-jav)
+
+[**Migration guide**](https://github.com/Azure/azure-sdk-for-jav)
+
+[**ReadMe**](https://github.com/Azure/azure-sdk-for-jav)
+
+[**Samples**](https://github.com/Azure/azure-sdk-for-jav)
+
+### [**JavaScript**](#tab/javascript)
+
+* **Version 4.0.0 GA (2022-09-08)**
+* **Supports REST API v3.0 and v2.0 clients**
+
+[**Package (npm)**](https://www.npmjs.com/package/@azure/ai-form-recognizer)
+
+[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-js/blob/%40azure/ai-form-recognizer_4.0.0/sdk/formrecognizer/ai-form-recognizer/CHANGELOG.md)
+
+[**Migration guide**](https://github.com/Azure/azure-sdk-for-js/blob/%40azure/ai-form-recognizer_4.0.0/sdk/formrecognizer/ai-form-recognizer/MIGRATION-v3_v4.md)
+
+[**ReadMe**](https://github.com/Azure/azure-sdk-for-js/blob/%40azure/ai-form-recognizer_4.0.0/sdk/formrecognizer/ai-form-recognizer/README.md)
+
+[**Samples**](https://github.com/witemple-msft/azure-sdk-for-js/blob/7e3196f7e529212a6bc329f5f06b0831bf4cc174/sdk/formrecognizer/ai-form-recognizer/samples/v4/javascript/README.md)
+
+### [Python](#tab/python)
+
+> [!NOTE]
+> Python 3.7 or later is required to use this package.
+
+* **Version 3.2.0 GA (2022-09-08)**
+* **Supports REST API v3.0 and v2.0 clients**
+
+[**Package (PyPi)**](https://pypi.org/project/azure-ai-formrecognizer/3.2.0/)
+
+[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0/sdk/formrecognizer/azure-ai-formrecognizer/CHANGELOG.md)
+
+[**Migration guide**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0/sdk/formrecognizer/azure-ai-formrecognizer/MIGRATION_GUIDE.md)
+
+[**ReadMe**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0/sdk/formrecognizer/azure-ai-formrecognizer/README.md)
+
+[**Samples**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0/sdk/formrecognizer/azure-ai-formrecognizer/samples/README.md)
+++
+#### Form Recognizer SDK beta August 2022 preview release
+
+This release includes the following updates:
+
+### [**C#**](#tab/csharp)
+
+**Version 4.0.0-beta.5 (2022-08-09)**
+**Supports REST API 2022-06-30-preview clients**
+
+[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/formrecognizer/Azure.AI.FormRecognizer/CHANGELOG.md#400-beta5-2022-08-09)
+
+[**Package (NuGet)**](https://www.nuget.org/packages/Azure.AI.FormRecognizer/4.0.0-beta.5)
+
+[**SDK reference documentation**](/dotnet/api/overview/azure/ai.formrecognizer-readme?view=azure-dotnet-preview&preserve-view=true)
+
+### [**Java**](#tab/java)
+
+**Version 4.0.0-beta.6 (2022-08-10)**
+**Supports REST API 2022-06-30-preview and earlier clients**
+
+[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-jav#400-beta6-2022-08-10)
+
+ [**Package (Maven)**](https://oss.sonatype.org/#nexus-search;quick~azure-ai-formrecognizer)
+
+ [**SDK reference documentation**](/java/api/overview/azure/ai-formrecognizer-readme?view=azure-java-preview&preserve-view=true)
+
+### [**JavaScript**](#tab/javascript)
+
+**Version 4.0.0-beta.6 (2022-08-09)**
+**Supports REST API 2022-06-30-preview and earlier clients**
+
+ [**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-js/blob/%40azure/ai-form-recognizer_4.0.0-beta.6/sdk/formrecognizer/ai-form-recognizer/CHANGELOG.md)
+
+ [**Package (npm)**](https://www.npmjs.com/package/@azure/ai-form-recognizer/v/4.0.0-beta.6)
+
+ [**SDK reference documentation**](/javascript/api/overview/azure/ai-form-recognizer-readme?view=azure-node-preview&preserve-view=true)
+
+### [Python](#tab/python)
+
+> [!IMPORTANT]
+> Python 3.6 is no longer supported in this release. Use Python 3.7 or later.
+
+**Version 3.2.0b6 (2022-08-09)**
+**Supports REST API 2022-06-30-preview and earlier clients**
+
+ [**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b6/sdk/formrecognizer/azure-ai-formrecognizer/CHANGELOG.md)
+
+ [**Package (PyPi)**](https://pypi.org/project/azure-ai-formrecognizer/3.2.0b6/)
+
+ [**SDK reference documentation**](https://pypi.org/project/azure-ai-formrecognizer/3.2.0b6/)
+++
+### Form Recognizer SDK beta June 2022 preview release
+
+This release includes the following updates:
+
+### [**C#**](#tab/csharp)
+
+**Version 4.0.0-beta.4 (2022-06-08)**
+
+[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.FormRecognizer_4.0.0-beta.4/sdk/formrecognizer/Azure.AI.FormRecognizer/CHANGELOG.md)
+
+[**Package (NuGet)**](https://www.nuget.org/packages/Azure.AI.FormRecognizer/4.0.0-beta.4)
+
+[**SDK reference documentation**](/dotnet/api/azure.ai.formrecognizer?view=azure-dotnet-preview&preserve-view=true)
+
+### [**Java**](#tab/java)
+
+**Version 4.0.0-beta.5 (2022-06-07)**
+
+[**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-jav)
+
+ [**Package (Maven)**](https://search.maven.org/artifact/com.azure/azure-ai-formrecognizer/4.0.0-beta.5/jar)
+
+ [**SDK reference documentation**](/java/api/overview/azure/ai-formrecognizer-readme?view=azure-java-preview&preserve-view=true)
+
+### [**JavaScript**](#tab/javascript)
+
+**Version 4.0.0-beta.4 (2022-06-07)**
+
+ [**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-js/blob/%40azure/ai-form-recognizer_4.0.0-beta.4/sdk/formrecognizer/ai-form-recognizer/CHANGELOG.md)
+
+ [**Package (npm)**](https://www.npmjs.com/package/@azure/ai-form-recognizer/v/4.0.0-beta.4)
+
+ [**SDK reference documentation**](/javascript/api/@azure/ai-form-recognizer/?view=azure-node-preview&preserve-view=true)
+
+### [Python](#tab/python)
+
+**Version 3.2.0b5 (2022-06-07**
+
+ [**Changelog/Release History**](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b5/sdk/formrecognizer/azure-ai-formrecognizer/CHANGELOG.md)
+
+ [**Package (PyPi)**](https://pypi.org/project/azure-ai-formrecognizer/3.2.0b5/)
+
+ [**SDK reference documentation**](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer?view=azure-python-preview&preserve-view=true)
++ ## Help options
applied-ai-services V3 Migration Guide https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/v3-migration-guide.md
POST https://{your-form-recognizer-endpoint}/formrecognizer/documentModels:compo
The call pattern for copy model remains unchanged: * Authorize the copy operation with the target resource calling ```authorizeCopy```. Now a POST request.
-* Submit the authorization to the source resource to copy the model calling ```copy-to```
+* Submit the authorization to the source resource to copy the model calling ```copyTo```
* Poll the returned operation to validate the operation completed successfully The only changes to the copy model function are:
POST https://{targetHost}/formrecognizer/documentModels:authorizeCopy?api-versio
Use the response body from the authorize action to construct the request for the copy. ```json
-POST https://{sourceHost}/formrecognizer/documentModels/{sourceModelId}:copy-to?api-version=2022-08-31
+POST https://{sourceHost}/formrecognizer/documentModels/{sourceModelId}:copyTo?api-version=2022-08-31
{ "targetResourceId": "{targetResourceId}", "targetResourceRegion": "{targetResourceRegion}",
applied-ai-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/whats-new.md
Form Recognizer service is updated on an ongoing basis. Bookmark this page to st
## October 2022
+### Form Recognizer Studio Sample Code
+
+Sample code the Form Recgonizer Studio labeling experience is now available on github - https://github.com/microsoft/Form-Recognizer-Toolkit/tree/main/SampleCode/LabelingUX. Customers can develop and integrate Form Recognizer into their own UX or build their own new UX using the Form Recognizer Studio sample code.
+ ### Language expansion With the latest preview release, Form Recognizer's Read (OCR), Layout, and Custom template models support 134 new languages. These language additions include Greek, Latvian, Serbian, Thai, Ukrainian, and Vietnamese, along with several Latin and Cyrillic languages. Form Recognizer now has a total of 299 supported languages across the most recent GA and new preview versions. Refer to the [supported languages](language-support.md) page to see all supported languages.
azure-app-configuration Concept Config File https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-app-configuration/concept-config-file.md
Previously updated : 04/01/2022 Last updated : 10/28/2022 # Azure App Configuration support for configuration files
Key Vault references require a particular content type during importing, so you
```json {
- "Database": {
- "ConnectionString": "{\"uri\":\"https://<your-vault-name>.vault.azure.net/secrets/db-secret\"}"
- }
+ "Database:ConnectionString": {
+ "uri": "https://<your-vault-name>.vault.azure.net/secrets/db-secret"
+ }
} ```
-Run the following CLI command to import it with the `test` label, the colon (`:`) separator, and the Key Vault reference content type.
+Run the following CLI command to import it with the `test` label and the Key Vault reference content type.
```azurecli-interactive
-az appconfig kv import --label test --separator : --content-type application/vnd.microsoft.appconfig.keyvaultref+json;charset=utf-8 --name <your store name> --source file --path keyvault-refs.json --format json
+az appconfig kv import --label test --content-type application/vnd.microsoft.appconfig.keyvaultref+json;charset=utf-8 --name <your store name> --source file --path keyvault-refs.json --format json
``` The following table shows all the imported data in your App Configuration store.
The following table shows all the imported data in your App Configuration store.
||||| | .appconfig.featureflag/Beta | {"id":"Beta","description":"","enabled":false,"conditions":{"client_filters":[]}} | dev | application/vnd.microsoft.appconfig.ff+json;charset=utf-8 | | Logging:LogLevel:Default | Warning | dev | |
-| Database:ConnectionString | "{\"uri\":\"https://\<your-vault-name\>.vault.azure.net/secrets/db-secret\"}" | test | application/vnd.microsoft.appconfig.keyvaultref+json;charset=utf-8 |
+| Database:ConnectionString | {\"uri\":\"https://\<your-vault-name\>.vault.azure.net/secrets/db-secret\"} | test | application/vnd.microsoft.appconfig.keyvaultref+json;charset=utf-8 |
## File content profile: KVSet
The following table shows all the imported data in your App Configuration store.
||||| | .appconfig.featureflag/Beta | {"id":"Beta","description":"Beta feature","enabled":**true**,"conditions":{"client_filters":[]}} | dev | application/vnd.microsoft.appconfig.ff+json;charset=utf-8 | | Logging:LogLevel:Default | **Debug** | dev | |
-| Database:ConnectionString | "{\"uri\":\"https://\<your-vault-name\>.vault.azure.net/secrets/db-secret\"}" | test | application/vnd.microsoft.appconfig.keyvaultref+json;charset=utf-8 |
+| Database:ConnectionString | {\"uri\":\"https://\<your-vault-name\>.vault.azure.net/secrets/db-secret\"} | test | application/vnd.microsoft.appconfig.keyvaultref+json;charset=utf-8 |
## Next steps
azure-arc Create Complete Managed Instance Indirectly Connected https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/create-complete-managed-instance-indirectly-connected.md
Title: Quickstart - Deploy Azure Arc-enabled data services - indirectly connected mode - Azure CLI
-description: Demonstrates how to deploy Azure Arc-enabled data services in indirectly connected mode from beginning, including a Kubernetes cluster. Uses Azure CLI. Finishes with an instance of Azure SQL Managed Instance.
+ Title: Quickstart - Deploy Azure Arc-enabled data services
+description: Quickstart - deploy Azure Arc-enabled data services in indirectly connected mode. Includes a Kubernetes cluster. Uses Azure CLI.
-+ Previously updated : 12/09/2021 Last updated : 09/20/2022 # Quickstart: Deploy Azure Arc-enabled data services - indirectly connected mode - Azure CLI
-This article demonstrates how to deploy Azure Arc-enabled data services in indirectly connected mode from with the Azure CLI.
-
-To deploy in directly connected mode, see [Quickstart: Deploy Azure Arc-enabled data services - directly connected mode - Azure portal](create-complete-managed-instance-directly-connected.md).
+In this quickstart, you will deploy Azure Arc-enabled data services in indirectly connected mode from with the Azure CLI.
When you complete the steps in this article, you will have:
Use these objects to experience Azure Arc-enabled data services.
Azure Arc allows you to run Azure data services on-premises, at the edge, and in public clouds via Kubernetes. Deploy SQL Managed Instance and PostgreSQL server data services (preview) with Azure Arc. The benefits of using Azure Arc include staying current with constant service patches, elastic scale, self-service provisioning, unified management, and support for disconnected mode.
-## Install client tools
+## Prerequisites
+
+If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+
+To complete the task in this article, install the required [client tools](install-client-tools.md). Specifically, you will use the following tools:
-First, install the [client tools](install-client-tools.md) needed on your machine. To complete the steps in this article, you will use the following tools:
* Azure Data Studio * The Azure Arc extension for Azure Data Studio * Kubernetes CLI
The environment variables include passwords for log and metric services. The pas
Run the following command to set the credential.
-#### [Linux](#tab/linux)
+### [Linux](#tab/linux)
```console export AZDATA_LOGSUI_USERNAME=<username for logs>
export AZDATA_METRICSUI_USERNAME=<username for metrics>
export AZDATA_METRICSUI_PASSWORD=<password for metrics> ```
-#### [Windows / PowerShell](#tab/powershell)
+### [Windows / PowerShell](#tab/powershell)
```powershell $ENV:AZDATA_LOGSUI_USERNAME="<username for logs>"
NAME STATE
<namespace> Ready ``` - ## Connect to managed instance on Azure Data Studio To connect with Azure Data Studio, see [Connect to Azure Arc-enabled SQL Managed Instance](connect-managed-instance.md).
+## Upload usage and metrics to Azure portal
+
+If you wish, you can [Upload usage data, metrics, and logs to Azure](upload-metrics-and-logs-to-azure-monitor.md).
+
+## Clean up resources
+
+After you are done with the resources you created in this article.
+
+Follow the steps in [Delete data controller in indirectly connected mode](uninstall-azure-arc-data-controller.md#delete-data-controller-in-indirectly-connected-mode).
+ ## Next steps
-[Upload usage data, metrics, and logs to Azure](upload-metrics-and-logs-to-azure-monitor.md).
+> [!div class="nextstepaction"]
+> [Quickstart: Deploy Azure Arc-enabled data services - directly connected mode - Azure portal](create-complete-managed-instance-directly-connected.md).
azure-arc Validation Program https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/data/validation-program.md
To see how all Azure Arc-enabled components are validated, see [Validation progr
## Partners
+### DataON
+
+|Solution and version | Kubernetes version | Azure Arc-enabled data services version | SQL engine version | PostgreSQL server version
+|--|--|--|--|--|
+|DataON AZS-6224|1.23.8|v1.12.0_2022-10-11|16.0.537.5223|)
+ ### Dell |Solution and version | Kubernetes version | Azure Arc-enabled data services version | SQL engine version | PostgreSQL server version
To see how all Azure Arc-enabled components are validated, see [Validation progr
|Solution and version | Kubernetes version | Azure Arc-enabled data services version | SQL engine version | PostgreSQL server version |--|--|--|--|--|
-| OpenShift 4.10.32 | v1.23.5 | v1.11.0_2022-09-13 | 16.0.312.4243 | postgres 12.3 (Ubuntu 12.3-1)|
+| OpenShift 4.10.16 | v1.23.5 | v1.11.0_2022-09-13 | 16.0.312.4243 | postgres 12.3 (Ubuntu 12.3-1)|
### VMware
azure-monitor Container Insights Prometheus Metrics Addon https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/containers/container-insights-prometheus-metrics-addon.md
Use the following procedure to install the Azure Monitor agent and the metrics a
#### Prerequisites - Register the `AKS-PrometheusAddonPreview` feature flag in the Azure Kubernetes clusters subscription with the following command in Azure CLI: `az feature register --namespace Microsoft.ContainerService --name AKS-PrometheusAddonPreview`.-- The aks-preview extension needs to be installed using the command `az extension add --name aks-preview`. For more information on how to install a CLI extension, see [Use and manage extensions with the Azure CLI](/azure/azure-cli-extensions-overview).
+- The aks-preview extension needs to be installed using the command `az extension add --name aks-preview`. For more information on how to install a CLI extension, see [Use and manage extensions with the Azure CLI](/cli/azure/azure-cli-extensions-overview).
- Azure CLI version 2.41.0 or higher is required for this feature. #### Install metrics addon
azure-monitor Diagnostic Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/diagnostic-settings.md
Each Azure resource requires its own diagnostic setting, which defines the follo
A single diagnostic setting can define no more than one of each of the destinations. If you want to send data to more than one of a particular destination type (for example, two different Log Analytics workspaces), create multiple settings. Each resource can have up to five diagnostic settings.
+> [!WARNING]
+> If you need to delete a resource, you should first delete its diagnostic settings. Otherwise, if you recreate this resource using the same name, the previous diagnostic settings will be included with the new resource. This will resume the collection of resource logs for the new resource as defined in a diagnostic setting and send the applicable metric and log data to the previously configured destination.
+ The following video walks you through routing resource platform logs with diagnostic settings. The video was done at an earlier time. Be aware of the following changes: - There are now four destinations. You can send platform metrics and logs to certain Azure Monitor partners.
azure-monitor Prometheus Remote Write Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/prometheus-remote-write-managed-identity.md
This is likely due to misconfiguration of the container. In order to view the co
kubectl get po <Prometheus-Pod-Name> -o json | jq -c '.spec.containers[] | select( .name | contains(" <Azure-Monitor-Side-Car-Container-Name> "))' ``` Output:
-{"env":[{"name":"INGESTION_URL","value":"https://rwtest-eus2-qu4m.eastus2-1.metrics.ingest.monitor.azure.com/dataCollectionRules/dcr-90b2d5e5feac43f486311dff33c3c116/streams/Microsoft-PrometheusMetrics/api/v1/write?api-version=2021-11-01-preview"},{"name":"LISTENING_PORT","value":"8081"},{"name":"IDENTITY_TYPE","value":"userAssigned"},{"name":"AZURE_CLIENT_ID","value":"fe9b242a-1cdb-4d30-86e4-14e432f326de"}],"image":"mcr.microsoft.com/azuremonitor/prometheus/promdev/prom-remotewrite:prom-remotewrite-20221012.2","imagePullPolicy":"Always","name":"prom-remotewrite","ports":[{"containerPort":8081,"name":"rw-port","protocol":"TCP"}],"resources":{},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File","volumeMounts":[{"mountPath":"/var/run/secrets/kubernetes.io/serviceaccount","name":"kube-api-access-vbr9d","readOnly":true}]}
+{"env":[{"name":"INGESTION_URL","value":"https://my-azure-monitor-workspace.eastus2-1.metrics.ingest.monitor.azure.com/dataCollectionRules/dcr-00000000000000000/streams/Microsoft-PrometheusMetrics/api/v1/write?api-version=2021-11-01-preview"},{"name":"LISTENING_PORT","value":"8081"},{"name":"IDENTITY_TYPE","value":"userAssigned"},{"name":"AZURE_CLIENT_ID","value":"00000000-0000-0000-0000-00000000000"}],"image":"mcr.microsoft.com/azuremonitor/prometheus/promdev/prom-remotewrite:prom-remotewrite-20221012.2","imagePullPolicy":"Always","name":"prom-remotewrite","ports":[{"containerPort":8081,"name":"rw-port","protocol":"TCP"}],"resources":{},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File","volumeMounts":[{"mountPath":"/var/run/secrets/kubernetes.io/serviceaccount","name":"kube-api-access-vbr9d","readOnly":true}]}
Verify the configuration values especially ΓÇ£AZURE_CLIENT_IDΓÇ¥ and ΓÇ£IDENTITY_TYPEΓÇ¥ ## Next steps
-= [Setup Grafana to use Managed Prometheus as a data source](prometheus-grafana.md).
+- [Setup Grafana to use Managed Prometheus as a data source](prometheus-grafana.md).
- [Learn more about Azure Monitor managed service for Prometheus](prometheus-metrics-overview.md).
azure-monitor Stream Monitoring Data Event Hubs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/stream-monitoring-data-event-hubs.md
Routing your monitoring data to an event hub with Azure Monitor enables you to e
|:|:| :| | IBM QRadar | No | The Microsoft Azure DSM and Microsoft Azure Event Hubs Protocol are available for download from [the IBM support website](https://www.ibm.com/support). | | Splunk | No | [Splunk Add-on for Microsoft Cloud Services](https://splunkbase.splunk.com/app/3110/) is an open source project available in Splunkbase. <br><br> If you can't install an add-on in your Splunk instance, if for example you're using a proxy or running on Splunk Cloud, you can forward these events to the Splunk HTTP Event Collector using [Azure Function For Splunk](https://github.com/Microsoft/AzureFunctionforSplunkVS), which is triggered by new messages in the event hub. |
-| SumoLogic | No | Instructions for setting up SumoLogic to consume data from an event hub are available at [Collect Logs for the Azure Audit App from Event Hubs](https://help.sumologic.com/Send-Data/Applications-and-Other-Data-Sources/Azure-Audit/02Collect-Logs-for-Azure-Audit-from-Event-Hub). |
+| SumoLogic | No | Instructions for setting up SumoLogic to consume data from an event hub are available at [Collect Logs for the Azure Audit App from Event Hubs](https://help.sumologic.com/docs/integrations/microsoft-azure/audit/#collecting-logs-for-the-azure-audit-app-from-event-hub). |
| ArcSight | No | The ArcSight Azure Event Hubs smart connector is available as part of [the ArcSight smart connector collection](https://community.microfocus.com/cyberres/arcsight/f/arcsight-product-announcements/163662/announcing-general-availability-of-arcsight-smart-connectors-7-10-0-8114-0). | | Syslog server | No | If you want to stream Azure Monitor data directly to a syslog server, you can use a [solution based on an Azure function](https://github.com/miguelangelopereira/azuremonitor2syslog/). | LogRhythm | No| Instructions to set up LogRhythm to collect logs from an event hub are available [here](https://logrhythm.com/six-tips-for-securing-your-azure-cloud-environment/).
azure-monitor Partners https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/partners.md
If you use Azure Monitor to route monitoring data to an event hub, you can easil
|:|:| :| | IBM QRadar | No | The Microsoft Azure DSM and the Microsoft Azure Event Hubs protocol are available for download from [the IBM support website](https://www.ibm.com/support). You can learn more about the integration with Azure at [QRadar DSM configuration](https://www.ibm.com/support/knowledgecenter/SS42VS_DSM/c_dsm_guide_microsoft_azure_overview.html?cp=SS42VS_7.3.0). | | Splunk | No | The [Azure Monitor Add-On for Splunk](https://splunkbase.splunk.com/app/3757/) is an open-source project available in Splunkbase. <br><br> If you can't install an add-on in your Splunk instance (because, for example, you're using a proxy or running on Splunk Cloud), you can forward these events to the Splunk HTTP Event Collector by using [Azure Function For Splunk](https://github.com/Microsoft/AzureFunctionforSplunkVS). Azure Function For Splunk is triggered by new messages in the event hub. |
-| Sumo Logic | No | Instructions for setting up Sumo Logic to consume data from an event hub are available at [Collect Logs for the Azure Audit App from Event Hub](https://help.sumologic.com/Send-Data/Applications-and-Other-Data-Sources/Azure-Audit/02Collect-Logs-for-Azure-Audit-from-Event-Hub). |
+| Sumo Logic | No | Instructions for setting up Sumo Logic to consume data from an event hub are available at [Collect Logs for the Azure Audit App from Event Hub](https://help.sumologic.com/docs/integrations/microsoft-azure/audit/#collecting-logs-for-the-azure-audit-app-from-event-hub). |
| ArcSight | No | The ArcSight smart connector for Azure Event Hubs is available as part of the [ArcSight smart connector collection](https://community.microfocus.com/cyberres/arcsight/f/arcsight-product-announcements/163662/announcing-general-availability-of-arcsight-smart-connectors-7-10-0-8114-0). | | Syslog server | No | If you want to stream Azure Monitor data directly to a syslog server, you can use a [solution based on an Azure function](https://github.com/miguelangelopereira/azuremonitor2syslog/). | LogRhythm | No| Instructions to set up LogRhythm to collect logs from an event hub are available on the [LogRhythm website](https://logrhythm.com/six-tips-for-securing-your-azure-cloud-environment/).
azure-netapp-files Create Active Directory Connections https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/create-active-directory-connections.md
na Previously updated : 09/27/2022 Last updated : 10/27/2022 # Create and manage Active Directory connections for Azure NetApp Files
Several features of Azure NetApp Files require that you have an Active Directory
![Screenshot of the AES description field. The field is a checkbox.](../media/azure-netapp-files/active-directory-aes-encryption.png)
- See [Requirements for Active Directory connections](#requirements-for-active-directory-connections) for requirements.
- ![Active Directory AES encryption](../media/azure-netapp-files/active-directory-aes-encryption.png)
+ See [Requirements for Active Directory connections](#requirements-for-active-directory-connections) for requirements.
* <a name="ldap-signing"></a>**LDAP Signing**
azure-netapp-files Volume Hard Quota Guidelines https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/volume-hard-quota-guidelines.md
There's no change in resource limits for Azure NetApp Files beyond the quota cha
### Is there an example ANFCapacityManager workflow?
-Yes. See the [Volume AutoGrow Workflow Example GitHub page](https://github.com/ANFTechTeam/ANFCapacityManager/blob/main/ResizeWorkflow.md).
+Yes. See the [Volume AutoGrow Workflow Example GitHub page](https://github.com/ANFTechTeam/ANFCapacityManager/blob/master/ResizeWorkflow.md).
### Is ANFCapacityManager Microsoft supported?
azure-portal Azure Portal Safelist Urls https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-portal/azure-portal-safelist-urls.md
aka.ms (Microsoft short URL)
*.aad.azure.com (Azure AD) *.aadconnecthealth.azure.com (Azure AD) ad.azure.com (Azure AD)
+adf.azure.com (Azure Data Factory)
api.aadrm.com (Azure AD) api.loganalytics.io (Log Analytics Service) *.applicationinsights.azure.com (Application Insights Service)
azure-resource-manager Resources Without Resource Group Limit https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/resources-without-resource-group-limit.md
Some resources have a limit on the number instances per region. This limit is di
## Microsoft.DevTestLab
-* labs/virtualMachines - By default, limited to 800 instances. That limit can be increased by [registering the following features](preview-features.md) - Microsoft.DevTestLab/DisableLabVirtualMachineQuota
* schedules ## Microsoft.EdgeOrder
azure-video-indexer Edit Transcript Lines Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-indexer/edit-transcript-lines-portal.md
Title: Insert or remove transcript lines in Azure Video Indexer portal
-description: This article explains how to insert or remove a transcript line in Azure Video Indexer portal.
+ Title: Insert or remove transcript lines in Azure Video Indexer website
+description: This article explains how to insert or remove a transcript line in the Azure Video Indexer website.
Last updated 05/03/2022
-# Insert or remove transcript lines in Video Indexer portal
+# Insert or remove transcript lines in the Azure Video Indexer website
-This article explains how to insert or remove a transcript line in Azure Video Indexer portal.
+This article explains how to insert or remove a transcript line in the [Azure Video Indexer website](https://www.videoindexer.ai/).
## Add new line to the transcript timeline
azure-video-indexer Insights Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-indexer/insights-overview.md
Last updated 10/19/2022
-# Azure Video Indexer insights
+# Insights and Azure Video Indexer responsible use of AI
Insights contain an aggregated view of the data: faces, topics, emotions. Azure Video Indexer analyzes the video and audio content by running 30+ AI models, generating rich insights. For more information about available models, see [overview](video-indexer-overview.md).
+## Concepts
-The [Azure Video Indexer](https://www.videoindexer.ai/) website enables you to use your video's deep insights to: find the right media content, locate the parts that youΓÇÖre interested in, and use the results to create an entirely new project. Once created, the project can be rendered and downloaded from Azure Video Indexer and be used in your own editing applications or downstream workflows. For more information, see [Use editor to create projects](use-editor-create-project.md).
+Before you start using the insights, make sure to check [Limited Access features of Azure Video Indexer](limited-access-features.md).
-Once you are [set up](video-indexer-get-started.md) with Azure Video Indexer, start using [insights](video-indexer-output-json-v2.md) and check out other **How to guides**.
+Then, check out Azure Video Indexer insights [transparency notes and use cases](/legal/azure-video-indexer/transparency-note?context=/azure/azure-video-indexer/context/context):
+
+* [Audio effects detection](/legal/azure-video-indexer/audio-effects-detection-transparency-note?context=/azure/azure-video-indexer/context/context)
+* [Faces detection](/legal/azure-video-indexer/face-detection-transparency-note?context=/azure/azure-video-indexer/context/context)
+* [OCR](/legal/azure-video-indexer/ocr-transparency-note?context=/azure/azure-video-indexer/context/context)
+* [Keywords extraction](/legal/azure-video-indexer/keywords-transparency-note?context=/azure/azure-video-indexer/context/context)
+* [Transcription, translation, language](/legal/azure-video-indexer/transcription-translation-lid-transparency-note?context=/azure/azure-video-indexer/context/context)
+* [Labels identification](/legal/azure-video-indexer/labels-identification-transparency-note?context=/azure/azure-video-indexer/context/context)
+* [Named entities](/legal/azure-video-indexer/named-entities-transparency-note?context=/azure/azure-video-indexer/context/context)
+* [Observed people tracking & matched faces](/legal/azure-video-indexer/observed-matched-people-transparency-note?context=/azure/azure-video-indexer/context/context)
+* [Topics inference](/legal/azure-video-indexer/topics-inference-transparency-note?context=/azure/azure-video-indexer/context/context)
+
+## Next steps
+
+Once you are [set up](video-indexer-get-started.md) with Azure Video Indexer, start using [insights](video-indexer-output-json-v2.md) and check out other **How to guides** that demonstrate how to navigate the website.
azure-video-indexer Observed People Featured Clothing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-indexer/observed-people-featured-clothing.md
This article discusses how to view the featured clothing insight and how the fea
You can view the following short video that discusses how to view and use the featured clothing insight.
-[An intro video](https://www.youtube.com/watch?v=x33fND286eE).
+> [!VIDEO https://www.microsoft.com/videoplayer/embed//RE5b4JJ]
## Viewing featured clothing
azure-video-indexer Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-indexer/release-notes.md
For more information, see [supported languages](language-support.md).
Use the [Patch person model](https://api-portal.videoindexer.ai/api-details#api=Operations&operation=Patch-Person-Model) API to configure the confidence level for face recognition within a person model.
+### View speakers in closed captions
+
+You can now view speakers in closed captions of the Azure Video Indexer media player. For more information, see [View closed captions in the Azure Video Indexer website](view-closed-captions.md).
+
+### Control face and people bounding boxes using parameters
+
+The new `boundingBoxes` URL parameter controls the option to set bounding boxes on/off when embedding a player. For more information, see [Embed widgets](video-indexer-embed-widgets.md#player-widget).
+
+### Control autoplay from the account settings
+
+Control whether a media file will autoplay when opened using the webapp is through the user settings. Navigate to the [Azure Video Indexer website](https://www.videoindexer.ai/) -> the **Gear** icon (the top-right corner) -> **User settings** -> **Auto-play media files**.
+
+### Copy video ID from the player view
+
+**Copy video ID** is available when you select the video in the [Azure Video Indexer website](https://www.videoindexer.ai/)
+
+### New dark theme in native Azure colors
+
+Select the desired theme in the [Azure Video Indexer website](https://www.videoindexer.ai/). Select the **Gear** icon (the top-right corner) -> **User settings**.
+
+### Search or filter the account list
+
+You can search or filter the account list using the account name or region. Select **User accounts** in the top-right corner of the [Azure Video Indexer website](https://www.videoindexer.ai/).
+ ## September 2022 ### General availability of ARM-based accounts
azure-video-indexer View Closed Captions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-indexer/view-closed-captions.md
+
+ Title: View closed captions
+description: Learn how to view captions using the Azure Video Indexer website.
+ Last updated : 10/24/2022++
+# View closed captions in the Azure Video Indexer website
+
+This article shows how to view closed captions in the [Azure Video Indexer video player](https://www.videoindexer.ai).
+
+## View closed captions
+
+1. Go to the [Azure Video Indexer](https://www.videoindexer.ai/) website and sign in.
+1. Select a video for which you want to view captions.
+1. On the bottom of the Azure Video Indexer video player, select **Closed Captioning** (in some browsers located under the **Captions** menu, in some located under the **gear** icon).
+1. Under **Closed Captioning**, select a language in which you want to view captions. For example, **English**. Once checked, you see the captions in English.
+1. To see a speaker in front of the caption, select **Settings** under **Closed Captioning** and check **Show speakers** (under **Configurations**) -> press **Done**.
+
+## Next steps
+
+See how to [Insert or remove transcript lines in the Azure Video Indexer website](edit-transcript-lines-portal.md) and other how to articles that demonstrate how to navigate in the Azure Video Indexer website.
azure-web-pubsub Howto Create Serviceclient With Net And Azure Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-web-pubsub/howto-create-serviceclient-with-net-and-azure-identity.md
This how-to guide shows you how to create a `WebPubSubServiceClient` using Azure
} ```
- Learn how to use this client, see [Azure Web PubSub service client library for .NET](/dotnet/api/overview/azure/messaging.webpubsub-readme-pre)
+ Learn how to use this client, see [Azure Web PubSub service client library for .NET](/dotnet/api/overview/azure/messaging.webpubsub-readme)
## Complete sample
backup Backup Azure Dataprotection Use Rest Api Create Update Blob Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-dataprotection-use-rest-api-create-update-blob-policy.md
Title: Create backup policies for blobs using data protection REST API
+ Title: Create Azure Backup policies for blobs using data protection REST API
description: In this article, you'll learn how to create and manage backup policies for blobs using REST API.- Previously updated : 07/09/2021+ Last updated : 10/28/2022 ms.assetid: 472d6a4f-7914-454b-b8e4-062e8b556de3++++ # Create Azure Data Protection backup policies for blobs using REST API
+Azure Backup policy typically governs the retention and schedule of your backups. As operational backup for blobs is continuous in nature, you don't need a schedule to perform backups. The policy is essentially needed to specify the retention period. You can reuse the backup policy to configure backup for multiple storage accounts to a vault.
+ > [!IMPORTANT]
-> Read [this section](blob-backup-configure-manage.md#before-you-start) before proceeding to create the policy and configuring backups for Azure blobs.
+> Before you proceed to create the policy and configure backups for Azure blobs, see [this section](blob-backup-configure-manage.md#before-you-start).
-A backup policy typically governs the retention and schedule of your backups. Since operational backup for blobs is continuous in nature, you don't need a schedule to perform backups. The policy is essentially needed to specify the retention period. You can reuse the backup policy to configure backup for multiple storage accounts to a vault.
+This article describes how to create a policy for blobs in a storage account. Learn about [the process to create a backup policy for an Azure Recovery Services vault using REST API](/rest/api/dataprotection/backup-policies/create-or-update).
>[!NOTE]
->Restoring over long durations may lead to restore operations taking longer to complete. Furthermore, the time that it takes to restore a set of data is based on the number of write and delete operations made during the restore period. For example, an account with one million objects with 3,000 objects added per day and 1,000 objects deleted per day will require approximately two hours to restore to a point 30 days in the past. A retention period and restoration more than 90 days in the past would not be recommended for an account with this rate of change.
+>Restoring over long durations may lead to restore operations taking longer to complete. Further, the time that it takes to restore a set of data is based on the number of write and delete operations made during the restore period.
+>For example, an account with one million objects with 3,000 objects added per day and 1,000 objects deleted per day will require approximately two hours to restore to a point 30 days in the past. A retention period and restoration more than 90 days in the past would not be recommended for an account with this rate of change.
-The steps to create a backup policy for an Azure Recovery Services vault are outlined in the policy [REST API document](/rest/api/dataprotection/backup-policies/create-or-update). Let's use this document as a reference to create a policy for blobs in a storage account.
+In this article, you'll learn about:
-## Create a policy
+> [!div class="checklist"]
+> - Create a policy
+> - Create the request body
+> - Responses
-> [!IMPORTANT]
-> Currently, we do not support updating or modifying an existing policy. An alternative is to create a new policy with the required details and assign it to the relevant backup instance.
+## Create a policy
-To create an Azure Backup policy, use the following *PUT* operation
+To create an Azure Backup policy, use the following *PUT* operation:
```http PUT https://management.azure.com/Subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataProtection/backupVaults/{vaultName}/backupPolicies/{policyName}?api-version=2021-01-01 ```
-The `{policyName}` and `{vaultName}` are provided in the URI. Additional information is provided in the request body.
+The `{policyName}` and `{vaultName}` are provided in the URI. You can find additional information the request body.
+
+> [!IMPORTANT]
+> Currently, we don't support updating or modifying an existing policy. So, you can create a new policy with the required details and assign it to the relevant backup instance.
## Create the request body
-For example, to create a policy for Blob backup, following are the components of the request body.
+For example, to create a policy for Blob backup, use the following component of the request body:
|Name |Required |Type |Description | |||||
-|properties | True | BaseBackupPolicy:[BackupPolicy](/rest/api/dataprotection/backup-policies/create-or-update#backuppolicy) | BaseBackupPolicyResource properties |
+|`properties` | True | BaseBackupPolicy:[BackupPolicy](/rest/api/dataprotection/backup-policies/create-or-update#backuppolicy) | BaseBackupPolicyResource properties |
-For the complete list of definitions in the request body, refer to the [backup policy REST API document](/rest/api/dataprotection/backup-policies/create-or-update).
+For the complete list of definitions in the request body, see the [backup policy REST API document](/rest/api/dataprotection/backup-policies/create-or-update).
### Example request body
The policy says:
``` > [!IMPORTANT]
-> The time formats for support only DateTime. They don't support Time format alone.
+> The supported time formats is *DateTime* only. They don't support *Time* format alone.
## Responses
-The backup policy creation/update is a synchronous operation and returns OK once the operation is successful.
+The backup policy creation/update is an asynchronous operation and returns *OK* once the operation is successful.
|Name |Type |Description | ||||
backup Backup Encryption https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-encryption.md
Title: Encryption in Azure Backup description: Learn how encryption features in Azure Backup help you protect your backup data and meet the security needs of your business. Previously updated : 05/25/2021- Last updated : 10/28/2022++++ # Encryption in Azure Backup
-All your backed-up data is automatically encrypted when stored in the cloud using Azure Storage encryption, which helps you meet your security and compliance commitments. This data at rest is encrypted using 256-bit AES encryption, one of the strongest block ciphers available, and is FIPS 140-2 compliant. In addition to encryption at rest, all your backup data in transit is transferred over HTTPS. It always remains on the Azure backbone network.
+Azure Backup automatically encrypts all your backed-up data while storing in the cloud using Azure Storage encryption, which helps you meet your security and compliance commitments. This data at rest is encrypted using 256-bit AES encryption (one of the strongest block ciphers available that is FIPS 140-2 compliant). Additionally, all your backup data in transit is transferred over HTTPS. It always remains on the Azure backbone network.
-## Levels of encryption in Azure Backup
+This article describes the levels of encryption in Azure Backup that helps to protect your backed-up data.
+
+## Encryption levels
Azure Backup includes encryption on two levels: -- **Encryption of data in the Recovery Services vault**
- - **Using platform-managed keys**: By default, all your data is encrypted using platform-managed keys. You don't need to take any explicit action from your end to enable this encryption. It applies to all workloads being backed up to your Recovery Services vault.
- - **Using customer-managed keys**: When backing up your Azure Virtual Machines, you can choose to encrypt your data using encryption keys owned and managed by you. Azure Backup lets you use your RSA keys stored in the Azure Key Vault for encrypting your backups. The encryption key used for encrypting backups may be different from the one used for the source. The data is protected using an AES 256 based data encryption key (DEK), which is, in turn, protected using your keys. This gives you full control over the data and the keys. To allow encryption, it's required that you grant the Recovery Services vault access to the encryption key in the Azure Key Vault. You can disable the key or revoke access whenever needed. However, you must enable encryption using your keys before you attempt to protect any items to the vault. [Learn more here](encryption-at-rest-with-cmk.md).
- - **Infrastructure-level encryption**: In addition to encrypting your data in the Recovery Services vault using customer-managed keys, you can also choose to have an additional layer of encryption configured on the storage infrastructure. This infrastructure encryption is managed by the platform. Together with encryption at rest using customer-managed keys, it allows two-layer encryption of your backup data. Infrastructure encryption can only be configured if you first choose to use your own keys for encryption at rest. Infrastructure encryption uses platform-managed keys for encrypting data.
-- **Encryption specific to the workload being backed up**
- - **Azure virtual machine backup**: Azure Backup supports backup of VMs with disks encrypted using [platform-managed keys](../virtual-machines/disk-encryption.md#platform-managed-keys), as well as [customer-managed keys](../virtual-machines/disk-encryption.md#customer-managed-keys) owned and managed by you. In addition, you can also back up your Azure Virtual machines that have their OS or data disks encrypted using [Azure Disk Encryption](backup-azure-vms-encryption.md#encryption-support-using-ade). ADE uses BitLocker for Windows VMs, and DM-Crypt for Linux VMs, to perform in-guest encryption.
- - **TDE - enabled database backup is supported**. To restore a TDE-encrypted database to another SQL Server, you need to first [restore the certificate to the destination server](/sql/relational-databases/security/encryption/move-a-tde-protected-database-to-another-sql-server). The backup compression for TDE-enabled databases for SQL Server 2016 and newer versions is available, but at lower transfer size as explained [here](https://techcommunity.microsoft.com/t5/sql-server/backup-compression-for-tde-enabled-databases-important-fixes-in/ba-p/385593).
+| Encryption level | Description |
+| | |
+| **Encryption of data in the Recovery Services vault** | - **Using platform-managed keys**: By default, all your data is encrypted using platform-managed keys. You don't need to take any explicit action from your end to enable this encryption. It applies to all workloads being backed-up to your Recovery Services vault. <br><br> - **Using customer-managed keys**: When backing up your Azure Virtual Machines, you can choose to encrypt your data using encryption keys owned and managed by you. Azure Backup lets you use your RSA keys stored in the Azure Key Vault for encrypting your backups. The encryption key used for encrypting backups may be different from the one used for the source. The data is protected using an AES 256 based data encryption key (DEK), which is, in turn, protected using your keys. This gives you full control over the data and the keys. To allow encryption, it's required that you grant the Recovery Services vault access to the encryption key in the Azure Key Vault. You can disable the key or revoke access whenever needed. However, you must enable encryption using your keys before you attempt to protect any items to the vault. [Learn more here](encryption-at-rest-with-cmk.md). <br><br> - **Infrastructure-level encryption**: In addition to encrypting your data in the Recovery Services vault using customer-managed keys, you can also choose to have an additional layer of encryption configured on the storage infrastructure. This infrastructure encryption is managed by the platform. Together with encryption at rest using customer-managed keys, it allows two-layer encryption of your backup data. Infrastructure encryption can only be configured if you first choose to use your own keys for encryption at rest. Infrastructure encryption uses platform-managed keys for encrypting data. |
+| **Encryption specific to the workload being backed-up** | - **Azure virtual machine backup**: Azure Backup supports backup of VMs with disks encrypted using [platform-managed keys](../virtual-machines/disk-encryption.md#platform-managed-keys), as well as [customer-managed keys](../virtual-machines/disk-encryption.md#customer-managed-keys) owned and managed by you. In addition, you can also back up your Azure Virtual machines that have their OS or data disks encrypted using [Azure Disk Encryption](backup-azure-vms-encryption.md#encryption-support-using-ade). ADE uses BitLocker for Windows VMs, and DM-Crypt for Linux VMs, to perform in-guest encryption. <br><br> - **TDE - enabled database backup is supported**. To restore a TDE-encrypted database to another SQL Server, you need to first [restore the certificate to the destination server](/sql/relational-databases/security/encryption/move-a-tde-protected-database-to-another-sql-server). The backup compression for TDE-enabled databases for SQL Server 2016 and newer versions is available, but at lower transfer size as explained [here](https://techcommunity.microsoft.com/t5/sql-server/backup-compression-for-tde-enabled-databases-important-fixes-in/ba-p/385593). |
## Next steps
backup Private Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/private-endpoints.md
Title: Create and use private endpoints for Azure Backup description: Understand the process to creating private endpoints for Azure Backup where using private endpoints helps maintain the security of your resources.- Previously updated : 11/09/2021+ Last updated : 10/28/2022
Private endpoints for Backup can be only created for Recovery Services vaults th
See [this section](#create-a-recovery-services-vault-using-the-azure-resource-manager-client) to learn how to create a vault using the Azure Resource Manager client. This creates a vault with its managed identity already enabled.
+## Deny public network access to the vault
+
+You can configure your vaults to deny access from public networks.
+
+Follow these steps:
+
+1. Go to the *vault* > **Networking**.
+
+2. On the **Public access** tab, select **Deny** to prevent access from public networks.
+
+ :::image type="content" source="./media/backup-azure-private-endpoints/deny-public-network.png" alt-text="Screenshot showing how to select the Deny option.":::
+
+ >[!Note]
+ >Once you deny access, you can still access the vault, but you can't move data to/from networks that don't contain private endpoints. For more information, see [Create private endpoints for Azure Backup](#create-private-endpoints-for-azure-backup).
+
+3. Select **Apply** to save the changes.
+ ## Enable Managed Identity for your vault Managed identities allow the vault to create and use private endpoints. This section talks about enabling the managed identity for your vault.
Once the private endpoints created for the vault in your VNet have been approved
In the VM in the locked down network, ensure the following:
-1. The VM should have access to AAD.
+1. The VM should have access to Azure AD.
2. Execute **nslookup** on the backup URL (`xxxxxxxx.privatelink.<geo>.backup.windowsazure.com`) from your VM, to ensure connectivity. This should return the private IP assigned in your virtual network. ### Configure backup
backup Tutorial Restore Disk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/tutorial-restore-disk.md
Title: Tutorial - Restore a VM with Azure CLI description: Learn how to restore a disk and create a recover a VM in Azure with Backup and Recovery Services. Previously updated : 04/25/2022 Last updated : 10/28/2022
az backup restore restore-disks \
--vault-name myRecoveryServicesVault \ --container-name myVM \ --item-name myVM \
- --restore-mode OriginalLocation
+ --restore-mode OriginalLocation \
--storage-account mystorageaccount \
- --rp-name myRecoveryPointName \
+
+--target-resource-group "Target_RG" \
+ --rp-name myRecoveryPointName \
+ --target-vm-name "TargetVirtualMachineName" \
+ --target-vnet-name "Target_VNet" \
+ --target-vnet-resource-group "Target_VNet_RG" \
+ --target-subnet-name "targetSubNet"
``` ```output
chaos-studio Chaos Studio Permissions Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/chaos-studio/chaos-studio-permissions-security.md
All user interactions with Chaos Studio happen through Azure Resource Manager. I
Azure Chaos Studio doesn't support Private Link for agent-based scenarios.
+## Service tags
+A service tag is a group of IP address prefixes that can be assigned to in-bound and out-bound NSG rules. It handles updates to the group of IP address prefixes without any intervention. This benefits you because you can use service tags to explicitly allow in-bound traffic from Chaos Studio, without needing to know the IP addresses of the platform. Currently service tags can be enabled via PowerShell.
+* Limitation of service tags is that they can only be used with resources that have a public IP address. If a resource only has a private IP address, then service tags will not be able to allow traffic to route to it.
+ ## Data encryption Chaos Studio encrypts all data by default. Chaos Studio only accepts input for system properties like managed identity object IDs, experiment/step/branch names, and fault parameters (for example, the network port range to block in a network disconnect fault). These properties shouldn't be used to store sensitive data such as payment information or passwords. For more on how Chaos Studio protects your data, see [the Azure customer data protection article](../security/fundamentals/protection-customer-data.md).
cognitive-services Multivariate How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Anomaly-Detector/How-to/multivariate-how-to.md
- Title: How to use Multivariate Anomaly Detector APIs on your time series data-
-description: Learn how to detect anomalies in your data with multivariate anomaly detector.
------ Previously updated : 06/07/2022---
-# How to: Use Multivariate Anomaly Detector on your time series data
-
-The Multivariate Anomaly Detector (MVAD) provides two primary methods to detect anomalies compared with Univariate Anomaly Detector (UVAD), **training** and **inference**. During the inference process, you can choose to use an asynchronous API or a synchronous API to trigger inference one time. Both of these APIs support batch or streaming scenarios.
-
-The following are the basic steps needed to use MVAD:
- 1. Create an Anomaly Detector resource in the Azure portal.
- 1. Prepare data for training and inference.
- 1. Train an MVAD model.
- 1. Get model status.
- 1. Detect anomalies during the inference process with the trained MVAD model.
-
-To test out this feature, try this SDK [Notebook](https://github.com/Azure-Samples/AnomalyDetector/blob/master/ipython-notebook/API%20Sample/Multivariate%20API%20Demo%20Notebook.ipynb). For more instructions on how to run a jupyter notebook, please refer to [Install and Run a Jupyter Notebook](https://jupyter-notebook-beginner-guide.readthedocs.io/en/latest/install.html#).
-
-## Multivariate Anomaly Detector APIs overview
-
-Generally, multivariate anomaly detector includes a set of APIs, covering the whole lifecycle of training and inference. For more information, refer to [Anomaly Detector API Operations](https://westus2.dev.cognitive.microsoft.com/docs/services/AnomalyDetector-v1-1-preview-1/operations/DetectAnomaly). Here are the **8 APIs** in MVAD:
-
-| APIs | Description |
-| - | - |
-| `/multivariate/models`| Create and train model using training data. |
-| `/multivariate/models/{modelid}`| Get model info including training status and parameters used in the model.|
-| `/multivariate/models[?$skip][&$top]`|List models in a subscription. |
-| `/multivariate/models/{modelid}/detect`| Submit asynchronous inference task with data. |
-| `/multivariate/models/{modelId}/last/detect`| Submit synchronous inference task with data. |
-| `/multivariate/results/{resultid}` | Get inference result with resultID in asynchronous inference. |
-| `/multivariate/models/{modelId}`| Delete an existing multivariate model according to the modelId. |
-| `/multivariate/models/{modelId}/export`| Export model as a Zip file. |
--
-## Create an Anomaly Detector resource in Azure portal
-
-* Create an Azure subscription if you don't have one - [Create one for free](https://azure.microsoft.com/free/cognitive-services)
-* Once you have your Azure subscription, [create an Anomaly Detector resource](https://portal.azure.com/#create/Microsoft.CognitiveServicesAnomalyDetector) in the Azure portal to get your API key and API endpoint.
-
-> [!NOTE]
-> During preview stage, MVAD is available in limited regions only. Please bookmark [What's new in Anomaly Detector](../whats-new.md) to keep up to date with MVAD region roll-outs. You could also file a GitHub issue or contact us at [AnomalyDetector@microsoft.com](mailto:AnomalyDetector@microsoft.com) to request information regarding the timeline for specific regions being supported.
--
-## Data preparation
-
-Next you need to prepare your training data (and inference data with asynchronous API).
---
-## Train an MVAD model
-
-In this process, you should upload your data to blob storage and generate a SAS url used for training dataset.
-
-For training data size, the maximum number of timestamps is `1000000`, and a recommended minimum number is `15000` timestamps.
-
-Here is a sample request body and the sample code in Python to train an MVAD model.
-
-```json
-// Sample Request Body
-{
- "slidingWindow": 200,
- "alignPolicy": {
- "alignMode": "Outer",
- "fillNAMethod": "Linear",
- "paddingValue": 0
- },
- // This could be your own ZIP file of training data stored on Azure Blob and a SAS url could be used here
- "source": "https://aka.ms/AnomalyDetector/MVADSampleData",
- "startTime": "2021-01-01T00:00:00Z",
- "endTime": "2021-01-02T12:00:00Z",
- "displayName": "Contoso model"
-}
-```
-
-```python
-# Sample Code in Python
-########### Python 3.x #############
-import http.client, urllib.request, urllib.parse, urllib.error, base64
-
-headers = {
- # Request headers
- 'Content-Type': 'application/json',
- 'Ocp-Apim-Subscription-Key': '{API key}',
-}
-
-params = urllib.parse.urlencode({})
-
-try:
- conn = http.client.HTTPSConnection('{endpoint}')
- conn.request("POST", "/anomalydetector/v1.1-preview/multivariate/models?%s" % params, "{request body}", headers)
- response = conn.getresponse()
- data = response.read()
- print(data)
- conn.close()
-except Exception as e:
- print("[Errno {0}] {1}".format(e.errno, e.strerror))
-
-####################################
-```
-
-Response code `201` indicates a successful request.
--
-## Get model status
-As the training API is asynchronous, you won't get the model immediately after calling the training API. However, you can query the status of models either by API key, which will list all the models, or by model ID, and will list information about the specific model.
--
-### List all the models
-
-You may refer to [this page](https://westus2.dev.cognitive.microsoft.com/docs/services/AnomalyDetector-v1-1-preview/operations/ListMultivariateModel) for information about the request URL and request headers. Notice that we only return 10 models ordered by update time, but you can visit other models by setting the `$skip` and the `$top` parameters in the request URL. For example, if your request URL is `https://{endpoint}/anomalydetector/v1.1-preview/multivariate/models?$skip=10&$top=20`, then we will skip the latest 10 models and return the next 20 models.
-
-A sample response is
-
-```json
-{
- "models": [
- {
- "createdTime":"2020-12-01T09:43:45Z",
- "displayName":"DevOps-Test",
- "lastUpdatedTime":"2020-12-01T09:46:13Z",
- "modelId":"b4c1616c-33b9-11eb-824e-0242ac110002",
- "status":"READY",
- "variablesCount":18
- },
- {
- "createdTime":"2020-12-01T09:43:30Z",
- "displayName":"DevOps-Test",
- "lastUpdatedTime":"2020-12-01T09:45:10Z",
- "modelId":"ab9d3e30-33b9-11eb-a3f4-0242ac110002",
- "status":"READY",
- "variablesCount":18
- }
- ],
- "currentCount": 1,
- "maxCount": 50,
- "nextLink": "<link to more models>"
-}
-```
-
-The response contains four fields, `models`, `currentCount`, `maxCount`, and `nextLink`.
-
-* `models` contains the created time, last updated time, model ID, display name, variable counts, and the status of each model.
-* `currentCount` contains the number of trained multivariate models.
-* `maxCount` is the maximum number of models supported by this Anomaly Detector resource.
-* `nextLink` could be used to fetch more models.
-
-### Get models by model ID
-
-[To learn about the request URL query model by model ID.](https://westus2.dev.cognitive.microsoft.com/docs/services/AnomalyDetector-v1-1-preview/operations/GetMultivariateModel) A sample response looks like this:
-
-```json
-{
- "modelId": "45aad126-aafd-11ea-b8fb-d89ef3400c5f",
- "createdTime": "2020-06-30T00:00:00Z",
- "lastUpdatedTime": "2020-06-30T00:00:00Z",
- "modelInfo": {
- "slidingWindow": 300,
- "alignPolicy": {
- "alignMode": "Outer",
- "fillNAMethod": "Linear",
- "paddingValue": 0
- },
- "source": "<TRAINING_ZIP_FILE_LOCATED_IN_AZURE_BLOB_STORAGE_WITH_SAS>",
- "startTime": "2019-04-01T00:00:00Z",
- "endTime": "2019-04-02T00:00:00Z",
- "displayName": "Devops-MultiAD",
- "status": "READY",
- "errors": [],
- "diagnosticsInfo": {
- "modelState": {
- "epochIds": [10, 20, 30, 40, 50, 60, 70, 80, 90, 100],
- "trainLosses": [0.6291328072547913, 0.1671326905488968, 0.12354248017072678, 0.1025966405868533,
- 0.0958492755889896, 0.09069952368736267,0.08686016499996185, 0.0860302299260931,
- 0.0828735455870684, 0.08235538005828857],
- "validationLosses": [1.9232804775238037, 1.0645641088485718, 0.6031560301780701, 0.5302737951278687,
- 0.4698025286197664, 0.4395163357257843, 0.4182931482799006, 0.4057914316654053,
- 0.4056498706340729, 0.3849248886108984],
- "latenciesInSeconds": [0.3398594856262207, 0.3659665584564209, 0.37360644340515137,
- 0.3513407707214355, 0.3370304107666056, 0.31876277923583984,
- 0.3283309936523475, 0.3503587245941162, 0.30800247192382812,
- 0.3327946662902832]
- },
- "variableStates": [
- {
- "variable": "ad_input",
- "filledNARatio": 0,
- "effectiveCount": 1441,
- "startTime": "2019-04-01T00:00:00Z",
- "endTime": "2019-04-02T00:00:00Z",
- "errors": []
- },
- {
- "variable": "ad_ontimer_output",
- "filledNARatio": 0,
- "effectiveCount": 1441,
- "startTime": "2019-04-01T00:00:00Z",
- "endTime": "2019-04-02T00:00:00Z",
- "errors": []
- },
- // More variables
- ]
- }
- }
- }
-```
-
-You will receive more detailed information about the queried model. The response contains meta information about the model, its training parameters, and diagnostic information. Diagnostic Information is useful for debugging and tracing training progress.
-
-* `epochIds` indicates how many epochs the model has been trained out of a total of 100 epochs. For example, if the model is still in training status, `epochId` might be `[10, 20, 30, 40, 50]` , which means that it has completed its 50th training epoch, and therefore is halfway complete.
-* `trainLosses` and `validationLosses` are used to check whether the optimization progress converges in which case the two losses should decrease gradually.
-* `latenciesInSeconds` contains the time cost for each epoch and is recorded every 10 epochs. In this example, the 10th epoch takes approximately 0.34 second. This would be helpful to estimate the completion time of training.
-* `variableStates` summarizes information about each variable. It is a list ranked by `filledNARatio` in descending order. It tells how many data points are used for each variable and `filledNARatio` tells how many points are missing. Usually we need to reduce `filledNARatio` as much as possible.
-Too many missing data points will deteriorate model accuracy.
-* Errors during data processing will be included in the `errors` field.
--
-## Inference with asynchronous API
-
-You could choose the asynchronous API, or the synchronous API for inference.
-
-| Asynchronous API | Synchronous API |
-| - | - |
-| More suitable for batch use cases when customers donΓÇÖt need to get inference results immediately and want to detect anomalies and get results over a longer time period.| When customers want to get inference immediately and want to detect multivariate anomalies in real time, this API is recommended. Also suitable for customers having difficulties conducting the previous compressing and uploading process for inference. |
-
-To perform asynchronous inference, provide the blob source path to the zip file containing the inference data, the start time, and end time. For inference data volume, at least `1 sliding window` length and at most `20000` timestamps.
-
-This inference is asynchronous, so the results are not returned immediately. Notice that you need to save in a variable the link of the results in the **response header** which contains the `resultId`, so that you may know where to get the results afterwards.
-
-Failures are usually caused by model issues or data issues. You cannot perform inference if the model is not ready or the data link is invalid. Make sure that the training data and inference data are consistent, which means they should be **exactly** the same variables but with different timestamps. More variables, fewer variables, or inference with a different set of variables will not pass the data verification phase and errors will occur. Data verification is deferred so that you will get error message only when you query the results.
-
-### Get inference results (asynchronous only)
-
-You need the `resultId` to get results. `resultId` is obtained from the response header when you submit the inference request. Consult [this page for instructions to query the inference results](https://westus2.dev.cognitive.microsoft.com/docs/services/AnomalyDetector-v1-1-preview/operations/GetDetectionResult).
-
-A sample response looks like this:
-
-```json
- {
- "resultId": "663884e6-b117-11ea-b3de-0242ac130004",
- "summary": {
- "status": "READY",
- "errors": [],
- "variableStates": [
- {
- "variable": "ad_input",
- "filledNARatio": 0,
- "effectiveCount": 26,
- "startTime": "2019-04-01T00:00:00Z",
- "endTime": "2019-04-01T00:25:00Z",
- "errors": []
- },
- {
- "variable": "ad_ontimer_output",
- "filledNARatio": 0,
- "effectiveCount": 26,
- "startTime": "2019-04-01T00:00:00Z",
- "endTime": "2019-04-01T00:25:00Z",
- "errors": []
- },
- // more variables
- ],
- "setupInfo": {
- "source": "https://aka.ms/AnomalyDetector/MVADSampleData",
- "startTime": "2019-04-01T00:15:00Z",
- "endTime": "2019-04-01T00:40:00Z"
- }
- },
- "results": [
- {
- "timestamp": "2019-04-01T00:15:00Z",
- "errors": [
- {
- "code": "InsufficientHistoricalData",
- "message": "historical data is not enough."
- }
- ]
- },
- // more results
- {
- "timestamp": "2019-04-01T00:20:00Z",
- "value": {
- "contributors": [],
- "isAnomaly": false,
- "severity": 0,
- "score": 0.17805261260751692
- }
- },
- // more results
- {
- "timestamp": "2019-04-01T00:27:00Z",
- "value": {
- "contributors": [
- {
- "contributionScore": 0.0007775013367514271,
- "variable": "ad_ontimer_output"
- },
- {
- "contributionScore": 0.0007989604079048129,
- "variable": "ad_series_init"
- },
- {
- "contributionScore": 0.0008900927229851369,
- "variable": "ingestion"
- },
- {
- "contributionScore": 0.008068144477478554,
- "variable": "cpu"
- },
- {
- "contributionScore": 0.008222036467507165,
- "variable": "data_in_speed"
- },
- {
- "contributionScore": 0.008674941549594993,
- "variable": "ad_input"
- },
- {
- "contributionScore": 0.02232242629793674,
- "variable": "ad_output"
- },
- {
- "contributionScore": 0.1583773213660846,
- "variable": "flink_last_ckpt_duration"
- },
- {
- "contributionScore": 0.9816531517495176,
- "variable": "data_out_speed"
- }
- ],
- "isAnomaly": true,
- "severity": 0.42135109874230336,
- "score": 1.213510987423033
- }
- },
- // more results
- ]
- }
-```
-
-The response contains the result status, variable information, inference parameters, and inference results.
-
-* `variableStates` lists the information of each variable in the inference request.
-* `setupInfo` is the request body submitted for this inference.
-* `results` contains the detection results. There are three typical types of detection results.
-
-* Error code `InsufficientHistoricalData`. This usually happens only with the first few timestamps because the model inferences data in a window-based manner and it needs historical data to make a decision. For the first few timestamps, there is insufficient historical data, so inference cannot be performed on them. In this case, the error message can be ignored.
-
-* `"isAnomaly": false` indicates the current timestamp is not an anomaly.
- * `severity` indicates the relative severity of the anomaly and for normal data it is always 0.
- * `score` is the raw output of the model on which the model makes a decision, which could be non-zero even for normal data points.
-* `"isAnomaly": true` indicates an anomaly at the current timestamp.
- * `severity` indicates the relative severity of the anomaly and for abnormal data it is always greater than 0.
- * `score` is the raw output of the model on which the model makes a decision. `severity` is a derived value from `score`. Every data point has a `score`.
-* `contributors` is a list containing the contribution score of each variable. Higher contribution scores indicate higher possibility of the root cause. This list is often used for interpreting anomalies and diagnosing the root causes.
-
-> [!NOTE]
-> A common pitfall is taking all data points with `isAnomaly`=`true` as anomalies. That may end up with too many false positives.
-> You should use both `isAnomaly` and `severity` (or `score`) to sift out anomalies that are not severe and (optionally) use grouping to check the duration of the anomalies to suppress random noise.
-> Please refer to the [FAQ](../concepts/best-practices-multivariate.md#faq) in the best practices document for the difference between `severity` and `score`.
--
-## (NEW) inference with synchronous API
-
-> [!NOTE]
-> In v1.1-preview.1, we support synchronous API and add more fields in inference result for both asynchronous API and synchronous API, you could upgrade the API version to access to these features. Once you upgrade, you'll no longer use previous model trained in old version, you should retrain a model to fit for new fields. [Learn more about v1.1-preview.1](https://westus2.dev.cognitive.microsoft.com/docs/services/AnomalyDetector-v1-1-preview-1/operations/DetectAnomaly).
-
-With the synchronous API, you can get inference results point by point in real time, and no need for compressing and uploading task like training and asynchronous inference. Here are some requirements for the synchronous API:
-* Need to put data in **JSON format** into the API request body.
-* The inference results are limited to up to 10 data points, which means you could detect **1 to 10 timestamps** with one synchronous API call.
-* Due to payload limitation, the size of inference data in the request body is limited, which support at most `2880` timestamps * `300` variables, and at least `1 sliding window length`.
-
-### Request schema
-
-You submit a bunch of timestamps of multiple variables into in JSON format in the request body, with an API call like this:
-
-`https://{endpoint}/anomalydetector/v1.1-preview.1/multivariate/models/{modelId}/last/detect`
-
-A sample request looks like following format, this case is detecting last two timestamps (`detectingPoints` is 2) of 3 variables in one synchronous API call.
-
-```json
-{
- "variables": [
- {
- "variableName": "Variable_1",
- "timestamps": [
- "2021-01-01T00:00:00Z",
- "2021-01-01T00:01:00Z",
- "2021-01-01T00:02:00Z"
- //more timestamps
- ],
- "values": [
- 0.4551378545933972,
- 0.7388603950488748,
- 0.201088255984052
- //more variables
- ]
- },
- {
- "variableName": "Variable_2",
- "timestamps": [
- "2021-01-01T00:00:00Z",
- "2021-01-01T00:01:00Z",
- "2021-01-01T00:02:00Z"
- //more timestamps
- ],
- "values": [
- 0.9617871613964145,
- 0.24903311574778408,
- 0.4920561254118613
- //more variables
- ]
- },
- {
- "variableName": "Variable_3",
- "timestamps": [
- "2021-01-01T00:00:00Z",
- "2021-01-01T00:01:00Z",
- "2021-01-01T00:02:00Z"
- //more timestamps
- ],
- "values": [
- 0.4030756879437628,
- 0.15526889968448554,
- 0.36352226408981103
- //more variables
- ]
- }
- ],
- "detectingPoints": 2
-}
-```
-
-### Response schema
-
-You will get the JSON response of inference results in real time after you call a synchronous API, which contains following new fields:
-
-| Field | Description |
-| - | - |
-| `interpretation`| This field only appears when a timestamp is detected as anomalous, which contains `variables`, `contributionScore`, `correlationChanges`. |
-| `correlationChanges`| This field only appears when a timestamp is detected as anomalous, which included in interpretation. It contains `changedVariables` and `changedValues` that interpret which correlations between variables changed. |
-| `changedVariables`| This field will show which variables that have significant change in correlation with `variable`. |
-| `changedValues`| This field calculates a number between 0 and 1 showing how much the correlation changed between variables. The bigger the number is, the greater the change on correlations. |
--
-See the following example of a JSON response:
-
-```json
-{
- "variableStates": [
- {
- "variable": "variable_1",
- "filledNARatio": 0,
- "effectiveCount": 30,
- "startTime": "2021-01-01T00:00:00Z",
- "endTime": "2021-01-01T00:29:00Z"
- },
- {
- "variable": "variable_2",
- "filledNARatio": 0,
- "effectiveCount": 30,
- "startTime": "2021-01-01T00:00:00Z",
- "endTime": "2021-01-01T00:29:00Z"
- },
- {
- "variable": "variable_3",
- "filledNARatio": 0,
- "effectiveCount": 30,
- "startTime": "2021-01-01T00:00:00Z",
- "endTime": "2021-01-01T00:29:00Z"
- }
- ],
- "results": [
- {
- "timestamp": "2021-01-01T00:28:00Z",
- "value": {
- "isAnomaly": false,
- "severity": 0,
- "score": 0.6928471326828003
- },
- "errors": []
- },
- {
- "timestamp": "2021-01-01T00:29:00Z",
- "value": {
- "isAnomaly": true,
- "severity": 0.5337404608726501,
- "score": 0.9171165823936462,
- "interpretation": [
- {
- "variable": "variable_2",
- "contributionScore": 0.5371576215,
- "correlationChanges": {
- "changedVariables": [
- "variable_1",
- "variable_3"
- ],
- "changedValues": [
- 0.1741322,
- 0.1093203
- ]
- }
- },
- {
- "variable": "variable_3",
- "contributionScore": 0.3324159383,
- "correlationChanges": {
- "changedVariables": [
- "variable_2"
- ],
- "changedValues": [
- 0.1229392
- ]
- }
- },
- {
- "variable": "variable_1",
- "contributionScore": 0.1304264402,
- "correlationChanges": {
- "changedVariables": [],
- "changedValues": []
- }
- }
- ]
- },
- "errors": []
- }
- ]
-}
-```
-
-## Next steps
-
-* [Best practices for using the Multivariate Anomaly Detector API](../concepts/best-practices-multivariate.md)
-* [Join us to get more supports!](https://aka.ms/adadvisorsjoin)
cognitive-services Multivariate Architecture https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Anomaly-Detector/concepts/multivariate-architecture.md
Previously updated : 04/01/2021 Last updated : 10/27/2022 keywords: anomaly detection, machine learning, algorithms
-# Predictive maintenance solution with Anomaly Detector (multivariate)
+# Predictive maintenance solution with Multivariate Anomaly Detector
Many different industries need predictive maintenance solutions to reduce risks and gain actionable insights through processing data from their equipment. Predictive maintenance evaluates the condition of equipment by performing online monitoring. The goal is to perform maintenance before the equipment degrades or breaks down.
-Monitoring the health status of equipment can be challenging, as each component inside the equipment can generate dozens of signals, for example vibration, orientation, and rotation. This can be even more complex when those signals have an implicit relationship, and need to be monitored and analyzed together. Defining different rules for those signals and correlating them with each other manually can be costly. Anomaly Detector's multivariate feature allows:
+Monitoring the health status of equipment can be challenging, as each component inside the equipment can generate dozens of signals. For example, vibration, orientation, and rotation. This can be even more complex when those signals have an implicit relationship, and need to be monitored and analyzed together. Defining different rules for those signals and correlating them with each other manually can be costly. Anomaly Detector's multivariate feature allows:
* Multiple correlated signals to be monitored together, and the inter-correlations between them are accounted for in the model. * In each captured anomaly, the contribution rank of different signals can help with anomaly explanation, and incident root cause analysis. * The multivariate anomaly detection model is built in an unsupervised manner. Models can be trained specifically for different types of equipment.
-Here, we provide a reference architecture for a predictive maintenance solution based on Anomaly Detector multivariate.
+Here, we provide a reference architecture for a predictive maintenance solution based on Multivariate Anomaly Detector.
## Reference architecture
cognitive-services Learn Multivariate Anomaly Detection https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Anomaly-Detector/tutorials/learn-multivariate-anomaly-detection.md
- Title: "Tutorial: Learn Multivariate Anomaly Detection in one hour"-
-description: An end-to-end tutorial of multivariate anomaly detection.
------ Previously updated : 06/27/2021---
-# Tutorial: Learn Multivariate Anomaly Detection in one hour
-
-Anomaly Detector with Multivariate Anomaly Detection (MVAD) is an advanced AI tool for detecting anomalies from a group of metrics in an **unsupervised** manner.
-
-In general, you could take these steps to use MVAD:
-
- 1. Create an Anomaly Detector resource that supports MVAD on Azure.
- 1. Prepare your data.
- 1. Train an MVAD model.
- 1. Query the status of your model.
- 1. Detect anomalies with the trained MVAD model.
- 1. Retrieve and interpret the inference results.
-
-In this tutorial, you'll:
-
-> [!div class="checklist"]
-> * Understand how to prepare your data in a correct format.
-> * Understand how to train and inference with MVAD.
-> * Understand the input parameters and how to interpret the output in inference results.
-
-## 1. Create an Anomaly Detector resource that supports MVAD
-
-* Create an Azure subscription if you don't have one - [Create one for free](https://azure.microsoft.com/free/cognitive-services)
-* Once you have your Azure subscription, [create an Anomaly Detector resource](https://portal.azure.com/#create/Microsoft.CognitiveServicesAnomalyDetector) in the Azure portal to get your API key and API endpoint.
-
-> [!NOTE]
-> During preview stage, MVAD is available in limited regions only. Please bookmark [What's new in Anomaly Detector](../whats-new.md) to keep up to date with MVAD region roll-outs. You could also file a GitHub issue or contact us at [AnomalyDetector@microsoft.com](mailto:AnomalyDetector@microsoft.com) to request for specific regions.
-
-## 2. Data preparation
-
-Then you need to prepare your training data (and inference data).
--
-### Tools for zipping and uploading data
-
-In this section, we share some sample code and tools which you could copy and edit to add into your own application logic which deals with MVAD input data.
-
-#### Compressing CSV files in \*nix
-
-```bash
-zip -j series.zip series/*.csv
-```
-
-#### Compressing CSV files in Windows
-
-* Navigate *into* the folder with all the CSV files.
-* Select all the CSV files you need.
-* Right click on one of the CSV files and select `Send to`.
-* Select `Compressed (zipped) folder` from the drop-down.
-* Rename the zip file as needed.
-
-#### Python code zipping & uploading data to Azure Blob Storage
-
-You could refer to [this doc](../../../storage/blobs/storage-quickstart-blobs-portal.md#upload-a-block-blob) to learn how to upload a file to Azure Blob.
-
-Or, you could refer to the sample code below that can do the zipping and uploading for you. You could copy and save the Python code in this section as a .py file (for example, `zipAndUpload.py`) and run it using command lines like these:
-
-* `python zipAndUpload.py -s "foo\bar" -z test123.zip -c {azure blob connection string} -n container_xxx`
-
- This command will compress all the CSV files in `foo\bar` into a single zip file named `test123.zip`. It will upload `test123.zip` to the container `container_xxx` in your blob.
-* `python zipAndUpload.py -s "foo\bar" -z test123.zip -c {azure blob connection string} -n container_xxx -r`
-
- This command will do the same thing as the above, but it will delete the zip file `test123.zip` after uploading successfully.
-
-Arguments:
-
-* `--source-folder`, `-s`, path to the source folder containing CSV files
-* `--zipfile-name`, `-z`, name of the zip file
-* `--connection-string`, `-c`, connection string to your blob
-* `--container-name`, `-n`, name of the container
-* `--remove-zipfile`, `-r`, if on, remove the zip file
-
-```python
-import os
-import argparse
-import shutil
-import sys
-
-from azure.storage.blob import BlobClient
-import zipfile
--
-class ZipError(Exception):
- pass
--
-class UploadError(Exception):
- pass
--
-def zip_file(root, name):
- try:
- z = zipfile.ZipFile(name, "w", zipfile.ZIP_DEFLATED)
- for f in os.listdir(root):
- if f.endswith("csv"):
- z.write(os.path.join(root, f), f)
- z.close()
- print("Compress files success!")
- except Exception as ex:
- raise ZipError(repr(ex))
--
-def upload_to_blob(file, conn_str, cont_name, blob_name):
- try:
- blob_client = BlobClient.from_connection_string(conn_str, container_name=cont_name, blob_name=blob_name)
- with open(file, "rb") as f:
- blob_client.upload_blob(f, overwrite=True)
- print("Upload Success!")
- except Exception as ex:
- raise UploadError(repr(ex))
--
-if __name__ == "__main__":
- parser = argparse.ArgumentParser()
- parser.add_argument("--source-folder", "-s", type=str, required=True, help="path to source folder")
- parser.add_argument("--zipfile-name", "-z", type=str, required=True, help="name of the zip file")
- parser.add_argument("--connection-string", "-c", type=str, help="connection string")
- parser.add_argument("--container-name", "-n", type=str, help="container name")
- parser.add_argument("--remove-zipfile", "-r", action="store_true", help="whether delete the zip file after uploading")
- args = parser.parse_args()
-
- try:
- zip_file(args.source_folder, args.zipfile_name)
- upload_to_blob(args.zipfile_name, args.connection_string, args.container_name, args.zipfile_name)
- except ZipError as ex:
- print(f"Failed to compress files. {repr(ex)}")
- sys.exit(-1)
- except UploadError as ex:
- print(f"Failed to upload files. {repr(ex)}")
- sys.exit(-1)
- except Exception as ex:
- print(f"Exception encountered. {repr(ex)}")
-
- try:
- if args.remove_zipfile:
- os.remove(args.zipfile_name)
- except Exception as ex:
- print(f"Failed to delete the zip file. {repr(ex)}")
-```
-
-## 3. Train an MVAD Model
-
-Here is a sample request body and the sample code in Python to train an MVAD model.
-
-```json
-// Sample Request Body
-{
- "slidingWindow": 200,
- "alignPolicy": {
- "alignMode": "Outer",
- "fillNAMethod": "Linear",
- "paddingValue": 0
- },
- // This could be your own ZIP file of training data stored on Azure Blob and a SAS url could be used here
- "source": "https://aka.ms/AnomalyDetector/MVADSampleData",
- "startTime": "2021-01-01T00:00:00Z",
- "endTime": "2021-01-02T12:00:00Z",
- "displayName": "Contoso model"
-}
-```
-
-```python
-# Sample Code in Python
-########### Python 3.x #############
-import http.client, urllib.request, urllib.parse, urllib.error, base64
-
-headers = {
- # Request headers
- 'Content-Type': 'application/json',
- 'Ocp-Apim-Subscription-Key': '{API key}',
-}
-
-params = urllib.parse.urlencode({})
-
-try:
- conn = http.client.HTTPSConnection('{endpoint}')
- conn.request("POST", "/anomalydetector/v1.1-preview/multivariate/models?%s" % params, "{request body}", headers)
- response = conn.getresponse()
- data = response.read()
- print(data)
- conn.close()
-except Exception as e:
- print("[Errno {0}] {1}".format(e.errno, e.strerror))
-
-####################################
-```
-
-Response code `201` indicates a successful request.
-
-> [!IMPORTANT]
-> Remember to remove the key from your code when you're done, and never post it publicly. For production, use a secure way of storing and accessing your credentials like [Azure Key Vault](../../../key-vault/general/overview.md). See the Cognitive Services [security](../../cognitive-services-security.md) article for more information.
--
-## 4. Get model status
-
-As the training API is asynchronous, you won't get the model immediately after calling the training API. However, you can query the status of models either by API key, which will list all the models, or by model ID, which will list information about the specific model.
-
-### List all the models
-
-You may refer to [this page](https://westus2.dev.cognitive.microsoft.com/docs/services/AnomalyDetector-v1-1-preview/operations/ListMultivariateModel) for information about the request URL and request headers. Notice that we only return 10 models ordered by update time, but you can visit other models by setting the `$skip` and the `$top` parameters in the request URL. For example, if your request URL is `https://{endpoint}/anomalydetector/v1.1-preview/multivariate/models?$skip=10&$top=20`, then we will skip the latest 10 models and return the next 20 models.
-
-A sample response is
-
-```json
-{
- "models": [
- {
- "createdTime":"2020-12-01T09:43:45Z",
- "displayName":"DevOps-Test",
- "lastUpdatedTime":"2020-12-01T09:46:13Z",
- "modelId":"b4c1616c-33b9-11eb-824e-0242ac110002",
- "status":"READY",
- "variablesCount":18
- },
- {
- "createdTime":"2020-12-01T09:43:30Z",
- "displayName":"DevOps-Test",
- "lastUpdatedTime":"2020-12-01T09:45:10Z",
- "modelId":"ab9d3e30-33b9-11eb-a3f4-0242ac110002",
- "status":"READY",
- "variablesCount":18
- }
- ],
- "currentCount": 1,
- "maxCount": 50,
- "nextLink": "<link to more models>"
-}
-```
-
-The response contains 4 fields, `models`, `currentCount`, `maxCount`, and `nextLink`.
-
-* `models` contains the created time, last updated time, model ID, display name, variable counts, and the status of each model.
-* `currentCount` contains the number of trained multivariate models.
-* `maxCount` is the maximum number of models supported by this Anomaly Detector resource.
-* `nextLink` could be used to fetch more models.
-
-### Get models by model ID
-
-[This page](https://westus2.dev.cognitive.microsoft.com/docs/services/AnomalyDetector-v1-1-preview/operations/GetMultivariateModel) describes the request URL to query model information by model ID. A sample response looks like this
-
-```json
-{
- "modelId": "45aad126-aafd-11ea-b8fb-d89ef3400c5f",
- "createdTime": "2020-06-30T00:00:00Z",
- "lastUpdatedTime": "2020-06-30T00:00:00Z",
- "modelInfo": {
- "slidingWindow": 300,
- "alignPolicy": {
- "alignMode": "Outer",
- "fillNAMethod": "Linear",
- "paddingValue": 0
- },
- "source": "<TRAINING_ZIP_FILE_LOCATED_IN_AZURE_BLOB_STORAGE_WITH_SAS>",
- "startTime": "2019-04-01T00:00:00Z",
- "endTime": "2019-04-02T00:00:00Z",
- "displayName": "Devops-MultiAD",
- "status": "READY",
- "errors": [],
- "diagnosticsInfo": {
- "modelState": {
- "epochIds": [10, 20, 30, 40, 50, 60, 70, 80, 90, 100],
- "trainLosses": [0.6291328072547913, 0.1671326905488968, 0.12354248017072678, 0.1025966405868533,
- 0.0958492755889896, 0.09069952368736267,0.08686016499996185, 0.0860302299260931,
- 0.0828735455870684, 0.08235538005828857],
- "validationLosses": [1.9232804775238037, 1.0645641088485718, 0.6031560301780701, 0.5302737951278687,
- 0.4698025286197664, 0.4395163357257843, 0.4182931482799006, 0.4057914316654053,
- 0.4056498706340729, 0.3849248886108984],
- "latenciesInSeconds": [0.3398594856262207, 0.3659665584564209, 0.37360644340515137,
- 0.3513407707214355, 0.3370304107666056, 0.31876277923583984,
- 0.3283309936523475, 0.3503587245941162, 0.30800247192382812,
- 0.3327946662902832]
- },
- "variableStates": [
- {
- "variable": "ad_input",
- "filledNARatio": 0,
- "effectiveCount": 1441,
- "startTime": "2019-04-01T00:00:00Z",
- "endTime": "2019-04-02T00:00:00Z",
- "errors": []
- },
- {
- "variable": "ad_ontimer_output",
- "filledNARatio": 0,
- "effectiveCount": 1441,
- "startTime": "2019-04-01T00:00:00Z",
- "endTime": "2019-04-02T00:00:00Z",
- "errors": []
- },
- // More variables
- ]
- }
- }
- }
-```
-
-You will receive more detailed information about the queried model. The response contains meta information about the model, its training parameters, and diagnostic information. Diagnostic Information is useful for debugging and tracing training progress.
-
-* `epochIds` indicates how many epochs the model has been trained out of in total 100 epochs. For example, if the model is still in training status, `epochId` might be `[10, 20, 30, 40, 50]` which means that it has completed its 50th training epoch, and there are half way to go.
-* `trainLosses` and `validationLosses` are used to check whether the optimization progress converges in which case the two losses should decrease gradually.
-* `latenciesInSeconds` contains the time cost for each epoch and is recorded every 10 epochs. In this example, the 10th epoch takes approximately 0.34 seconds. This would be helpful to estimate the completion time of training.
-* `variableStates` summarizes information about each variable. It is a list ranked by `filledNARatio` in descending order. It tells how many data points are used for each variable and `filledNARatio` tells how many points are missing. Usually we need to reduce `filledNARatio` as much as possible.
-Too many missing data points will deteriorate model accuracy.
-* Errors during data processing will be included in the `errors` field.
-
-## 5. Inference with MVAD
-
-To perform inference, simply provide the blob source to the zip file containing the inference data, the start time, and end time.
-
-Inference is also asynchronous, so the results are not returned immediately. Notice that you need to save in a variable the link of the results in the **response header** which contains the `resultId`, so that you may know where to get the results afterwards.
-
-Failures are usually caused by model issues or data issues. You cannot perform inference if the model is not ready or the data link is invalid. Make sure that the training data and inference data are consistent, which means they should be **exactly** the same variables but with different timestamps. More variables, fewer variables, or inference with a different set of variables will not pass the data verification phase and errors will occur. Data verification is deferred so that you will get error message only when you query the results.
-
-## 6. Get inference results
-
-You need the `resultId` to get results. `resultId` is obtained from the response header when you submit the inference request. [This page](https://westus2.dev.cognitive.microsoft.com/docs/services/AnomalyDetector-v1-1-preview/operations/GetDetectionResult) contains instructions to query the inference results.
-
-A sample response looks like this
-
-```json
- {
- "resultId": "663884e6-b117-11ea-b3de-0242ac130004",
- "summary": {
- "status": "READY",
- "errors": [],
- "variableStates": [
- {
- "variable": "ad_input",
- "filledNARatio": 0,
- "effectiveCount": 26,
- "startTime": "2019-04-01T00:00:00Z",
- "endTime": "2019-04-01T00:25:00Z",
- "errors": []
- },
- {
- "variable": "ad_ontimer_output",
- "filledNARatio": 0,
- "effectiveCount": 26,
- "startTime": "2019-04-01T00:00:00Z",
- "endTime": "2019-04-01T00:25:00Z",
- "errors": []
- },
- // more variables
- ],
- "setupInfo": {
- "source": "https://aka.ms/AnomalyDetector/MVADSampleData",
- "startTime": "2019-04-01T00:15:00Z",
- "endTime": "2019-04-01T00:40:00Z"
- }
- },
- "results": [
- {
- "timestamp": "2019-04-01T00:15:00Z",
- "errors": [
- {
- "code": "InsufficientHistoricalData",
- "message": "historical data is not enough."
- }
- ]
- },
- // more results
- {
- "timestamp": "2019-04-01T00:20:00Z",
- "value": {
- "contributors": [],
- "isAnomaly": false,
- "severity": 0,
- "score": 0.17805261260751692
- }
- },
- // more results
- {
- "timestamp": "2019-04-01T00:27:00Z",
- "value": {
- "contributors": [
- {
- "contributionScore": 0.0007775013367514271,
- "variable": "ad_ontimer_output"
- },
- {
- "contributionScore": 0.0007989604079048129,
- "variable": "ad_series_init"
- },
- {
- "contributionScore": 0.0008900927229851369,
- "variable": "ingestion"
- },
- {
- "contributionScore": 0.008068144477478554,
- "variable": "cpu"
- },
- {
- "contributionScore": 0.008222036467507165,
- "variable": "data_in_speed"
- },
- {
- "contributionScore": 0.008674941549594993,
- "variable": "ad_input"
- },
- {
- "contributionScore": 0.02232242629793674,
- "variable": "ad_output"
- },
- {
- "contributionScore": 0.1583773213660846,
- "variable": "flink_last_ckpt_duration"
- },
- {
- "contributionScore": 0.9816531517495176,
- "variable": "data_out_speed"
- }
- ],
- "isAnomaly": true,
- "severity": 0.42135109874230336,
- "score": 1.213510987423033
- }
- },
- // more results
- ]
- }
-```
-
-The response contains the result status, variable information, inference parameters, and inference results.
-
-* `variableStates` lists the information of each variable in the inference request.
-* `setupInfo` is the request body submitted for this inference.
-* `results` contains the detection results. There're three typical types of detection results.
- 1. Error code `InsufficientHistoricalData`. This usually happens only with the first few timestamps because the model inferences data in a window-based manner and it needs historical data to make a decision. For the first few timestamps, there is insufficient historical data, so inference cannot be performed on them. In this case, the error message can be ignored.
- 1. `"isAnomaly": false` indicates the current timestamp is not an anomaly.
- * `severity ` indicates the relative severity of the anomaly and for normal data it is always 0.
- * `score` is the raw output of the model on which the model makes a decision which could be non-zero even for normal data points.
- 1. `"isAnomaly": true` indicates an anomaly at the current timestamp.
- * `severity ` indicates the relative severity of the anomaly and for abnormal data it is always greater than 0.
- * `score` is the raw output of the model on which the model makes a decision. `severity` is a derived value from `score`. Every data point has a `score`.
- * `contributors` is a list containing the contribution score of each variable. Higher contribution scores indicate higher possibility of the root cause. This list is often used for interpreting anomalies as well as diagnosing the root causes.
-
-> [!NOTE]
-> A common pitfall is taking all data points with `isAnomaly`=`true` as anomalies. That may end up with too many false positives.
-> You should use both `isAnomaly` and `severity` (or `score`) to sift out anomalies that are not severe and (optionally) use grouping to check the duration of the anomalies to suppress random noise.
-> Please refer to the [FAQ](../concepts/best-practices-multivariate.md#faq) in the best practices document for the difference between `severity` and `score`.
-
-## Next steps
-
-* [Best practices: Recommended practices to follow when using the multivariate Anomaly Detector APIs](../concepts/best-practices-multivariate.md)
-* [Quickstarts: Use the Anomaly Detector multivariate client library](../quickstarts/client-libraries-multivariate.md)
cognitive-services Concept Ocr https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Computer-vision/concept-ocr.md
Title: Reading text - Computer Vision
+ Title: OCR for images - Computer Vision
-description: Learn concepts related to the Read feature of the Computer Vision API - usage and limits.
+description: Extract text from in-the-wild and non-document images with a fast and synchronous Computer Vision API.
Last updated 09/12/2022
-# Computer Vision v4.0 Read OCR (preview)
+# OCR for images
-The new Computer Vision v4.0 Image Analysis REST API preview offers the ability to extract printed or handwritten text from images in a unified performance-enhanced synchronous API that makes it easy to get all image insights including OCR results in a single API operation. The Read OCR engine is built on top of multiple deep learning models supported by universal script-based models for [global language support](./language-support.md).
+> [!NOTE]
+>
+> For extracting text from PDF, Office, and HTML documents and document images, use the [Form Recognizer Read OCR model](../../applied-ai-services/form-recognizer/concept-read.md) optimized for text-heavy digital and scanned documents with an asynchronous API that makes it easy to power your intelliegnt document processing scenarios.
+>
+
+OCR traditionally started as a machine-learning based technique for extracting text from in-the-wild and non-document images like product labels, user generated images, screenshots, street signs, and posters. For several scenarios that including running OCR on single images that are not text-heavy, you need a fast, synchronous API or service. This allows OCR to be embedded in near real-time user experiences to enrich content understanding and follow-up user actions with fast turn-around times.
+
+## What is Computer Vision v4.0 Read OCR (preview)
+
+The new Computer Vision v4.0 Image Analysis REST API preview offers the ability to extract printed or handwritten text from images in a unified performance-enhanced synchronous API that makes it easy to get all image insights including OCR results in a single API operation. The Read OCR engine is built on top of multiple deep learning models supported by universal script-based models for [global language support](./language-support.md).
## Use the V4.0 REST API preview
cognitive-services Luis Container Howto https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/LUIS/luis-container-howto.md
Use the host, `http://localhost:5000`, for container APIs.
|Package type|HTTP verb|Route|Query parameters| |--|--|--|--|
-|Published|GET, POST|`/luis/v3.0/apps/{appId}/slots/{slotName}/predict?`|`query={query}`<br>[`&verbose`]<br>[`&log`]<br>[`&show-all-intents`]|
-|Versioned|GET, POST|`/luis/v3.0/apps/{appId}/versions/{versionId}/predict?`|`query={query}`<br>[`&verbose`]<br>[`&log`]<br>[`&show-all-intents`]|
+|Published|GET, POST|`/luis/v3.0/apps/{appId}/slots/{slotName}/predict?` `/luis/prediction/v3.0/apps/{appId}/slots/{slotName}/predict?`|`query={query}`<br>[`&verbose`]<br>[`&log`]<br>[`&show-all-intents`]|
+|Versioned|GET, POST|`/luis/v3.0/apps/{appId}/versions/{versionId}/predict?` `/luis/prediction/v3.0/apps/{appId}/versions/{versionId}/predict`|`query={query}`<br>[`&verbose`]<br>[`&log`]<br>[`&show-all-intents`]|
The query parameters configure how and what is returned in the query response:
cognitive-services How To Custom Speech Upload Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/how-to-custom-speech-upload-data.md
Previously updated : 05/08/2022 Last updated : 10/28/2022 zone_pivot_groups: speech-studio-cli-rest
To upload your own datasets in Speech Studio, follow these steps:
1. Select **Custom Speech** > Your project name > **Speech datasets** > **Upload data**. 1. Select the **Training data** or **Testing data** tab. 1. Select a dataset type, and then select **Next**.
-1. Specify the dataset location, and then select **Next**. You can choose a local file or enter a remote location such as Azure Blob public access URL.
+1. Specify the dataset location, and then select **Next**. You can choose a local file or enter a remote location such as Azure Blob URL.
+
+ > [!NOTE]
+ > If you use Azure Blob URL, you can ensure maximum security of your dataset files by using trusted Azure services security mechanism. You will use the same techniques as for Batch transcription and plain Storage Account URLs for your dataset files. See details [here](batch-transcription-audio-data.md#trusted-azure-services-security-mechanism).
+ 1. Enter the dataset name and description, and then select **Next**. 1. Review your settings, and then select **Save and close**.
To create a dataset and connect it to an existing project, use the `spx csr data
- Set the `project` parameter to the ID of an existing project. This is recommended so that you can also view and manage the dataset in Speech Studio. You can run the `spx csr project list` command to get available projects. - Set the required `kind` parameter. The possible set of values for dataset kind are: Language, Acoustic, Pronunciation, and AudioFiles. - Set the required `contentUrl` parameter. This is the location of the dataset.+
+ > [!NOTE]
+ > If you use Azure Blob URL, you can ensure maximum security of your dataset files by using trusted Azure services security mechanism. You will use the same techniques as for Batch transcription and plain Storage Account URLs for your dataset files. See details [here](batch-transcription-audio-data.md#trusted-azure-services-security-mechanism).
+ - Set the required `language` parameter. The dataset locale must match the locale of the project. The locale can't be changed later. The Speech CLI `language` parameter corresponds to the `locale` property in the JSON request and response. - Set the required `name` parameter. This is the name that will be displayed in the Speech Studio. The Speech CLI `name` parameter corresponds to the `displayName` property in the JSON request and response.
To create a dataset and connect it to an existing project, use the [CreateDatase
- Set the `project` property to the URI of an existing project. This is recommended so that you can also view and manage the dataset in Speech Studio. You can make a [GetProjects](https://eastus.dev.cognitive.microsoft.com/docs/services/speech-to-text-api-v3-0/operations/GetProjects) request to get available projects. - Set the required `kind` property. The possible set of values for dataset kind are: Language, Acoustic, Pronunciation, and AudioFiles. - Set the required `contentUrl` property. This is the location of the dataset.+
+ > [!NOTE]
+ > If you use Azure Blob URL, you can ensure maximum security of your dataset files by using trusted Azure services security mechanism. You will use the same techniques as for Batch transcription and plain Storage Account URLs for your dataset files. See details [here](batch-transcription-audio-data.md#trusted-azure-services-security-mechanism).
+ - Set the required `locale` property. The dataset locale must match the locale of the project. The locale can't be changed later. - Set the required `displayName` property. This is the name that will be displayed in the Speech Studio.
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/language-support.md
To set the translation target language, with few exceptions you only specify the
# [Language identification](#tab/language-identification)
-The table in this section summarizes the locales supported for Language identification. With language identification, the Speech service compares speech at the language level, such as English and German. If you include multiple locales of the same language, for example, `en-IN` English (India) and `en-US` English (United States), we'll only compare `en` (English) with the other candidate languages.
+The table in this section summarizes the locales supported for [Language identification](language-identification.md).
+> [!NOTE]
+> Language Identification compares speech at the language level, such as English and German. Do not include multiple locales of the same language in your candidate list.
[!INCLUDE [Language support include](includes/language-support/language-identification.md)]
communication-services Call Recording https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/concepts/voice-video-calling/call-recording.md
Title: Azure Communication Services Call Recording overview description: Provides an overview of the Call Recording feature and APIs.--++ - Last updated 06/30/2021
Call Recording enables you to record multiple calling scenarios available in Azu
Depending on your business needs, you can use Call Recording for different Azure Communication Services calling implementations. For example, you can record 1:1 or 1:N scenarios for audio and video calls enabled by [Calling Client SDK](https://learn.microsoft.com/azure/communication-services/concepts/voice-video-calling/calling-sdk-features). -
-![Diagram showing call recording architecture using calling client sdk.](../media/call-recording-with-calling-client.png)
-
+![Diagram showing a call that it's being recorded.](../media/call-recording-client.png)
But also, you can use Call Recording to record complex PSTN or VoIP inbound and outbound calling workflows managed by [Call Automation](https://learn.microsoft.com/azure/communication-services/concepts/voice-video-calling/call-automation). Regardless of how you established the call, Call Recording allows you to produce mixed or unmixed media files that are stored for 48 hours on a built-in temporary storage. You can retrieve the files and take them to the long-term storage solution of your choice. Call Recording supports all Azure Communication Services data regions.
-## Media output and Channel types supported
+![Diagram showing call recording architecture using calling client sdk.](../media/call-recording-with-call-automation.png)
+
+## Call Recording that supports your business needs
Call Recording supports multiple media outputs and content types to address your business needs and use cases. You might use mixed formats for scenarios such as keeping records, meeting notes, coaching and training, or even compliance and adherence. Or, you can use unmixed formats to address quality assurance use cases or even more complex scenarios like advanced analytics or AI-based (Artificial Intelligence) sophisticated post-call processes. ### Video
Call Recording supports multiple media outputs and content types to address your
-## Call Recording APIs
+## Get full control over your recordings with our Call Recording APIs
Call Recording APIs can be used to manage recording via internal business logic triggers, such as an application creating a group call and recording the conversation. Also, recordings can be triggered by a user action that tells the server application to start recording. Call Recording APIs use exclusively the `serverCallId` to initiate recording. To learn how to get the `serverCallId`, check our [Call Recording Quickstart](../../quickstarts/voice-video-calling/get-started-call-recording.md). A `recordingId` is returned when recording is started, which is then used for follow-on operations like pause and resume.
A `recordingId` is returned when recording is started, which is then used for fo
## Event Grid notifications
-Notifications related to media and metadata are emitted via Event Grid.
+Call Recording use [Azure Event Grid](https://learn.microsoft.com/azure/event-grid/event-schema-communication-services) to provide you with notifications related to media and metadata.
> [!NOTE] > Azure Communication Services provides short term media storage for recordings. **Recordings will be available to download for 48 hours.** After 48 hours, recordings will no longer be available.
communication-services Call Recording Sample https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/quickstarts/voice-video-calling/call-recording-sample.md
- Title: Azure Communication Services Call Recording API quickstart-
-description: Provides a quickstart sample for the Call Recording APIs.
---- Previously updated : 06/30/2021---
-zone_pivot_groups: acs-csharp-java
--
-# Call Recording API Quickstart
--
-This quickstart gets you started recording voice and video calls. This quickstart assumes you've already used the [Calling client SDK](get-started-with-video-calling.md) to build the end-user calling experience. Using the **Calling Server APIs and SDKs** you can enable and manage recordings.
-
-> [!NOTE]
-> **Unmixed audio recording** is still in a **Private Preview**.
---
-## Clean up resources
-
-If you want to clean up and remove a Communication Services subscription, you can delete the resource or resource group. Deleting the resource group also deletes any other resources associated with it. Learn more about [cleaning up resources](../create-communication-resource.md#clean-up-resources).
-
-## Next steps
-
-For more information, see the following articles:
--- Check out our [calling hero sample](../../samples/calling-hero-sample.md)-- Learn about [Calling SDK capabilities](./getting-started-with-calling.md)-- Learn more about [how calling works](../../concepts/voice-video-calling/about-call-types.md)
communication-services Call Recording Unmixed Audio Private Preview Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/communication-services/quickstarts/voice-video-calling/call-recording-unmixed-audio-private-preview-quickstart.md
- Title: Azure Communication Services Unmixed Audio Recording API quickstart -
-description: quickstart for Unmixed Audio Call Recording APIs
--- Previously updated : 09/07/2022---
-zone_pivot_groups: acs-csharp-java
--
-# Unmixed Audio Recording Quickstart
--
-This quickstart gets you started with Call Recording for voice and video calls. To start using the Call Recording APIs, you must have a call in place. Make sure you're familiar with [Calling client SDK](get-started-with-video-calling.md) and/or [Call Automation](https://learn.microsoft.com/azure/communication-services/quickstarts/voice-video-calling/callflows-for-customer-interactions?pivots=programming-language-csharp#configure-programcs-to-answer-the-call) to build the end-user calling experience.
---
-## Clean up resources
-
-If you want to clean up and remove a Communication Services subscription, you can delete the resource or resource group. Deleting the resource group also deletes any other resources associated with it. Learn more about [cleaning up resources](../create-communication-resource.md#clean-up-resources).
-
-## Next steps
-
-For more information, see the following articles:
--- Learn more about [Call Recording](../../concepts/voice-video-calling/call-recording.md)-- Check out our [calling hero sample](../../samples/calling-hero-sample.md)-- Learn about [Calling SDK capabilities](./getting-started-with-calling.md)-- Learn more about [how calling works](../../concepts/voice-video-calling/about-call-types.md)
connectors Apis List https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/connectors/apis-list.md
ms.suite: integration Previously updated : 08/25/2022 Last updated : 10/25/2022
For more information about securing logic apps and connections, review [Secure a
If you use a firewall that limits traffic, and your logic app workflows need to communicate through that firewall, you have to set up your firewall to allow access for both the [inbound](../logic-apps/logic-apps-limits-and-config.md#inbound) and [outbound](../logic-apps/logic-apps-limits-and-config.md#outbound) IP addresses used by the Azure Logic Apps platform or runtime in the Azure region where your logic app workflows exist. If your workflows also use managed connectors, such as the Office 365 Outlook connector or SQL connector, or use custom connectors, your firewall also needs to allow access for *all* the [managed connector outbound IP addresses](/connectors/common/outbound-ip-addresses#azure-logic-apps) in your logic app's Azure region. For more information, review [Firewall configuration](../logic-apps/logic-apps-limits-and-config.md#firewall-configuration-ip-addresses-and-service-tags).
-## Recurrence behavior
-
-Recurring built-in triggers, such as the [Recurrence trigger](connectors-native-recurrence.md), run natively on the Azure Logic Apps runtime and differ from recurring connection-based triggers, such as the Office 365 Outlook connector trigger where you need to create a connection first.
-
-For both kinds of triggers, if a recurrence doesn't specify a specific start date and time, the first recurrence runs immediately when you save or deploy the logic app, despite your trigger's recurrence setup. To avoid this behavior, provide a start date and time for when you want the first recurrence to run.
-
-Some managed connectors have both recurrence-based and webhook-based triggers, so if you use a recurrence-based trigger, review the [Recurrence behavior overview](apis-list.md#recurrence-behavior).
-
-### Recurrence for built-in triggers
-
-Recurring built-in triggers follow the schedule that you set, including any specified time zone. However, if a recurrence doesn't specify other advanced scheduling options, such as specific times to run future recurrences, those recurrences are based on the last trigger execution. As a result, the start times for those recurrences might drift due to factors such as latency during storage calls.
-
-For more information, review the following documentation:
-
-* [Schedule and run recurring automated tasks, processes, and workflows with Azure Logic Apps](../logic-apps/concepts-schedule-automated-recurring-tasks-workflows.md)
-* [Create, schedule, and run recurring tasks and workflows with the Recurrence trigger](connectors-native-recurrence.md)
-* [Troubleshooting recurrence issues](#recurrence-issues)
-
-### Recurrence for connection-based triggers
-
-For recurring connection-based triggers, such as Office 365 Outlook, the schedule isn't the only driver that controls execution. The time zone only determines the initial start time. Subsequent runs depend on the recurrence schedule, the last trigger execution, and other factors that might cause run times to drift or produce unexpected behavior, for example:
-
-* Whether the trigger accesses a server that has more data, which the trigger immediately tries to fetch.
-* Any failures or retries that the trigger incurs.
-* Latency during storage calls.
-* Not maintaining the specified schedule when daylight saving time (DST) starts and ends.
-* Other factors that can affect when the next run time happens.
-
-For more information, review the following documentation:
-
-* [Schedule and run recurring automated tasks, processes, and workflows with Azure Logic Apps](../logic-apps/concepts-schedule-automated-recurring-tasks-workflows.md)
-* [Troubleshooting recurrence issues](#recurrence-issues)
-
-<a name="recurrence-issues"></a>
-
-### Troubleshooting recurrence issues
-
-To make sure that your workflow runs at your specified start time and doesn't miss a recurrence, especially when the frequency is in days or longer, try the following solutions:
-
-* When DST takes effect, manually adjust the recurrence so that your workflow continues to run at the expected time. Otherwise, the start time shifts one hour forward when DST starts and one hour backward when DST ends. For more information and examples, review [Recurrence for daylight saving time and standard time](../logic-apps/concepts-schedule-automated-recurring-tasks-workflows.md#daylight-saving-standard-time).
-
-* If you're using a **Recurrence** trigger, specify a time zone, a start date, and start time. In addition, configure specific times to run subsequent recurrences in the properties **At these hours** and **At these minutes**, which are available only for the **Day** and **Week** frequencies. However, some time windows might still cause problems when the time shifts.
-
-* Consider using a [**Sliding Window** trigger](connectors-native-sliding-window.md) instead of a **Recurrence** trigger to avoid missed recurrences.
- ## Custom connectors and APIs In Consumption logic apps that run in multi-tenant Azure Logic Apps, you can call Swagger-based or SOAP-based APIs that aren't available as out-of-the-box connectors. You can also run custom code by creating custom API Apps. For more information, review the following documentation:
connectors Built In https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/connectors/built-in.md
For more information, review the following documentation:
You can use the following built-in connectors to perform general tasks, for example:
-* Run workflows using custom and advanced schedules. For more information about scheduling, review the [Recurrence behavior in the connector overview for Azure Logic Apps](apis-list.md#recurrence-behavior).
+* Run workflows using custom and advanced schedules. For more information about scheduling, review the [Recurrence behavior for connectors in Azure Logic Apps](../logic-apps/concepts-schedule-automated-recurring-tasks-workflows.md#recurrence-behavior).
* Organize and control your workflow's structure, for example, using loops and conditions.
connectors Connectors Create Api Sqlazure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/connectors/connectors-create-api-sqlazure.md
When you save your workflow, this step automatically publishes your updates to y
Recurring connection-based triggers where you need to create a connection first, such as the SQL Server managed connector trigger, differ from built-in triggers that run natively in Azure Logic Apps, such as the [Recurrence trigger](../connectors/connectors-native-recurrence.md). For recurring connection-based triggers, the recurrence schedule isn't the only driver that controls execution, and the time zone only determines the initial start time. Subsequent runs depend on the recurrence schedule, the last trigger execution, *and* other factors that might cause run times to drift or produce unexpected behavior. For example, unexpected behavior can include failure to maintain the specified schedule when daylight saving time (DST) starts and ends.
-To make sure that the recurrence time doesn't shift when DST takes effect, manually adjust the recurrence. That way, your workflow continues to run at the expected or specified start time. Otherwise, the start time shifts one hour forward when DST starts and one hour backward when DST ends. For more information, see [Recurrence for connection-based triggers](../connectors/apis-list.md#recurrence-for-connection-based-triggers).
+To make sure that the recurrence time doesn't shift when DST takes effect, manually adjust the recurrence. That way, your workflow continues to run at the expected or specified start time. Otherwise, the start time shifts one hour forward when DST starts and one hour backward when DST ends. For more information, see [Recurrence for connection-based triggers](../logic-apps/concepts-schedule-automated-recurring-tasks-workflows.md#recurrence-for-connection-based-triggers).
<a name="add-sql-action"></a>
connectors Connectors Sftp Ssh https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/connectors/connectors-sftp-ssh.md
When a trigger finds a new file, the trigger checks that the new file is complet
Recurring connection-based triggers where you need to create a connection first, such as the managed SFTP-SSH trigger, differ from built-in triggers that run natively in Azure Logic Apps, such as the [Recurrence trigger](../connectors/connectors-native-recurrence.md). In recurring connection-based triggers, the recurrence schedule isn't the only driver that controls execution, and the time zone only determines the initial start time. Subsequent runs depend on the recurrence schedule, the last trigger execution, *and* other factors that might cause run times to drift or produce unexpected behavior. For example, unexpected behavior can include failure to maintain the specified schedule when daylight saving time (DST) starts and ends.
-To make sure that the recurrence time doesn't shift when DST takes effect, manually adjust the recurrence. That way, your workflow continues to run at the expected time or specified start time. Otherwise, the start time shifts one hour forward when DST starts and one hour backward when DST ends. For more information, see [Recurrence for connection-based triggers](../connectors/apis-list.md#recurrence-for-connection-based-triggers).
+To make sure that the recurrence time doesn't shift when DST takes effect, manually adjust the recurrence. That way, your workflow continues to run at the expected time or specified start time. Otherwise, the start time shifts one hour forward when DST starts and one hour backward when DST ends. For more information, see [Recurrence for connection-based triggers](../logic-apps/concepts-schedule-automated-recurring-tasks-workflows.md#recurrence-for-connection-based-triggers).
## Prerequisites
cosmos-db How To Configure Multi Region Write https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/mongodb/how-to-configure-multi-region-write.md
+
+ Title: Configure multi-region writes in your Azure Cosmos DB for MongoDB database
+description: Learn how to configure multi-region writes in Azure Cosmos DB for MongoDB
+++ Last updated : 10/27/2022+++
+# Configure multi-region writes in Azure Cosmos DB for MongoDB
+
+Multi-region writes in Azure Cosmos DB for MongoDB enable your clients to write to multiple regions. This results in lower latency and better availability for your writes. It's important to note that unlike other MongoDB services, Azure Cosmos DB for MongoDB enables you to write data from the same shard to multiple regions. Multi-region writes is a true active-active setup.
+
+## Configure in Azure portal
+To enable multi-region writes from Azure portal, use the following steps:
+
+1. Sign-in to the [Azure portal](https://portal.azure.com/).
+
+1. Navigate to your Azure Cosmos DB for MongoDB account and from the menu, open the **Replicate data globally** pane.
+
+1. Under the **Multi-region writes** option, choose **enable**. It automatically adds the existing regions to read and write regions.
+
+1. You can add additional regions by selecting the icons on the map or by selecting the **Add region** button. All the regions you add will have both read and writes enabled.
+
+1. After you update the region list, select **save** to apply the changes.
+
+ :::image type="content" source="./media/how-to-multi-region-write/enable-multi-region-writes.png" alt-text="Screenshot to enable multi-region writes using Azure portal." lightbox="./media/how-to-multi-region-write/enable-multi-region-writes.png":::
++
+## Connect your client
+MongoDB connection strings supports the ΓÇ£appNameΓÇ¥ parameter, which is a means to identify client workloads. appName is used to identify the preferred write region for your connection. AppName can be specified in the connection string or using SDK specific initialization methods/properties.
+
+The appName parameter can be in one of the following formatsΓÇï:
+
+```powershell
+appName=<user-workload-name>ΓÇï
+appName=<user-workload-name>@<preferred-write-region>ΓÇï
+appName=<user-workload-name>@<cosmosdb-account-name>@<preferred-write-region>
+```
+
+On multi-region write accounts, Azure portal supports generation of region-specific connection strings to encode the preferred region listΓÇï. Selecting the preferred region dropdown will change the appName in the connection string to set the preferred write region. Simply copy the connection string after setting the preferred region.
+
+ :::image type="content" source="./media/how-to-multi-region-write/connect-multi-region-writes.png" alt-text="Screenshot to connect to multi-region writes account using Azure portal." lightbox="./media/how-to-multi-region-write/connect-multi-region-writes.png":::
+
+We recommend applications deployed to different regions to use the region-specific connection string with the correct preferred region for low-latency writes.
+
+## Next steps
+
+- Get an overview of [secure access to data in Azure Cosmos DB](../secure-access-to-data.md).
+- Learn more about [RBAC for Azure Cosmos DB management](../role-based-access-control.md).
cosmos-db Indexing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/mongodb/indexing.md
Here's how you can create a wildcard index on all fields:
You can also create wildcard indexes using the Data Explorer in the Azure portal:
+![Add wildcard index in indexing policy editor](./media/indexing/add-wildcard-index.png)
> [!NOTE] > If you are just starting development, we **strongly** recommend starting off with a wildcard index on all fields. This can simplify development and make it easier to optimize queries.
cosmos-db Quickstart App Stacks Csharp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/postgresql/quickstart-app-stacks-csharp.md
recommendations: false Previously updated : 09/27/2022 Last updated : 10/27/2022 # Use C# to connect and run SQL commands on Azure Cosmos DB for PostgreSQL
Last updated 09/27/2022
[!INCLUDE [App stack selector](includes/quickstart-selector.md)]
-This quickstart shows you how to use C# code to connect to a cluster, and then use SQL statements to create a table and insert, query, update, and delete data in the database. The steps in this article assume that you're familiar with C# development, and are new to working with Azure Cosmos DB for PostgreSQL.
+This quickstart shows you how to use C# code to connect to a cluster, and use SQL statements to create a table. You'll then insert, query, update, and delete data in the database. The steps in this article assume that you're familiar with C# development, and are new to working with Azure Cosmos DB for PostgreSQL.
-> [!TIP]
-> The process of creating a C# app with Azure Cosmos DB for PostgreSQL is the same as working with ordinary PostgreSQL.
+## Install PostgreSQL library
-## Prerequisites
--- An Azure account with an active subscription. If you don't have one, [create an account for free](https://azure.microsoft.com/free).-- [Visual Studio](https://www.visualstudio.com/downloads) with the .NET desktop development workload installed. Or [install the .NET SDK](https://dotnet.microsoft.com/download) for your Windows, Ubuntu Linux, or macOS platform.-- In Visual Studio, a C# console project with the [Npgsql](https://www.nuget.org/packages/Npgsql) NuGet package installed.-- An Azure Cosmos DB for PostgreSQL cluster. To create a cluster, see [Create a cluster in the Azure portal](quickstart-create-portal.md).
-
-The code samples in this article use your cluster name and password. You can see your cluster name at the top of your cluster page in the Azure portal.
-
+The code examples in this article require the [Npgsql](https://www.nuget.org/packages/Npgsql) library. You'll need to install Npgsql with your language package manager (such as NuGet in [Visual Studio](https://www.visualstudio.com/downloads).)
## Connect, create a table, and insert data
-In Visual Studio, use the following code to connect to your cluster and load data using CREATE TABLE and INSERT INTO SQL statements. The code uses these `NpgsqlCommand` class methods:
+We'll connect to a cluster and load data using CREATE TABLE and INSERT INTO SQL statements. The code uses these `NpgsqlCommand` class methods:
* [Open()](https://www.npgsql.org/doc/api/Npgsql.NpgsqlConnection.html#Npgsql_NpgsqlConnection_Open) to establish a connection to Azure Cosmos DB for PostgreSQL * [CreateCommand()](https://www.npgsql.org/doc/api/Npgsql.NpgsqlConnection.html#Npgsql_NpgsqlConnection_CreateCommand) to set the CommandText property
cosmos-db Quickstart App Stacks Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/postgresql/quickstart-app-stacks-java.md
recommendations: false Previously updated : 09/28/2022 Last updated : 10/26/2022 # Java app to connect and run SQL commands on Azure Cosmos DB for PostgreSQL
Last updated 09/28/2022
[!INCLUDE [App stack selector](includes/quickstart-selector.md)]
-This quickstart shows you how to build a Java app that connects to a cluster, and then uses SQL statements to create a table and insert, query, update, and delete data in the database. The steps in this article assume that you're familiar with Java development and [JDBC](https://en.wikipedia.org/wiki/Java_Database_Connectivity), and are new to working with Azure Cosmos DB for PostgreSQL.
-
-> [!TIP]
-> The process of creating a Java app with Azure Cosmos DB for PostgreSQL is the same as working with ordinary PostgreSQL.
-
-## Prerequisites
--- An Azure account with an active subscription. If you don't have one, [create an account for free](https://azure.microsoft.com/free).-- A supported [Java Development Kit](/azure/developer/jav).-- The [Apache Maven](https://maven.apache.org) build tool.-- An Azure Cosmos DB for PostgreSQL cluster. To create a cluster, see [Create a cluster in the Azure portal](quickstart-create-portal.md).
-
-The code samples in this article use your cluster name and password. In the Azure portal, your cluster name appears at the top of your cluster page.
-
+This quickstart shows you how to use Java code to connect to a cluster, and use SQL statements to create a table. You'll then insert, query, update, and delete data in the database. The steps in this article assume that you're familiar with Java development and [JDBC](https://en.wikipedia.org/wiki/Java_Database_Connectivity), and are new to working with Azure Cosmos DB for PostgreSQL.
## Set up the Java project and connection
public class DemoApplication
## Next steps
cosmos-db Quickstart App Stacks Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/postgresql/quickstart-app-stacks-nodejs.md
recommendations: false Previously updated : 09/28/2022 Last updated : 10/27/2022 # Use Node.js to connect and run SQL commands on Azure Cosmos DB for PostgreSQL
Last updated 09/28/2022
[!INCLUDE [App stack selector](includes/quickstart-selector.md)]
-This quickstart shows you how to use Node.js code to connect to a cluster, and then use SQL statements to create a table and insert, query, update, and delete data in the database. The steps in this article assume that you're familiar with Node.js development, and are new to working with Azure Cosmos DB for PostgreSQL.
+This quickstart shows you how to use Node.js code to connect to a cluster, and use SQL statements to create a table. You'll then insert, query, update, and delete data in the database. The steps in this article assume that you're familiar with Node.js development, and are new to working with Azure Cosmos DB for PostgreSQL.
-> [!TIP]
-> The process of creating a Node.js app with Azure Cosmos DB for PostgreSQL is the same as working with ordinary PostgreSQL.
+## Install PostgreSQL library
-## Prerequisites
--- An Azure account with an active subscription. If you don't have one, [create an account for free](https://azure.microsoft.com/free).-- An Azure Cosmos DB for PostgreSQL cluster. To create a cluster, see [Create a cluster in the Azure portal](quickstart-create-portal.md).-- [Node.js](https://nodejs.org) installed.-- For various samples, the following packages installed:-
- - [pg](https://www.npmjs.com/package/pg) PostgreSQL client for Node.js.
- - [pg-copy-streams](https://www.npmjs.com/package/pg-copy-streams).
- - [through2](https://www.npmjs.com/package/through2) to allow pipe chaining.
-
- Install these packages from your command line by using the JavaScript `npm` node package manager.
-
- ```bash
- npm install <package name>
- ```
-
- Verify the installation by listing the packages installed.
-
- ```bash
- npm list
- ```
-
-You can launch Node.js from the Bash shell, terminal, or Windows command prompt by typing `node`. Then run the example JavaScript code interactively by copying and pasting the code into the prompt. Or, you can save the JavaScript code into a *\<filename>.js* file, and then run `node <filename>.js` with the file name as a parameter.
-
-> [!NOTE]
-> Because each code sample finishes by ending the connection pool, you need to start a new Node.js session to build a new pool for each of the samples.
-
-The code samples in this article use your cluster name and password. You can see your cluster name at the top of your cluster page in the Azure portal.
-
+The code examples in this article require the [pg](https://node-postgres.com) library to interface with the PostgreSQL server. You'll need to install pg with your language package manager (such as npm).
## Connect, create a table, and insert data
-All examples in this article need to connect to the database. You can put the connection logic into its own module for reuse. Use the [pg](https://node-postgres.com) client object to interface with the PostgreSQL server.
+### Create the common connection module
[!INCLUDE[why-connection-pooling](includes/why-connection-pooling.md)]
-### Create the common connection module
- Create a folder called *db*, and inside this folder create a *citus.js* file that contains the following common connection code. In this code, replace \<cluster> with your cluster name and \<password> with your administrator password. ```javascript
cosmos-db Quickstart App Stacks Python https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/postgresql/quickstart-app-stacks-python.md
recommendations: false Previously updated : 09/28/2022 Last updated : 10/27/2022 # Use Python to connect and run SQL commands on Azure Cosmos DB for PostgreSQL
Last updated 09/28/2022
[!INCLUDE [App stack selector](includes/quickstart-selector.md)]
-This quickstart shows you how to use Python code on macOS, Ubuntu Linux, or Windows to connect to a cluster, and use SQL statements to create a table and insert, query, update, and delete data. The steps in this article assume that you're familiar with Python development, and are new to working with Azure Cosmos DB for PostgreSQL.
+This quickstart shows you how to use Python code to connect to a cluster, and use SQL statements to create a table. You'll then insert, query, update, and delete data in the database. The steps in this article assume that you're familiar with Python development, and are new to working with Azure Cosmos DB for PostgreSQL.
-> [!TIP]
-> The process of creating a Python app with Azure Cosmos DB for PostgreSQL is the same as working with ordinary PostgreSQL.
+## Install PostgreSQL library
-## Prerequisites
--- An Azure account with an active subscription. If you don't have one, [create an account for free](https://azure.microsoft.com/free).-- [Python](https://www.python.org/downloads) 2.7 or 3.6+.-- The latest [pip](https://pip.pypa.io/en/stable/installing) package installer. Most versions of Python already install `pip`.-- [psycopg2](https://pypi.python.org/pypi/psycopg2-binary) installed by using `pip` in a terminal or command prompt window. For more information, see [How to install psycopg2](https://www.psycopg.org/docs/install.html).-- An Azure Cosmos DB for PostgreSQL cluster. To create a cluster, see [Create a cluster in the Azure portal](quickstart-create-portal.md).-
-The code samples in this article use your cluster name and password. In the Azure portal, your cluster name appears at the top of your cluster page.
-
+The code examples in this article require the [psycopg2](https://pypi.python.org/pypi/psycopg2-binary) library. You'll need to install psycopg2 with your language package manager (such as pip).
## Connect, create a table, and insert data
-The following code example creates a connection pool to your Postgres database by using the [psycopg2.pool](https://www.psycopg.org/docs/pool.html) library, and uses `pool.getconn()` to get a connection from the pool. The code then uses [cursor.execute](https://www.psycopg.org/docs/cursor.html#execute) functions with SQL CREATE TABLE and INSERT INTO statements to create a table and insert data.
+The following code example creates a [connection pool](https://www.psycopg.org/docs/pool.html) to your Postgres database. It then uses [cursor.execute](https://www.psycopg.org/docs/cursor.html#execute) functions with SQL CREATE TABLE and INSERT INTO statements to create a table and insert data.
[!INCLUDE[why-connection-pooling](includes/why-connection-pooling.md)]
cosmos-db Quickstart App Stacks Ruby https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/postgresql/quickstart-app-stacks-ruby.md
recommendations: false Previously updated : 09/28/2022 Last updated : 10/27/2022 # Use Ruby to connect and run SQL commands on Azure Cosmos DB for PostgreSQL
Last updated 09/28/2022
[!INCLUDE [App stack selector](includes/quickstart-selector.md)]
-This quickstart shows you how to use Ruby code to connect to a cluster, and then use SQL statements to create a table and insert, query, update, and delete data in the database. The steps in this article assume that you're familiar with Ruby development, and are new to working with Azure Cosmos DB for PostgreSQL.
+This quickstart shows you how to use Ruby code to connect to a cluster, and use SQL statements to create a table. You'll then insert, query, update, and delete data in the database. The steps in this article assume that you're familiar with Ruby development, and are new to working with Azure Cosmos DB for PostgreSQL.
-> [!TIP]
-> The process of creating a Ruby app with Azure Cosmos DB for PostgreSQL is the same as working with ordinary PostgreSQL.
+## Install PostgreSQL library
-## Prerequisites
--- An Azure account with an active subscription. If you don't have one, [create an account for free](https://azure.microsoft.com/free).-- [Ruby](https://www.ruby-lang.org/en/downloads) installed.-- [Ruby pg](https://rubygems.org/gems/pg), the PostgreSQL module for Ruby.-- An Azure Cosmos DB for PostgreSQL cluster. To create a cluster, see [Create a cluster in the Azure portal](quickstart-create-portal.md).
-
-The code samples in this article use your cluster name and password. In the Azure portal, your cluster name appears at the top of your cluster page.
-
+The code examples in this article require the [pg](https://rubygems.org/gems/pg) gem. You'll need to install pg with your language package manager (such as bundler).
## Connect, create a table, and insert data
-Use the following code to connect and create a table by using the CREATE TABLE SQL statement, then add rows to the table by using the INSERT INTO SQL statement.
-
-The code uses a `PG::Connection` object with constructor to connect to Azure Cosmos DB for PostgreSQL. Then it calls method `exec()` to run the DROP, CREATE TABLE, and INSERT INTO commands. The code checks for errors using the `PG::Error` class. Then it calls method `close()` to close the connection before terminating. For more information about these classes and methods, see the [Ruby pg reference documentation](https://rubygems.org/gems/pg).
+Use the following code to connect and create a table by using the CREATE TABLE SQL statement, then add rows to the table by using the INSERT INTO SQL statement. The code uses a `PG::Connection` object with constructor to connect to Azure Cosmos DB for PostgreSQL. Then it calls method `exec()` to run the DROP, CREATE TABLE, and INSERT INTO commands. The code checks for errors using the `PG::Error` class. Then it calls method `close()` to close the connection before terminating.
In the code, replace \<cluster> with your cluster name and \<password> with your administrator password.
cost-management-billing Enable Preview Features Cost Management Labs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/costs/enable-preview-features-cost-management-labs.md
Charts are enabled on the [Try preview](https://aka.ms/costmgmt/trypreview) page
Show the forecast for the current period at the top of the cost analysis preview.
-Charts can be enabled from the [Try preview](https://aka.ms/costmgmt/trypreview) page in the Azure portal. Use the **How would you rate the cost analysis preview?** Option at the bottom of the page to share feedback about the preview.
+The Forecast KPI can be enabled from the [Try preview](https://aka.ms/costmgmt/trypreview) page in the Azure portal. Use the **How would you rate the cost analysis preview?** option at the bottom of the page to share feedback about the preview.
++
+<a name="cav3delta"></a>
+
+## Compare cost with previous period in the cost analysis preview
+
+Show the percentage difference in cost compared to the previous period at the top of the cost analysis preview. When your view is showing 3 months or less, the difference is calculated as the cost from the start of the period through yesterday compared to the the same days from the previous period. If showing more than 3 months, the date range uses the first month through last month. If the current day or month are not included, the entire period is compared to the previous period.
+
+The previous period delta can be enabled from the [Try preview](https://aka.ms/costmgmt/trypreview) page in the Azure portal. Use the **How would you rate the cost analysis preview?** option at the bottom of the page to share feedback about the preview.
<a name="productscolumn"></a>
Charts can be enabled from the [Try preview](https://aka.ms/costmgmt/trypreview)
Every service tracks different usage attributes of the resources you've deployed. Each of these usage attributes is tracked via a "meter" in your cost data. Meters are grouped into categories and include other metadata to help you understand the charges. WeΓÇÖre testing new columns in the Resources and Services views in the cost analysis preview for Microsoft Customer Agreement. You may see a single Product column instead of the Service, Tier, and Meter columns.
-You can also enable this preview from the [Try preview](https://aka.ms/costmgmt/trypreview) page in the Azure portal. Note this preview is only applicable for Microsoft Customer Agreement accounts.
+**The Product column is available by default in the cost analysis preview for Microsoft Customer Agreement accounts.**
<a name="recommendationinsights"></a>
You can also enable this preview from the [Try preview](https://aka.ms/costmgmt/
Cost insights surface important details about your subscriptions, like potential anomalies or top cost contributors. To support your cost optimization goals, cost insights now include the total cost savings available from Azure Advisor for your subscription.
-You can enable cost savings insights for subscriptions from the [Try preview](https://aka.ms/costmgmt/trypreview) page in the Azure portal.
+**Cost savings insights are available by default for all subscriptions in the cost analysis preview.**
<a name="resourceessentials"></a>
The view cost link is enabled by default in the [Azure preview portal](https://p
Learn about new and updated features or other announcements directly from within the Cost Management experience in the Azure portal. You can also follow along using the [Cost Management updates on the Azure blog](https://aka.ms/costmgmt/blog).
-What's new can be enabled from the [Try preview](https://aka.ms/costmgmt/trypreview) page in the Azure portal.
+**What's new is available by default from the [Cost Management overview](https://aka.ms/costmgmt/whatsnew) in the Azure portal.
<a name="onlyinconfig"></a>
data-factory Author Global Parameters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/author-global-parameters.md
Previously updated : 01/31/2022 Last updated : 09/26/2022
data-factory Author Visually https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/author-visually.md
The management hub, accessed by the *Manage* tab in the UI, is a portal that hos
Expressions and functions can be used instead of static values to specify many properties within the service.
-To specify an expression for a property value, select **Add Dynamic Content** or click **Alt + P** while focusing on the field.
+To specify an expression for a property value, select **Add Dynamic Content** or click **Alt + Shift + D** while focusing on the field.
:::image type="content" source="media/author-visually/dynamic-content-1.png" alt-text="Add Dynamic Content":::
data-factory Plan Manage Costs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/plan-manage-costs.md
To view Data Factory costs in cost analysis:
Actual monthly costs are shown when you initially open cost analysis. Here's an example showing all monthly usage costs. - To narrow costs for a single service, like Data Factory, select **Add filter** and then select **Service name**. Then, select **Azure Data Factory v2**. Here's an example showing costs for just Data Factory. In the preceding example, you see the current cost for the service. Costs by Azure regions (locations) and Data Factory costs by resource group are also shown. From here, you can explore costs on your own.
data-factory Tutorial Incremental Copy Change Tracking Feature Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/tutorial-incremental-copy-change-tracking-feature-portal.md
In this tutorial, you create an Azure Data Factory with a pipeline that loads de
You perform the following steps in this tutorial: > [!div class="checklist"]
-> * Prepare the source data store
+> * Prepare the source data store.
> * Create a data factory. > * Create linked services. > * Create source, sink, and change tracking datasets.
-> * Create, run, and monitor the full copy pipeline
-> * Add or update data in the source table
-> * Create, run, and monitor the incremental copy pipeline
+> * Create, run, and monitor the full copy pipeline.
+> * Add or update data in the source table.
+> * Create, run, and monitor the incremental copy pipeline.
## Overview In a data integration solution, incrementally loading data after initial data loads is a widely used scenario. In some cases, the changed data within a period in your source data store can be easily to sliced up (for example, LastModifyTime, CreationTime). In some cases, there is no explicit way to identify the delta data from last time you processed the data. The Change Tracking technology supported by data stores such as Azure SQL Database and SQL Server can be used to identify the delta data. This tutorial describes how to use Azure Data Factory with SQL Change Tracking technology to incrementally load delta data from Azure SQL Database into Azure Blob Storage. For more concrete information about SQL Change Tracking technology, see [Change tracking in SQL Server](/sql/relational-databases/track-changes/about-change-tracking-sql-server).
If you don't have an Azure subscription, create a [free](https://azure.microsoft
Age int PRIMARY KEY (PersonID) );- INSERT INTO data_source_table (PersonID, Name, Age) VALUES
If you don't have an Azure subscription, create a [free](https://azure.microsoft
(3, 'cccc', 20), (4, 'dddd', 26), (5, 'eeee', 22);- ``` 4. Enable **Change Tracking** mechanism on your database and the source table (data_source_table) by running the following SQL query:
If you don't have an Azure subscription, create a [free](https://azure.microsoft
> [!NOTE] > - Replace &lt;your database name&gt; with the name of the database in Azure SQL Database that has the data_source_table. > - The changed data is kept for two days in the current example. If you load the changed data for every three days or more, some changed data is not included. You need to either change the value of CHANGE_RETENTION to a bigger number. Alternatively, ensure that your period to load the changed data is within two days. For more information, see [Enable change tracking for a database](/sql/relational-databases/track-changes/enable-and-disable-change-tracking-sql-server#enable-change-tracking-for-a-database)- ```sql ALTER DATABASE <your database name> SET CHANGE_TRACKING = ON (CHANGE_RETENTION = 2 DAYS, AUTO_CLEANUP = ON) - ALTER TABLE data_source_table ENABLE CHANGE_TRACKING WITH (TRACK_COLUMNS_UPDATED = ON)
If you don't have an Azure subscription, create a [free](https://azure.microsoft
TableName varchar(255), SYS_CHANGE_VERSION BIGINT, );- DECLARE @ChangeTracking_version BIGINT SET @ChangeTracking_version = CHANGE_TRACKING_CURRENT_VERSION(); - INSERT INTO table_store_ChangeTracking_version VALUES ('data_source_table', @ChangeTracking_version) ```
If you don't have an Azure subscription, create a [free](https://azure.microsoft
> [!NOTE] > If the data is not changed after you enabled the change tracking for SQL Database, the value of the change tracking version is 0. 6. Run the following query to create a stored procedure in your database. The pipeline invokes this stored procedure to update the change tracking version in the table you created in the previous step.- ```sql CREATE PROCEDURE Update_ChangeTracking_Version @CurrentTrackingVersion BIGINT, @TableName varchar(50) AS- BEGIN- UPDATE table_store_ChangeTracking_version SET [SYS_CHANGE_VERSION] = @CurrentTrackingVersion WHERE [TableName] = @TableName- END ```
Install the latest Azure PowerShell modules by following instructions in [How t
1. Launch **Microsoft Edge** or **Google Chrome** web browser. Currently, Data Factory UI is supported only in Microsoft Edge and Google Chrome web browsers. 1. On the left menu, select **Create a resource** > **Data + Analytics** > **Data Factory**:
- :::image type="content" source="./media/doc-common-process/new-azure-data-factory-menu.png" alt-text="Screenshot that shows the data factory selection in the New pane.":::
+ ![Screenshot that shows the data factory selection in the New pane.](media/tutorial-incremental-copy-change-tracking-feature-portal/new-azure-data-factory-menu.png)
+1. In the **New data factory** page, enter **ADFTutorialDataFactory** for the **name**.
-2. In the **New data factory** page, enter **ADFTutorialDataFactory** for the **name**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/new-azure-data-factory.png" alt-text="New data factory page":::
-
- The name of the Azure Data Factory must be **globally unique**. If you receive the following error, change the name of the data factory (for example, yournameADFTutorialDataFactory) and try creating again. See [Data Factory - Naming Rules](naming-rules.md) article for naming rules for Data Factory artifacts.
+ ![New data factory page.](media/tutorial-incremental-copy-change-tracking-feature-portal/new-azure-data-factory-menu.png)
+1. The name of the Azure Data Factory must be **globally unique**. If you receive the following error, change the name of the data factory (for example, yournameADFTutorialDataFactory) and try creating again. See [Data Factory - Naming Rules](naming-rules.md) article for naming rules for Data Factory artifacts.
*Data factory name ΓÇ£ADFTutorialDataFactoryΓÇ¥ is not available* 3. Select your Azure **subscription** in which you want to create the data factory.
Install the latest Azure PowerShell modules by following instructions in [How t
- Select **Create new**, and enter the name of a resource group. To learn about resource groups, see [Using resource groups to manage your Azure resources](../azure-resource-manager/management/overview.md).
-4. Select **V2 (Preview)** for the **version**.
-5. Select the **location** for the data factory. Only locations that are supported are displayed in the drop-down list. The data stores (Azure Storage, Azure SQL Database, etc.) and computes (HDInsight, etc.) used by data factory can be in other regions.
-6. Select **Pin to dashboard**.
+1. Select **V2** for the **version**.
+1. Select the **Region** for the data factory. Only locations that are supported are displayed in the drop-down list. The data stores (Azure Storage, Azure SQL Database, etc.) and computes (HDInsight, etc.) used by data factory can be in other regions.
+1. Select **Next : Git configuration** and setup the repository following the instructions in [Configuration method 4: During factory creation](/azure/data-factory/source-control) or select **Configure Git later** checkbox.
+ ![Create Data Factory - Git Configuration.](media/tutorial-incremental-copy-change-tracking-feature-portal/new-azure-data-factory-menu-git-configuration.png)
+1. Select **Review + create**.
7. Click **Create**. 8. On the dashboard, you see the following tile with status: **Deploying data factory**. :::image type="content" source="media/tutorial-incremental-copy-change-tracking-feature-portal/deploying-data-factory.png" alt-text="deploying data factory tile":::
-9. After the creation is complete, you see the **Data Factory** page as shown in the image.
-
- :::image type="content" source="./media/doc-common-process/data-factory-home-page.png" alt-text="Home page for the Azure Data Factory, with the Open Azure Data Factory Studio tile.":::
-
-10. Select **Open** on the **Open Azure Data Factory Studio** tile to launch the Azure Data Factory user interface (UI) in a separate tab.
+1. After the creation is complete, you see the **Data Factory** page as shown in the image.
+1. Select **Launch studio** tile to launch the Azure Data Factory user interface (UI) in a separate tab.
11. In the home page, switch to the **Manage** tab in the left panel as shown in the following image:
- :::image type="content" source="media/doc-common-process/get-started-page-manage-button.png" alt-text="Screenshot that shows the Manage button.":::
- ## Create linked services You create linked services in a data factory to link your data stores and compute services to the data factory. In this section, you create linked services to your Azure Storage account and your database in Azure SQL Database. ### Create Azure Storage linked service. In this step, you link your Azure Storage Account to the data factory.
-1. Click **Connections**, and click **+ New**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/new-connection-button-storage.png" alt-text="New connection button":::
-2. In the **New Linked Service** window, select **Azure Blob Storage**, and click **Continue**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/select-azure-storage.png" alt-text="Select Azure Blob Storage":::
-3. In the **New Linked Service** window, do the following steps:
-
- 1. Enter **AzureStorageLinkedService** for **Name**.
- 2. Select your Azure Storage account for **Storage account name**.
- 3. Click **Save**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/azure-storage-linked-service-settings.png" alt-text="Azure Storage Account settings":::
--
+1. Navigate to **Linked services** in **Connections** under **Manage** tab and click **+ New** or click on **Create linked service** button.
+ ![New connection button.](media/tutorial-incremental-copy-change-tracking-feature-portal/new-connection-button-storage.png)
+1. In the **New Linked Service** window, select **Azure Blob Storage**, and click **Continue**.
+
+1. In the **New Linked Service** window, do the following steps:
+1. Enter **AzureStorageLinkedService** for the **Name** field.
+1. Select the integration runtime in **Connect via integrationruntime**.
+1. Select the integration runtime in **Connect via integrationruntime**.
+1. Select the **Authentication type**.
+1. Select your Azure Storage account for **Storage account name**.
+1. Click **Create**.
### Create Azure SQL Database linked service. In this step, you link your database to the data factory.
-1. Click **Connections**, and click **+ New**.
-2. In the **New Linked Service** window, select **Azure SQL Database**, and click **Continue**.
-3. In the **New Linked Service** window, do the following steps:
+1. Select **Linked services** under **Connections**, and click **+ New**.
+1. In the **New Linked Service** window, select **Azure SQL Database**, and click **Continue**.
+1. In the **New Linked Service** window, do the following steps:
1. Enter **AzureSqlDatabaseLinkedService** for the **Name** field. 2. Select your server for the **Server name** field.
- 3. Select your database for the **Database name** field.
- 4. Enter name of the user for the **User name** field.
- 5. Enter password for the user for the **Password** field.
- 6. Click **Test connection** to test the connection.
- 7. Click **Save** to save the linked service.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/azure-sql-database-linked-service-settings.png" alt-text="Azure SQL Database linked service settings":::
-
+1. Select your database for the **Database name** field.
+1. Select the authentication type for the **Authentication type** field.
+1. We are using SQL authentication for this demo, enter name of the user for the **User name** field.
+1. Enter password for the user for the **Password** field or provide the **Azure Key Vault - AKV linked service** name, **Secret name** and **secret version**.
+1. Click **Test connection** to test connection.
+1. Click **Create** to create the linked service.![Azure SQL Database linked service settings.](media/tutorial-incremental-copy-change-tracking-feature-portal/azure-sql-database-linked-service-setting.png)
## Create datasets In this step, you create datasets to represent data source, data destination. and the place to store the SYS_CHANGE_VERSION. ### Create a dataset to represent source data In this step, you create a dataset to represent the source data.
-1. In the treeview, click **+ (plus)**, and click **Dataset**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/new-dataset-menu.png" alt-text="New Dataset menu":::
-2. Select **Azure SQL Database**, and click **Finish**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/select-azure-sql-database.png" alt-text="Source dataset type - Azure SQL Database":::
-3. You see a new tab for configuring the dataset. You also see the dataset in the treeview. In the **Properties** window, change the name of the dataset to **SourceDataset**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/source-dataset-name.png" alt-text="Source dataset name":::
-4. Switch to the **Connection** tab, and do the following steps:
-
- 1. Select **AzureSqlDatabaseLinkedService** for **Linked service**.
- 2. Select **[dbo].[data_source_table]** for **Table**.
+1. Click **+ (plus)** and click **Dataset** in the treeview under the **Author** tab or click on the ellipsis for Dataset actions.
+ ![New Dataset menu 1.](media/tutorial-incremental-copy-change-tracking-feature-portal/new-dataset-menu.png)
+1. Select **Azure SQL Database**, and click **Continue**.
+1. In the **Set Properties** window, do the following steps:
+ 1. Set the name of the dataset to **SourceDataset**.
+ 1. Select **AzureSqlDatabaseLinkedService** for **Linked service**.
+ 1. Select **dbo.data_source_table** for **Table name.**
+ 1. Select the radio button to **Import schema** for **From connection/store**.
+1. Click **OK**.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/source-dataset-connection.png" alt-text="Source connection":::
+ ![Source Dataset Properties.](media/tutorial-incremental-copy-change-tracking-feature-portal/source-dataset-properties.png)
### Create a dataset to represent data copied to sink data store. In this step, you create a dataset to represent the data that is copied from the source data store. You created the adftutorial container in your Azure Blob Storage as part of the prerequisites. Create the container if it does not exist (or) set it to the name of an existing one. In this tutorial, the output file name is dynamically generated by using the expression: `@CONCAT('Incremental-', pipeline().RunId, '.txt')`.
-1. In the treeview, click **+ (plus)**, and click **Dataset**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/new-dataset-menu.png" alt-text="New Dataset menu":::
-2. Select **Azure Blob Storage**, and click **Finish**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/source-dataset-type.png" alt-text="Sink dataset type - Azure Blob Storage":::
-3. You see a new tab for configuring the dataset. You also see the dataset in the treeview. In the **Properties** window, change the name of the dataset to **SinkDataset**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/sink-dataset-name.png" alt-text="Sink dataset - name":::
-4. Switch to the **Connection** tab in the Properties window, and do the following steps:
-
- 1. Select **AzureStorageLinkedService** for **Linked service**.
- 2. Enter **adftutorial/incchgtracking** for **folder** part of the **filePath**.
- 3. Enter **\@CONCAT('Incremental-', pipeline().RunId, '.txt')** for **file** part of the **filePath**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/sink-dataset-connection.png" alt-text="Sink dataset - connection":::
-
+1. Click **+ (plus)** and click **Dataset** in the treeview under the **Author** tab or click on the ellipsis for Dataset actions.
+ ![New Dataset menu 2.](media/tutorial-incremental-copy-change-tracking-feature-portal/new-dataset-menu.png)
+1. Select **Azure Blob Storage**, and click **Continue**.
+1. Select the format of the data type as **DelimitedText** and click **Continue**.
+1. In the **Set** **properties** window, change the name of the dataset to **SinkDataset**.
+1. In the **Set properties** window, do the following steps:
+1. Change the name of the dataset to **SinkDataset**.
+1. Change the name of the dataset to **SinkDataset**.
+1. Select **AzureBlobStorageLinkedService** for **Linked service**.
+1. Enter **adftutorial/incchgtracking** for **folder** part of the **filePath**.
+1. Click **OK**.
+
+ ![Sink dataset - Properties.](media/tutorial-incremental-copy-change-tracking-feature-portal/source-dataset-properties.png)
+1. The dataset will be visible in the treeview, do the following steps:
+ 1. In **Connection** tab, click in the text box field for **File name**. **Add dynamic content** option will appear, click on it.
+ ![Sink Dataset - Setting dynamic file path.](media/tutorial-incremental-copy-change-tracking-feature-portal/sink-dataset-filepath.png)
+ 1. Click on **Add dynamic content[Alt+Shift+D].**
+ 1. **Pipeline expression builder** window will appear. Paste the following in the text box filed, @concat('Incremental-',pipeline().RunId,'.csv')
+ 1. Click **OK**.
### Create a dataset to represent change tracking data In this step, you create a dataset for storing the change tracking version. You created the table table_store_ChangeTracking_version as part of the prerequisites. 1. In the treeview, click **+ (plus)**, and click **Dataset**.
-2. Select **Azure SQL Database**, and click **Finish**.
-3. You see a new tab for configuring the dataset. You also see the dataset in the treeview. In the **Properties** window, change the name of the dataset to **ChangeTrackingDataset**.
-4. Switch to the **Connection** tab, and do the following steps:
-
- 1. Select **AzureSqlDatabaseLinkedService** for **Linked service**.
- 2. Select **[dbo].[table_store_ChangeTracking_version]** for **Table**.
-
+1. Select **Azure SQL Database**, and click **Continue**.
+1. In the **Set Properties** window, do the following steps:
+1. Set the name of the dataset to **ChangeTrackingDataset**.
+1. Select **AzureSqlDatabaseLinkedService** for **Linked service**.
+1. Select **dbo.table_store_ChangeTracking_version** for **Table name.**
+1. Select the radio button to **Import schema** for **From connection/store**.
+1. Click **OK**.
## Create a pipeline for the full copy In this step, you create a pipeline with a copy activity that copies the entire data from the source data store (Azure SQL Database) to the destination data store (Azure Blob Storage).
-1. Click **+ (plus)** in the left pane, and click **Pipeline**.
+1. Click **+ (plus)** in the left pane, and click **Pipeline > Pipeline**.
+ ![Screenshot shows the Pipeline option for a data factory.](media/tutorial-incremental-copy-change-tracking-feature-portal/new-pipeline-menu.png)
+1. You see a new tab for configuring the pipeline. You also see the pipeline in the treeview. In the **Properties** window, change the name of the pipeline to **FullCopyPipeline**.
+1. In the **Activities** toolbox, expand **Move & transform**, and drag-drop the **Copy** activity to the pipeline designer surface or search the **copy data** activity in search bar under **Activities**, and set the name **FullCopyActivity**.
+1. Switch to the **Source** tab, and select **SourceDataset** for the **Source Dataset** field.
+1. Switch to the **Sink** tab, and select **SinkDataset** for the **Sink Dataset** field.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/new-pipeline-menu.png" alt-text="Screenshot shows the Pipeline option for a data factory.":::
-2. You see a new tab for configuring the pipeline. You also see the pipeline in the treeview. In the **Properties** window, change the name of the pipeline to **FullCopyPipeline**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/full-copy-pipeline-name.png" alt-text="Screenshot shows a pipeline with a name entered.":::
-3. In the **Activities** toolbox, expand **Data Flow**, and drag-drop the **Copy** activity to the pipeline designer surface, and set the name **FullCopyActivity**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/full-copy-activity-name.png" alt-text="Full copy activity-name":::
-4. Switch to the **Source** tab, and select **SourceDataset** for the **Source Dataset** field.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/copy-activity-source.png" alt-text="Copy activity - source":::
-5. Switch to the **Sink** tab, and select **SinkDataset** for the **Sink Dataset** field.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/copy-activity-sink.png" alt-text="Copy activity - sink":::
-6. To validate the pipeline definition, click **Validate** on the toolbar. Confirm that there is no validation error. Close the **Pipeline Validation Report** by clicking **>>**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/full-copy-pipeline-validate.png" alt-text="Validate the pipeline":::
-7. To publish entities (linked services, datasets, and pipelines), click **Publish**. Wait until the publishing succeeds.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/publish-button.png" alt-text="Screenshot shows data factory with the Publish All button called out.":::
+1. To validate the pipeline definition, click **Validate** on the toolbar. Confirm that there is no validation error. Close the **Pipeline Validation output** by clicking **Close**.
+1. To publish entities (linked services, datasets, and pipelines), click **Publish all**. Wait until the publishing succeeds.
8. Wait until you see the **Successfully published** message. :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/publishing-succeeded.png" alt-text="Publishing succeeded":::
-9. You can also see notifications by clicking the **Show Notifications** button on the left. To close the notifications window, click **X**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/show-notifications.png" alt-text="Show notifications":::
--
+1. You can also see notifications by clicking the **Show Notifications** button on the left. To close the notifications window, click **X** or **Close** button on the bottom of the plane.
### Run the full copy pipeline
-Click **Trigger** on the toolbar for the pipeline, and click **Trigger Now**.
--
+1. Click **Add** **trigger** on the toolbar for the pipeline, and click **Trigger Now**.
+ ![Screenshot shows the Trigger Now option selected from the Trigger menu.](media/tutorial-incremental-copy-change-tracking-feature-portal/trigger-now-menu.png)
+1. Click **OK** on the Pipeline run window.
+
+ ![Pipeline run confirmation with parameter check.](media/tutorial-incremental-copy-change-tracking-feature-portal/trigger-pipeline-run-confirmation.png)
### Monitor the full copy pipeline
-1. Click the **Monitor** tab on the left. You see the pipeline run in the list and its status. To refresh the list, click **Refresh**. The links in the Actions column let you view activity runs associated with the pipeline run and to rerun the pipeline.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/monitor-full-copy-pipeline-run.png" alt-text="Screenshot shows pipeline runs for a data factory.":::
-2. To view activity runs associated with the pipeline run, click the **View Activity Runs** link in the **Actions** column. There is only one activity in the pipeline, so you see only one entry in the list. To switch back to the pipeline runs view, click **Pipelines** link at the top.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/activity-runs-full-copy.png" alt-text="Screenshot shows activity runs for a data factory with the Pipelines link called out.":::
-
+1. Click the **Monitor** tab on the left. You see the pipeline run in the list and its status. To refresh the list, click **Refresh**. Hover on the pipeline run to get the option to **Rerun** or check **consumption**.
+ ![Screenshot shows pipeline runs for a data factory.](media/tutorial-incremental-copy-change-tracking-feature-portal/monitor-full-copy-pipeline-run.png)
+1. To view activity runs associated with the pipeline run, click the pipeline name from **Pipeline name** column. There is only one activity in the pipeline, so you see only one entry in the list. To switch back to the pipeline runs view, click **All** **pipeline runs** link at the top.
### Review the results
-You see a file named `incremental-<GUID>.txt` in the `incchgtracking` folder of the `adftutorial` container.
--
+You see a file named `incremental-<GUID>.csv` in the `incchgtracking` folder of the `adftutorial` container.
+![Output file from full copy.](media/tutorial-incremental-copy-change-tracking-feature-portal/full-copy-output-file.png)
The file should have the data from your database: ```
-1,aaaa,21
-2,bbbb,24
-3,cccc,20
-4,dddd,26
-5,eeee,22
-```
+PersonID,Name,Age
+1,"aaaa",21
+2,"bbbb",24
+3,"cccc",20
+4,"dddd",26
+5,"eeee",22
+
+5,eeee,PersonID,Name,Age
+1,"aaaa",21
+2,"bbbb",24
+3,"cccc",20
+4,"dddd",26
+5,"eeee",22
+
+```
## Add more data to the source table Run the following query against your database to add a row and update a row.
SET [Age] = '10', [name]='update' where [PersonID] = 1
## Create a pipeline for the delta copy In this step, you create a pipeline with the following activities, and run it periodically. The **lookup activities** get the old and new SYS_CHANGE_VERSION from Azure SQL Database and pass it to copy activity. The **copy activity** copies the inserted/updated/deleted data between the two SYS_CHANGE_VERSION values from Azure SQL Database to Azure Blob Storage. The **stored procedure activity** updates the value of SYS_CHANGE_VERSION for the next pipeline run.
-1. In the Data Factory UI, switch to the **Edit** tab. Click **+ (plus)** in the left pane, and click **Pipeline**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/new-pipeline-menu-2.png" alt-text="Screenshot shows how to create a pipeline in a data factory.":::
+1. In the Data Factory UI, switch to the **Author** tab. Click **+ (plus)** in the left pane treeview, and click **Pipeline > Pipeline**.
+ ![Screenshot shows how to create a pipeline in a data factory.](media/tutorial-incremental-copy-change-tracking-feature-portal/new-pipeline-menu-2.png)
2. You see a new tab for configuring the pipeline. You also see the pipeline in the treeview. In the **Properties** window, change the name of the pipeline to **IncrementalCopyPipeline**.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/incremental-copy-pipeline-name.png" alt-text="Pipeline name":::
-3. Expand **General** in the **Activities** toolbox, and drag-drop the **Lookup** activity to the pipeline designer surface. Set the name of the activity to **LookupLastChangeTrackingVersionActivity**. This activity gets the change tracking version used in the last copy operation that is stored in the table **table_store_ChangeTracking_version**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/first-lookup-activity-name.png" alt-text="Screenshot shows a pipeline with a lookup activity.":::
+3. Expand **General** in the **Activities** toolbox, and drag-drop the **Lookup** activity to the pipeline designer surface or search in the **Search activities** search box. Set the name of the activity to **LookupLastChangeTrackingVersionActivity**. This activity gets the change tracking version used in the last copy operation that is stored in the table **table_store_ChangeTracking_version**.
4. Switch to the **Settings** in the **Properties** window, and select **ChangeTrackingDataset** for the **Source Dataset** field.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/first-lookup-activity-settings.png" alt-text="Screenshot shows the Settings tab in the Properties window.":::
5. Drag-and-drop the **Lookup** activity from the **Activities** toolbox to the pipeline designer surface. Set the name of the activity to **LookupCurrentChangeTrackingVersionActivity**. This activity gets the current change tracking version.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/second-lookup-activity-name.png" alt-text="Screenshot shows a pipeline with two lookup activities.":::
6. Switch to the **Settings** in the **Properties** window, and do the following steps: 1. Select **SourceDataset** for the **Source Dataset** field. 2. Select **Query** for **Use Query**. 3. Enter the following SQL query for **Query**.
+
+ ```sql
+ SELECT CHANGE_TRACKING_CURRENT_VERSION() as CurrentChangeTrackingVersion
+ ```
- ```sql
- SELECT CHANGE_TRACKING_CURRENT_VERSION() as CurrentChangeTrackingVersion
- ```
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/second-lookup-activity-settings.png" alt-text="Screenshot shows a query added to the Settings tab in the Properties window.":::
-7. In the **Activities** toolbox, expand **Data Flow**, and drag-drop the **Copy** activity to the pipeline designer surface. Set the name of the activity to **IncrementalCopyActivity**. This activity copies the data between last change tracking version and the current change tracking version to the destination data store.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/incremental-copy-activity-name.png" alt-text="Copy Activity - name":::
+ ![Screenshot shows a query added to the Settings tab in the Properties window.](media/tutorial-incremental-copy-change-tracking-feature-portal/second-lookup-activity-settings.png)
+7. In the **Activities** toolbox, expand **Move & transform**, drag-drop the **Copy** **data** activity to the pipeline designer surface. Set the name of the activity to **IncrementalCopyActivity**. This activity copies the data between last change tracking version and the current change tracking version to the destination data store.
8. Switch to the **Source** tab in the **Properties** window, and do the following steps: 1. Select **SourceDataset** for **Source Dataset**. 2. Select **Query** for **Use Query**. 3. Enter the following SQL query for **Query**.
- ```sql
- select data_source_table.PersonID,data_source_table.Name,data_source_table.Age, CT.SYS_CHANGE_VERSION, SYS_CHANGE_OPERATION from data_source_table RIGHT OUTER JOIN CHANGETABLE(CHANGES data_source_table, @{activity('LookupLastChangeTrackingVersionActivity').output.firstRow.SYS_CHANGE_VERSION}) as CT on data_source_table.PersonID = CT.PersonID where CT.SYS_CHANGE_VERSION <= @{activity('LookupCurrentChangeTrackingVersionActivity').output.firstRow.CurrentChangeTrackingVersion}
- ```
+ ```sql
+ SELECT data_source_table.PersonID,data_source_table.Name,data_source_table.Age, CT.SYS_CHANGE_VERSION, SYS_CHANGE_OPERATION from data_source_table RIGHT OUTER JOIN CHANGETABLE(CHANGES data_source_table, @{activity('LookupLastChangeTrackingVersionActivity').output.firstRow.SYS_CHANGE_VERSION}) AS CT ON data_source_table.PersonID = CT.PersonID where CT.SYS_CHANGE_VERSION <= @{activity('LookupCurrentChangeTrackingVersionActivity').output.firstRow.CurrentChangeTrackingVersion}
+ ```
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/inc-copy-source-settings.png" alt-text="Copy Activity - source settings":::
+ ![Copy Activity - source settings.](media/tutorial-incremental-copy-change-tracking-feature-portal/inc-copy-source-settings.png)
+
9. Switch to the **Sink** tab, and select **SinkDataset** for the **Sink Dataset** field.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/inc-copy-sink-settings.png" alt-text="Copy Activity - sink settings":::
10. **Connect both Lookup activities to the Copy activity** one by one. Drag the **green** button attached to the **Lookup** activity to the **Copy** activity.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/connect-lookup-and-copy.png" alt-text="Connect Lookup and Copy activities":::
11. Drag-and-drop the **Stored Procedure** activity from the **Activities** toolbox to the pipeline designer surface. Set the name of the activity to **StoredProceduretoUpdateChangeTrackingActivity**. This activity updates the change tracking version in the **table_store_ChangeTracking_version** table.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/stored-procedure-activity-name.png" alt-text="Stored Procedure Activity - name":::
-12. Switch to the *SQL Account** tab, and select **AzureSqlDatabaseLinkedService** for **Linked service**.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/sql-account-tab.png" alt-text="Stored Procedure Activity - SQL Account":::
-13. Switch to the **Stored Procedure** tab, and do the following steps:
-
- 1. For **Stored procedure name**, select **Update_ChangeTracking_Version**.
- 2. Select **Import parameter**.
- 3. In the **Stored procedure parameters** section, specify following values for the parameters:
+12. Switch to the **Settings** tab, and do the following steps:
+
+ 1. Select **AzureSqlDatabaseLinkedService** for **Linked service**.
+ 2. For **Stored procedure name**, select **Update_ChangeTracking_Version**.
+ 3. Select **Import**.
+ 4. In the **Stored procedure parameters** section, specify following values for the parameters:
| Name | Type | Value | | - | - | -- | | CurrentTrackingVersion | Int64 | @{activity('LookupCurrentChangeTrackingVersionActivity').output.firstRow.CurrentChangeTrackingVersion} | | TableName | String | @{activity('LookupLastChangeTrackingVersionActivity').output.firstRow.TableName} |
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/stored-procedure-parameters.png" alt-text="Stored Procedure Activity - Parameters":::
-14. **Connect the Copy activity to the Stored Procedure Activity**. Drag-and-drop the **green** button attached to the Copy activity to the Stored Procedure activity.
+ ![Stored Procedure Activity - Parameters.](media/tutorial-incremental-copy-change-tracking-feature-portal/stored-procedure-parameters.png)
+
+13. **Connect the Copy activity to the Stored Procedure Activity**. Drag-and-drop the **green** button attached to the Copy activity to the Stored Procedure activity.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/connect-copy-stored-procedure.png" alt-text="Connect Copy and Stored Procedure activities":::
-15. Click **Validate** on the toolbar. Confirm that there are no validation errors. Close the **Pipeline Validation Report** window by clicking **>>**.
+14. Click **Validate** on the toolbar. Confirm that there are no validation errors. Close the **Pipeline Validation Report** window by clicking **Close**.
+15. Publish entities (linked services, datasets, and pipelines) to the Data Factory service by clicking the **Publish All** button. Wait until you see the **Publishing succeeded** message.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/validate-button.png" alt-text="Validate button":::
-16. Publish entities (linked services, datasets, and pipelines) to the Data Factory service by clicking the **Publish All** button. Wait until you see the **Publishing succeeded** message.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/publish-button-2.png" alt-text="Screenshot shows the Publish All button for a data factory.":::
+ ![Screenshot shows the Publish All button for a data factory.](media/tutorial-incremental-copy-change-tracking-feature-portal/publish-button-2.png)
### Run the incremental copy pipeline 1. Click **Trigger** on the toolbar for the pipeline, and click **Trigger Now**.
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/trigger-now-menu-2.png" alt-text="Screenshot shows a pipeline with activities and the Trigger Now option selected from the Trigger menu.":::
-2. In the **Pipeline Run** window, select **Finish**.
-
+ ![Screenshot shows a pipeline with activities and the Trigger Now option selected from the Trigger menu.](media/tutorial-incremental-copy-change-tracking-feature-portal/trigger-now-menu-2.png)
+2. In the **Pipeline Run** window, select **OK**.
### Monitor the incremental copy pipeline
-1. Click the **Monitor** tab on the left. You see the pipeline run in the list and its status. To refresh the list, click **Refresh**. The links in the **Actions** column let you view activity runs associated with the pipeline run and to rerun the pipeline.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/inc-copy-pipeline-runs.png" alt-text="Screenshot shows pipeline runs for a data factory including your pipeline.":::
-2. To view activity runs associated with the pipeline run, click the **View Activity Runs** link in the **Actions** column. There is only one activity in the pipeline, so you see only one entry in the list. To switch back to the pipeline runs view, click **Pipelines** link at the top.
-
- :::image type="content" source="./media/tutorial-incremental-copy-change-tracking-feature-portal/inc-copy-activity-runs.png" alt-text="Screenshot shows pipeline runs for a data factory with several marked as succeeded.":::
--
+1. Click the **Monitor** tab on the left. You see the pipeline run in the list and its status. To refresh the list, click **Refresh**. The links in the **Pipeline name** column let you view activity runs associated with the pipeline run and to rerun the pipeline.
+ ![Screenshot shows pipeline runs for a data factory including your pipeline.](media/tutorial-incremental-copy-change-tracking-feature-portal/inc-copy-pipeline-runs.png)
+1. To view activity runs associated with the pipeline run, click the **IncrementalCopyPipeline** link in the **Pipeline name** column.
+ ![Screenshot shows pipeline runs for a data factory with several marked as succeeded.](media/tutorial-incremental-copy-change-tracking-feature-portal/inc-copy-activity-runs.png)
### Review the results You see the second file in the `incchgtracking` folder of the `adftutorial` container. -
-The file should have only the delta data from your database. The record with `U` is the updated row in the database and `I` is the one added row.
+![Output file from incremental copy.](media/tutorial-incremental-copy-change-tracking-feature-portal/incremental-copy-output-file.png)The file should have only the delta data from your database. The record with `U` is the updated row in the database and `I` is the one added row.
```
+PersonID,Name,Age,SYS_CHANGE_VERSION,SYS_CHANGE_OPERATION
1,update,10,2,U 6,new,50,1,I ```
The first three columns are changed data from data_source_table. The last two co
PersonID Name Age SYS_CHANGE_VERSION SYS_CHANGE_OPERATION ================================================================== 1 update 10 2 U
-6 new 50 1 I
+6 new 50 1 I
``` ## Next steps
Advance to the following tutorial to learn about copying new and changed files o
> [!div class="nextstepaction"] > [Copy new files by lastmodifieddate](tutorial-incremental-copy-lastmodified-copy-data-tool.md)++++
data-factory Whats New Archive https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/whats-new-archive.md
Check out our [What's New video archive](https://www.youtube.com/playlist?list=P
## June 2022
+### Video summary
+
+> [!VIDEO https://www.youtube.com/embed?v=Ay3tsJe_vMM&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=3]
+ <table> <tr><td><b>Service category</b></td><td><b>Service improvements</b></td><td><b>Details</b></td></tr> <tr><td rowspan=3><b>Data flow</b></td><td>Fuzzy join supported for data flows</td><td>Fuzzy join is now supported in Join transformation of data flows with configurable similarity score on join conditions.<br><a href="data-flow-join.md#fuzzy-join">Learn more</a></td></tr>
Check out our [What's New video archive](https://www.youtube.com/playlist?list=P
<tr><td><b>Orchestration</b></td><td>ΓÇÿturnOffAsync' property is available in Web activity</td><td>Web activity supports an async request-reply pattern that invokes HTTP GET on the Location field in the response header of an HTTP 202 Response. It helps web activity automatically poll the monitoring end-point till the job runs. ΓÇÿturnOffAsync' property is supported to disable this behavior in cases where polling isn't needed<br><a href="control-flow-web-activity.md#type-properties">Learn more</a></td></tr> </table> -
-### Video summary
-
-> [!VIDEO https://www.youtube.com/embed?v=Ay3tsJe_vMM&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=3]
- ## May 2022 <table>
Check out our [What's New video archive](https://www.youtube.com/playlist?list=P
## March 2022
+### Video summary
+
+> [!VIDEO https://www.youtube.com/embed?v=MkgBxFyYwhQ&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=2]
+ <table> <tr><td><b>Service category</b></td><td><b>Service improvements</b></td><td><b>Details</b></td></tr>
Check out our [What's New video archive](https://www.youtube.com/playlist?list=P
</table>
-### Video summary
+## February 2022
-> [!VIDEO https://www.youtube.com/embed?v=MkgBxFyYwhQ&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=2]
+### Video summary
-## February 2022
+> [!VIDEO https://www.youtube.com/embed?v=r22nthp-f4g&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=1]
<table> <tr><td><b>Service category</b></td><td><b>Service improvements</b></td><td><b>Details</b></td></tr>
Check out our [What's New video archive](https://www.youtube.com/playlist?list=P
<tr><td><b>Security</b></td><td>Connect to an Azure DevOps account in another Azure Active Directory (Azure AD) tenant</td><td>You can connect your Data Factory instance to an Azure DevOps account in a different Azure AD tenant for source control purposes.<br><a href="cross-tenant-connections-to-azure-devops.md">Learn more</a></td></tr> </table>
-### Video summary
-
-> [!VIDEO https://www.youtube.com/embed?v=r22nthp-f4g&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=1]
- ## January 2022 <table>
Check out our [What's New video archive](https://www.youtube.com/playlist?list=P
- [Blog - Azure Data Factory](https://techcommunity.microsoft.com/t5/azure-data-factory/bg-p/AzureDataFactoryBlog) - [Stack Overflow forum](https://stackoverflow.com/questions/tagged/azure-data-factory) - [Twitter](https://twitter.com/AzDataFactory?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor)-- [Videos](https://www.youtube.com/channel/UC2S0k7NeLcEm5_IhHUwpN0g/featured)
+- [Videos](https://www.youtube.com/channel/UC2S0k7NeLcEm5_IhHUwpN0g/featured)
data-factory Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/whats-new.md
Check out our [What's New video archive](https://www.youtube.com/playlist?list=P
## September 2022
+### Video summary
+
+> [!VIDEO https://www.youtube.com/embed?v=Bh_VA8n-SL8&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=6]
+ ### Data flow - Amazon S3 source connector added [Learn more](connector-amazon-simple-storage-service.md?tabs=data-factory)
DELETE method in the Web activity now supports sending a body with HTTP request
- Native UI support of parameterization added for 6 additional linked services ΓÇô SAP ODP, ODBC, Microsoft Access, Informix, Snowflake, and DB2 [Learn more](parameterize-linked-services.md?tabs=data-factory#supported-linked-service-types) - Pipeline designer enhancements added in Studio Preview experience ΓÇô users can view workflow inside pipeline objects like For Each, If Then, etc.. [Learn more](https://techcommunity.microsoft.com/t5/azure-data-factory-blog/azure-data-factory-updated-pipeline-designer/ba-p/3618755)
-### Video summary
-
-> [!VIDEO https://www.youtube.com/embed?v=Bh_VA8n-SL8&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=7]
+## August 2022
+### Video summary
-## August 2022
+> [!VIDEO https://www.youtube.com/embed?v=KCJ2F6Y_nfo&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=5]
### Data flow - Appfigures connector added as Source (Preview) [Learn more](connector-appfigures.md)
Service principal authentication type added for Azure Blob storage [Learn more](
### Continuous integration and continuous delivery (CI/CD) When CI/CD integrating ARM template, instead of turning off all triggers, it can exclude triggers that didn't change in deployment [Learn more](https://techcommunity.microsoft.com/t5/azure-data-factory-blog/ci-cd-improvements-related-to-pipeline-triggers-deployment/ba-p/3605064)
-### Video summary
+## July 2022
-> [!VIDEO https://www.youtube.com/embed?v=KCJ2F6Y_nfo&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=5]
+### Video summary
-## July 2022
+> [!VIDEO https://www.youtube.com/embed?v=EOVVt4qYvZI&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=4]
### Data flow
Include Global parameters supported in ARM template. [Learn more](https://techco
Be a part of Azure Data Factory studio preview features - Experience the latest Azure Data Factory capabilities and be the first to share your feedback [Learn more](https://techcommunity.microsoft.com/t5/azure-data-factory-blog/introducing-the-azure-data-factory-studio-preview-experience/ba-p/3563880)
-### Video summary
-
-> [!VIDEO https://www.youtube.com/embed?v=EOVVt4qYvZI&list=PLt4mCx89QIGS1rQlNt2-7iuHHAKSomVLv&index=4]
- ## More information - [What's new archive](whats-new-archive.md)
data-share Disaster Recovery https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-share/disaster-recovery.md
Title: Disaster recovery for Azure Data Share description: Disaster recovery for Azure Data Share--++ Previously updated : 01/03/2022 Last updated : 10/27/2022 # Disaster recovery for Azure Data Share
data-share How To Add Datasets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-share/how-to-add-datasets.md
Title: Add datasets to an existing Azure Data Share description: Learn how to add datasets to an existing data share in Azure Data Share and share with the same recipients.--++ Previously updated : 01/03/2022 Last updated : 10/27/2022 # How to add datasets to an existing share in Azure Data Share
data-share How To Configure Mapping https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-share/how-to-configure-mapping.md
Title: Configure a dataset mapping in Azure Data Share description: Learn how to configure a dataset mapping for a received share using Azure Data Share.--++ Previously updated : 01/03/2022 Last updated : 10/27/2022 # How to configure a dataset mapping for a received share in Azure Data Share
data-share How To Monitor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-share/how-to-monitor.md
Title: How to monitor Azure Data Share description: Learn how to monitor invitation status, share subscriptions, and snapshot history in Azure Data Share --++ Previously updated : 01/03/2022 Last updated : 10/27/2022 # Monitor Azure Data Share
data-share Samples Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-share/samples-powershell.md
Title: Azure PowerShell Samples for Azure Data Share description: Learn about Azure PowerShell Sample scripts to help you create and manage data shares in Azure Data Share. --++ Previously updated : 01/03/2022 Last updated : 10/27/2022 # Azure PowerShell samples for Azure Data Share
data-share Share Your Data Bicep https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-share/share-your-data-bicep.md
Title: 'Share outside your org (Bicep) - Azure Data Share quickstart' description: Learn how to share data with customers and partners using Azure Data Share and Bicep.--++ Previously updated : 04/04/2022 Last updated : 10/27/2022
devtest-labs How To Move Labs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/devtest-labs/how-to-move-labs.md
In this article, you'll learn how to:
- For preview features, ensure that your subscription is allowlisted for the target region. -- DevTest Labs doesn't store them nor expose passwords from the exported ARM template. You will need to know the passwords/secrets for:
+- DevTest Labs doesn't store or expose passwords from the exported ARM template. You will need to know the passwords/secrets for:
- the VMs - the Stored Secrets
dms Tutorial Azure Postgresql To Azure Postgresql Online Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/tutorial-azure-postgresql-to-azure-postgresql-online-portal.md
To complete all the database objects like table schemas, indexes and stored proc
1. Use pg_dump -s command to create a schema dump file for a database. ```
- pg_dump -o -h hostname -U db_username -d db_name -s > your_schema.sql
+ pg_dump -O -h hostname -U db_username -d db_name -s > your_schema.sql
``` For example, to create a schema dump file for the **dvdrental** database: ```
- pg_dump -o -h mypgserver-source.postgres.database.azure.com -U pguser@mypgserver-source -d dvdrental -s -O -x > dvdrentalSchema.sql
+ pg_dump -O -h mypgserver-source.postgres.database.azure.com -U pguser@mypgserver-source -d dvdrental -s -x > dvdrentalSchema.sql
``` For more information about using the pg_dump utility, see the examples in the [pg-dump](https://www.postgresql.org/docs/9.6/static/app-pgdump.html#PG-DUMP-EXAMPLES) tutorial.
dms Tutorial Postgresql Azure Postgresql Online Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/tutorial-postgresql-azure-postgresql-online-portal.md
To complete all the database objects like table schemas, indexes and stored proc
1. Use pg_dump -s command to create a schema dump file for a database. ```
- pg_dump -o -h hostname -U db_username -d db_name -s > your_schema.sql
+ pg_dump -O -h hostname -U db_username -d db_name -s > your_schema.sql
``` For example, to create a schema dump file for the **dvdrental** database: ```
- pg_dump -o -h localhost -U postgres -d dvdrental -s -O -x > dvdrentalSchema.sql
+ pg_dump -O -h localhost -U postgres -d dvdrental -s -x > dvdrentalSchema.sql
``` For more information about using the pg_dump utility, see the examples in the [pg-dump](https://www.postgresql.org/docs/9.6/static/app-pgdump.html#PG-DUMP-EXAMPLES) tutorial.
dms Tutorial Postgresql Azure Postgresql Online https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/tutorial-postgresql-azure-postgresql-online.md
To complete all the database objects like table schemas, indexes and stored proc
1. Use pg_dump -s command to create a schema dump file for a database. ```
- pg_dump -o -h hostname -U db_username -d db_name -s > your_schema.sql
+ pg_dump -O -h hostname -U db_username -d db_name -s > your_schema.sql
``` For example, to dump a schema file dvdrental database: ```
- pg_dump -o -h localhost -U postgres -d dvdrental -s > dvdrentalSchema.sql
+ pg_dump -O -h localhost -U postgres -d dvdrental -s > dvdrentalSchema.sql
``` For more information about using the pg_dump utility, see the examples in the [pg-dump](https://www.postgresql.org/docs/9.6/static/app-pgdump.html#PG-DUMP-EXAMPLES) tutorial.
dms Tutorial Rds Postgresql Server Azure Db For Postgresql Online https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/tutorial-rds-postgresql-server-azure-db-for-postgresql-online.md
To complete this tutorial, you need to:
The easiest way to migrate only the schema is to use pg_dump with the -s option. For more information, see the [examples](https://www.postgresql.org/docs/9.6/app-pgdump.html#PG-DUMP-EXAMPLES) in the Postgres pg_dump tutorial. ```
- pg_dump -o -h hostname -U db_username -d db_name -s > your_schema.sql
+ pg_dump -O -h hostname -U db_username -d db_name -s > your_schema.sql
``` For example, to dump a schema file for the **dvdrental** database, use the following command: ```
- pg_dump -o -h localhost -U postgres -d dvdrental -s > dvdrentalSchema.sql
+ pg_dump -O -h localhost -U postgres -d dvdrental -s > dvdrentalSchema.sql
``` 2. Create an empty database in the target service, which is Azure Database for PostgreSQL. To connect and create a database, refer to one of the following articles:
dms Tutorial Sql Server Managed Instance Online Ads https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/dms/tutorial-sql-server-managed-instance-online-ads.md
Title: "Tutorial: Migrate SQL Server to Azure SQL Managed Instance online using Azure Data Studio"-+ description: Migrate SQL Server to an Azure SQL Managed Instance online using Azure Data Studio with Azure Database Migration Service
energy-data-services How To Convert Segy To Ovds https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/energy-data-services/how-to-convert-segy-to-ovds.md
OSDU&trade; is a trademark of The Open Group.
## Next steps <!-- Add a context sentence for the following links --> > [!div class="nextstepaction"]
-> [How to convert a segy to zgy file](/how-to-convert-segy-to-zgy.md)
+> [How to convert a segy to zgy file](/azure/energy-data-services/how-to-convert-segy-to-zgy)
energy-data-services How To Convert Segy To Zgy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/energy-data-services/how-to-convert-segy-to-zgy.md
Seismic data stored in industry standard SEG-Y format can be converted to ZGY fo
> [!NOTE] > See [How to generate a refresh token](how-to-generate-refresh-token.md). Once you've generated the token, store it in a place where you'll be able to access it in the future.
-8. Run the following commands using **sdutil** to see its working fine. Follow the directions in [Setup and Usage for Azure env](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/tree/azure/stable#setup-and-usage-for-azure-env). Understand that depending on your OS and Python version, you may have to run `python3` command as opposed to `python`. If you run into errors with these commands, refer to the [SDUTIL tutorial](/tutorials/tutorial-seismic-ddms-sdutil.md).
+8. Run the following commands using **sdutil** to see its working fine. Follow the directions in [Setup and Usage for Azure env](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/tree/azure/stable#setup-and-usage-for-azure-env). Understand that depending on your OS and Python version, you may have to run `python3` command as opposed to `python`. If you run into errors with these commands, refer to the [SDUTIL tutorial](/azure/energy-data-services/tutorial-seismic-ddms-sdutil).
> [!NOTE] > when running `python sdutil config init`, you don't need to enter anything when prompted with `Insert the azure (azureGlabEnv) application key:`.
Seismic data stored in industry standard SEG-Y format can be converted to ZGY fo
10. Create the manifest file (otherwise known as the records file)
- ZGY conversion uses a manifest file that you'll upload to your storage account in order to run the conversion. This manifest file is created by using multiple JSON files and running a script. The JSON files for this process are stored [here](https://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-zgy-conversion/-/tree/master/doc/sample-records/volve). For more information on Volve, where the dataset definitions come from, visit [their website](https://www.equinor.com/en/what-we-do/digitalisation-in-our-dna/volve-field-data-village-download.html). Complete the following steps in order to create the manifest file:
+ ZGY conversion uses a manifest file that you'll upload to your storage account in order to run the conversion. This manifest file is created by using multiple JSON files and running a script. The JSON files for this process are stored [here](https://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-zgy-conversion/-/tree/master/doc/sample-records/volve). For more information on Volve, where the dataset definitions come from, visit [their website](https://www.equinor.com/energy/volve-data-sharing). Complete the following steps in order to create the manifest file:
* Clone the [repo](https://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-zgy-conversion/-/tree/master/) and navigate to the folder doc/sample-records/volve * Edit the values in the `prepare-records.sh` bash script:
OSDU&trade; is a trademark of The Open Group.
## Next steps <!-- Add a context sentence for the following links --> > [!div class="nextstepaction"]
-> [How to convert segy to ovds](/how-to-convert-segy-to-ovds.md)
+> [How to convert segy to ovds](/azure/energy-data-services/how-to-convert-segy-to-ovds)
event-grid Blob Event Quickstart Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/blob-event-quickstart-portal.md
Title: 'Use Azure Event Grid to send Blob storage events to web endpoint - portal' description: 'Quickstart: Use Azure Event Grid and Azure portal to create Blob storage account, and subscribe its events. Send the events to a Webhook.' Previously updated : 07/01/2021 Last updated : 10/27/2022
When you're finished, you see that the event data has been sent to the web app.
3. Enter the name for your storage account. 1. Select the **Region** in which you want the storage account to be created. 1. For **Redundancy**, select **Locally-redundant storage (LRS)** from the drop-down list.
- 1. Select **Review + create**.
+ 1. Select **Review** at the bottom of the page.
:::image type="content" source="./media/blob-event-quickstart-portal/create-storage-account-page.png" alt-text="Screenshot showing the Create a storage account page.":::
- 5. On the **Review + create** page, review the settings, and select **Create**.
+ 5. On the **Review** page, review the settings, and select **Create**.
>[!NOTE] > Only storage accounts of kind **StorageV2 (general purpose v2)** and **BlobStorage** support event integration. **Storage (general purpose v1)** does *not* support integration with Event Grid.
+1. The deployment may take a few minutes to complete. On the **Deployment** page, select **Go to resource**.
+
+ :::image type="content" source="./media/blob-event-quickstart-portal/go-to-resource-link.png" alt-text="Screenshot showing the deployment succeeded page with a link to go to the resource.":::
+1. On the **Storage account** page, select **Events** on the left menu.
+
+ :::image type="content" source="./media/blob-event-quickstart-portal/events-page.png" alt-text="Screenshot showing the Events page for an Azure storage account.":::
+1. Keep this page in the web browser open.
## Create a message endpoint Before subscribing to the events for the Blob storage, let's create the endpoint for the event message. Typically, the endpoint takes actions based on the event data. To simplify this quickstart, you deploy a [pre-built web app](https://github.com/Azure-Samples/azure-event-grid-viewer) that displays the event messages. The deployed solution includes an App Service plan, an App Service web app, and source code from GitHub.
Before subscribing to the events for the Blob storage, let's create the endpoint
<a href="https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure-Samples%2Fazure-event-grid-viewer%2Fmaster%2Fazuredeploy.json" target="_blank"><img src="../media/template-deployments/deploy-to-azure.svg" alt="Button to deploy to Azure."></a> 2. On the **Custom deployment** page, do the following steps:
- 1. For **Resource group**, select the resource group that you created when creating the storage account. It will be easier for you to clean up after you are done with the tutorial by deleting the resource group.
+ 1. For **Resource group**, select the resource group that you created when creating the storage account. It will be easier for you to clean up after you're done with the tutorial by deleting the resource group.
2. For **Site Name**, enter a name for the web app. 3. For **Hosting plan name**, enter a name for the App Service plan to use for hosting the web app. 5. Select **Review + create**. :::image type="content" source="./media/blob-event-quickstart-portal/template-deploy-parameters.png" alt-text="Screenshot showing the Custom deployment page."::: 1. On the **Review + create** page, select **Create**.
-1. The deployment may take a few minutes to complete. Select Alerts (bell icon) in the portal, and then select **Go to resource group**.
+1. The deployment may take a few minutes to complete. On the **Deployment** page, select **Go to resource group**.
- ![Alert - navigate to resource group.](./media/blob-event-quickstart-portal/navigate-resource-group.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/navigate-resource-group.png" alt-text="Screenshot showing the deployment succeeded page with a link to go to the resource group.":::
4. On the **Resource group** page, in the list of resources, select the web app that you created. You also see the App Service plan and the storage account in this list.
- ![Select web site.](./media/blob-event-quickstart-portal/resource-group-resources.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/resource-group-resources.png" alt-text="Screenshot that shows the selection of web app in the resource group.":::
5. On the **App Service** page for your web app, select the URL to navigate to the web site. The URL should be in this format: `https://<your-site-name>.azurewebsites.net`.
-
- ![Navigate to web site.](./media/blob-event-quickstart-portal/web-site.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/web-site.png" alt-text="Screenshot that shows the selection of link to navigate to web app.":::
6. Confirm that you see the site but no events have been posted to it yet. ![View new site.](./media/blob-event-quickstart-portal/view-site.png)
Before subscribing to the events for the Blob storage, let's create the endpoint
You subscribe to a topic to tell Event Grid which events you want to track, and where to send the events.
-1. In the portal, navigate to your Azure Storage account that you created earlier. On the left menu, select **All resources** and select your storage account.
-2. On the **Storage account** page, select **Events** on the left menu.
-1. Select **More Options**, and **Web Hook**. You are sending events to your viewer app using a web hook for the endpoint.
+1. If you closed the **Storage account** page, navigate to your Azure Storage account that you created earlier. On the left menu, select **All resources** and select your storage account.
+1. On the **Storage account** page, select **Events** on the left menu.
+1. Select **More Options**, and **Web Hook**. You're sending events to your viewer app using a web hook for the endpoint.
- ![Select web hook.](./media/blob-event-quickstart-portal/select-web-hook.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/select-web-hook.png" alt-text="Screenshot showing the selection of Web Hook on the Events page.":::
3. On the **Create Event Subscription** page, do the following steps: 1. Enter a **name** for the event subscription. 2. Enter a **name** for the **system topic**. To learn about system topics, see [Overview of system topics](system-topics.md).
- ![Enter names for event subscription and system topic.](./media/blob-event-quickstart-portal/event-subscription-name-system-topic.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/event-subscription-name-system-topic.png" alt-text="Screenshot showing the Create Event Subscription page with a name for the system topic.":::
2. Select **Web Hook** for **Endpoint type**.
- ![Select web hook endpoint type.](./media/blob-event-quickstart-portal/select-web-hook-end-point-type.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/select-web-hook-end-point-type.png" alt-text="Screenshot showing the Create Event Subscription page with Web Hook selected as an endpoint.":::
4. For **Endpoint**, click **Select an endpoint**, and enter the URL of your web app and add `api/updates` to the home page URL (for example: `https://spegridsite.azurewebsites.net/api/updates`), and then select **Confirm Selection**.
- ![Confirm endpoint selection.](./media/blob-event-quickstart-portal/confirm-endpoint-selection.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/confirm-endpoint-selection.png" lightbox="./media/blob-event-quickstart-portal/confirm-endpoint-selection.png" alt-text="Screenshot showing the Select Web Hook page.":::
5. Now, on the **Create Event Subscription** page, select **Create** to create the event subscription.
- ![Select logs.](./media/blob-event-quickstart-portal/create-subscription.png)
-
+ :::image type="content" source="./media/blob-event-quickstart-portal/create-subscription.png" alt-text="Screenshot showing the Create Event Subscription page with all fields selected.":::
1. View your web app again, and notice that a subscription validation event has been sent to it. Select the eye icon to expand the event data. Event Grid sends the validation event so the endpoint can verify that it wants to receive event data. The web app includes code to validate the subscription.
- ![View subscription event.](./media/blob-event-quickstart-portal/view-subscription-event.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/view-subscription-event.png" alt-text="Screenshot showing the Event Grid Viewer with the subscription validation event.":::
Now, let's trigger an event to see how Event Grid distributes the message to your endpoint.
You trigger an event for the Blob storage by uploading a file. The file doesn't
1. In the Azure portal, navigate to your Blob storage account, and select **Containers** on the let menu. 1. Select **+ Container**. Give your container a name, and use any access level, and select **Create**.
- ![Add container.](./media/blob-event-quickstart-portal/add-container.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/add-container.png" alt-text="Screenshot showing the New container page.":::
1. Select your new container.
- ![Select container.](./media/blob-event-quickstart-portal/select-container.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/select-container.png" alt-text="Screenshot showing the selection of the container.":::
1. To upload a file, select **Upload**. On the **Upload blob** page, browse and select a file that you want to upload for testing, and then select **Upload** on that page.
- ![Select upload.](./media/blob-event-quickstart-portal/upload-file.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/upload-file.png" alt-text="Screenshot showing Upload blob page.":::
1. Browse to your test file and upload it. 1. You've triggered the event, and Event Grid sent the message to the endpoint you configured when subscribing. The message is in the JSON format and it contains an array with one or more events. In the following example, the JSON message contains an array with one event. View your web app and notice that a **blob created** event was received.
- ![Blob created event.](./media/blob-event-quickstart-portal/blob-created-event.png)
+ :::image type="content" source="./media/blob-event-quickstart-portal/blob-created-event.png" alt-text="Screenshot showing the Event Grid Viewer page with the Blob Created event.":::
## Clean up resources
event-grid Communication Services Voice Video Events https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/communication-services-voice-video-events.md
This section contains an example of what that data would look like for each even
[ { "id": "7283825e-f8f1-4c61-a9ea-752c56890500",
- "topic": "/subscriptions/{subscription-id}/resourcegroups/}{group-name}/providers/microsoft.communication/communicationservices/{communication-services-resource-name}",
- "subject": "/recording/call/{call-id}/recordingId/{recording-id}",
+ "topic": "/subscriptions/{subscription-id}/resourcegroups/{group-name}/providers/microsoft.communication/communicationservices/{communication-services-resource-name}",
+ "subject": "/recording/call/{call-id}/serverCallId/{server-call-id}/recordingId/{recording-id}",
"data": { "recordingStorageInfo": { "recordingChunks": [
This section contains an example of what that data would look like for each even
"endReason": "SessionEnded", "contentLocation": "https://storage.asm.skype.com/v1/objects/0-eus-d12-801b3f3fc462fe8a01e6810cbff729b8/content/video", "metadataLocation": "https://storage.asm.skype.com/v1/objects/0-eus-d12-801b3f3fc462fe8a01e6810cbff729b8/content/acsmetadata"
+ "deleteLocation": "https://us-storage.asm.skype.com/v1/objects/0-eus-d1-83e9599991e21ad21220427d78fbf558"
} ] },
event-hubs Event Hubs Capture Enable Through Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-hubs/event-hubs-capture-enable-through-portal.md
Title: Event Hubs - Capture streaming events using Azure portal description: This article describes how to enable capturing of events streaming through Azure Event Hubs by using the Azure portal. Previously updated : 09/16/2021 Last updated : 10/27/2021 # Enable capturing of events streaming through Azure Event Hubs
-Azure [Event Hubs Capture][capture-overview] enables you to automatically deliver the streaming data in Event Hubs to an [Azure Blob storage](https://azure.microsoft.com/services/storage/blobs/) or [Azure Data Lake Storage Gen1 or Gen 2](https://azure.microsoft.com/services/data-lake-store/) account of your choice.
-
-You can configure Capture at the event hub creation time using the [Azure portal](https://portal.azure.com). You can either capture the data to an Azure [Blob storage](https://azure.microsoft.com/services/storage/blobs/) container, or to an [Azure Data Lake Storage Gen 1 or Gen 2](https://azure.microsoft.com/services/data-lake-store/) account.
-
-For more information, see the [Event Hubs Capture overview][capture-overview].
+Azure [Event Hubs Capture][capture-overview] enables you to automatically deliver the streaming data in Event Hubs to an [Azure Blob storage](https://azure.microsoft.com/services/storage/blobs/) or [Azure Data Lake Storage Gen1 or Gen 2](https://azure.microsoft.com/services/data-lake-store/) account of your choice.You can configure capture settings using the [Azure portal](https://portal.azure.com) when creating an event hub or for an existing event hub. For conceptual information on this feature, see [Event Hubs Capture overview][capture-overview].
> [!IMPORTANT] > - The destination storage (Azure Storage or Azure Data Lake Storage) account must be in the same subscription as the event hub. > - Event Hubs doesn't support capturing events in a **premium** storage account.
-## Capture data to Azure Storage
+## Enable Capture when you create an event hub
-When you create an event hub, you can enable Capture by clicking the **On** button in the **Create Event Hub** portal screen. You then specify a Storage Account and container by clicking **Azure Storage** in the **Capture Provider** box. Because Event Hubs Capture uses service-to-service authentication with storage, you do not need to specify a storage connection string. The resource picker selects the resource URI for your storage account automatically. If you use Azure Resource Manager, you must supply this URI explicitly as a string.
+If you don't have an Event Hubs namespace to work with, create a **standard** tier namespace by following steps from the article: [Create an Event Hubs namespace](event-hubs-create.md#create-an-event-hubs-namespace). Make sure that you select **Standard** for the **pricing tier**. The basic tier doesn't support the Capture feature.
-The default time window is 5 minutes. The minimum value is 1, the maximum 15. The **Size** window has a range of 10-500 MB.
+To create an event hub within the namespace, follow these steps:
-You can enable or disable emitting empty files when no events occur during the Capture window.
+1. On the **Overview** page for your namespace, select **+ Event hub** on the command bar.
+
+ :::image type="content" source="./media/event-hubs-quickstart-portal/create-event-hub4.png" lightbox="./media/event-hubs-quickstart-portal/create-event-hub4.png" alt-text="Screenshot of the selection of Add event hub button on the command bar.":::
+2. On the **Create event hub** page, type a name for your event hub, then select **Next: Capture** at the bottom of the page.
+
+ :::image type="content" source="./media/event-hubs-capture-enable-through-portal/create-event-hub-basics-page.png" alt-text="Screenshot of the Create event hub page.":::
+1. On the **Capture** tab, select **On** for **Capture**.
+1. Drag the slider to set the **Time window** in minutes. The default time window is 5 minutes. The minimum value is 1 and the maximum is 15.
+1. Drag the slider to set the **Size window (MB)**. The default value is 300 MB. The minimum value is 10 MB and the maximum value is 500 MB.
+1. Specify whether you want Event Hubs to **emit empty files when no events occur during the Capture time window**.
+
+See one of the following sections based on the type of storage you want to use to store captured files.
+
+## Capture data to Azure Storage
-![Time window for capture][1]
+1. For **Capture Provider**, select **Azure Storage Account** (default).
+1. For **Azure Storage Container**, click the **Select the container** link.
+ :::image type="content" source="./media/event-hubs-capture-enable-through-portal/select-container-link.png" alt-text="Screenshot that shows the Create event hub page with the Select container link.":::
+1. On the **Storage accounts** page, select the storage account that you want to use to capture data.
+1. On the **Containers** page, select the container where you want to store captured files, and then click **Select**.
+ Because Event Hubs Capture uses service-to-service authentication with storage, you don't need to specify a storage connection string. The resource picker selects the resource URI for your storage account automatically. If you use Azure Resource Manager, you must supply this URI explicitly as a string.
+1. Now, on the **Create event hub** page, confirm that the selected container shows up.
+1. For **Capture file name format**, specify format for the captured file names.
+1. Select **Review + create** at the bottom of the page.
+1. On the **Review + create** page, review settings, and select **Create** to create the event hub.
+
## Capture data to Azure Data Lake Storage Gen 2
-1. Follow [Create a storage account](../storage/common/storage-account-create.md?tabs=azure-portal#create-a-storage-account) article to create an Azure Storage account. Set **Hierarchical namespace** to **Enabled** on the **Advanced** tab to make it an Azure Data Lake Storage Gen 2 account. The Azure Storage account must be in the same subscription as the event hub.
-2. When creating an event hub, do the following steps:
+Follow [Create a storage account](../storage/common/storage-account-create.md?tabs=azure-portal#create-a-storage-account) article to create an Azure Storage account. Set **Hierarchical namespace** to **Enabled** on the **Advanced** tab to make it an Azure Data Lake Storage Gen 2 account. The Azure Storage account must be in the same subscription as the event hub.
- 1. Select **On** for **Capture**.
- 2. Select **Azure Storage** as the capture provider. The **Azure Data Lake Store** option you see for the **Capture provider** is for the Gen 1 of Azure Data Lake Storage. To use a Gen 2 of Azure Data Lake Storage, you select **Azure Storage**.
- 2. Select the **Select Container** button.
+1. Select **Azure Storage** as the capture provider. The **Azure Data Lake Store** option you see for the **Capture provider** is for the Gen 1 of Azure Data Lake Storage. To use a Gen 2 of Azure Data Lake Storage, you select **Azure Storage**.
+2. For **Azure Storage Container**, click the **Select the container** link.
- ![Enable capture to Data Lake Storage Gen 2](./media/event-hubs-capture-enable-through-portal/data-lake-storage-gen2.png)
+ :::image type="content" source="./media/event-hubs-capture-enable-through-portal/select-container-link.png" alt-text="Screenshot that shows the Create event hub page with the Select container link.":::
3. Select the **Azure Data Lake Storage Gen 2** account from the list.
- ![Select Data Lake Storage Gen 2](./media/event-hubs-capture-enable-through-portal/select-data-lake-storage-gen2.png)
-4. Select the **container** (file system in Data Lake Storage Gen 2).
+ :::image type="content" source="./media/event-hubs-capture-enable-through-portal/select-data-lake-storage-gen2.png" alt-text="Screenshot showing the selection of Data Lake Storage Gen 2 account.":::
+4. Select the **container** (file system in Data Lake Storage Gen 2), and then click **Select** at the bottom of the page.
- ![Select file system in the storage](./media/event-hubs-capture-enable-through-portal/select-file-system-data-lake-storage.png)
-5. On the **Create Event Hub** page, select **Create**.
+ :::image type="content" source="./media/event-hubs-capture-enable-through-portal/select-file-system-data-lake-storage.png" alt-text="Screenshot that shows the Containers page.":::
+1. For **Capture file name format**, specify format for the captured file names.
+1. Select **Review + create** at the bottom of the page.
- ![Select Create button](./media/event-hubs-capture-enable-through-portal/create-event-hub-data-lake-storage.png)
+ :::image type="content" source="./media/event-hubs-capture-enable-through-portal/create-event-hub-data-lake-storage.png" alt-text="Screenshot that shows the Create event hub page with all the fields specified.":::
+1. On the **Review + create** page, review settings, and select **Create** to create the event hub.
> [!NOTE] > The container you create in a Azure Data Lake Storage Gen 2 using this user interface (UI) is shown under **File systems** in **Storage Explorer**. Similarly, the file system you create in a Data Lake Storage Gen 2 account shows up as a container in this UI.
You can enable or disable emitting empty files when no events occur during the C
To capture data to Azure Data Lake Storage Gen 1, you create a Data Lake Storage Gen 1 account, and an event hub:
+> [!IMPORTANT]
+> On Feb 29, 2024 Azure Data Lake Storage Gen1 will be retired. For more information, see the [official announcement](https://azure.microsoft.com/updates/action-required-switch-to-azure-data-lake-storage-gen2-by-29-february-2024/). If you use Azure Data Lake Storage Gen1, make sure to migrate to Azure Data Lake Storage Gen2 prior to that date. For more information, see [Azure Data Lake Storage migration guidelines and patterns](../storage/blobs/data-lake-storage-migrate-gen1-to-gen2.md).
+ ### Create an Azure Data Lake Storage Gen 1 account and folders 1. Create a Data Lake Storage account, following the instructions in [Get started with Azure Data Lake Storage Gen 1 using the Azure portal](../data-lake-store/data-lake-store-get-started-portal.md).
To capture data to Azure Data Lake Storage Gen 1, you create a Data Lake Storage
### Create an event hub 1. The event hub must be in the same Azure subscription as the Azure Data Lake Storage Gen 1 account you created. Create the event hub, clicking the **On** button under **Capture** in the **Create Event Hub** portal page.
-2. In the **Create Event Hub** portal page, select **Azure Data Lake Store** from the **Capture Provider** box.
+2. On the **Create Event Hub** page, select **Azure Data Lake Store** from the **Capture Provider** box.
3. In **Select Store** next to the **Data Lake Store** drop-down list, specify the Data Lake Storage Gen 1 account you created previously, and in the **Data Lake Path** field, enter the path to the data folder you created.
- ![Select Data Lake Storage account][3]
--
-## Add or configure Capture on an existing event hub
-
-You can configure Capture on existing event hubs that are in Event Hubs namespaces. To enable Capture on an existing event hub, or to change your Capture settings, click the namespace to load the overview screen, then click the event hub for which you want to enable or change the Capture setting. Finally, click the **Capture** option on the left side of the open page and then edit the settings, as shown in the following figures:
+ :::image type="content" source="./media/event-hubs-capture-enable-through-portal/event-hubs-capture3.png" alt-text="Screenshot showing the selection of Data Lake Storage Account Gen 1.":::
-### Azure Blob Storage
+## Configure Capture for an existing event hub
-### Azure Data Lake Storage Gen 2
-It's same as above (for Azure Blob Storage) except that you will be selecting a container from an Azure Data Lake Storage Gen 2 account.
+You can configure Capture on existing event hubs that are in Event Hubs namespaces. To enable Capture on an existing event hub, or to change your Capture settings, follow these steps:
-### Azure Data Lake Storage Gen 1
+1. On the home page for your namespace, select **Event Hubs** under **Entities** on the left menu.
+1. Select the event hub for which you want to configure the Capture feature.
-![Configure Azure Data Lake Storage][4]
+ :::image type="content" source="./media/event-hubs-capture-enable-through-portal/select-event-hub.png" alt-text="Screenshot showing the selection of an event hub in the list of event hubs.":::
+1. On the **Event Hubs Instance** page, select **Capture** on the left menu.
-[1]: ./media/event-hubs-capture-enable-through-portal/event-hubs-capture1.png
-[3]: ./media/event-hubs-capture-enable-through-portal/event-hubs-capture3.png
-[4]: ./media/event-hubs-capture-enable-through-portal/event-hubs-capture4.png
+ :::image type="content" source="./media/event-hubs-capture-enable-through-portal/view-capture-page.png" alt-text="Screenshot showing the Capture page for your event hub.":::
+1. On the **Capture** page, select **Avro** for **Output event serialization format**. The **Parquet** format is supported only via Azure Stream Analytics integration. For more information, see [Capture Event Hubs data in parquet format and analyze with Azure Synapse Analytics](../stream-analytics/event-hubs-parquet-capture-tutorial.md).
+1. Select **On** for **Capture**.
+ :::image type="content" source="./media/event-hubs-capture-enable-through-portal/enable-capture.png" alt-text="Screenshot showing the Capture page for your event hub with the Capture feature enabled.":::
+1. To configure other settings, see the sections:
+ - [Capture data to Azure Storage](#capture-data-to-azure-storage)
+ - [Capture data to Azure Data Lake Storage Gen 2](#capture-data-to-azure-data-lake-storage-gen-2)
+ - [Capture data to Azure Data Lake Storage Gen 1](#capture-data-to-azure-data-lake-storage-gen-1)
+
## Next steps - Learn more about Event Hubs capture by reading the [Event Hubs Capture overview][capture-overview].
firewall Fqdn Filtering Network Rules https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/firewall/fqdn-filtering-network-rules.md
Previously updated : 09/01/2021 Last updated : 10/28/2022 + # Use FQDN filtering in network rules
firewall Tutorial Firewall Deploy Portal Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/firewall/tutorial-firewall-deploy-portal-policy.md
Previously updated : 10/18/2022 Last updated : 10/28/2022 -+ #Customer intent: As an administrator new to this service, I want to control outbound network access from resources located in an Azure subnet.
frontdoor Front Door Security Headers https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/frontdoor/front-door-security-headers.md
na Previously updated : 10/12/2022 Last updated : 10/28/2022 + # Customer intent: As an IT admin, I want to learn about Front Door and how to configure a security header via Rules Engine.
frontdoor Quickstart Create Front Door https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/frontdoor/quickstart-create-front-door.md
documentationcenter: na
Previously updated : 10/12/2022 Last updated : 10/28/2022 na-+ #Customer intent: As an IT admin, I want to direct user traffic to ensure high availability of web applications.
healthcare-apis Deploy 02 New Button https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/iot/deploy-02-new-button.md
Previously updated : 10/10/2022 Last updated : 10/28/2022 # Deploy the MedTech service with an Azure Resource Manager Quickstart template
-In this article, you'll learn how to deploy the MedTech service in the Azure portal using an Azure Resource Manager (ARM) Quickstart template. This template will be used with the **Deploy to Azure** button to make it easy to provide the information you need to automatically set up the infrastructure and configuration of your deployment. For more information about Azure ARM templates, see [What are ARM templates?](../../azure-resource-manager/templates/overview.md).
+In this article, you'll learn how to deploy the MedTech service in the Azure portal using an Azure Resource Manager (ARM) Quickstart template. This template will be used with the **Deploy to Azure** button to make it easy to provide the information you need to automatically create the infrastructure and configuration of your deployment. For more information about Azure Resource Manager (ARM) templates, see [What are ARM templates?](../../azure-resource-manager/templates/overview.md).
-If you need to see a diagram with information on the MedTech service deployment, there is an architecture overview at [Choose a deployment method](deploy-iot-connector-in-azure.md#deployment-architecture-overview). This diagram shows the data flow steps of deployment and how MedTech service processes data into a Fast Healthcare Interoperability Resources (FHIR&#174;) Observation.
+If you need to see a diagram with information on the MedTech service deployment, there's an architecture overview at [Choose a deployment method](deploy-iot-connector-in-azure.md#deployment-architecture-overview). This diagram shows the data flow steps of deployment and how MedTech service processes data into a Fast Healthcare Interoperability Resources (FHIR&#174;) Observation.
There are four simple tasks you need to complete in order to deploy MedTech service with the ARM template **Deploy to Azure** button. They are:
In order to begin deployment, you need to have the following prerequisites:
- Two resource providers registered with your Azure subscription: **Microsoft.HealthcareApis** and **Microsoft.EventHub**. To learn more about registering resource providers, see [Azure resource providers and types](../../azure-resource-manager/management/resource-providers-and-types.md).
-When you've fulfilled these two prerequisites, you are ready to begin the second task.
+When you've fulfilled these two prerequisites, you're ready to begin the second task.
## Deploy to Azure button
Next, you need to select the ARM template **Deploy to Azure** button here:
[![Deploy to Azure](https://aka.ms/deploytoazurebutton)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fquickstarts%2Fmicrosoft.healthcareapis%2Fworkspaces%2Fiotconnectors%2Fazuredeploy.json).
-This button will call a template from the Azure ARM QuickStart template library to get information from your Azure subscription environment and begin deploying the MedTech service.
+This button will call a template from the Azure Resource Manager (ARM) Quickstart template library to get information from your Azure subscription environment and begin deploying the MedTech service.
After you select the **Deploy to Azure** button, it may take a few minutes to implement the following resources and roles:
When the Azure portal screen appears, your next task is to fill out five fields
Don't change the **Device Mapping** and **Destination Mapping** default values at this time.
-Select the **Review + create** button after all the fields are filled out. This will review your input and check to see if all your values are valid.
+Select the **Review + create** button after all the fields are filled out. This selection will review your input and check to see if all your values are valid.
When the validation is successful, select the **Create** button to begin the deployment. After a brief wait, a message will appear telling you that your deployment is complete.
Now that the MedTech service is successfully deployed, there are three post-depl
In this article, you learned how to deploy the MedTech service in the Azure portal using a Quickstart ARM template with a **Deploy to Azure** button. To learn more about other methods of deployment, see
->[!div class="nextstepaction"]
->[Choosing a method of deployment for MedTech service in Azure](deploy-iot-connector-in-azure.md)
+> [!div class="nextstepaction"]
+> [Choosing a method of deployment for MedTech service in Azure](deploy-iot-connector-in-azure.md)
->[!div class="nextstepaction"]
->[How to manually deploy MedTech service with Azure portal](deploy-03-new-manual.md)
+> [!div class="nextstepaction"]
+> [How to manually deploy MedTech service with Azure portal](deploy-03-new-manual.md)
->[!div class="nextstepaction"]
->[How to deploy MedTech service using an ARM template and Azure PowerShell or Azure CLI](deploy-08-new-ps-cli.md)
+> [!div class="nextstepaction"]
+> [How to deploy MedTech service using an ARM template and Azure PowerShell or Azure CLI](deploy-08-new-ps-cli.md)
FHIR&#174; is a registered trademark of Health Level Seven International, registered in the U.S. Trademark Office and is used with their permission.
healthcare-apis Deploy 03 New Manual https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/iot/deploy-03-new-manual.md
Previously updated : 10/10/2022 Last updated : 10/28/2022 # How to manually deploy MedTech service using the Azure portal
-You may prefer to manually deploy MedTech service if you need to track every step of the developmental process. This might be necessary if you have to customize or troubleshoot your deployment. Manual deployment will help you by providing all the details for implementing each task.
+You may prefer to manually deploy MedTech service if you need to track every step of the developmental process. Manual deployment might be necessary if you have to customize or troubleshoot your deployment. Manual deployment will help you by providing all the details for implementing each task.
The explanation of MedTech service manual deployment using the Azure portal is divided into three parts that cover each of key tasks required: - Part 1: Prerequisites (see Prerequisites below)-- Part 2: Configuration (see [Configure for manual deployment](./deploy-05-new-config.md))-- Part 3: Deployment and Post Deployment (see [Manual deployment and post-deployment](./deploy-06-new-deploy.md))
+- Part 2: Configuration (see [Configure for manual deployment](deploy-05-new-config.md))
+- Part 3: Deployment and Post Deployment (see [Manual deployment and post-deployment](deploy-06-new-deploy.md))
-If you need a diagram with information on the MedTech service deployment, there is an architecture overview at [Choose a deployment method](deploy-iot-connector-in-azure.md#deployment-architecture-overview). This diagram shows the data flow steps of deployment and how MedTech service processes data into a Fast Healthcare Interoperability Resources (FHIR&#174;) Observation.
+If you need a diagram with information on the MedTech service deployment, there's an architecture overview at [Choose a deployment method](deploy-iot-connector-in-azure.md#deployment-architecture-overview). This diagram shows the data flow steps of deployment and how MedTech service processes data into a Fast Healthcare Interoperability Resources (FHIR&#174;) Observation.
## Part 1: Prerequisites
The first thing you need to do is determine if you have a valid Azure subscripti
## Deploy a resource group in the Azure portal
-When you log in to your Azure account, go to Azure portal and select the Create a resource button. Then enter "Azure Health Data Services" in the "Search services and marketplace" box. This should take you to the Azure Health Data Services page.
+When you sign in to your Azure account, go to the Azure portal and select the **Create a resource** button. Enter "Azure Health Data Services" in the "Search services and marketplace" box. This step should take you to the Azure Health Data Services page.
## Deploy a workspace in Azure Health Data Services
-The first resource you must create is a workspace to contain your Azure Health Data Services resources. Start by selecting Create from the Azure Health Data Services resource page. This will take you to the first page of Create Azure Health Data Services workspace, when you need to do the following 8 steps:
+The first resource you must create is a workspace to contain your Azure Health Data Services resources. Start by selecting Create from the Azure Health Data Services resource page. This step will take you to the first page of Create Azure Health Data Services workspace, when you need to do the following eight steps:
1. Fill in the resource group you want to use or create a new one.
The first resource you must create is a workspace to contain your Azure Health D
5. Choose whether you want a public or private endpoint.
-6. Create tags if you want to use them. They are optional.
+6. Create tags if you want to use them. They're optional.
-7. When you are ready to continue, select the Review + create tab.
+7. When you're ready to continue, select the Review + create tab.
8. Select the Create button to deploy your workspace.
-After a short delay, you will start to see information about your new workspace. Make sure you wait until all parts of the screen are displayed. If your initial deployment was successful, you should see:
+After a short delay, you'll start to see information about your new workspace. Make sure you wait until all parts of the screen are displayed. If your initial deployment was successful, you should see:
- "Your deployment is complete" - Deployment name
After a short delay, you will start to see information about your new workspace.
## Deploy an event hub in the Azure portal using a namespace
-An event hub is the next prerequisite you need to create. It is important because it receives the data flow from a medical device and stores it there until MedTech can pick up the data and translate it into a FHIR service Observation resource. Because Internet propagation times are indeterminate, the event hub is needed to buffer the data and store it for as much as 24 hours before expiring.
+An event hub is the next prerequisite you need to create. It's an important step because the event hub receives the data flow from a device and stores it until the MedTech service picks up the device data. Once the MedTech service picks up the device data, it can begin the transformation of the device data into a FHIR service Observation resource. Because Internet propagation times are indeterminate, the event hub is needed to buffer the data and store it for as much as 24 hours before expiring.
Before you can create an event hub, you must create a namespace in Azure portal to contain it. For more information on how To create a namespace and an event hub, see [Azure Event Hubs namespace and event hub deployed in the Azure portal](../../event-hubs/event-hubs-create.md).
After your prerequisites are successfully completed, you can go on to Part 2: Co
## Next steps
-When you are ready to begin Part 2 of Manual Deployment, see
+When you're ready to begin Part 2 of Manual Deployment, see
->[!div class="nextstepaction"]
->[Part 2: Configure the MedTech service for manual deployment using the Azure portal](deploy-05-new-config.md)
+> [!div class="nextstepaction"]
+> [Part 2: Configure the MedTech service for manual deployment using the Azure portal](deploy-05-new-config.md)
FHIR&#174; is a registered trademark of Health Level Seven International, registered in the U.S. Trademark Office and is used with their permission.
healthcare-apis Deploy 05 New Config https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/iot/deploy-05-new-config.md
Previously updated : 10/10/2022 Last updated : 10/28/2022
Before you can manually deploy the MedTech service, you must complete the follow
Start with these three steps to begin configuring the MedTech service so it will be ready to accept your tabbed configuration input:
-1. Start by going to the Health Data Services workspace you created in the manual deployment [Prerequisites](deploy-03-new-manual.md#part-1-prerequisites) section. Select the Create MedTech service box.
+1. Start by going to the Health Data Services workspace you created in the manual deployment [Prerequisites](deploy-03-new-manual.md#part-1-prerequisites) section. Select the **Create MedTech service** box.
-2. This will take you to the Add MedTech service button. Select the button.
+2. This step will take you to the **Add MedTech service** button. Select the button.
-3. This will take you to the Create MedTech service page. This page has five tabs you need to fill out:
+3. This step will take you to the **Create MedTech service** page. This page has five tabs you need to fill out:
- Basics - Device mapping
Follow these six steps to fill in the Basics tab configuration:
> > - A MedTech service and a storage writer application accessing the same device message event hub.
-The Basics tab should now look like this after you have filled it out:
+The Basics tab should now look like this after you've filled it out:
:::image type="content" source="media\iot-deploy-manual-in-portal\select-device-mapping-button.png" alt-text="Screenshot of Basics tab filled out correctly." lightbox="media\iot-deploy-manual-in-portal\select-device-mapping-button.png":::
-You are now ready to select the Device mapping tab and begin setting up the connection from the medical device to MedTech service.
+You're now ready to select the Device mapping tab and begin setting up the device mappings for your MedTech service.
## Configure the Device mapping tab
-You need to configure device mapping so that your instance of MedTech service can connect to the device you want to receive data from. This means that the data will be first sent to your event hub instance and then picked up by the MedTech service.
+You need to configure device mappings so that your instance of the MedTech service can normalize the incoming device data. The device data will first be sent to your event hub instance and then picked up by the MedTech service.
The easiest way to configure the Device mapping tab is to use the Internet of Medical Things (IoMT) Connector Data Mapper tool to visualize, edit, and test your device mapping. This open source tool is available from [IoMT Connector Data Mapper](https://github.com/microsoft/iomt-fhir/tree/master/tools/data-mapper).
To begin configuring the device mapping tab, go to the Create MedTech service pa
2. Return to the Create MedTech service page. Enter the JSON code for the template you want to use into the **Device mapping** tab. After you enter the template code, the Device mapping code will be displayed on the screen.
-3. If the Device code is correct, select the **Next: Destination >** tab to enter the destination properties you want to use with your MedTech service. Note that your device configuration data will be saved for this session.
+3. If the Device code is correct, select the **Next: Destination >** tab to enter the destination properties you want to use with your MedTech service. Your device configuration data will be saved for this session.
For more information regarding device mappings, see the relevant GitHub open source documentation at [Device Content Mapping](https://github.com/microsoft/iomt-fhir/blob/master/docs/Configuration.md#device-content-mapping).
Under the **Destination** tab, use these values to enter the destination propert
- **Create**
- If you selected **Create**, and device or patient resources are missing when you are reading data, new resources will be created, containing just the identifier.
+ If **Create** was selected, and device or patient resources are missing when you're reading data, new resources will be created, containing just the identifier.
- **Lookup**
- If you selected **Lookup**, and device or patient resources are missing, an error will occur, and the data won't be processed. The errors **DeviceNotFoundException** and/or a **PatientNotFoundException** error will be generated, depending on the type of resource not found.
+ If **Lookup** was selected, and device or patient resources are missing, an error will occur, and the data won't be processed. The errors **DeviceNotFoundException** and/or a **PatientNotFoundException** error will be generated, depending on the type of resource not found.
For more information regarding destination mapping, see the FHIR service GitHub documentation at [FHIR mapping](https://github.com/microsoft/iomt-fhir/blob/master/docs/Configuration.md#fhir-mapping).
Before you can complete the FHIR destination mapping, you must get a FHIR destin
1. Go to the [IoMT Connector Data Mapper Tool](https://github.com/microsoft/iomt-fhir/tree/master/tools/data-mapper) and get the JSON template for your FHIR destination. 1. Go back to the Destination tab of the Create MedTech service page. 1. Go to the large box below the boxes for FHIR server name, Destination name, and Resolution type. Enter the JSON template request in that box.
-1. You will then receive the FHIR Destination mapping code which will be saved as part of your configuration.
+1. You'll then receive the FHIR Destination mapping code, which will be saved as part of your configuration.
## Configure the Tags tab (optional)
-Before you complete your configuration in the **Review + create** tab, you may want to configure tabs. You can do this by selecting the **Next: Tags >** tabs.
+Before you complete your configuration in the **Review + create** tab, you may want to configure tabs. You can do this step by selecting the **Next: Tags >** tabs.
-Tags are name and value pairs used for categorizing resources. This an optional step you may have a lot of resources and want to sort them. For more information about tags, see [Use tags to organize your Azure resources and management hierarchy](../../azure-resource-manager/management/tag-resources.md).
+Tags are name and value pairs used for categorizing resources. This step is an optional step when you may have many resources and want to sort them. For more information about tags, see [Use tags to organize your Azure resources and management hierarchy](../../azure-resource-manager/management/tag-resources.md).
Follow these steps if you want to create tags:
Follow these steps if you want to create tags:
- Enter a **Name**. - Enter a **Value**.
-2. Once you've entered your tag(s), you are ready to do the last step of your configuration.
+2. Once you've entered your tag(s), you're ready to do the last step of your configuration.
## Select the Review + create tab to validate your deployment request
Your validation screen should look something like this:
If your MedTech service didn't validate, review the validation failure message, and troubleshoot the issue. Check all properties under each MedTech service tab that you've configured. Go back and try again.
-## Continue on to Part 3: Deployment and Post-deployment
+## Continue on to Part 3: Deployment and post-deployment
After your configuration is successfully completed, you can go on to Part 3: Deployment and post deployment. See **Next steps** below. ## Next steps
-When you are ready to begin Part 3 of Manual Deployment, see
+When you're ready to begin Part 3 of Manual Deployment, see
->[!div class="nextstepaction"]
->[Part 3: Manual deployment and post-deployment of MedTech service](deploy-06-new-deploy.md)
+> [!div class="nextstepaction"]
+> [Part 3: Manual deployment and post-deployment of MedTech service](deploy-06-new-deploy.md)
FHIR&#174; is a registered trademark of Health Level Seven International, registered in the U.S. Trademark Office and is used with their permission.
healthcare-apis Deploy 06 New Deploy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/iot/deploy-06-new-deploy.md
Previously updated : 10/14/2022 Last updated : 10/28/2022
-# Part 3: Manual Deployment and Post-deployment of MedTech service
+# Part 3: Manual deployment and post-deployment of MedTech service
When you're satisfied with your configuration and it has been successfully validated, you can complete the deployment and post-deployment process.
Your screen should look something like this:
:::image type="content" source="media\iot-deploy-manual-in-portal\created-medtech-service.png" alt-text="Screenshot of the MedTech service deployment completion." lightbox="media\iot-deploy-manual-in-portal\created-medtech-service.png":::
-## Manual Post-deployment requirements
+## Manual post-deployment requirements
-There are two post-deployment steps you must perform or the MedTech service can't read device data from the device message event hub, and it also can't read or write to the Fast Healthcare Interoperability Resources (FHIR&#174;) service. These steps are:
+There are two post-deployment steps you must perform or the MedTech service can't:
+
+1. Read device data from the device message event hub.
+2. Read or write to the Fast Healthcare Interoperability Resources (FHIR&#174;) service.
+
+These steps are:
1. Grant access to the device message event hub. 2. Grant access to the FHIR service.
-These two additional steps are needed because MedTech service uses [Azure role-based access control (Azure RBAC)](../../role-based-access-control/overview.md) and a [system-assigned managed identity](../../active-directory/managed-identities-azure-resources/overview.md) for extra security and control of your MedTech service assets.
+These two other steps are needed because MedTech service uses [Azure role-based access control (Azure RBAC)](../../role-based-access-control/overview.md) and a [system-assigned managed identity](../../active-directory/managed-identities-azure-resources/overview.md) for extra security and control of your MedTech service assets.
### Grant access to the device message event hub
Follow these steps to grant access to the device message event hub:
2. Select the **Event Hubs** button under **Entities**.
-3. Select the event hub that will be used for your MedTech service device messages. For this example, the device message event hub is named `devicedata'.
+3. Select the event hub that will be used for your MedTech service device messages. For this example, the device message event hub is named **devicedata**.
4. Select the **Access control (IAM)** button.
For more information about assigning roles to the FHIR service, see [Configure A
For more information about application roles, see [Authentication & Authorization for Azure Health Data Services](.././authentication-authorization.md).
-Now that you have granted access to the device message event hub and the FHIR service, your manual deployment is complete and MedTech service is ready to receive data from a medical device and process it into a FHIR Observation resource.
+Now that you have granted access to the device message event hub and the FHIR service, your manual deployment is complete. You're MedTech service is now ready to receive data from a device and process it into a FHIR Observation resource.
## Next steps In this article, you learned how to perform the manual deployment and post-deployment steps to implement your MedTech service. To learn more about other methods of deployment, see
->[!div class="nextstepaction"]
->[Choosing a method of deployment for MedTech service in Azure](deploy-iot-connector-in-azure.md)
+> [!div class="nextstepaction"]
+> [Choosing a method of deployment for MedTech service in Azure](deploy-iot-connector-in-azure.md)
->[!div class="nextstepaction"]
->[Deploy the MedTech service with a QuickStart template](deploy-02-new-button.md)
+> [!div class="nextstepaction"]
+> [Deploy the MedTech service with a QuickStart template](deploy-02-new-button.md)
->[!div class="nextstepaction"]
->[Using Azure PowerShell and Azure CLI to deploy the MedTech service using Azure Resource Manager templates](deploy-08-new-ps-cli.md)
+> [!div class="nextstepaction"]
+> [Using Azure PowerShell and Azure CLI to deploy the MedTech service using Azure Resource Manager templates](deploy-08-new-ps-cli.md)
FHIR&#174; is a registered trademark of Health Level Seven International, registered in the U.S. Trademark Office and is used with their permission.
healthcare-apis Deploy 08 New Ps Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/iot/deploy-08-new-ps-cli.md
Title: Using Azure PowerShell and Azure CLI to deploy the MedTech service with Azure Resource Manager templates - Azure Health Data Services
+ Title: Using Azure PowerShell and Azure CLI to deploy the MedTech service with Azure Resource Manager (ARM) templates - Azure Health Data Services
description: In this article, you'll learn how to use Azure PowerShell and Azure CLI to deploy the MedTech service using an Azure Resource Manager template. Previously updated : 10/10/2022 Last updated : 10/28/2022 # Using Azure PowerShell and Azure CLI to deploy the MedTech service with Azure Resource Manager templates
-In this quickstart article, you'll learn how to use Azure PowerShell and Azure CLI to deploy the MedTech service using an Azure Resource Manager (ARM) template. When you call the template from PowerShell or CLI, it provides automation that enables you to distribute your deployment to large numbers of developers. Using PowerShell or CLI allows for modifiable automation capabilities that will speed up your deployment configuration in enterprise environments. For more information about ARM templates, see [What are ARM templates?](./../../azure-resource-manager/templates/overview.md).
+In this Quickstart article, you'll learn how to use Azure PowerShell and Azure CLI to deploy the MedTech service using an Azure Resource Manager (ARM) template. When you call the template from PowerShell or CLI, it provides automation that enables you to distribute your deployment to large numbers of developers. Using PowerShell or CLI allows for modifiable automation capabilities that will speed up your deployment configuration in enterprise environments. For more information about ARM templates, see [What are ARM templates?](./../../azure-resource-manager/templates/overview.md).
## Resources provided by the ARM template
The ARM template will help you automatically configure and deploy the following
The ARM template used in this article is available from the [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates/iotconnectors/) site using the **azuredeploy.json** file located on [GitHub](https://github.com/Azure/azure-quickstart-templates/blob/master/quickstarts/microsoft.healthcareapis/workspaces/iotconnectors/azuredeploy.json).
-If you need to see a diagram with information on the MedTech service deployment, there is an architecture overview at [Choose a deployment method](deploy-iot-connector-in-azure.md#deployment-architecture-overview). This diagram shows the data flow steps of deployment and how MedTech service processes data into a FHIR Observation.
+If you need to see a diagram with information on the MedTech service deployment, there's an architecture overview at [Choose a deployment method](deploy-iot-connector-in-azure.md#deployment-architecture-overview). This diagram shows the data flow steps of deployment and how MedTech service processes data into a FHIR Observation.
## Azure PowerShell prerequisites
For example: `az group delete --resource-group ArmTestDeployment`
In this article, you learned how to use Azure PowerShell and Azure CLI to deploy the MedTech service using an Azure Resource Manager (ARM) template. To learn more about other methods of deployment, see
->[!div class="nextstepaction"]
->[Choosing a method of deployment for MedTech service in Azure](deploy-iot-connector-in-azure.md)
+> [!div class="nextstepaction"]
+> [Choosing a method of deployment for MedTech service in Azure](deploy-iot-connector-in-azure.md)
->[!div class="nextstepaction"]
->[How to deploy the MedTech service with a Azure ARM QuickStart template](deploy-03-new-manual.md)
+> [!div class="nextstepaction"]
+> [How to deploy the MedTech service with a Azure ARM Quickstart template](deploy-03-new-manual.md)
->[!div class="nextstepaction"]
->[How to manually deploy MedTech service with Azure portal](deploy-03-new-manual.md)
+> [!div class="nextstepaction"]
+> [How to manually deploy the MedTech service with Azure portal](deploy-03-new-manual.md)
FHIR&#174; is a registered trademark of Health Level Seven International, registered in the U.S. Trademark Office and is used with their permission.
healthcare-apis Deploy Iot Connector In Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/iot/deploy-iot-connector-in-azure.md
Previously updated : 10/20/2022 Last updated : 10/28/2022
MedTech service provides multiple methods for deploying it into an Azure Platfor
The different deployment methods are: -- Azure ARM Quickstart template with Deploy to Azure button
+- Azure Resource Manager (ARM) Quickstart template with Deploy to Azure button
- Azure PowerShell and Azure CLI automation - Manual deployment
For more information about Using an ARM template with Azure PowerShell and Azure
## Manual deployment
-The manual deployment method uses Azure portal to implement each deployment task individually. There are no shortcuts. Because you'll be able to see all the details of how to complete the sequence of each task, this procedure can be beneficial if you need to customize or troubleshoot your deployment process. This is the most complex method, but it provides valuable technical information and developmental options that will enable you to fine-tune your deployment precisely.
+The manual deployment method uses the Azure portal to implement each deployment task individually. Using the manual deployment method will allow you to see all the details of how to complete the sequence of each deployment task. The manual deployment method can be beneficial if you need to customize or troubleshoot your deployment process. The manual deployment is the most complex method, but it provides valuable technical information and developmental options that will enable you to fine-tune your deployment precisely.
For more information about manual deployment with portal, see [Overview of how to manually deploy the MedTech service using the Azure portal](deploy-03-new-manual.md). ## Deployment architecture overview
-The following data-flow diagram outlines the basic steps of MedTech service deployment and shows how these steps fit together with its data processing procedures. This may help you analyze the options and determine which deployment method is best for you.
+The following data-flow diagram outlines the basic steps of MedTech service deployment and shows how these steps fit together with its data processing procedures. These basic steps may help you analyze the options and determine which deployment method is best for you.
:::image type="content" source="media/iot-get-started/get-started-with-iot.png" alt-text="Diagram showing MedTech service architecture overview." lightbox="media/iot-get-started/get-started-with-iot.png":::
There are six different steps of the MedTech service PaaS. Only the first four a
### Step 1: Prerequisites - Have an Azure subscription-- Create RBAC roles contributor and user access administrator or owner. This feature is automatically done in the QuickStart template method with the Deploy to Azure button, but it isn't included in manual or PowerShell/CLI method and need to be implemented individually.
+- Create RBAC roles contributor and user access administrator or owner. This feature is automatically done in the Quickstart template method with the Deploy to Azure button. It isn't included in the manual or PowerShell/CLI methods and needs to be implemented individually.
### Step 2: Provision
Each method must add **all** these post-deployment tasks:
- Connect to services using device and destination mapping. - Use managed identity to grant access to the device message event hub. - Use managed identity to grant access to the FHIR service, enabling FHIR to receive data from the MedTech service.-- Note: only the ARM QuickStart method requires a shared access key for post-deployment.
+- Note: only the ARM Quickstart method requires a shared access key for post-deployment.
### Granting access to the device message event hub
iot-dps Concepts Control Access Dps Azure Ad https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/concepts-control-access-dps-azure-ad.md
Title: Access control and security for DPS by using Azure Active Directory | Mi
description: Concepts - how to control access to Azure IoT Hub Device Provisioning Service (DPS) (DPS) for back-end apps. Includes information about Azure Active Directory and RBAC. -+ Last updated 02/07/2022 + # Control access to Azure IoT Hub Device Provisioning Service (DPS) by using Azure Active Directory (preview)
iot-dps Concepts Control Access Dps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/concepts-control-access-dps.md
Title: Access control and security for Azure IoT Hub Device Provisioning Servic
description: Overview on how to control access to Azure IoT Hub Device Provisioning Service (DPS), includes links to in-depth articles on Azure Active Directory integration (Public Preview) and SAS options. -+ Last updated 04/20/2022 + # Control access to Azure IoT Hub Device Provisioning Service (DPS)
iot-dps Concepts Device Oem Security Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/concepts-device-oem-security-practices.md
Last updated 3/02/2020 -+ + # Security practices for Azure IoT device manufacturers As more manufacturers release IoT devices, it's helpful to identify guidance around common practices. This article summarizes recommended security practices to consider when you manufacture devices for use with Azure IoT Device Provisioning Service (DPS).
iot-dps Iot Dps Customer Data Requests https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/iot-dps-customer-data-requests.md
Last updated 05/16/2018 -+ + # Summary of customer data request featuresΓÇï
iot-dps Iot Dps Https Sym Key Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/iot-dps-https-sym-key-support.md
+
+ Title: How to use raw HTTPS in Azure IoT Hub Device Provisioning Service
+description: This article shows how to use symmetric keys over HTTPS in your Device Provisioning Service (DPS) instance
++ Last updated : 10/27/2022++++++
+# How to use symmetric keys over HTTPS without an SDK
+
+In this how-to article, you'll provision a device using symmetric keys over HTTPS without using an Azure IoT DPS device SDK. Most languages provide libraries to send HTTP requests, but, rather than focus on a specific language, in this article, you'll use the [cURL](https://en.wikipedia.org/wiki/CURL) command-line tool to send and receive over HTTPS.
+
+You can follow the steps in this article on either a Linux or a Windows machine. If you're running on Windows Subsystem for Linux (WSL) or running on a Linux machine, you can enter all commands on your local system in a Bash prompt. If you're running on Windows, enter all commands on your local system in a GitBash prompt.
+
+There are different paths through this article depending on the type of enrollment entry you choose to use. After installing the prerequisites, be sure to read the [Overview](#overview) before proceeding.
+
+## Prerequisites
+
+* If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.
+
+* Complete the steps in [Set up IoT Hub Device Provisioning Service with the Azure portal](./quick-setup-auto-provision.md).
+
+* Make sure [Python 3.7](https://www.python.org/downloads/) or later is installed on your machine. You can check your version of Python by running `python --version`.
+
+* If you're running in Windows, install the latest version of [Git](https://git-scm.com/download/). Make sure that Git is added to the environment variables accessible to the command window. See [Software Freedom Conservancy's Git client tools](https://git-scm.com/download/) for the latest version of `git` tools to install, which includes *Git Bash*, the command-line app that you can use to interact with your local Git repository. On Windows, you'll enter all commands on your local system in a GitBash prompt.
+
+* Azure CLI. You have two options for running Azure CLI commands in this article:
+ * Use the Azure Cloud Shell, an interactive shell that runs CLI commands in your browser. This option is recommended because you don't need to install anything. If you're using Cloud Shell for the first time, log into the [Azure portal](https://portal.azure.com). Follow the steps in [Cloud Shell quickstart](../cloud-shell/quickstart.md) to **Start Cloud Shell** and **Select the Bash environment**.
+ * Optionally, run Azure CLI on your local machine. If Azure CLI is already installed, run `az upgrade` to upgrade the CLI and extensions to the current version. To install Azure CLI, see [Install Azure CLI]( /cli/azure/install-azure-cli).
+
+* If you're running in a Linux or a WSL environment, open a Bash prompt to run commands locally. If you're running in a Windows environment, open a GitBash prompt.
+
+## Overview
+
+For this article, you can use either an [individual enrollment](concepts-service.md#individual-enrollment) or an [enrollment group](concepts-service.md#enrollment-group) to provision through DPS.
+
+* For an individual enrollment, complete [Use individual enrollment](#use-an-individual-enrollment).
+
+* For an enrollment group, complete [Use an enrollment group](#use-an-enrollment-group).
+
+After you've created the individual enrollment or enrollment group entry, continue on to [create a SAS token](#create-a-sas-token) and [register your device](#register-your-device) with DPS.
+
+## Use an individual enrollment
+
+If you want to create a new individual enrollment to use for this article, you can use the [az iot dps enrollment create](/cli/azure/iot/dps/enrollment#az-iot-dps-enrollment-create) command to create an individual enrollment for symmetric key attestation.
+
+The following command creates an enrollment entry with the default allocation policy for your DPS instance and lets DPS assign the primary and secondary keys for your device:
+
+```azurecli
+az iot dps enrollment create -g {resource_group_name} --dps-name {dps_name} --enrollment-id {enrollment_id} --attestation-type symmetrickey
+```
+
+* Substitute the name of your resource group and DPS instance.
+
+* The enrollment ID is the registration ID for your device. The registration ID is a case-insensitive string (up to 128 characters long) of alphanumeric characters plus the special characters: `'-'`, `'.'`, `'_'`, `':'`. The last character must be alphanumeric or dash (`'-'`). Make sure the enrollment ID you use in the command adheres to this format.
+
+The assigned symmetric keys are returned in the **attestation** property in the response:
+
+```json
+
+{
+ "allocationPolicy": null,
+ "attestation": {
+ "symmetricKey": {
+ "primaryKey": "G3vn0IZH9oK3d4wsxFpWBtd2KUrtjI+39dZVRf26To8w9OX0LaFV9yZ93ELXY7voqHEUsNhnb9bt717UP87KxA==",
+ "secondaryKey": "4lNxgD3lUAOEOied5/xOocyiUSCAgS+4b9OvXLDi8ug46/CJzIn/3rN6Ys6gW8SMDDxMQDaMRnIoSd1HJ5qn/g=="
+ },
+ "tpm": null,
+ "type": "symmetricKey",
+ "x509": null
+ },
+
+ ...
+
+}
+```
+
+Note down the primary key and the registration ID (enrollment ID) for your individual enrollment entry, you'll use them later in this article.
+
+If you want to use an existing individual enrollment for this article, you can get the primary key with the [az iot dps enrollment show](/cli/azure/iot/dps/enrollment#az-iot-dps-enrollment-show) command:
+
+```azurecli
+az iot dps enrollment show -g {resource_group_name} --dps-name {dps_name} --enrollment-id {enrollment_id} --show-keys true
+```
+
+## Use an enrollment group
+
+If you want to create a new enrollment group to use for this article, you can use the [az iot dps enrollment-group create](/cli/azure/iot/dps/enrollment-group#az-iot-dps-enrollment-group-create) command to create an enrollment group for symmetric key attestation.
+
+The following command creates an enrollment group entry with the default allocation policy for your DPS instance and lets DPS assign the primary and secondary keys for the enrollment group:
+
+```azurecli
+az iot dps enrollment-group create -g {resource_group_name} --dps-name {dps_name} --enrollment-id {enrollment_id}
+```
+
+* Substitute the name of your resource group and DPS instance.
+
+* The enrollment ID is a case-insensitive string (up to 128 characters long) of alphanumeric characters plus the special characters: `'-'`, `'.'`, `'_'`, `':'`. The last character must be alphanumeric or dash (`'-'`). It can be any name you choose to use for the enrollment group.
+
+The assigned symmetric keys are returned in the **attestation** property in the response:
+
+```json
+
+{
+ "allocationPolicy": null,
+ "attestation": {
+ "symmetricKey": {
+ "primaryKey": "G3vn0IZH9oK3d4wsxFpWBtd2KUrtjI+39dZVRf26To8w9OX0LaFV9yZ93ELXY7voqHEUsNhnb9bt717UP87KxA==",
+ "secondaryKey": "4lNxgD3lUAOEOied5/xOocyiUSCAgS+4b9OvXLDi8ug46/CJzIn/3rN6Ys6gW8SMDDxMQDaMRnIoSd1HJ5qn/g=="
+ },
+ "tpm": null,
+ "type": "symmetricKey",
+ "x509": null
+ },
+
+ ...
+
+}
+```
+
+Note down the primary key.
+
+If you want to use an existing individual enrollment for this article, you can get the primary key with the [az iot dps enrollment-group show](/cli/azure/iot/dps/enrollment#az-iot-dps-enrollment-show) command:
+
+```azurecli
+az iot dps enrollment-group show -g {resource_group_name} --dps-name {dps_name} --enrollment-id {enrollment_id} --show-keys true
+```
+
+### Derive a device key
+
+When using symmetric key attestation with group enrollments, you don't use the enrollment group keys directly. Instead, you derive a unique key for each device from the enrollment group key. For more information, see [Group Enrollments with symmetric keys](concepts-symmetric-key-attestation.md#group-enrollments).
+
+In this section, you'll generate a device key from the enrollment group primary key to compute an [HMAC-SHA256](https://wikipedia.org/wiki/HMAC) of the unique registration ID for the device. The result will then be converted into Base64 format.
+
+1. Generate your unique key using **openssl**. You'll use the following Bash shell script. Replace `{primary-key}` with the enrollment group's **Primary Key** that you copied earlier and replace `{contoso-simdevice}`with the registration ID you want to use for the device. The registration ID is a case-insensitive string (up to 128 characters long) of alphanumeric characters plus the special characters: `'-'`, `'.'`, `'_'`, `':'`. The last character must be alphanumeric or dash (`'-'`).
+
+ ```bash
+ KEY={primary-key}
+ REG_ID={contoso-simdevice}
+
+ keybytes=$(echo $KEY | base64 --decode | xxd -p -u -c 1000)
+ echo -n $REG_ID | openssl sha256 -mac HMAC -macopt hexkey:$keybytes -binary | base64
+ ```
+
+2. The script will output something like the following key:
+
+ ```bash
+ p3w2DQr9WqEGBLUSlFi1jPQ7UWQL4siAGy75HFTFbf8=
+ ```
+
+Note down the derived device key and the registration ID you used to generate it, you'll use them in the next section.
+
+You can also use the Azure CLI or PowerShell to derive a device key. To learn more, see [Derive a device key](how-to-legacy-device-symm-key.md#derive-a-device-key).
+
+## Create a SAS token
+
+When using symmetric key attestation, devices authenticate with DPS using a Shared Access Signature (SAS) token. For devices provisioning through an individual enrollment, the token is signed using either the primary or secondary key set in the enrollment entry. For a device provisioning through an enrollment group, the token is signed using a derived device key, which, in turn, has been generated using either the primary or secondary key set in the enrollment group entry. The token specifies an expiry time and a target resource URI.
+
+The following Python script can be used to generate a SAS token:
+
+```python
+from base64 import b64encode, b64decode
+from hashlib import sha256
+from time import time
+from urllib.parse import quote_plus, urlencode
+from hmac import HMAC
+
+def generate_sas_token(uri, key, policy_name, expiry=3600):
+ ttl = time() + expiry
+ sign_key = "%s\n%d" % ((quote_plus(uri)), int(ttl))
+ print(sign_key)
+ signature = b64encode(HMAC(b64decode(key), sign_key.encode('utf-8'), sha256).digest())
+
+ rawtoken = {
+ 'sr' : uri,
+ 'sig': signature,
+ 'se' : str(int(ttl))
+ }
+
+ if policy_name is not None:
+ rawtoken['skn'] = policy_name
+
+ return 'SharedAccessSignature ' + urlencode(rawtoken)
+
+uri = '[resource_uri]'
+key = '[device_key]'
+expiry = [expiry_in_seconds]
+policy= '[policy]'
+
+print(generate_sas_token(uri, key, policy, expiry))
+```
+
+Where:
+
+* `[resource_uri]` is the URI of the resource you're trying to access with this token. For DPS, it's of the form `[dps_id_scope]/registrations/[dps_registration_id]`, where `[dps_id_scope]` is the ID scope of your DPS instance, and `[dps_registration_id]` is the registration ID you used for your device.
+
+ You can get the ID scope for your DPS instance from the **Overview** pane of your instance in Azure portal, or you can use the [az iot dps show](/cli/azure/iot/dps#az-iot-dps-show) Azure CLI command (replace the placeholders with the name of your resource group and DPS instance):
+
+ ```azurecli
+ az iot dps show -g {resource_group_name} --name {dps_name}
+ ```
+
+* `[device_key]` is the device key associated with your device. This key is either the one specified or auto-generated for you in an individual enrollment, or a derived key for a group enrollment.
+
+ * If you're using an individual enrollment, use the primary key you saved in [Use an individual enrollment](#use-an-individual-enrollment).
+
+ * If you're using an enrollment group, use the derived device key you generated in [Use an enrollment group](#use-an-enrollment-group).
+
+* `[expiry_in_seconds]` is the validity period of this SAS token in seconds.
+
+* `[policy]` is the policy with which the device key is associated. For DPS device registration, the policy is hard coded to 'registration'.
+
+An example set of inputs for a device called `my-symkey-device` with a validity period of 30 days might look like this.
+
+```python
+uri = '0ne00111111/registrations/my-symkey-device'
+key = '18RQk/hOPJR9EbsJlk2j8WA6vWaj/yi+oaYg7zmxfQNdOyMSu+SJ8O7TSlZhDJCYmn4rzEiVKIzNiVAWjLxrGA=='
+expiry = 2592000
+policy='registration'
+```
+
+Modify the script for your device and DPS instance and save it as a Python file; for example, *generate_token.py*. Run the script, for example, `python generate_token.py`. It should output a SAS token similar to the following:
+
+```output
+0ne00111111%2Fregistrations%2Fmy-symkey-device
+1663952627
+SharedAccessSignature sr=0ne00111111%2Fregistrations%2Fmy-symkey-device&sig=eNwg52xQdFTNf7bgPAlAJBCIcONivq%2Fck1lf3wtxI4A%3D&se=1663952627&skn=registration
+```
+
+Copy and save the entire line that begins with `SharedAccessSignature`. This line is the SAS token. You'll need it in the following sections.
+
+To learn more about using SAS tokens with DPS and their structure, see [Control Access to DPS with SAS](how-to-control-access.md).
+
+## Register your device
+
+You call the [Register Device](/rest/api/iot-dps/device/runtime-registration/register-device) REST API to provision your device through DPS.
+
+Use the following curl command:
+
+```bash
+curl -L -i -X PUT -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -H 'Authorization: [sas_token]' -d '{"registrationId": "[registration_id]"}' https://global.azure-devices-provisioning.net/[dps_id_scope]/registrations/[registration_id]/register?api-version=2019-03-31
+```
+
+Where:
+
+* `-L` tells curl to follow HTTP redirects.
+
+* `ΓÇôi` tells curl to include protocol headers in output. These headers aren't strictly necessary, but they can be useful.
+
+* `-X PUT` tells curl that this is an HTTP PUT command. Required for this API call.
+
+* `-H 'Content-Type: application/json'` tells DPS we're posting JSON content and must be 'application/json'.
+
+* `-H 'Content-Encoding: utf-8'` tells DPS the encoding we're using for our message body. Set to the proper value for your OS/client; however, it's generally `utf-8`.
+
+* `-H 'Authorization: [sas_token]'` tells DPS to authenticate using your SAS token. Replace [sas_token] with the token you generated in [Create a SAS token](#create-a-sas-token).
+
+* `-d '{"registrationId": "[registration_id]"}'`, the `ΓÇôd` parameter is the 'data' or body of the message we're posting. It must be JSON, in the form of '{"registrationId":"[registration_id"}'. Note that for curl, it's wrapped in single quotes; otherwise, you need to escape the double quotes in the JSON.
+
+* Finally, the last parameter is the URL to post to. For "regular" (i.e not on-premises) DPS, the global DPS endpoint, *global.azure-devices-provisioning.net*, is used: `https://global.azure-devices-provisioning.net/[dps_id_scope]/registrations/[registration_id]/register?api-version=2019-03-31`. Note that you have to replace `[dps_scope_id]` and `[registration_id]` with the appropriate values.
+
+For example:
+
+```bash
+curl -L -i -X PUT -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -H 'Authorization: SharedAccessSignature sr=0ne00111111%2Fregistrations%2Fmy-symkey-device&sig=eNwg52xQdFTNf7bgPAlAJBCIcONivq%2Fck1lf3wtxI4A%3D&se=1663952627&skn=registration' -d '{"registrationId": "my-symkey-device"}' https://global.azure-devices-provisioning.net/0ne00111111/registrations/my-symkey-device/register?api-version=2021-06-01
+```
+
+A successful call will have a response similar to the following:
+
+```output
+HTTP/1.1 202 Accepted
+Date: Wed, 31 Aug 2022 22:02:49 GMT
+Content-Type: application/json; charset=utf-8
+Transfer-Encoding: chunked
+Location: https://global.azure-devices-provisioning.net/0ne00111111/registrations/my-symkey-device/register
+Retry-After: 3
+x-ms-request-id: a021814f-0cf6-4ce9-a1e9-ead7eb5118d9
+Strict-Transport-Security: max-age=31536000; includeSubDomains
+
+{"operationId":"5.316aac5bdc130deb.b1e02da8-c3a0-4ff2-a121-7ea7a6b7f550","status":"assigning"}
+```
+
+The response contains an operation ID and a status. In this case, the status is set to `assigning`. DPS enrollment is, potentially, a long-running operation, so it's done asynchronously. Typically, you'll poll for status using the [Operation Status Lookup](/rest/api/iot-dps/device/runtime-registration/operation-status-lookup) REST API to determine when your device has been assigned or whether a failure has occurred.
+
+The valid status values for DPS are:
+
+* `assigned`: the return value from the status call will indicate what IoT Hub the device was assigned to.
+
+* `assigning`: the operation is still running.
+
+* `disabled`: the enrollment record is disabled in DPS, so the device can't be assigned.
+
+* `failed`: the assignment failed. There will be an `errorCode` and `errorMessage` returned in an `registrationState` record in the response to indicate what failed.
+
+* `unassigned`
+
+To call the **Operation Status Lookup** API, use the following curl command:
+
+```bash
+curl -L -i -X GET -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -H 'Authorization: [sas_token]' https://global.azure-devices-provisioning.net/[dps_id_scope]/registrations/[registration_id]/operations/[operation_id]?api-version=2019-03-31
+```
+
+You'll use the same ID scope, registration ID, and SAS token as you did in the **Register Device** request. Use the operation ID that was returned in the **Register Device** response.
+
+For example:
+
+```bash
+curl -L -i -X GET -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -H 'Authorization: SharedAccessSignature sr=0ne00111111%2Fregistrations%2Fmy-symkey-device&sig=eNwg52xQdFTNf7bgPAlAJBCIcONivq%2Fck1lf3wtxI4A%3D&se=1663952627&skn=registration' https://global.azure-devices-provisioning.net/0ne00111111/registrations/my-symkey-device/operations/5.316aac5bdc130deb.f4f1828c-4dab-4ca9-98b2-dfc63b5835d6?api-version=2021-06-01
+```
+
+The following output shows the response for a device that has been successfully assigned. Notice that the `status` property is `assigned` and that the `registrationState.assignedHub` property is set to the IoT hub where the device was provisioned.
+
+```output
+HTTP/1.1 200 OK
+Date: Wed, 31 Aug 2022 22:05:23 GMT
+Content-Type: application/json; charset=utf-8
+Transfer-Encoding: chunked
+x-ms-request-id: ffb98d42-023e-4e75-afb0-1807ff091cbb
+Strict-Transport-Security: max-age=31536000; includeSubDomains
+
+{
+ "operationId":"5.316aac5bdc130deb.b1e02da8-c3a0-4ff2-a121-7ea7a6b7f550",
+ "status":"assigned",
+ "registrationState":{
+ "registrationId":"my-symkey-device",
+ "createdDateTimeUtc":"2022-08-31T22:02:50.5163352Z",
+ "assignedHub":"MyExampleHub.azure-devices.net",
+ "deviceId":"my-symkey-device",
+ "status":"assigned",
+ "substatus":"initialAssignment",
+ "lastUpdatedDateTimeUtc":"2022-08-31T22:02:50.7370676Z",
+ "etag":"IjY5MDAzNTUyLTAwMDAtMDMwMC0wMDAwLTYzMGZkYThhMDAwMCI="
+ }
+}
+```
+
+## Send a telemetry message
+
+Before you can send a telemetry message, you need to create a SAS token for the IoT hub that the device was assigned to. You sign this token using the same primary key or derived device key that you used to sign the SAS token for your DPS instance.
+
+### Create a SAS token for your IoT hub
+
+To create the SAS token, you can run the same code you did to create the token for your DPS instance with the following changes:
+
+```python
+uri = '[resource_uri]'
+key = '[device_key]'
+expiry = [expiry_in_seconds]
+policy= None
+```
+
+Where:
+
+* `[resource_uri]` is the URI of the resource you're trying to access with this token. For a device sending messages to an IoT hub, it's of the form `[iot-hub-host-name]/devices/[device-id]`.
+
+ * For `[iot-hub-host-name]`, use the IoT Hub hostname returned in the `assignedHub` property in the previous section.
+
+ * For `[device-id]`, use the device ID returned in the `deviceId` property in the previous section.
+
+* `[device_key]` is the device key associated with your device. This key is either the one specified or auto-generated for you in an individual enrollment, or a derived key for a group enrollment. (It's the same key you used previously to create a token for DPS.)
+
+ * If you're using an individual enrollment, use the primary key you saved in [Use an individual enrollment](#use-an-individual-enrollment).
+
+ * If you're using an enrollment group, use the derived device key you generated in [Use an enrollment group](#use-an-enrollment-group).
+
+* `[expiry_in_seconds]` is the validity period of this SAS token in seconds.
+
+* `policy=None` No policy is required for a device sending telemetry to an IoT hub, so this parameter is set to `None`.
+
+An example set of inputs for a device called `my-symkey-device` sending to an IoT Hub named `MyExampleHub` with a token validity period of one hour might look like this:
+
+```python
+uri = 'MyExampleHub.azure-devices.net/devices/my-symkey-device'
+key = '18RQk/hOPJR9EbsJlk2j8WA6vWaj/yi+oaYg7zmxfQNdOyMSu+SJ8O7TSlZhDJCYmn4rzEiVKIzNiVAWjLxrGA=='
+expiry = 3600
+policy= None
+```
+
+The following output shows a sample SAS token for these inputs:
+
+```output
+SharedAccessSignature sr=MyExampleHub.azure-devices.net%2Fdevices%2Fmy-symkey-device&sig=f%2BwW8XOKeJOtiPc9Iwjc4OpExvPM7NlhM9qxN2a1aAM%3D&se=1663119026
+```
+
+To learn more about creating SAS tokens for IoT Hub, including example code in other programming languages, see [Control access to IoT Hub using Shared Access Signatures](../iot-hub/iot-hub-dev-guide-sas.md?tabs=python).
+
+> [!NOTE]
+>
+> As a convenience, you can use the Azure CLI [az iot hub generate-sas-token](/cli/azure/iot/hub#az-iot-hub-generate-sas-token) command to get a SAS token for a device registered with an IoT hub. For example, the following command generates a SAS token with a duration of one hour. For the `{iothub_name}`, you only need the first part of the host hame, for example, `MyExampleHub`.
+>
+> ```azurecli
+> az iot hub generate-sas-token -d {device_id} -n {iothub_name}
+> ```
+
+### Send data to your IoT hub
+
+You call the IoT Hub [Send Device Event](/rest/api/iothub/device/send-device-event) REST API to send telemetry to the device.
+
+Use the following curl command:
+
+```bash
+curl -L -i -X POST -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -H 'Authorization: [sas_token]' -d '{"temperature": 30}' https://[assigned_iot_hub_name].azure-devices.net/devices/[device_id]/messages/events?api-version=2020-03-13
+```
+
+Where:
+
+* `-X POST` tells curl that this is an HTTP POST command. Required for this API call.
+
+* `-H 'Content-Type: application/json'` tells IoT Hub we're posting JSON content and must be 'application/json'.
+
+* `-H 'Content-Encoding: utf-8'` tells IoT Hub the encoding we're using for our message body. Set to the proper value for your OS/client; however, it's generally `utf-8`.
+
+* `-H 'Authorization: [sas_token]'` tells IoT Hub to authenticate using your SAS token. Replace `[sas_token]` with the token you generated for the assigned IoT hub.
+
+* `-d '{"temperature": 30}'`, the `ΓÇôd` parameter is the 'data' or body of the message we're posting. For this article, we're posting a single temperature data point. The content type was specified as application/json, so, for this request, the body is JSON. Note that for curl, it's wrapped in single quotes; otherwise, you need to escape the double quotes in the JSON.
+
+* The last parameter is the URL to post to. For the Send Device Event API, the URL is: `https://[assigned_iot_hub_name].azure-devices.net/devices/[device_id]/messages/events?api-version=2020-03-13`.
+
+ * Replace `[assigned_iot_hub_name]` with the name of the IoT hub that your device was assigned to.
+
+ * Replace `[device_id]` with the device ID that was assigned when you registered your device. For devices that provision through enrollment groups the device ID will be the registration ID. For individual enrollments, you can, optionally, specify a device ID that is different than the registration ID in the enrollment entry.
+
+For example, for a device with a device ID of `my-symkey-device` sending a telemetry data point to an IoT hub named `MyExampleHub`:
+
+```bash
+curl -L -i -X POST -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -H 'Authorization: SharedAccessSignature sr=MyExampleHub.azure-devices.net%2Fdevices%2Fmy-symkey-device&sig=f%2BwW8XOKeJOtiPc9Iwjc4OpExvPM7NlhM9qxN2a1aAM%3D&se=1663119026' -d '{"temperature": 30}' https://MyExampleHub.azure-devices.net/devices/my-symkey-device/messages/events?api-version=2020-03-13
+```
+
+A successful call will have a response similar to the following:
+
+```output
+HTTP/1.1 204 No Content
+Content-Length: 0
+Vary: Origin
+Server: Microsoft-HTTPAPI/2.0
+x-ms-request-id: 9e278582-3561-417b-b807-76426195920f
+Date: Wed, 14 Sep 2022 00:32:53 GMT
+```
+
+## Next Steps
+
+* To learn more about symmetric key attestation, see [Symmetric key attestation](concepts-symmetric-key-attestation.md).
+
+* To learn more about SAS tokens and their structure, see [Control access to DPS with SAS](how-to-control-access.md).
iot-dps Iot Dps Https X509 Support https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-dps/iot-dps-https-x509-support.md
+
+ Title: How to use raw HTTPS with X.509 certificates with Azure IoT Hub Device Provisioning Service
+description: This article shows how to use X.509 certificates over HTTPS in your Device Provisioning Service (DPS) instance
++ Last updated : 10/27/2022++++++
+# How to use X.509 certificates over HTTPS without an SDK
+
+In this how-to article, you'll provision a device using x.509 certificates over HTTPS without using an Azure IoT DPS device SDK. Most languages provide libraries to send HTTP requests, but, rather than focus on a specific language, in this article, you'll use the [cURL](https://en.wikipedia.org/wiki/CURL) command-line tool to send and receive over HTTPS.
+
+You can follow the steps in this article on either a Linux or a Windows machine. If you're running on Windows Subsystem for Linux (WSL) or running on a Linux machine, you can enter all commands on your local system in a Bash prompt. If you're running on Windows, enter all commands on your local system in a GitBash prompt.
+
+There are multiple paths through this article depending on the type of enrollment entry and X.509 certificate(s) you choose to use. After installing the prerequisites, be sure to read the [Overview](#overview) before proceeding.
+
+## Prerequisites
+
+* If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.
+
+* Complete the steps in [Set up IoT Hub Device Provisioning Service with the Azure portal](./quick-setup-auto-provision.md).
+
+* Make sure you have [Python 3.7](https://www.python.org/downloads/) or later installed on your machine. You can check your version of Python by running `python --version` or `python3 --version`.
+
+* If you're running in Windows, install the latest version of [Git](https://git-scm.com/download/). Make sure that Git is added to the environment variables accessible to the command window. See [Software Freedom Conservancy's Git client tools](https://git-scm.com/download/) for the latest version of `git` tools to install, which includes *Git Bash*, the command-line app that you can use to interact with your local Git repository. On Windows, you'll enter all commands on your local system in a GitBash prompt.
+
+* Azure CLI. You have two options for running Azure CLI commands in this article:
+ * Use the Azure Cloud Shell, an interactive shell that runs CLI commands in your browser. This option is recommended because you don't need to install anything. If you're using Cloud Shell for the first time, log into the [Azure portal](https://portal.azure.com). Follow the steps in [Cloud Shell quickstart](../cloud-shell/quickstart.md) to **Start Cloud Shell** and **Select the Bash environment**.
+ * Optionally, run Azure CLI on your local machine. If Azure CLI is already installed, run `az upgrade` to upgrade the CLI and extensions to the current version. To install Azure CLI, see [Install Azure CLI]( /cli/azure/install-azure-cli).
+
+* If you're running in a Linux or a WSL environment, open a Bash prompt to run commands locally. If you're running in a Windows environment, open a GitBash prompt.
+
+## Overview
+
+There are three scenarios covered in this article and the initial steps you'll perform will be different for each. If you want to:
+
+* Provision through an [individual enrollment](concepts-service.md#individual-enrollment) using a self-signed certificate, follow the steps in these sections:
+
+ 1. [Use a self-signed certificate](#use-a-self-signed-certificate) to create a self-signed certificate.
+ 1. [Use an individual enrollment](#use-an-individual-enrollment) to create an individual enrollment.
+
+* Provision through an individual enrollment using a certificate chain, follow the steps in these sections:
+
+ 1. [Use a certificate chain](#use-a-certificate-chain) to create a certificate chain.
+ 1. [Use an individual enrollment](#use-an-individual-enrollment) to create an individual enrollment.
+ 1. [Upload and verify a signing certificate](#upload-and-verify-a-signing-certificate) to upload and verify your root CA certificate.
+
+* Provision through an [enrollment group](concepts-service.md#enrollment-group), follow the steps in these sections:
+
+ 1. [Use a certificate chain](#use-a-certificate-chain) to create a certificate chain.
+ 1. [Use an enrollment group](#use-an-enrollment-group) to create an enrollment group.
+ 1. [Upload and verify a signing certificate](#upload-and-verify-a-signing-certificate) to upload and verify your root CA certificate.
+
+Once you've completed the steps for your chosen scenario, you can continue on to [Register your device](#register-your-device) and [Send a telemetry message](#send-a-telemetry-message).
+
+## Create a device certificate
+
+For this article, you'll use an X.509 certificate to authenticate with DPS using either an individual enrollment or an enrollment group.
+
+If you're using an individual enrollment, you can use a self-signed X.509 certificate or a [certificate chain](concepts-x509-attestation.md) composed of the device certificate plus one or more signing certificates. If you're using an enrollment group, you must use a certificate chain.
+
+> [!IMPORTANT]
+>
+> For X.509 enrollment authentication, the subject common name (CN) of the device certificate is used as the registration ID for the device. The registration ID is a case-insensitive string of alphanumeric characters plus the special characters: `'-'`, `'.'`, `'_'`, `':'`. The last character must be alphanumeric or dash (`'-'`). DPS supports registration IDs up to 128 characters long; however, the subject common name of an X.509 certificate is limited to 64 characters. If you change the subject common name for your device certificate in the following steps, make sure it adheres to this format.
+
+### Use a self-signed certificate
+
+To create a self-signed certificate to use with an individual enrollment, navigate to a directory where you want to create your certificate and follow these steps:
+
+1. Run the following command:
+
+ # [Windows (GitBash)](#tab/windows)
+
+ ```bash
+ winpty openssl req -outform PEM -x509 -sha256 -newkey rsa:4096 -keyout device-key.pem -out device-cert.pem -days 30 -extensions usr_cert -addext extendedKeyUsage=clientAuth -subj "//CN=my-x509-device"
+ ```
+
+ > [!IMPORTANT]
+ > The extra forward slash given for the subject name (`//CN=my-x509-device`) is only required to escape the string with Git on Windows platforms.
+
+ # [Linux/WSL](#tab/linux)
+
+ ```bash
+ openssl req -outform PEM -x509 -sha256 -newkey rsa:4096 -keyout device-key.pem -out device-cert.pem -days 30 -extensions usr_cert -addext extendedKeyUsage=clientAuth -subj "/CN=my-x509-device"
+ ```
+
+
+
+3. When asked to **Enter PEM pass phrase:**, use the pass phrase `1234`.
+
+4. When asked **Verifying - Enter PEM pass phrase:**, use the pass phrase `1234` again.
+
+ A public key certificate file (*device-cert.pem*) and private key file (*device-key.pem*) should now be generated in the directory where you ran the `openssl` command.
+
+ The certificate file has its subject common name (CN) set to `my-x509-device`.
+
+ The private key file is protected by the pass phrase: `1234`.
+
+5. The certificate file is Base64 encoded. To view the subject common name (CN) and other properties of the certificate file, enter the following command:
+
+ # [Windows (GitBash)](#tab/windows)
+
+ ```bash
+ winpty openssl x509 -in device-cert.pem -text -noout
+ ```
+
+ # [Linux/WSL](#tab/linux)
+
+ ```bash
+ openssl x509 -in device-cert.pem -text -noout
+ ```
+
+
+
+ ```output
+ Certificate:
+ Data:
+ Version: 3 (0x2)
+ Serial Number:
+ 77:3e:1d:e4:7e:c8:40:14:08:c6:09:75:50:9c:1a:35:6e:19:52:e2
+ Signature Algorithm: sha256WithRSAEncryption
+ Issuer: CN = my-x509-device
+ Validity
+ Not Before: May 5 21:41:42 2022 GMT
+ Not After : Jun 4 21:41:42 2022 GMT
+ Subject: CN = my-x509-device
+ Subject Public Key Info:
+ Public Key Algorithm: rsaEncryption
+ RSA Public-Key: (4096 bit)
+ Modulus:
+ 00:d2:94:37:d6:1b:f7:43:b4:21:c6:08:1a:d6:d7:
+ e6:40:44:4e:4d:24:41:6c:3e:8c:b2:2c:b0:23:29:
+ ...
+ 23:6e:58:76:45:18:03:dc:2e:9d:3f:ac:a3:5c:1f:
+ 9f:66:b0:05:d5:1c:fe:69:de:a9:09:13:28:c6:85:
+ 0e:cd:53
+ Exponent: 65537 (0x10001)
+ X509v3 extensions:
+ X509v3 Basic Constraints:
+ CA:FALSE
+ Netscape Comment:
+ OpenSSL Generated Certificate
+ X509v3 Subject Key Identifier:
+ 63:C0:B5:93:BF:29:F8:57:F8:F9:26:44:70:6F:9B:A4:C7:E3:75:18
+ X509v3 Authority Key Identifier:
+ keyid:63:C0:B5:93:BF:29:F8:57:F8:F9:26:44:70:6F:9B:A4:C7:E3:75:18
+
+ X509v3 Extended Key Usage:
+ TLS Web Client Authentication
+ Signature Algorithm: sha256WithRSAEncryption
+ 82:8a:98:f8:47:00:85:be:21:15:64:b9:22:b0:13:cc:9e:9a:
+ ed:f5:93:b9:4b:57:0f:79:85:9d:89:47:69:95:65:5e:b3:b1:
+ ...
+ cc:b2:20:9a:b7:f2:5e:6b:81:a1:04:93:e9:2b:92:62:e0:1c:
+ ac:d2:49:b9:36:d2:b0:21
+ ```
+
+### Use a certificate chain
+
+If you're using an enrollment group, you must authenticate with a certificate chain. With an individual enrollment, you can use a certificate chain or a self-signed certificate.
+
+To create a certificate chain, follow the instructions in [Create an X.509 certificate chain](tutorial-custom-hsm-enrollment-group-x509.md?tabs=linux#create-an-x509-certificate-chain). You only need one device for this article, so you can stop after creating the private key and certificate chain for the first device.
+
+When you're finished, you should have the following files:
+
+| Certificate | File | Description |
+| - | | - |
+| root CA certificate. | *certs/azure-iot-test-only.root.ca.cert.pem* | Will be uploaded to DPS and verified. |
+| intermediate CA certificate | *certs/azure-iot-test-only.intermediate.cert.pem* | Will be used to create an enrollment group in DPS. |
+| device-01 private key | *private/device-01.key.pem* | Used by the device to verify ownership of the device certificate during authentication with DPS. |
+| device-01 certificate | *certs/device-01.cert.pem* | Used to create individual enrollment entry with DPS. |
+| device-01 full chain certificate | *certs/device-01-full-chain.cert.pem* | Presented by the device to authenticate and register with DPS. |
+
+## Use an individual enrollment
+
+To create an individual enrollment to use for this article, use the [az iot dps enrollment create](/cli/azure/iot/dps/enrollment#az-iot-dps-enrollment-create) command.
+
+The following command creates an individual enrollment entry with the default allocation policy for your DPS instance using the device certificate you specify.
+
+```azurecli
+az iot dps enrollment create -g {resource_group_name} --dps-name {dps_name} --enrollment-id {enrollment_id} --attestation-type x509 --certificate-path {path to your certificate}
+```
+
+* Substitute the name of your resource group and DPS instance.
+
+* The enrollment ID is the registration ID for your device and, for X.509 enrollments, must match the subject common name (CN) of the device certificate.
+
+ * If you followed the instructions in [Use a self-signed-certificate](#use-a-self-signed-certificate), the enrollment ID is my-x509-device.
+
+ * If you followed the instructions in [Use a certificate chain](#use-a-certificate-chain), the enrollment ID is device-01.
+
+* The certificate path is the path to your device certificate.
+
+ * If you followed the instructions in [Use a self-signed-certificate](#use-a-self-signed-certificate), the filename is *device-cert.pem*.
+
+ * If you followed the instructions in [Use a certificate chain](#use-a-certificate-chain), the filename is *certs/device-01.cert.pem*.
+
+> [!NOTE]
+> If you're using Cloud Shell to run Azure CLI commands, you can use the upload button to upload your certificate file to your cloud drive before you run the command.
+>
+> :::image type="content" source="media/iot-dps-https-x509-support/upload-to-cloud-shell.png" alt-text="Screenshot that shows the upload file button in Azure Cloud Shell.":::
+
+## Use an enrollment group
+
+To create an enrollment group to use for this article, use the [az iot dps enrollment-group create](/cli/azure/iot/dps/enrollment-group#az-iot-dps-enrollment-group-create) command.
+
+The following command creates an enrollment group entry with the default allocation policy for your DPS instance using an intermediate CA certificate:
+
+```azurecli
+az iot dps enrollment-group create -g {resource_group_name} --dps-name {dps_name} --enrollment-id {enrollment_id} --certificate-path {path_to_your_certificate}
+```
+
+* Substitute the name of your resource group and DPS instance.
+
+* The enrollment ID is a case-insensitive string of alphanumeric characters plus the special characters: `'-'`, `'.'`, `'_'`, `':'`. The last character must be alphanumeric or dash (`'-'`). It can be any name you choose to use for the enrollment group.
+
+* The certificate path is the path to your intermediate certificate. If you followed the instructions in [Use a certificate chain](#use-a-certificate-chain), the filename is *certs/azure-iot-test-only.intermediate.cert.pem*.
+
+> [!NOTE]
+>> If you're using Cloud Shell to run Azure CLI commands, you can use the upload button to upload your certificate file to your cloud drive before you run the command.
+>
+> :::image type="content" source="media/iot-dps-https-x509-support/upload-to-cloud-shell.png" alt-text="Screenshot that shows the upload file button in Azure Cloud Shell.":::
+
+> [!NOTE]
+>
+> If you prefer, you can create an enrollment group based on a signing certificate that has been previously uploaded and verified with DPS (see next section). To do so, you specify the certificate name with the `--ca-name` and omit the `--certificate-path` parameter in the `az iot dps enrollment-group create` command.
+
+## Upload and verify a signing certificate
+
+If you're using a certificate chain for either an individual enrollment or an enrollment group, you must upload and verify at least one certificate in the device certificate's signing chain to DPS.
+
+* For an individual enrollment, this can be any signing certificate in the device's certificate chain.
+
+* For an enrollment group, this can be the certificate set on the enrollment group or any certificate in its signing chain up to and including the root CA certificate.
+
+To upload and verify your certificate, use the [az iot dps certificate create](/cli/azure/iot/dps/certificate#az-iot-dps-certificate-create) command:
+
+```azurecli
+az iot dps certificate create -g {resource_group_name} --dps-name {dps_name} --certificate-name {friendly_name_for_your_certificate} --path {path_to_your_certificate} --verified true
+```
+
+* Substitute the name of your resource group and DPS instance.
+
+* The certificate path is the path to your signing certificate. For this article, we recommend you upload the root CA certificate. If you followed the instructions in [Use a certificate chain](#use-a-certificate-chain), the filename is *certs/azure-iot-test-only.root.ca.cert.pem*.
+
+* The certificate name can contain only alphanumeric characters or the following special characters: `-._`. No whitespace is permitted. For example, "azure-iot-test-only-root".
+
+> [!NOTE]
+> If you're using Cloud Shell to run Azure CLI commands, you can use the upload button to upload your certificate file to your cloud drive before you run the command.
+>
+> :::image type="content" source="media/iot-dps-https-x509-support/upload-to-cloud-shell.png" alt-text="Screenshot that shows the upload file button in Azure Cloud Shell.":::
+
+> [!NOTE]
+>
+> The steps in this section automatically verified the certificate on upload. You can also do manual verification of the certificate. To learn more, see [Manual verification of intermediate or root CA](how-to-verify-certificates.md#manual-verification-of-intermediate-or-root-ca).
+
+## Register your device
+
+You call the [Register Device](/rest/api/iot-dps/device/runtime-registration/register-device) REST API to provision your device through DPS.
+
+Use the following curl command:
+
+```bash
+curl -L -i -X PUT --cert [path_to_your_device_cert] --key [path_to_your_device_private_key] -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -d '{"registrationId": "[registration_id]"}' https://global.azure-devices-provisioning.net/[dps_id_scope]/registrations/[registration_id]/register?api-version=2019-03-31
+```
+
+Where:
+
+* `-L` tells curl to follow HTTP redirects.
+
+* `ΓÇôi` tells curl to include protocol headers in output. These headers aren't strictly necessary, but they can be useful.
+
+* `-X PUT` tells curl that this is an HTTP PUT command. Required for this API call.
+
+* `--cert [path_to_your_device_cert]` tells curl where to find your device's X.509 certificate. If your device private key is protected by a pass phrase, you can add the pass phrase after the certificate path preceded by a colon, for example: `--cert my-device.pem:1234`.
+
+ * If you're using a self-signed certificate, your device certificate file will only contain a single X.509 certificate. If you followed the instructions in [Use a self-signed-certificate](#use-a-self-signed-certificate), the filename is *device-cert.pem* and the private key pass phrase is `1234`, so use `--cert device-cert.pem:1234`.
+
+ * If you're using a certificate chain, for example, when authenticating through an enrollment group, your device certificate file must contain a valid certificate chain. The certificate chain must include the device certificate and any signing certificates up to and including a verified certificate. If you followed the instructions in [Use a certificate chain](#use-a-certificate-chain) to create the certificate chain, the filepath is *certs/device-01-full-chain.cert.pem*, so use `--cert certs/device-01-full-chain.cert.pem`.
+
+* `--key [path_to_your_device_private_key]` tells curl where to find your device's private key.
+
+ * If you followed the instructions in [Use a self-signed-certificate](#use-a-self-signed-certificate), the filename is *device-key.pem*, so use `--key device-cert.pem:1234`.
+
+ * If you followed the instructions in [Use a certificate chain](#use-a-certificate-chain), the key path is *certs/device-01-full-chain.cert.pem*, so use `--cert certs/device-01-full-chain.cert.pem`.
+
+* `-H 'Content-Type: application/json'` tells DPS we're posting JSON content and must be 'application/json'
+
+* `-H 'Content-Encoding: utf-8'` tells DPS the encoding we're using for our message body. Set to the proper value for your OS/client; however, it's generally `utf-8`.
+
+* `-d '{"registrationId": "[registration_id]"}'`, the `ΓÇôd` parameter is the 'data' or body of the message we're posting. It must be JSON, in the form of '{"registrationId":"[registration_id"}'. Note that for curl, it's wrapped in single quotes; otherwise, you need to escape the double quotes in the JSON. For X.509 enrollment, the registration ID is the subject common name (CN) of your device certificate.
+
+* Finally, the last parameter is the URL to post to. For "regular" (i.e not on-premises) DPS, the global DPS endpoint, *global.azure-devices-provisioning.net* is used: `https://global.azure-devices-provisioning.net/[dps_id_scope]/registrations/[registration_id]/register?api-version=2019-03-31`. Note that you have to replace `[dps_scope_id]` and `[registration_id]` with the appropriate values.
+
+For example:
+
+* If you followed the instructions in [Use a self-signed certificate](#use-a-self-signed-certificate):
+
+ ```bash
+ curl -L -i -X PUT --cert device-cert.pem:1234 --key device-key.pem -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -d '{"registrationId": "my-x509-device"}' https://global.azure-devices-provisioning.net/0ne00111111/registrations/my-x509-device/register?api-version=2021-06-01
+ ```
+
+* If you followed the instructions in [Use a certificate chain](#use-a-certificate-chain):
+
+ ```bash
+ curl -L -i -X PUT --cert certs/device-01-full-chain.cert.pem --key private/device-01.key.pem -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -d '{"registrationId": "device-01"}' https://global.azure-devices-provisioning.net/0ne00111111/registrations/device-01/register?api-version=2021-06-01
+ ```
+
+A successful call will have a response similar to the following:
+
+```output
+HTTP/1.1 202 Accepted
+Date: Sat, 27 Aug 2022 17:53:18 GMT
+Content-Type: application/json; charset=utf-8
+Transfer-Encoding: chunked
+Location: https://global.azure-devices-provisioning.net/0ne00111111/registrations/my-x509-device/register
+Retry-After: 3
+x-ms-request-id: 05cdec07-c0c7-48f3-b3cd-30cfe27cbe57
+Strict-Transport-Security: max-age=31536000; includeSubDomains
+
+{"operationId":"5.506603669bd3e2bf.b3602f8f-76fe-4341-9214-bb6cfb891b8a","status":"assigning"}
+```
+
+The response contains an operation ID and a status. In this case, the status is set to `assigning`. DPS enrollment is, potentially, a long-running operation, so it's done asynchronously. Typically, you'll poll for status using the [Operation Status Lookup](/rest/api/iot-dps/device/runtime-registration/operation-status-lookup) REST API to determine when your device has been assigned or whether a failure has occurred.
+
+The valid status values for DPS are:
+
+* `assigned`: the return value from the status call will indicate what IoT Hub the device was assigned to.
+
+* `assigning`: the operation is still running.
+
+* `disabled`: the enrollment record is disabled in DPS, so the device can't be assigned.
+
+* `failed`: the assignment failed. There will be an `errorCode` and `errorMessage` returned in an `registrationState` record in the response to indicate what failed.
+
+* `unassigned`
+
+To call the **Operation Status Lookup** API, use the following curl command:
+
+```bash
+curl -L -i -X GET --cert [path_to_your_device_cert] --key [path_to_your_device_private_key] -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' https://global.azure-devices-provisioning.net/[dps_id_scope]/registrations/[registration_id]/operations/[operation_id]?api-version=2019-03-31
+```
+
+You'll use the same ID scope, registration ID, and certificate and key as you did in the **Register Device** request. Use the operation ID that was returned in the **Register Device** response.
+
+For example, the following command is for the self-signed certificate created in [Use a self-signed certificate](#use-a-self-signed-certificate). (You need to modify the ID scope and operation ID.)
+
+```bash
+curl -L -i -X GET --cert ./device-certPUT --cert device-cert.pem:1234 --key device-key.pem -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' https://global.azure-devices-provisioning.net/0ne00111111/registrations/my-x509-device/operations/5.506603669bd3e2bf.b3602f8f-76fe-4341-9214-bb6cfb891b8a?api-version=2021-06-01
+```
+
+The following output shows the response for a device that has been successfully assigned. Notice that the `status` property is `assigned` and that the `registrationState.assignedHub` property is set to the IoT hub where the device was provisioned.
+
+```output
+HTTP/1.1 200 OK
+Date: Sat, 27 Aug 2022 18:10:49 GMT
+Content-Type: application/json; charset=utf-8
+Transfer-Encoding: chunked
+x-ms-request-id: 8f211bc5-3ed8-4c8b-9a79-e003e756e9e4
+Strict-Transport-Security: max-age=31536000; includeSubDomains
+
+{
+ "operationId":"5.506603669bd3e2bf.b3602f8f-76fe-4341-9214-bb6cfb891b8a",
+ "status":"assigned",
+ "registrationState":{
+ "x509":{
+
+ },
+ "registrationId":"my-x509-device",
+ "createdDateTimeUtc":"2022-08-27T17:53:19.5143497Z",
+ "assignedHub":"MyExampleHub.azure-devices.net",
+ "deviceId":"my-x509-device",
+ "status":"assigned",
+ "substatus":"initialAssignment",
+ "lastUpdatedDateTimeUtc":"2022-08-27T17:53:19.7519141Z",
+ "etag":"IjEyMDA4NmYyLTAwMDAtMDMwMC0wMDAwLTYzMGE1YTBmMDAwMCI="
+ }
+}
+```
+
+Note down the device ID and the assigned IoT hub. You'll use them to send a telemetry message in the next section.
+
+## Send a telemetry message
+
+You call the IoT Hub [Send Device Event](/rest/api/iothub/device/send-device-event) REST API to send telemetry to the device.
+
+Use the following curl command:
+
+```bash
+curl -L -i -X POST --cert [path_to_your_device_cert] --key [path_to_your_device_private_key] -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -d '{"temperature": 30}' https://[assigned_iot_hub_name].azure-devices.net/devices/[device_id]/messages/events?api-version=2020-03-13
+```
+
+Where:
+
+* `-X POST` tells curl that this is an HTTP POST command. Required for this API call.
+
+* `--cert [path_to_your_device_cert]` tells curl where to find your device's X.509 certificate. If your device private key is protected by a pass phrase, you can add the pass phrase after the certificate path preceded by a colon, for example: `--cert my-device.pem:1234`.
+
+ * If you're using a self-signed certificate, your device certificate file will only contain a single X.509 certificate. If you followed the instructions in [Use a self-signed-certificate](#use-a-self-signed-certificate), the filename is *device-cert.pem* and the private key pass phrase is `1234`, so use `--cert device-cert.pem:1234`.
+
+ * If you're using a certificate chain, your device certificate file must contain a valid certificate chain. If you followed the instructions in [Use a certificate chain](#use-a-certificate-chain) to create the certificate chain, the filepath is *certs/device-01-full-chain.cert.pem*, so use `--cert certs/device-01-full-chain.cert.pem`.
+
+* `--key [path_to_your_device_private_key]` tells curl where to find your device's private key.
+
+ * If you followed the instructions in [Use a self-signed-certificate](#use-a-self-signed-certificate), the filename is *device-key.pem*, so use `--key device-cert.pem:1234`.
+
+ * If you followed the instructions in [Use a certificate chain](#use-a-certificate-chain), the key path is *certs/device-01-full-chain.cert.pem*, so use `--cert certs/device-01-full-chain.cert.pem`.
+
+* `-H 'Content-Type: application/json'` tells IoT Hub we're posting JSON content and must be 'application/json'.
+
+* `-H 'Content-Encoding: utf-8'` tells IoT Hub the encoding we're using for our message body. Set to the proper value for your OS/client; however, it's generally `utf-8`.
+
+* `-d '{"temperature": 30}'`, the `ΓÇôd` parameter is the 'data' or body of the message we're posting. For this article, we're posting a single temperature data point. The content type was specified as application/json, so, for this request, the body is JSON. Note that for curl, it's wrapped in single quotes; otherwise, you need to escape the double quotes in the JSON.
+
+* The last parameter is the URL to post to. For the Send Device Event API, the URL is: `https://[assigned_iot_hub_name].azure-devices.net/devices/[device_id]/messages/events?api-version=2020-03-13`.
+
+ * Replace `[assigned_iot_hub_name]` with the name of the IoT hub that your device was assigned to.
+
+ * Replace `[device_id]` with the device ID that was assigned when you registered your device. For devices that provision through enrollment groups the device ID will be the registration ID. For individual enrollments, you can, optionally, specify a device ID that is different than the registration ID in the enrollment entry.
+
+For example:
+
+* If you followed the instructions in [Use a self-signed certificate](#use-a-self-signed-certificate):
+
+ ```bash
+ curl -L -i -X POST --cert device-cert.pem:1234 --key device-key.pem -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -d '{"temperature": 30}' https://MyExampleHub.azure-devices.net/devices/my-x509-device/messages/events?api-version=2020-03-13
+ ```
+
+* If you followed the instructions in [Use a certificate chain](#use-a-certificate-chain):
+
+ ```bash
+ curl -L -i -X POST --cert certs/device-01-full-chain.cert.pem --key private/device-01.key.pem -H 'Content-Type: application/json' -H 'Content-Encoding: utf-8' -d '{"temperature": 30}' https://MyExampleHub.azure-devices.net/devices/my-x509-device/messages/events?api-version=2020-03-13
+ ```
+
+A successful call will have a response similar to the following:
+
+```output
+HTTP/1.1 204 No Content
+Content-Length: 0
+Vary: Origin
+Server: Microsoft-HTTPAPI/2.0
+x-ms-request-id: aa58c075-20d9-4565-8058-de6dc8524f14
+Date: Wed, 31 Aug 2022 18:34:44 GMT
+```
+
+## Next Steps
+
+* To learn more about attestation with X.509 certificates, see [X.509 certificate attestation](concepts-x509-attestation.md).
+
+* To learn more about uploading and verifying X.509 certificates, see [Configure verified CA certificates](how-to-verify-certificates.md).
iot-hub-device-update Create Device Update Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub-device-update/create-device-update-account.md
An IoT hub. It's recommended that you use an S1 (Standard) tier or above.
:::image type="content" source="media/create-device-update-account/account-details.png" alt-text="Screenshot of account details." lightbox="media/create-device-update-account/account-details.png":::
-4. Optionally, you can check the box to assign the Device Update administrator role to yourself. You can also use the steps listed in the [Configure access control roles](#configure-access-control-roles) section to provide a combination of roles to users and applications for the right level of access.
+4. Optionally, you can check the box to assign the Device Update administrator role to yourself. You can also use the steps listed in the [Configure access control roles](#configure-access-control-roles-for-device-update) section to provide a combination of roles to users and applications for the right level of access.
You need to have Owner or User Access Administrator permissions in your subscription to manage roles.
az iot device-update instance create --account <account_name> --instance <instan
-## Configure access control roles
+## Configure access control roles for Device Update
In order for other users to have access to Device Update, they must be granted access to this resource. You can skip this step if you assigned the Device Update administrator role to yourself during account creation and don't need to provide access to other users or applications.
az role assignment create --role '<role>' --assignee <user_group> --scope <accou
+## Configure access control roles for IoT Hub
+
+Device Update for IoT Hub communicates with IoT Hub to manage deployments and updates and to get information about devices. To enable the access, you need to give the **Azure Device Update** service principal access with the **IoT Hub Data Contributor** role.
+
+# [Azure portal](#tab/portal)
+
+1. In the Azure portal, navigate to the IoT hub connected to your Device Update instance.
+1. Select **Access Control(IAM)** from the navigation menu.
+1. Select **Add** > **Add role assignment**.
+1. In the **Role** tab, select **IoT Hub Data Contributor**. Select **Next**.
+1. For **Assign access to**, select **User, group, or service principal**.
+1. Select **Select Members** and search for '**Azure Device Update**'
+1. Select **Next** > **Review + Assign**
+
+To validate that you've set permissions correctly:
+
+1. In the Azure portal, navigate to the IoT hub connected to your Device Update instance.
+1. Select **Access Control(IAM)** from the navigation menu.
+1. Select **Check access**.
+1. Select **User, group, or service principal** and search for '**Azure Device Update**'
+1. After clicking on **Azure Device Update**, verify that the **IoT Hub Data Contributor** role is listed under **Role assignments**
+
+# [Azure CLI](#tab/cli)
+
+Use the [az role assignment create](/cli/azure/role/assignment#az-role-assignment-create) command to create a role assignment for the Azure Device Update service principal.
+
+Replace *\<resource_id>* with the resource ID of your IoT hub. You can retrieve the resource ID by using the [az iot hub show](/cli/azure/iot/hub#az-iot-hub-show) command and querying for the ID value: `az iot hub show -n <hub_name> --query id`.
+
+```azurecli
+az role assignment create --role "IoT Hub Data Contributor" --assignee https://api.adu.microsoft.com/ --scope <resource_id>
+```
+++ ## View and query accounts or instances You can view, sort, and query all of your Device Update accounts and instances.
iot-hub-device-update Device Update Control Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub-device-update/device-update-control-access.md
A combination of roles can be used to provide the right level of access. For exa
## Configuring access for Azure Device Update service principal in the IoT Hub
-Device Update for IoT Hub communicates with the IoT Hub for deployments and manage updates at scale. In order to enable Device Update to do this, users need to set IoT Hub Data Contributor Contributor access for Azure Device Update Service Principal in the IoT Hub permissions.
+Device Update for IoT Hub communicates with the IoT Hub for deployments and manage updates at scale. In order to enable Device Update to do this, users need to set IoT Hub Data Contributor access for Azure Device Update Service Principal in the IoT Hub permissions.
Below actions will be blocked with upcoming release, if these permissions are not set:+ * Create Deployment * Cancel Deployment
-* Retry Deployment
+* Retry Deployment
* Get Device 1. Go to the **IoT Hub** connected to your Device Update Instance. Click **Access Control(IAM)**
Below actions will be blocked with upcoming release, if these permissions are no
5. Click **Next** -> **Review + Assign** To validate that you've set permissions correctly:+ 1. Go to the **IoT Hub** connected to your Device Update Instance. Click **Access Control(IAM)** 2. Click **Check access** 3. Select **User, group, or service principal** and search for '**Azure Device Update**'
iot-hub-device-update Device Update Plug And Play https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-hub-device-update/device-update-plug-and-play.md
IoT Hub device twin example:
"deviceProperties": { "manufacturer": "contoso", "model": "virtual-vacuum-v1",
- "interfaceId": "dtmi:azure:iot:deviceUpdate;1",
+ "interfaceId": "dtmi:azure:iot:deviceUpdateModel;1",
"aduVer": "DU;agent/0.8.0-rc1-public-preview", "doVer": "DU;lib/v0.6.0+20211001.174458.c8c4051,DU;agent/v0.6.0+20211001.174418.c8c4051" },
key-vault Assign Access Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/general/assign-access-policy.md
A Key Vault access policy determines whether a given security principal, namely
1. In the [Azure portal](https://portal.azure.com), navigate to the Key Vault resource.
-1. Under **Settings**, select **Access policies**, then select **Add Access Policy**:
+1. Select **Access policies**, then select **Create**:
- ![Select Access policies, selecting Add role assignment](../media/authentication/assign-policy-portal-01.png)
+ ![Select Access policies, selecting Add role assignment](../media/authentication/assign-access-01.png)
-1. Select the permissions you want under **Certificate permissions**, **Key permissions**, and **Secret permissions**. You can also select a template that contains common permission combinations:
+1. Select the permissions you want under **Key permissions**, **Secret permissions**, and **Certificate permissions**.
- ![Specifying access policy permissions](../media/authentication/assign-policy-portal-02.png)
+ ![Specifying access policy permissions](../media/authentication/assign-access-02.png)
-1. Under **Select principal**, choose the **None selected** link to open the **Principal** selection pane. Enter the name of the user, app or service principal in the search field, select the appropriate result, then choose **Select**.
+1. Under the **Principal** selection pane, enter the name of the user, app or service principal in the search field and select the appropriate result.
- ![Selecting the security principal for the access policy](../media/authentication/assign-policy-portal-03.png)
+ ![Selecting the security principal for the access policy](../media/authentication/assign-access-03.png)
If you're using a managed identity for the app, search for and select the name of the app itself. (For more information on security principals, see [Key Vault authentication](authentication.md).
-1. Back in the **Add access policy** pane, select **Add** to save the access policy.
+1. Review the access policy changes and select **Create** to save the access policy.
- ![Adding the access policy with the security principal assigned](../media/authentication/assign-policy-portal-04.png)
+ ![Adding the access policy with the security principal assigned](../media/authentication/assign-access-04.png)
-1. Back on the **Access policies** page, verify that your access policy is listed under **Current Access Policies**, then select **Save**. Access policies aren't applied until you save them.
+1. Back on the **Access policies** page, verify that your access policy is listed.
- ![Saving the access policy changes](../media/authentication/assign-policy-portal-05.png)
+ ![Saving the access policy changes](../media/authentication/assign-access-05.png)
# [Azure CLI](#tab/azure-cli)
key-vault Network Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/key-vault/general/network-security.md
To understand how to configure a private link connection on your key vault, plea
> * IP network rules are only allowed for public IP addresses. IP address ranges reserved for private networks (as defined in RFC 1918) are not allowed in IP rules. Private networks include addresses that start with **10.**, **172.16-31**, and **192.168.**. > * Only IPv4 addresses are supported at this time.
+### Public Access Disabled (Private Endpoint Only)
+
+To enhance network security, you can configure your vault to disable public access. This will deny all public configurations and allow only connections through private endpoints.
+ ## References * ARM Template Reference: [Azure Key Vault ARM Template Reference](/azure/templates/Microsoft.KeyVault/vaults) * Azure CLI commands: [az keyvault network-rule](/cli/azure/keyvault/network-rule)
load-balancer Tutorial Load Balancer Port Forwarding Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-balancer/tutorial-load-balancer-port-forwarding-portal.md
Previously updated : 10/18/2022- Last updated : 10/28/2022+ # Tutorial: Create a single virtual machine inbound NAT rule using the Azure portal
logic-apps Concepts Schedule Automated Recurring Tasks Workflows https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/concepts-schedule-automated-recurring-tasks-workflows.md
ms.suite: integration Previously updated : 08/20/2022 Last updated : 10/26/2022 # Schedules for recurring triggers in Azure Logic Apps workflows
Here are some patterns that show how you can control recurrence with the start d
| {none} | Runs the first workload instantly. <p>Runs future workloads based on the last run time. | Runs the first workload instantly. <p>Runs future workloads based on the specified schedule. | | Start time in the past | **Recurrence** trigger: Calculates run times based on the specified start time and discards past run times. <p><p>Runs the first workload at the next future run time. <p><p>Runs future workloads based on the last run time. <p><p>**Sliding Window** trigger: Calculates run times based on the specified start time and honors past run times. <p><p>Runs future workloads based on the specified start time. <p><p>For more explanation, see the example following this table. | Runs the first workload *no sooner* than the start time, based on the schedule calculated from the start time. <p><p>Runs future workloads based on the specified schedule. <p><p>**Note:** If you specify a recurrence with a schedule, but don't specify hours or minutes for the schedule, Azure Logic Apps calculates future run times by using the hours or minutes, respectively, from the first run time. | | Start time now or in the future | Runs the first workload at the specified start time. <p><p>**Recurrence** trigger: Runs future workloads based on the last run time. <p><p>**Sliding Window** trigger: Runs future workloads based on the specified start time. | Runs the first workload *no sooner* than the start time, based on the schedule calculated from the start time. <p><p>Runs future workloads based on the specified schedule. If you use the **Day**, **Week**, or **Month** frequency, and you specify a future date and time, make sure that you set up the recurrence in advance: <p>- **Day**: Set up the daily recurrence at least 24 hours in advance. <p>- **Week**: Set up the weekly recurrence at least 7 days in advance. <p>- **Month**: Set up the monthly recurrence at least one month in advance. <p>Otherwise, the workflow might skip the first recurrence. <p>**Note:** If you specify a recurrence with a schedule, but don't specify hours or minutes for the schedule, Azure Logic Apps calculates future run times by using the hours or minutes, respectively, from the first run time. |
-||||
*Example for past start time and recurrence but no schedule*
Suppose the current date and time is September 8, 2017 at 1:00 PM. You specify t
| Start time | Current time | Recurrence | Schedule | ||--||-| | 2017-09-**07**T14:00:00Z <br>(2017-09-**07** at 2:00 PM) | 2017-09-**08**T13:00:00Z <br>(2017-09-**08** at 1:00 PM) | Every two days | {none} |
-|||||
For the Recurrence trigger, the Azure Logic Apps engine calculates run times based on the start time, discards past run times, uses the next future start time for the first run, and calculates future runs based on the last run time.
Here's how this recurrence looks:
| Start time | First run time | Future run times | ||-|| | 2017-09-**07** at 2:00 PM | 2017-09-**09** at 2:00 PM | 2017-09-**11** at 2:00 PM </br>2017-09-**13** at 2:00 PM </br>2017-09-**15** at 2:00 PM </br>and so on... |
-||||
So, no matter how far in the past you specify the start time, for example, 2017-09-**05** at 2:00 PM or 2017-09-**01** at 2:00 PM, your first run always uses the next future start time.
Here's how this recurrence looks:
| Start time | First run time | Future run times | ||-|| | 2017-09-**07** at 2:00 PM | 2017-09-**08** at 1:00 PM (Current time) | 2017-09-**09** at 2:00 PM </br>2017-09-**11** at 2:00 PM </br>2017-09-**13** at 2:00 PM </br>2017-09-**15** at 2:00 PM </br>and so on... |
-||||
So, no matter how far in the past you specify the start time, for example, 2017-09-**05** at 2:00 PM or 2017-09-**01** at 2:00 PM, your first run always uses the specified start time.
+## Recurrence behavior
+
+Recurring built-in triggers, such as the [Recurrence trigger](../connectors/connectors-native-recurrence.md), run natively on the Azure Logic Apps runtime. These triggers differ from recurring connection-based managed connector triggers where you need to create a connection first, such as the Office 365 Outlook managed connector trigger.
+
+For both kinds of triggers, if a recurrence doesn't specify a specific start date and time, the first recurrence runs immediately when you save or deploy the logic app resource, despite your trigger's recurrence setup. To avoid this behavior, provide a start date and time for when you want the first recurrence to run.
+
+### Recurrence for built-in triggers
+
+Recurring built-in triggers follow the schedule that you set, including any specified time zone. However, if a recurrence doesn't specify other advanced scheduling options, such as specific times to run future recurrences, those recurrences are based on the last trigger execution. As a result, the start times for those recurrences might drift due to factors such as latency during storage calls.
+
+For more information, review the following documentation:
+
+* [Trigger recurrence for daylight saving time and standard time](#daylight-saving-standard-time)
+* [Troubleshoot recurrence issues](../logic-apps/concepts-schedule-automated-recurring-tasks-workflows.md#recurrence-issues)
+
+### Recurrence for connection-based triggers
+
+For recurring connection-based triggers, such as Office 365 Outlook, the schedule isn't the only driver that controls execution. The time zone only determines the initial start time. Subsequent runs depend on the recurrence schedule, the last trigger execution, and other factors that might cause run times to drift or produce unexpected behavior, for example:
+
+* Whether the trigger accesses a server that has more data, which the trigger immediately tries to fetch.
+* Any failures or retries that the trigger incurs.
+* Latency during storage calls.
+* Not maintaining the specified schedule when daylight saving time (DST) starts and ends.
+* Other factors that can affect when the next run time happens.
+
+For more information, review the following documentation:
+
+* [Trigger recurrence for daylight saving time and standard time](#daylight-saving-standard-time)
+* [Trigger recurrence shift and drift during daylight saving time and standard time](#recurrence-shift-drift)
+* [Troubleshoot recurrence issues](../logic-apps/concepts-schedule-automated-recurring-tasks-workflows.md#recurrence-issues)
+ <a name="daylight-saving-standard-time"></a>
-## Recurrence for daylight saving time and standard time
+### Trigger recurrence for daylight saving time and standard time
To schedule jobs, Azure Logic Apps puts the message for processing into the queue and specifies when that message becomes available, based on the UTC time when the last job ran and the UTC time when the next job is scheduled to run. If you specify a start time with your recurrence, *make sure that you select a time zone* so that your logic app workflow runs at the specified start time. That way, the UTC time for your logic app also shifts to counter the seasonal time change. Recurring triggers honor the schedule that you set, including any time zone that you specify.
-Otherwise, if you don't select a time zone, daylight saving time (DST) events might affect when triggers run. For example, the start time shifts one hour forward when DST starts and one hour backward when DST ends.
+If you don't select a time zone, daylight saving time (DST) events might affect when triggers run. For example, the start time shifts one hour forward when DST starts and one hour backward when DST ends.
+
+<a name="recurrence-shift-drift"></a>
+
+### Trigger recurrence shift and drift during daylight saving time and standard time
+
+For recurring connection-based triggers, the recurrence schedule isn't the only driver that controls execution. The time zone only determines the initial start time. Subsequent runs depend on the recurrence schedule, the last trigger execution, and other factors that might cause run times to drift or produce unexpected behavior, for example:
+
+* Failure to maintain the specified schedule when daylight saving time (DST) starts and ends.
+* Other factors that can affect when the next run time happens.
+* Latency during storage calls.
+* Whether the trigger accesses a server that has more data, which the trigger immediately tries to fetch.
+* Any failures or retries that the trigger incurs.
+
+To make sure that the recurrence time doesn't shift when DST takes effect, manually adjust the recurrence. That way, your workflow continues to run at the expected or specified start time. Otherwise, the start time shifts one hour forward when DST starts and one hour backward when DST ends.
<a name="dst-window"></a>
If these logic apps use the UTC-6:00 Central Time (US & Canada) zone, this simul
| 03/09/2019 | 1:30:00 AM | 7:30:00 AM | UTC before the day that DST takes effect. | | 03/10/2019 | 1:30:00 AM | 7:30:00 AM | UTC is the same because DST hasn't taken effect. | | 03/11/2019 | 1:30:00 AM | 6:30:00 AM | UTC shifted one hour backward after DST took effect. |
- |||||
* Logic app #2
If these logic apps use the UTC-6:00 Central Time (US & Canada) zone, this simul
| 03/09/2019 | 2:30:00 AM | 8:30:00 AM | UTC before the day that DST takes effect. | | 03/10/2019 | 3:30:00 AM* | 8:30:00 AM | DST is already in effect, so local time has moved one hour forward because the UTC-6:00 time zone changes to UTC-5:00. For more information, see [Triggers that start between 2:00 AM - 3:00 AM](#dst-window). | | 03/11/2019 | 2:30:00 AM | 7:30:00 AM | UTC shifted one hour backward after DST took effect. |
- |||||
* **11/03/2019: DST ends at 2:00 AM and shifts time one hour backward**
If these logic apps use the UTC-6:00 Central Time (US & Canada) zone, this simul
| 11/02/2019 | 1:30:00 AM | 6:30:00 AM || | 11/03/2019 | 1:30:00 AM | 6:30:00 AM || | 11/04/2019 | 1:30:00 AM | 7:30:00 AM ||
- |||||
* Logic app #2
If these logic apps use the UTC-6:00 Central Time (US & Canada) zone, this simul
| 11/02/2019 | 2:30:00 AM | 7:30:00 AM || | 11/03/2019 | 2:30:00 AM | 8:30:00 AM || | 11/04/2019 | 2:30:00 AM | 8:30:00 AM ||
- |||||
+
+<a name="recurrence-issues"></a>
+
+### Troubleshoot recurrence issues
+
+To make sure that your workflow runs at your specified start time and doesn't miss a recurrence, especially when the frequency is in days or longer, try the following solutions:
+
+* When DST takes effect, manually adjust the recurrence so that your workflow continues to run at the expected time. Otherwise, the start time shifts one hour forward when DST starts and one hour backward when DST ends. For more information and examples, review [Recurrence for daylight saving time and standard time](#daylight-saving-standard-time).
+
+* If you're using a **Recurrence** trigger, specify a time zone, a start date, and start time. In addition, configure specific times to run subsequent recurrences in the properties **At these hours** and **At these minutes**, which are available only for the **Day** and **Week** frequencies. However, some time windows might still cause problems when the time shifts.
+
+* Consider using a [**Sliding Window** trigger](../connectors/connectors-native-sliding-window.md) instead of a **Recurrence** trigger to avoid missed recurrences.
<a name="run-once"></a>
Here are various example recurrences that you can set up for the triggers that s
| Recurrence | Run every 15 minutes biweekly on Mondays only | 2 | Week | {none} | "Monday" | 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23 | 0, 15, 30, 45 | This schedule runs every other Monday at every 15-minute mark. | | Recurrence | Run every month | 1 | Month | *startDate*T*startTime*Z | {unavailable} | {unavailable} | {unavailable} | This schedule doesn't start *any sooner* than the specified start date and time and calculates future recurrences on the start date and time. If you don't specify a start date and time, this schedule uses the creation date and time. | | Recurrence | Run every hour for one day per month | 1 | Month | {see note} | {unavailable} | 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23 | {see note} | If you don't specify a start date and time, this schedule uses the creation date and time. To control the minutes for the recurrence schedule, specify the minutes of the hour, a start time, or use the creation time. For example, if the start time or creation time is 8:25 AM, this schedule runs at 8:25 AM, 9:25 AM, 10:25 AM, and so on. |
-|||||||||
## Next steps
logic-apps Export From Consumption To Standard Logic App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/export-from-consumption-to-standard-logic-app.md
+
+ Title: Export workflows from Consumption to Standard
+description: Export logic app workflows created in the consumption sku to a Standard logic app using Visual Studio Code.
+
+ms.suite: integration
++ Last updated : 10/28/2022
+#Customer intent: As a developer, I want to export one or more Consumption workflows to a Standard workflow.
++
+# Export Consumption workflows to a Standard logic app (Preview)
+
+> [!NOTE]
+>
+> This capability is in preview and is subject to the
+> [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+
+Standard logic app workflows, which run in single-tenant Azure Logic Apps, offer many new and improved capabilities. For example, you get compute isolation, virtual network integration, and private endpoints along with App Services Environment hosting, local development and debugging using Visual Studio Code, low latency with stateless workflows, and more.
+
+If you want the benefits from Standard workflows, but your workflows run in multi-tenant Azure Logic Apps, you can now move your Consumption workflows to single-tenant Azure Logic Apps. This switch makes sense in scenarios that require some of the Standard capabilities such as isolation and network integration, lower latency or better predictability of costs.
+
+You can now export Consumption logic apps to a Standard logic app. Using Visual Studio Code and the latest Azure Logic Apps (Standard) extension, you export your logic apps as stateful workflows to a Standard logic app project. You can then locally update, test, and debug your workflows to get them ready for redeployment. When you're ready, you can deploy either directly from Visual Studio Code or through your own DevOps process.
+
+> [!NOTE]
+>
+> The export capability doesn't migrate your workflows. Instead, this tool replicates artifacts,
+> such as workflow definitions, connections, integration account artifacts, and others. Your source
+> logic app resources, workflows, trigger history, run history, and other data stay intact.
+>
+> You control the export process and your migration journey. You can test and validate your
+> exported workflows to your satisfaction with the destination environment. You choose when
+> to disable or delete your source logic apps.
+
+This article provides information about the export process and shows how to export your logic app workflows from an ISE to a local Standard logic app project in Visual Studio Code.
+
+## Known issues and limitations
+
+- The following logic apps and scenarios are currently ineligible for export:
+
+ - Logic apps that use custom connectors
+ - Logic apps that use the Azure API Management connector
+
+- The export tool doesn't export any infrastructure information, such as integration account settings.
+
+- The export tool can export logic app workflows with triggers that have concurrency settings. However, single-tenant Azure Logic Apps ignores these settings.
+
+- Logic apps must exist in the same region if you want to export them within the same Standard logic app project.
+
+- For now, connectors deploy as their *managed* versions, which appear in the designer under the **Azure** tab. The export tool will have the capability to export connectors that have a built-in, service provider counterpart, when the latter gain parity with their Azure versions. The export tool automatically makes the conversion when Azure connector is available to export as a built-in, service provider connector.
+
+- By default, connection credentials aren't cloned from source logic app workflows. Before your logic app workflows can run, you'll have to reauthenticate these connections after export.
+
+## Exportable operation types
+
+| Operation | JSON type |
+|--|--|
+| Trigger | **Built-in**: `Http`, `HttpWebhook`, `Recurrence`, `manual` (Request) <br><br>**Managed**: `ApiConnection` `ApiConnectionNotification`, `ApiConnectionWebhook` |
+| Action | **Built-in**: `AppendToArrayVariable`, `AppendToStringVariable`, `Compose`, `DecrementVariable`, `Foreach`, `Http`, `HttpWebhook`, `If`, `IncrementVariable`, `InitializeVariable`, `JavaScriptCode`, `Join`, `ParseJson`, `Response`, `Scope`, `Select`, `SetVariable`, `Switch`, `Table`, `Terminate`, `Until`, `Wait` <br><br>- **Managed**: `ApiConnection`, `ApiConnectionWebhook` |
+
+## Prerequisites
+
+- One or more logic apps to deploy to the same subscription and Azure region, for example, East US 2.
+
+- Azure contributor subscription-level access to the subscription where the logic apps are currently deployed, not just resource group-level access.
+
+- Review and meet the requirements for [how to set up Visual Studio Code with the Azure Logic Apps (Standard) extension](create-single-tenant-workflows-visual-studio-code.md#prerequisites).
+
+## Group logic apps for export
+
+With the Azure Logic Apps (Standard) extension, you can combine multiple Consumption logic app workflows into a single Standard logic app project. In single-tenant Azure Logic Apps, one Standard logic app resource can have multiple workflows. With this approach, you can pre-validate your workflows so that you don't miss any dependencies when you select logic apps for export.
+
+Consider the following recommendations when you select logic apps for export:
+
+- Group logic apps where workflows share the same resources, such as integration account artifacts, maps, and schemas, or use resources through a chain of processes.
+
+- For the organization and number of workflows per logic app, review [Best practices and recommendations](create-single-tenant-workflows-azure-portal.md#best-practices-and-recommendations).
+
+## Export Consumption workflows to a local project
++
+### Select logic apps for export
+
+1. In Visual Studio Code, sign in to Azure, if you haven't already.
+
+1. In the left navigation bar, select **Azure** to open the **Azure** window (Shift + Alt + A), and expand the **Logic Apps (Standard)** extension view.
+
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-azure-view.png" alt-text="Screenshot showing Visual Studio Code with 'Azure' view selected.":::
+
+1. On the extension toolbar, select **Export Logic App...**.
+
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-export-logic-app.png" alt-text="Screenshot showing Visual Studio Code and 'Logic Apps (Standard)' extension toolbar with 'Export Logic App' selected.":::
+
+1. After the **Export** tab opens, select your Azure subscription and region, and then select **Next**.
+
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-subscription-consumption.png" alt-text="Screenshot showing 'Export' tab and 'Select logic app instance' section with Azure subscription and region selected.":::
+
+1. Select the logic apps to export. Each selected logic app appears on the **Selected logic apps** list to the side. When you're done, select **Next**.
+
+ > [!TIP]
+ >
+ > You can also search for logic apps and filter on resource group.
+
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-logic-apps.png" alt-text="Screenshot showing 'Select logic apps to export' section with logic apps selected for export.":::
+
+ The export tool starts to validate whether your selected logic apps are eligible for export.
+
+### Review export validation results
+
+1. After export validation completes, review the results by expanding the entry for each logic app.
+
+ - Logic apps that have errors are ineligible for export. You must remove these logic apps from the export list until you fix them at the source. To remove a logic app from the list, select **Back**.
+
+ For example, **SourceLogicApp2** has an error and can't be exported until fixed:
+
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-back-button-remove-app.png" alt-text="Screenshot showing 'Review export status' section and validation status for logic app workflow with error.":::
+
+ - Logic apps that pass validation with or without warnings are still eligible for export. To continue, select **Export** if all apps validate successfully, or select **Export with warnings** if apps have warnings.
+
+ For example, **SourceLogicApp3** has a warning, but you can still continue to export:
+
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-export-with-warnings.png" alt-text="Screenshot showing 'Review export status' section and validation status for logic app workflow with warning.":::
+
+ The following table provides more information about each validation icon and status:
+
+ | Validation icon | Validation status |
+ |--|-|
+ | ![Success icon](media/export-from-consumption-to-standard-logic-app/success-icon.png) | Item passed validation, so export can continue without problems to resolve. |
+ | ![Failed icon](media/export-from-consumption-to-standard-logic-app/failed-icon.png) | Item failed validation, so export can't continue. <br><br>The validation entry for the failed item automatically appears expanded and provides information about the validation failure. |
+ | ![Warning icon](media/export-from-consumption-to-standard-logic-app/warning-icon.png) | Item passed validation with a warning, but export can continue with required post-export resolution. <br><br>The validation entry for the item with a warning automatically appears expanded and provides information about the warning and required post-export remediation. |
+
+1. After the **Finish export** section appears, for **Export location**, browse and select a local folder for your new Standard logic app project.
+
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-local-folder.png" alt-text="Screenshot showing 'Finish export' section and 'Export location' property with selected local export project folder.":::
+
+1. If your workflow has *managed* connections that you want to deploy, which is only recommended for non-production environments, select **Deploy managed connections**, which shows existing resource groups in your Azure subscription. Select the resource group where you want to deploy the managed connections.
+
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/select-deploy-managed-connections-resource-group.png" alt-text="Screenshot showing 'Finish export' section with selected local export folder, 'Deploy managed connections' selected, and target resource group selected.":::
+
+1. Under **After export steps**, review any required post-export steps, for example:
+
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/review-post-export-steps.png" alt-text="Screenshot showing 'After export steps' section and required post-export steps, if any.":::
+
+1. Based on your scenario, select **Export and finish** or **Export with warnings and finish**.
+
+ The export tool downloads your project to your selected folder location, expands the project in Visual Studio Code, and deploys any managed connections, if you selected that option.
+
+ :::image type="content" source="media/export-from-consumption-to-standard-logic-app/export-status.png" alt-text="Screenshot showing the 'Export status' section with export progress.":::
+
+1. After this process completes, Visual Studio Code opens a new workspace. You can now safely close the export window.
+
+1. From your Standard logic app project, open and review the README.md file for the required post-export steps.
+
+ :::image type="content" source="medi file opened.":::
+
+## Post-export steps
+
+### Remediation steps
+
+Some exported logic app workflows require post-export remediation steps to run on the Standard platform.
+
+1. From your Standard logic app project, open the README.md file, and review the remediation steps for your exported workflows. The export tool generates the README.md file, which contains all the required post-export steps.
+
+1. Before you make any changes to your source logic app workflow, make sure to test your new Standard logic app resource and workflows.
+
+### Integration account actions and settings
+
+If you export actions that depend on an integration account, you have to manually set up your Standard logic app with a reference link to the integration account that contains the required artifacts. For more information, review [Link integration account to a Standard logic app](logic-apps-enterprise-integration-create-integration-account.md#link-account).
+
+## Project folder structure
+
+After the export process finishes, your Standard logic app project contains new folders and files alongside most others in a [typical Standard logic app project](create-single-tenant-workflows-visual-studio-code.md).
+
+The following table describes these new folders and files added by the export process:
+
+| Folder | File | Description |
+|--||-|
+| .development\\deployment | LogicAppStandardConnections.parameters.json | Azure Resource Manager template parameters file for deploying managed connectors |
+| | LogicAppStandardConnections.template.json | Azure Resource Manager template definition for deploying managed connectors |
+| | LogicAppStandardInfrastructure.parameters.json | Azure Resource Manager template parameters file for deploying Standard logic app resource |
+| | LogicAppStandardInfrastructure.template.json | Azure Resource Manager template definition for deploying Standard logic app resource |
+| .logs\\export | exportReport.json | Export report summary raw file, which includes all the steps required for post-export remediation |
+| | exportValidation.json | Validation report raw file, which includes the validation results for each exported logic app |
+| | README.md | Markdown file with export results summary, including the created logic apps and all the required next steps |
+
+## Next steps
+
+- [Run, test, and debug locally](create-single-tenant-workflows-visual-studio-code.md#run-test-and-debug-locally)
logic-apps Export From Ise To Standard Logic App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/export-from-ise-to-standard-logic-app.md
ms.suite: integration Previously updated : 09/28/2022 Last updated : 10/28/2022 #Customer intent: As a developer, I want to export one or more ISE workflows to a Standard workflow.
Consider the following recommendations when you select logic apps for export:
1. On the extension toolbar, select **Export Logic App...**.
- ![Screenshot showing Visual Studio Code and **Logic Apps (Standard)** extension toolbar with 'Export Logic App' selected.](media/export-from-ise-to-standard-logic-app/select-export-logic-app.png)
+ ![Screenshot showing Visual Studio Code and 'Logic Apps (Standard)' extension toolbar with 'Export Logic App' selected.](media/export-from-ise-to-standard-logic-app/select-export-logic-app.png)
1. After the **Export** tab opens, select your Azure subscription and ISE instance, and then select **Next**. ![Screenshot showing 'Export' tab and 'Select logic app instance' section with Azure subscription and ISE instance selected.](media/export-from-ise-to-standard-logic-app/select-subscription-ise.png)
-1. Select the logic apps to export. Each selected logic app appears on the **Selected logic apps** list to the side. When you're done, select **Next**.
-
- ![Screenshot showing 'Select logic apps to export' section with logic apps selected for export.](media/export-from-ise-to-standard-logic-app/select-logic-apps.png)
+1. Select the logic apps to export. Each selected logic app appears on the **Selected logic apps** list to the side. When you're done, select **Next**.
> [!TIP] > > You can also search for logic apps and filter on resource group.
+ ![Screenshot showing 'Select logic apps to export' section with logic apps selected for export.](media/export-from-ise-to-standard-logic-app/select-logic-apps.png)
+ The export tool starts to validate whether your selected logic apps are eligible for export. ### Review export validation results
Consider the following recommendations when you select logic apps for export:
1. Under **After export steps**, review any required post-export steps, for example:
- ![Screenshot showing **After export steps** section and required post-export steps, if any.](media/export-from-ise-to-standard-logic-app/review-post-export-steps.png)
+ ![Screenshot showing 'After export steps' section and required post-export steps, if any.](media/export-from-ise-to-standard-logic-app/review-post-export-steps.png)
1. Based on your scenario, select **Export and finish** or **Export with warnings and finish**.
logic-apps Logic Apps Enterprise Integration As2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/logic-apps/logic-apps-enterprise-integration-as2.md
Previously updated : 08/23/2022 Last updated : 10/20/2022 # Exchange AS2 messages using workflows in Azure Logic Apps [!INCLUDE [logic-apps-sku-consumption-standard](../../includes/logic-apps-sku-consumption-standard.md)]
-To send and receive AS2 messages in workflows that you create using Azure Logic Apps, use the **AS2** connector, which provides triggers and actions that support and manage AS2 (version 1.2) communication.
+To send and receive AS2 messages in workflows that you create using Azure Logic Apps, you can use the **AS2 (v2)** connector, which provides actions that support and manage AS2 communication. If you need tracking capabilities, the original **AS2** connector is still available, but is being deprecated.
-* If you're working with the **Logic App (Consumption)** resource type and don't need tracking capabilities, use the **AS2 (v2)** connector, rather than the original **AS2** connector, which is being deprecated.
+Except for tracking capabilities, the **AS2 (v2)** connector provides the same capabilities as the original **AS2** connector, runs natively with the Azure Logic Apps runtime, and offers significant performance improvements in message size, throughput, and latency. Unlike the original **AS2** connector, the **AS2 (v2)** connector doesn't require that you create a connection to your integration account. Instead, as described in the prerequisites, make sure that you link your integration account to the logic app resource where you plan to use the connector.
- Except for tracking, **AS2 (v2)** provides better performance, the same capabilities as the original version, is native to the Azure Logic Apps runtime, and has significant performance improvements in message size, throughput, and latency. Also, the v2 connector doesn't require that you create a connection to your integration account. Instead, as described in the prerequisites, make sure that you link your integration account to the logic app resource where you plan to use the connector.
+This article shows how to add the AS2 encoding and decoding actions to an existing logic app workflow. The **AS2 (v2)** connector doesn't include any triggers, so you can use any trigger to start your workflow. The examples in this article use the [Request](../connectors/connectors-native-reqres.md) trigger.
-* If you're working with the **Logic App (Standard)** resource type, only the original **AS2** connector is currently available.
+## Connector technical reference
- For technical information about the original **AS2** connector version, review the [connector's reference page](/connectors/as2/), which describes the triggers, actions, and limits as documented by the connector's Swagger file.
+The AS2 connector has different versions, based on [logic app type and host environment](logic-apps-overview.md#resource-environment-differences).
-### [Consumption](#tab/consumption)
-
-The following lists describe actions that the **AS2 (v2)** connector provides for establishing security and reliability when transmitting messages:
-
-* [**AS2 Encode** action](#encode) for providing encryption, digital signing, and acknowledgments through Message Disposition Notifications (MDN), which help support non-repudiation. For example, this action applies AS2/HTTP headers and performs these tasks when configured:
-
- * Signs outgoing messages.
- * Encrypts outgoing messages.
- * Compresses the message.
- * Transmits the file name in the MIME header.
+| Logic app | Environment | Connector version |
+|--|-|-|
+| **Consumption** | Multi-tenant Azure Logic Apps | **AS2 (v2)** and **AS2** managed connectors (Standard class). The **AS2 (v2)** connector provides only actions, but you can use any trigger that works for your scenario. For more information, review the following documentation: <br><br>- [AS2 managed connector reference](/connectors/as2/) <br>- [AS2 (v2) managed connector operations](#as2-v2-operations) <br>- [B2B protocol limits for message sizes](logic-apps-limits-and-config.md#b2b-protocol-limits) <br>- [Managed connectors in Azure Logic Apps](../connectors/managed.md) |
+| **Consumption** | Integration service environment (ISE) | **AS2 (v2)** and **AS2** managed connectors (Standard class) and **AS2** ISE version, which has different message limits than the Standard class. The **AS2 (v2)** connector provides only actions, but you can use any trigger that works for your scenario. For more information, review the following documentation: <br><br>- [AS2 managed connector reference](/connectors/as2/) <br>- [AS2 (v2) managed connector operations](#as2-v2-operations) <br>- [ISE message limits](logic-apps-limits-and-config.md#message-size-limits) <br>- [Managed connectors in Azure Logic Apps](../connectors/managed.md) |
+| **Standard** | Single-tenant Azure Logic Apps and App Service Environment v3 (Windows plans only) | **AS2 (v2)** built-in connector and **AS2** managed connector. The built-in version differs in the following ways: <br><br>- The built-in version provides only actions, but you can use any trigger that works for your scenario. <br><br>- The built-in version can directly access Azure virtual networks. You don't need an on-premises data gateway.<br><br>For more information, review the following documentation: <br><br>- [AS2 managed connector reference](/connectors/as2/) <br>- [AS2 (v2) built-in connector operations](#as2-v2-operations) <br>- [Built-in connectors in Azure Logic Apps](../connectors/built-in.md) |
-* [**AS2 Decode** action](#decode) for providing decryption, digital signing, and acknowledgments through Message Disposition Notifications (MDN). For example, this action performs these tasks:
+<a name="as-v2-operations"></a>
- * Processes AS2/HTTP headers.
- * Reconciles received MDNs with the original outbound messages.
- * Updates and correlates records in the non-repudiation database.
- * Writes records for AS2 status reporting.
- * Outputs payload contents as base64-encoded.
- * Determines whether MDNs are required. Based on the AS2 agreement, determines whether MDNs should be synchronous or asynchronous.
- * Generates synchronous or asynchronous MDNs based on the AS2 agreement.
- * Sets the correlation tokens and properties on MDNs.
+### AS2 (v2) operations
- This action also performs these tasks when configured:
+The **AS2 (v2)** connector has no triggers. The following table describes the actions that the **AS2 (v2)** connector provides for establishing security and reliability when transmitting messages:
- * Verifies the signature.
- * Decrypts the messages.
- * Decompresses the message.
- * Check and disallow message ID duplicates.
-
-### [Standard](#tab/standard)
-
-For more information about the original **AS2** connector's triggers, actions, and limits version, review the [connector's reference page](/connectors/as2/) as documented by the connector's Swagger file.
---
-This article shows how to add the AS2 encoding and decoding actions to an existing logic app workflow. Although you can use any trigger to start your workflow, the examples use the [Request](../connectors/connectors-native-reqres.md) trigger.
-
-## Limits
-
-For information about the AS2 connector limits for workflows running in [multi-tenant Azure Logic Apps, single-tenant Azure Logic Apps, or the integration service environment (ISE)](logic-apps-overview.md#resource-environment-differences), review the [B2B protocol limits for message sizes](logic-apps-limits-and-config.md#b2b-protocol-limits).
+| Action | Description |
+|--|-|
+| [**AS2 Encode** action](#encode) | Provides encryption, digital signing, and acknowledgments through Message Disposition Notifications (MDN), which help support non-repudiation. For example, this action applies AS2/HTTP headers and performs the following tasks when configured: <br><br>- Sign outgoing messages. <br>- Encrypt outgoing messages. <br>- Compress the message. <br>- Transmit the file name in the MIME header. |
+| [**AS2 Decode** action](#decode) | Provide decryption, digital signing, and acknowledgments through Message Disposition Notifications (MDN). For example, this action performs the following tasks when configured: <br><br>- Process AS2/HTTP headers. <br>- Reconcile received MDNs with the original outbound messages. <br>- Update and correlate records in the non-repudiation database. <br>- Write records for AS2 status reporting. <br>- Output payload contents as base64-encoded. <br>- Determine whether MDNs are required. Based on the AS2 agreement, determine whether MDNs should be synchronous or asynchronous. <br>- Generate synchronous or asynchronous MDNs based on the AS2 agreement. <br>- Set the correlation tokens and properties on MDNs. <br>- Verify the signature. <br>- Decrypt the messages. <br>- Decompress the message. <br>- Check and disallow message ID duplicates. |
## Prerequisites * An Azure account and subscription. If you don't have a subscription yet, [sign up for a free Azure account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
-* An [integration account resource](logic-apps-enterprise-integration-create-integration-account.md) where you define and store artifacts, such as trading partners, agreements, certificates, and so on, for use in your enterprise integration and B2B workflows. This resource has to meet the following requirements:
-
- * Is associated with the same Azure subscription as your logic app resource.
-
- * Exists in the same location or Azure region as your logic app resource.
+* The logic app resource and workflow where you want to use the AS2 operations.
- * When you use the [**Logic App (Consumption)** resource type](logic-apps-overview.md#resource-environment-differences) and the **AS2 (v2)** operations, your logic app resource doesn't need a link to your integration account. However, you still need this account to store artifacts, such as partners, agreements, and certificates, along with using the AS2, [X12](logic-apps-enterprise-integration-x12.md), or [EDIFACT](logic-apps-enterprise-integration-edifact.md) operations. Your integration account still has to meet other requirements, such as using the same Azure subscription and existing in the same location as your logic app resource.
+* An [integration account resource](logic-apps-enterprise-integration-create-integration-account.md) to define and store artifacts for use in enterprise integration and B2B workflows.
- * When you use the [**Logic App (Standard)** resource type](logic-apps-overview.md#resource-environment-differences) and the original **AS2** operations, your workflow requires a connection to your integration account that you create directly from your workflow when you add the AS2 operation.
+ > [!IMPORTANT]
+ >
+ > To work together, both your integration account and logic app resource must exist in the same Azure subscription and Azure region.
* At least two [trading partners](logic-apps-enterprise-integration-partners.md) in your integration account. The definitions for both partners must use the same *business identity* qualifier, which is **AS2Identity** for this scenario. * An [AS2 agreement](logic-apps-enterprise-integration-agreements.md) in your integration account between the trading partners that participate in your workflow. Each agreement requires a host partner and a guest partner. The content in the messages between you and the other partner must match the agreement type.
-* The logic app resource and workflow where you want to use the AS2 operations.
-
- > [!NOTE]
- > The **AS2 (v2)** connector provides only actions, not triggers. In this article, the examples for this connector use the
- > [Request](../connectors/connectors-native-reqres.md) trigger. The original **AS2** connector includes triggers and actions.
- > For more information about the original **AS2** connector's triggers, actions, and limits version, review the
- > [connector's reference page](/connectors/as2/) as documented by the connector's Swagger file.
+* Based on whether you're working on a Consumption or Standard logic app workflow, your logic app resource might require a link to your integration account:
- If you're new to logic apps, review [What is Azure Logic Apps](logic-apps-overview.md) and [Quickstart: Create your first logic app](quickstart-create-first-logic-app-workflow.md).
+ | Logic app workflow | Link required? |
+ |--|-|
+ | Consumption | - **AS2 (v2)** connector: Connection required, but no link required <br>- **AS2** connector: [Link required](logic-apps-enterprise-integration-create-integration-account.md?tabs=consumption#link-account), but no connection required |
+ | Standard | - **AS2 (v2)** connector: [Link required](logic-apps-enterprise-integration-create-integration-account.md?tabs=standard#link-account), but no connection required <br>- **AS2** connector: Connection required, but no link required |
* If you use [Azure Key Vault](../key-vault/general/overview.md) for certificate management, check that your vault keys permit the **Encrypt** and **Decrypt** operations. Otherwise, the encoding and decoding actions fail.
- 1. In the Azure portal, open your key vault. On the key vault menu, under **Settings**, select **Keys**.
+ 1. In the [Azure portal](https://portal.azure.com), open your key vault. On the key vault menu, under **Settings**, select **Keys**.
1. On the **Keys** pane, select your key. On the **Versions** pane, select the key version that you're using.
For information about the AS2 connector limits for workflows running in [multi-t
## Encode AS2 messages
+Select the tab for either Consumption or Standard logic app workflows:
+ ### [Consumption](#tab/consumption)
+#### AS2 v2 connector
+ 1. In the [Azure portal](https://portal.azure.com), open your logic app resource and workflow in the designer.
-1. On the designer, under the trigger or action where you want to add the AS2 action, select **New step**.
+1. On the designer, under the trigger or action where you want to add the **AS2 (v2)** action, select **New step**.
-1. Under the **Choose an operation** search box, select **All**. In the search box, enter `as2 encode`. Select the action named **AS2 Encode**.
+1. Under the **Choose an operation** search box, select **Standard**. In the search box, enter **as2**.
- ![Screenshot showing the Azure portal, workflow designer, and "AS2 Encode" action selected.](./media/logic-apps-enterprise-integration-as2/select-as2-encode.png)
+1. From the actions list, select the action named **AS2 Encode**.
-1. After the AS2 operation appears on the designer, provide information for the following properties:
+ ![Screenshot showing the Azure portal, designer for Consumption workflow, and "AS2 Encode" action selected.](./media/logic-apps-enterprise-integration-as2/select-as2-v2-encode-consumption.png)
+
+1. In the action information box, provide the following information.
| Property | Required | Description | |-|-|-|
- | **Message to encode** | Yes | The message payload |
- | **AS2 from** | Yes | The business identifier for the message sender as specified by your AS2 agreement |
- | **AS2 to** | Yes | The business identifier for the message receiver as specified by your AS2 agreement |
- ||||
-
- For example, the message payload is the **Body** content output from the Request trigger:
+ | **Message to encode** | Yes | The message payload, for example, the **Body** output from the Request trigger. <br><br>1. Put your cursor in the **Message to encode** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Body**. |
+ | **AS2 from** | Yes | The business identifier for the message sender as specified by your AS2 agreement, for example, **Fabrikam**. |
+ | **AS2 to** | Yes | The business identifier for the message receiver as specified by your AS2 agreement, for example, **Contoso**. |
- ![Screenshot showing the "AS2 Encode" action with the message encoding properties.](./media/logic-apps-enterprise-integration-as2/as2-message-encode-details.png)
+ ![Screenshot showing the "AS2 Encode" action with the message encoding properties.](./media/logic-apps-enterprise-integration-as2/as2-v2-encode-details-consumption.png)
- > [!TIP]
- > If you experience problems when sending signed or encrypted messages, consider trying different SHA256 algorithm formats.
- > The AS2 specification doesn't provide any information about SHA256 formats, so each provider uses their own implementation or format.
-
-### [Standard](#tab/standard)
+#### AS2 connector
1. In the [Azure portal](https://portal.azure.com), open your logic app resource and workflow in the designer.
-1. On the designer, under the trigger or action where you want to add the AS2 action, select **Insert a new step** (plus sign), and then select **Add an action**.
+1. On the designer, under the trigger or action where you want to add the **AS2** action, select **New step**.
+
+1. Under the **Choose an operation** search box, select **Standard**. In the search box, enter **as2 encode**.
-1. Under the **Choose an operation** search box, select **Azure**. In the search box, enter `as2 encode`. Select the action named **Encode to AS2 message**.
+1. From the actions list, select the action named **Encode to AS2 message**.
- ![Screenshot showing the Azure portal, workflow designer, and "Encode to AS2 message" operation selected.](./media/logic-apps-enterprise-integration-as2/select-encode-as2-message.png)
+ ![Screenshot showing the Azure portal, designer for Consumption workflow, and "Encode to AS2 message" action selected.](./media/logic-apps-enterprise-integration-as2/select-encode-as2-consumption.png)
1. When prompted to create a connection to your integration account, provide the following information: | Property | Required | Description | |-|-|-| | **Connection name** | Yes | A name for the connection |
- | **Integration account** | Yes | From the list of available integration accounts, select the account to use. |
- ||||
+ | **Integration Account** | Yes | From the list of available integration accounts, select the account to use. |
For example:
- ![Screenshot showing the "Encode to AS2 message" connection pane.](./media/logic-apps-enterprise-integration-as2/create-as2-encode-connection-standard.png)
+ ![Screenshot showing Consumption workflow and "Encode to AS2 message" connection information.](./media/logic-apps-enterprise-integration-as2/create-encode-as2-connection-consumption.png)
1. When you're done, select **Create**.
-1. After the AS2 details pane appears on the designer, provide information for the following properties:
+1. In the action information box, provide the following information.
+
+ | Property | Required | Description |
+ |-|-|-|
+ | **AS2-From** | Yes | The business identifier for the message sender as specified by your AS2 agreement, for example, **Fabrikam**. |
+ | **AS2-To** | Yes | The business identifier for the message receiver as specified by your AS2 agreement, for example, **Contoso**. |
+ | **body** | Yes | The message payload to encode, for example, the **Body** output from the Request trigger. <br><br>1. Put your cursor in the **body** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Body**. |
+
+ ![Screenshot showing the "Encode to AS2 message" action with the message encoding properties.](./media/logic-apps-enterprise-integration-as2/encode-as2-details-consumption.png)
+
+### [Standard](#tab/standard)
+
+#### AS2 v2 connector
+
+1. In the [Azure portal](https://portal.azure.com), open your logic app resource and workflow in the designer.
+
+1. On the designer, under the trigger or action where you want to add the **AS2 (v2)** action, select **Insert a new step** (plus sign), and then select **Add an action**.
+
+1. Under the **Choose an operation** search box, select **Built-in**. In the search box, enter **as2 encode**.
+
+1. From the actions list, select the action named **AS2 Encode**.
+
+ ![Screenshot showing the Azure portal, designer for Standard workflow, and "AS2 Encode" action selected.](./media/logic-apps-enterprise-integration-as2/select-as2-v2-encode-built-in-standard.png)
+
+1. In the action information pane, provide the following information:
| Property | Required | Description | |-|-|-|
- | **Message to encode** | Yes | The message payload |
- | **AS2 from** | Yes | The business identifier for the message sender as specified by your AS2 agreement |
- | **AS2 to** | Yes | The business identifier for the message receiver as specified by your AS2 agreement |
- ||||
+ | **Message to encode** | Yes | The message payload to encode, for example, the **Body** output from the Request trigger. <br><br>1. Put your cursor in the **Message to encode** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Body**. |
+ | **AS2 from** | Yes | The business identifier for the message sender as specified by your AS2 agreement, for example, **Fabrikam**. |
+ | **AS2 to** | Yes | The business identifier for the message receiver as specified by your AS2 agreement, for example, **Contoso**. |
For example, the message payload is the **Body** content output from the Request trigger:
- ![Screenshot showing the "AS2 Encode" operation with the message encoding properties.](./media/logic-apps-enterprise-integration-as2/encode-as2-message-details.png)
+ ![Screenshot showing the Standard workflow designer and "AS2 Encode" action with the message encoding properties.](./media/logic-apps-enterprise-integration-as2/as2-v2-encode-details-built-in-standard.png)
+
+#### AS2 connector
+
+1. In the [Azure portal](https://portal.azure.com), open your logic app resource and workflow in the designer.
+
+1. On the designer, under the trigger or action where you want to add the **AS2** action, select **Insert a new step** (plus sign), and then select **Add an action**.
+
+1. Under the **Choose an operation** search box, select **Azure**. In the search box, enter **as2 encode**.
+
+1. From the actions list, select the action named **Encode to AS2 message**.
+
+ ![Screenshot showing the Azure portal, workflow designer for Standard, and "Encode to AS2 message" action selected.](./media/logic-apps-enterprise-integration-as2/select-encode-as2-message-managed-standard.png)
+
+1. When prompted to create a connection to your integration account, provide the following information:
+
+ | Property | Required | Description |
+ |-|-|-|
+ | **Connection name** | Yes | A name for the connection |
+ | **Integration Account** | Yes | From the list of available integration accounts, select the account to use. |
+
+ For example:
+
+ ![Screenshot showing "Encode to AS2 message" connection information.](./media/logic-apps-enterprise-integration-as2/create-encode-as2-connection-managed-standard.png)
+
+1. When you're done, select **Create**.
+
+1. In the action information pane, provide the following information.
+
+ | Property | Required | Description |
+ |-|-|-|
+ | **AS2-From** | Yes | The business identifier for the message sender as specified by your AS2 agreement, for example, **Fabrikam**. |
+ | **AS2-To** | Yes | The business identifier for the message receiver as specified by your AS2 agreement, for example, **Contoso**. |
+ | **body** | Yes | The message payload to encode, for example, the **Body** output from the Request trigger. <br><br>1. Put your cursor in the **body** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Body**. |
- > [!TIP]
- > If you experience problems when sending signed or encrypted messages, consider trying different SHA256 algorithm formats.
- > The AS2 specification doesn't provide any information about SHA256 formats, so each provider uses their own implementation or format.
+ ![Screenshot showing the "Encode to AS2 message" action with the message encoding properties.](./media/logic-apps-enterprise-integration-as2/encode-as2-message-details-managed-standard.png)
For information about the AS2 connector limits for workflows running in [multi-t
## Decode AS2 messages
+Select the tab for either Consumption or Standard logic app workflows:
+ ### [Consumption](#tab/consumption)
+#### AS2 v2 connector
+ 1. In the [Azure portal](https://portal.azure.com), open your logic app resource and workflow in the designer.
-1. On the designer, under the trigger or action where you want to add the AS2 action, select **New step**. This example uses the [Request](../connectors/connectors-native-reqres.md) trigger.
+1. On the designer, under the trigger or action where you want to add the **AS2 (v2)** action, select **New step**.
+
+1. Under the **Choose an operation** search box, select **Standard**. In the search box, enter **as2**.
+
+1. From the actions list, select the action named **AS2 Decode**.
+
+ ![Screenshot showing the Azure portal, designer for Consumption workflow, and "AS2 Decode" action selected.](media/logic-apps-enterprise-integration-as2/select-as2-v2-decode-consumption.png)
+
+1. In the action information box, provide the following information:
+
+ | Property | Required | Description |
+ |-|-|-|
+ | **body** | Yes | The body for the message to decode, for example, the **Body** output from the Request trigger. <br><br>1. Put your cursor in the **body** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Body**. |
+ | **Headers** | Yes | The headers for the message to decode, for example, the **Headers** output from the Request trigger. <br><br>1. Put your cursor in the **Headers** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Headers**. |
+
+ For example:
+
+ ![Screenshot showing the "AS2 Decode" action with the "Body" and "Headers" outputs entered from the Request trigger.](media/logic-apps-enterprise-integration-as2/as2-v2-decode-details-consumption.png)
+
+#### AS2 connector
+
+1. In the [Azure portal](https://portal.azure.com), open your logic app resource and workflow in the designer.
+
+1. On the designer, under the trigger or action where you want to add the **AS2** action, select **New step**.
+
+1. Under the **Choose an operation** search box, select **Standard**. In the search box, enter **as2 decode**.
+
+1. From the actions list, select the action named **Decode AS2 message**.
+
+ ![Screenshot showing the Azure portal, designer for Consumption workflow, and "Decode AS2 message" action selected.](./media/logic-apps-enterprise-integration-as2/select-decode-as2-consumption.png)
+
+1. When prompted to create a connection to your integration account, provide the following information:
+
+ | Property | Required | Description |
+ |-|-|-|
+ | **Connection name** | Yes | A name for the connection |
+ | **Integration Account** | Yes | From the list of available integration accounts, select the account to use. |
+
+ For example:
-1. Under the **Choose an operation** search box, select **All**. In the search box, enter `as2 decode`. Select the action named **AS2 Decode**.
+ ![Screenshot showing Consumption workflow and "Decode AS2 message" connection information.](./media/logic-apps-enterprise-integration-as2/create-decode-as2-connection-consumption.png)
- ![Screenshot showing the Azure portal, workflow designer, and "AS2 Decode" operation selected.](media/logic-apps-enterprise-integration-as2/select-as2-decode.png)
+1. When you're done, select **Create**.
-1. In the AS2 operation shape, select the values for the **Message to encode** and the **Message headers** properties from the previous trigger or action outputs.
+1. In the action information box, provide the following information.
- In this example, you can select the outputs from the Request trigger.
+ | Property | Required | Description |
+ |-|-|-|
+ | **body** | Yes | The message payload, for example, the **Body** output from the Request trigger. <br><br>1. Put your cursor in the **body** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Body**. |
+ | **Headers** | Yes | The headers for the message to decode, for example, the **Headers** output from the Request trigger. <br><br>1. Put your cursor in the **Headers** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Headers**. |
- ![Screenshot showing the Azure portal, workflow designer, and "AS2 Decode" operation with the "Body" and "Headers" output selected from the Request trigger.](media/logic-apps-enterprise-integration-as2/as2-message-decode-details.png)
+ ![Screenshot showing the "Decode AS2 message" action with the message decoding properties.](./media/logic-apps-enterprise-integration-as2/decode-as2-details-consumption.png)
### [Standard](#tab/standard)
+#### AS2 v2 connector
+ 1. In the [Azure portal](https://portal.azure.com), open your logic app resource and workflow in the designer. 1. On the designer, under the trigger or action where you want to add the AS2 action, select **Insert a new step** (plus sign), and then select **Add an action**.
-1. Under the **Choose an operation** search box, select **Azure**. In the search box, enter `as2 decode`. Select the action named **Decode AS2 message**.
+1. Under the **Choose an operation** search box, select **Built-in**. In the search box, enter **as2 decode**.
+
+1. From the actions list, select the action named **AS2 Decode**.
+
+ ![Screenshot showing the Azure portal, designer for Standard workflow, and "AS2 Decode" action selected.](./media/logic-apps-enterprise-integration-as2/select-as2-v2-decode-built-in-standard.png)
+
+1. In the action information pane, provide the following information:
+
+ | Property | Required | Description |
+ |-|-|-|
+ | **Message to decode** | Yes | The message payload to decode, for example, the **Body** output from the Request trigger. <br><br>1. Put your cursor in the **Message to decode** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Body**. |
+ | **Message headers** | Yes | The headers for the message to decode, for example, the **Headers** output from the Request trigger. <br><br>1. Put your cursor in the **Message headers** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Headers**. |
+
+ For example:
+
+ ![Screenshot showing the Standard workflow designer and "AS2 Decode" action with the message decoding properties.](./media/logic-apps-enterprise-integration-as2/as2-v2-decode-details-built-in-standard.png)
- ![Screenshot showing the Azure portal, workflow designer, and "Decode AS2 message" operation selected.](./media/logic-apps-enterprise-integration-as2/select-decode-as2-message.png)
+#### AS2 connector
+
+1. In the [Azure portal](https://portal.azure.com), open your logic app resource and workflow in the designer.
+
+1. On the designer, under the trigger or action where you want to add the AS2 action, select **Insert a new step** (plus sign), and then select **Add an action**.
+
+1. Under the **Choose an operation** search box, select **Azure**. In the search box, enter **as2 decode**.
+
+1. From the actions list, select the action named **Decode AS2 message**.
+
+ ![Screenshot showing the Azure portal, designer for Standard workflow, and "Decode AS2 message" operation selected.](./media/logic-apps-enterprise-integration-as2/select-decode-as2-message-managed-standard.png)
1. When prompted to create a connection to your integration account, provide the following information: | Property | Required | Description | |-|-|-| | **Connection name** | Yes | A name for the connection |
- | **Integration account** | Yes | From the list of available integration accounts, select the account to use. |
- ||||
+ | **Integration Account** | Yes | From the list of available integration accounts, select the account to use. |
For example:
- ![Screenshot showing the "Decode AS2 message" connection pane.](./media/logic-apps-enterprise-integration-as2/create-as2-decode-connection-standard.png)
+ ![Screenshot showing "Decode AS2 message" connection information.](./media/logic-apps-enterprise-integration-as2/create-decode-as2-connection-managed-standard.png)
1. When you're done, select **Create**.
-1. In the AS2 details pane, select the values for the **Message to encode** and the **Message headers** properties from the previous trigger or action outputs.
+1. In the action information pane, provide the following information:
- In this example, you can select the outputs from the Request trigger.
+ | Property | Required | Description |
+ |-|-|-|
+ | **body** | Yes | The message payload, for example, the **Body** output from the Request trigger. <br><br>1. Put your cursor in the **body** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Body**. |
+ | **Headers** | Yes | The headers for the message to decode, for example, the **Headers** output from the Request trigger. <br><br>1. Put your cursor in the **Headers** box so that the dynamic content list opens. <br>2. Next to the section name **When a HTTP request is received**, select **See more**. <br>3. From the outputs list, select **Headers**. |
+
+ For example:
- ![Screenshot showing the Azure portal, workflow designer, and "Decode AS2 message" operation with the "Body" and "Headers" output selected from the Request trigger.](media/logic-apps-enterprise-integration-as2/decode-as2-message-details.png)
+ ![Screenshot showing the "Decode AS2 message" action with the "Body" and "Headers" outputs entered from the Request trigger.](media/logic-apps-enterprise-integration-as2/decode-as2-message-details-managed-standard.png)
For information about the AS2 connector limits for workflows running in [multi-t
To try deploying a fully operational logic app and sample AS2 (v2) scenario, review the [AS2 (v2) logic app template and scenario](https://azure.microsoft.com/resources/templates/logic-app-as2-send-receive/).
+## Troubleshoot problems
+
+* Problems when sending signed or encrypted messages
+
+ Consider trying different SHA256 algorithm formats. The AS2 specification doesn't provide any information about SHA256 formats, so each provider uses their own implementation or format.
+ ## Next steps
-* Learn about other [connectors for Azure Logic Apps](../connectors/apis-list.md)
+* [Managed connectors for Azure Logic Apps](/connectors/connector-reference/connector-reference-logicapps-connectors)
machine-learning Concept Enterprise Security https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/concept-enterprise-security.md
For more information, see the following documents:
* [Virtual network isolation and privacy overview](how-to-network-security-overview.md) * [Secure workspace resources](how-to-secure-workspace-vnet.md) * [Secure training environment](how-to-secure-training-vnet.md)
-* [Secure inference environment](/how-to-secure-inferencing-vnet.md)
+* [Secure inference environment](/azure/machine-learning/how-to-secure-inferencing-vnet)
* [Use studio in a secured virtual network](how-to-enable-studio-virtual-network.md) * [Use custom DNS](how-to-custom-dns.md) * [Configure firewall](how-to-access-azureml-behind-firewall.md)
machine-learning How To Configure Network Isolation With V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-configure-network-isolation-with-v2.md
The Azure Machine Learning CLI v2 uses our new v2 API platform. New features suc
As mentioned in the previous section, there are two types of operations; with ARM and with the workspace. With the __legacy v1 API__, most operations used the workspace. With the v1 API, adding a private endpoint to the workspace provided network isolation for everything except CRUD operations on the workspace or compute resources.
-With the __new v2 API__, most operations use ARM. So enabling a private endpoint on your workspace doesn't provide the same level of network isolation. Operations that use ARM communicate over public networks, and include any metadata (such as your resource IDs) or parameters used by the operation. For example, the [create or update job](/rest/api/azureml/2022-05-01/jobs/create-or-update) api sends metadata, and [parameters](./reference-yaml-job-command.md).
+With the __new v2 API__, most operations use ARM. So enabling a private endpoint on your workspace doesn't provide the same level of network isolation. Operations that use ARM communicate over public networks, and include any metadata (such as your resource IDs) or parameters used by the operation. For example, the [create or update job](/rest/api/azureml/2022-10-01/jobs/create-or-update) api sends metadata, and [parameters](./reference-yaml-job-command.md).
> [!IMPORTANT] > For most people, using the public ARM communications is OK:
machine-learning How To Create Component Pipelines Ui https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-create-component-pipelines-ui.md
In the example below take using CLI for example. If you want to learn more about
## Next steps -- Use [these Jupyter notebooks on GitHub](https://github.com/Azure/azureml-examples/tree/pipeline/builder_function_samples/cli/jobs/pipelines-with-components) to explore machine learning pipelines further
+- Use [these Jupyter notebooks on GitHub](https://github.com/Azure/azureml-examples/tree/main/cli/jobs/pipelines-with-components) to explore machine learning pipelines further
- Learn [how to use CLI v2 to create pipeline using components](how-to-create-component-pipelines-cli.md). - Learn [how to use SDK v2 to create pipeline using components](how-to-create-component-pipeline-python.md)
machine-learning How To Create Data Assets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-create-data-assets.md
When you create a data asset in Azure Machine Learning, you'll need to specify a
- The schema of your data is complex and/or changes frequently. - You only need a subset of data (for example: a sample of rows or files, specific columns, etc). - AutoML jobs requiring tabular data.
+
If your scenario does not fit the above then it is likely that URIs are a more suitable type. ## Create a `uri_folder` data asset
machine-learning How To Deploy Managed Online Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-deploy-managed-online-endpoints.md
The `update` command also works with local deployments. Use the same `az ml onli
# [Python](#tab/python)
-If you want to update the code, model, or environment, update the configuration, and then run the `MLClient`'s [`online_deployments.begin_create_or_update`](/python/api/azure-ai-ml/azure.ai.ml.operations.onlinedeploymentoperations.md#azure-ai-ml-operations-onlinedeploymentoperations-begin-create-or-update) module/method.
+If you want to update the code, model, or environment, update the configuration, and then run the `MLClient`'s [`online_deployments.begin_create_or_update`](/python/api/azure-ai-ml/azure.ai.ml.operations.onlinedeploymentoperations#azure-ai-ml-operations-onlinedeploymentoperations-begin-create-or-update) module/method.
> [!NOTE] > If you update instance count and along with other model settings (code, model, or environment) in a single `begin_create_or_update` method: first the scaling operation will be performed, then the other updates will be applied. In production environment is a good practice to perform these operations separately.
machine-learning How To Identity Based Service Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-identity-based-service-authentication.md
If you're training a model on a remote compute target and want to access the dat
Certain machine learning scenarios involve working with private data. In such cases, data scientists may not have direct access to data as Azure AD users. In this scenario, the managed identity of a compute can be used for data access authentication. In this scenario, the data can only be accessed from a compute instance or a machine learning compute cluster executing a training job.
-With this approach, the admin grants the compute instance or compute cluster managed identity Storage Blob Data Reader permissions on the storage. The individual data scientists don't need to be granted access. For more information on configuring the managed identity for the compute cluster, see the [compute cluster](#compute-cluster) section. For information on using configuring Azure RBAC for the storage, see [role-based access controls](/storage/blobs/assign-azure-role-data-access).
+With this approach, the admin grants the compute instance or compute cluster managed identity Storage Blob Data Reader permissions on the storage. The individual data scientists don't need to be granted access. For more information on configuring the managed identity for the compute cluster, see the [compute cluster](#compute-cluster) section. For information on using configuring Azure RBAC for the storage, see [role-based access controls](../storage/blobs/assign-azure-role-data-access.md).
### Work with virtual networks
The following steps outline how to set up identity-based data access for trainin
* Learn more about [enterprise security in Azure Machine Learning](concept-enterprise-security.md) * Learn about [data administration](how-to-administrate-data-authentication.md)
-* Learn about [managed identities on compute cluster](how-to-create-attach-compute-cluster.md).
+* Learn about [managed identities on compute cluster](how-to-create-attach-compute-cluster.md).
machine-learning How To Machine Learning Fairness Aml https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-machine-learning-fairness-aml.md
The following example shows how to use the fairness package. We will upload mode
1. If you registered your original model by following the previous steps, you can select **Models** in the left pane to view it. 1. Select a model, and then the **Fairness** tab to view the explanation visualization dashboard.
- To learn more about the visualization dashboard and what it contains, check out Fairlearn's [user guide](https://fairlearn.org/main/user_guide/assessment.html#fairlearn-dashboard).
+ To learn more about the visualization dashboard and what it contains, check out Fairlearn's [user guide](https://fairlearn.org/main/user_guide/assessment/https://docsupdatetracker.net/index.html#fairlearn-dashboard).
## Upload fairness insights for multiple models
machine-learning How To Manage Workspace https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-manage-workspace.md
The Azure Machine Learning workspace uses Azure Container Registry (ACR) for som
## Examples
-Examples in this article come from [workspace.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/resources/workspace/workspace.ipynb).
+Examples in this article come from [workspace.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/resources/workspace/workspace.ipynb).
## Next steps
machine-learning How To Prepare Datasets For Automl Images https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-prepare-datasets-for-automl-images.md
In this article, you learn how to prepare image data for training computer visio
To generate models for computer vision tasks with automated machine learning, you need to bring labeled image data as input for model training in the form of an `MLTable`. You can create an `MLTable` from labeled training data in JSONL format.
-If your labeled training data is in a different format (like, pascal VOC or COCO), you can use a [conversion script](https://github.com/Azure/azureml-examples/blob/main/sdk/jobs/automl-standalone-jobs/automl-image-object-detection-task-fridge-items/coco2jsonl.py) to first convert it to JSONL, and then create an `MLTable`. Alternatively, you can use Azure Machine Learning's [data labeling tool](how-to-create-image-labeling-projects.md) to manually label images, and export the labeled data to use for training your AutoML model.
+If your labeled training data is in a different format (like, pascal VOC or COCO), you can use a [conversion script](https://github.com/Azure/azureml-examples/blob/main/sdk/python/jobs/automl-standalone-jobs/automl-image-object-detection-task-fridge-items/coco2jsonl.py) to first convert it to JSONL, and then create an `MLTable`. Alternatively, you can use Azure Machine Learning's [data labeling tool](how-to-create-image-labeling-projects.md) to manually label images, and export the labeled data to use for training your AutoML model.
## Prerequisites
machine-learning How To Safely Rollout Managed Endpoints Sdk V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-safely-rollout-managed-endpoints-sdk-v2.md
- Title: Safe rollout for managed online endpoints using Python SDK v2.-
-description: Safe rollout for online endpoints using Python SDK v2.
------ Previously updated : 05/25/2022----
-# Safe rollout for managed online endpoints using Python SDK v2
--
-In this article, you learn how to deploy a new version of the model without causing any disruption. With blue-green deployment or safe rollout, an approach in which a new version of a web service is introduced to production by rolling out the change to a small subset of users/requests before rolling it out completely. This article assumes you're using online endpoints; for more information, see [Azure Machine Learning endpoints](concept-endpoints.md).
-
-In this article, you'll learn to:
-
-* Deploy a new online endpoint called "blue" that serves version 1 of the model.
-* Scale this deployment so that it can handle more requests.
-* Deploy version 2 of the model to an endpoint called "green" that accepts no live traffic.
-* Test the green deployment in isolation.
-* Send 10% of live traffic to the green deployment.
-* Fully cut-over all live traffic to the green deployment.
-* Delete the now-unused v1 blue deployment.
-
-## Prerequisites
-
-* If you don't have an Azure subscription, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://azure.microsoft.com/free/) today.
-* The [Azure Machine Learning SDK v2 for Python](/python/api/overview/azure/ml/installv2).
-* You must have an Azure resource group, and you (or the service principal you use) must have Contributor access to it.
-* You must have an Azure Machine Learning workspace.
-* To deploy locally, you must install [Docker Engine](https://docs.docker.com/engine/) on your local computer. We highly recommend this option, so it's easier to debug issues.
-
-### Clone examples repository
-
-To run the training examples, first clone the examples repository and change into the `sdk` directory:
-
-```bash
-git clone --depth 1 https://github.com/Azure/azureml-examples
-cd azureml-examples/sdk
-```
-
-> [!TIP]
-> Use `--depth 1` to clone only the latest commit to the repository, which reduces time to complete the operation.
-
-## Connect to Azure Machine Learning workspace
-
-The [workspace](concept-workspace.md) is the top-level resource for Azure Machine Learning, providing a centralized place to work with all the artifacts you create when you use Azure Machine Learning. In this section, we'll connect to the workspace in which you'll perform deployment tasks.
-
-1. Import the required libraries:
-
- ```python
- # import required libraries
- from azure.ai.ml import MLClient
- from azure.ai.ml.entities import (
- ManagedOnlineEndpoint,
- ManagedOnlineDeployment,
- Model,
- Environment,
- CodeConfiguration,
- )
- from azure.identity import DefaultAzureCredential
- ```
-
-1. Configure workspace details and get a handle to the workspace:
-
- To connect to a workspace, we need identifier parameters - a subscription, resource group and workspace name. We'll use these details in the `MLClient` from `azure.ai.ml` to get a handle to the required Azure Machine Learning workspace. This example uses the [default Azure authentication](/python/api/azure-identity/azure.identity.defaultazurecredential).
-
- ```python
- # enter details of your AzureML workspace
- subscription_id = "<SUBSCRIPTION_ID>"
- resource_group = "<RESOURCE_GROUP>"
- workspace = "<AZUREML_WORKSPACE_NAME>"
- ```
-
- ```python
- # get a handle to the workspace
- ml_client = MLClient(
- DefaultAzureCredential(), subscription_id, resource_group, workspace
- )
- ```
-
-## Create online endpoint
-
-Online endpoints are endpoints that are used for online (real-time) inferencing. Online endpoints contain deployments that are ready to receive data from clients and can send responses back in real time.
-
-To create an online endpoint, we'll use `ManagedOnlineEndpoint`. This class allows user to configure the following key aspects:
-
-* `name` - Name of the endpoint. Needs to be unique at the Azure region level
-* `auth_mode` - The authentication method for the endpoint. Key-based authentication and Azure ML token-based authentication are supported. Key-based authentication doesn't expire but Azure ML token-based authentication does. Possible values are `key` or `aml_token`.
-* `identity`- The managed identity configuration for accessing Azure resources for endpoint provisioning and inference.
- * `type`- The type of managed identity. Azure Machine Learning supports `system_assigned` or `user_assigned` identity.
- * `user_assigned_identities` - List (array) of fully qualified resource IDs of the user-assigned identities. This property is required if `identity.type` is user_assigned.
-* `description`- Description of the endpoint.
-
-1. Configure the endpoint:
-
- ```python
- # Creating a unique endpoint name with current datetime to avoid conflicts
- import datetime
-
- online_endpoint_name = "endpoint-" + datetime.datetime.now().strftime("%m%d%H%M%f")
-
- # create an online endpoint
- endpoint = ManagedOnlineEndpoint(
- name=online_endpoint_name,
- description="this is a sample online endpoint",
- auth_mode="key",
- tags={"foo": "bar"},
- )
- ```
-
-1. Create the endpoint:
-
- Using the `MLClient` created earlier, we'll now create the Endpoint in the workspace. This command will start the endpoint creation and return a confirmation response while the endpoint creation continues.
-
- ```python
- ml_client.begin_create_or_update(endpoint)
- ```
-
-## Create the 'blue' deployment
-
-A deployment is a set of resources required for hosting the model that does the actual inferencing. We'll create a deployment for our endpoint using the `ManagedOnlineDeployment` class. This class allows user to configure the following key aspects.
-
-**Key aspects of deployment**
-* `name` - Name of the deployment.
-* `endpoint_name` - Name of the endpoint to create the deployment under.
-* `model` - The model to use for the deployment. This value can be either a reference to an existing versioned model in the workspace or an inline model specification.
-* `environment` - The environment to use for the deployment. This value can be either a reference to an existing versioned environment in the workspace or an inline environment specification.
-* `code_configuration` - the configuration for the source code and scoring script
- * `path`- Path to the source code directory for scoring the model
- * `scoring_script` - Relative path to the scoring file in the source code directory
-* `instance_type` - The VM size to use for the deployment. For the list of supported sizes, see [Managed online endpoints SKU list](reference-managed-online-endpoints-vm-sku-list.md).
-* `instance_count` - The number of instances to use for the deployment
-
-1. Configure blue deployment:
-
- ```python
- # create blue deployment
- model = Model(path="../model-1/model/sklearn_regression_model.pkl")
- env = Environment(
- conda_file="../model-1/environment/conda.yml",
- image="mcr.microsoft.com/azureml/openmpi3.1.2-ubuntu18.04:20210727.v1",
- )
-
- blue_deployment = ManagedOnlineDeployment(
- name="blue",
- endpoint_name=online_endpoint_name,
- model=model,
- environment=env,
- code_configuration=CodeConfiguration(
- code="../model-1/onlinescoring", scoring_script="score.py"
- ),
- instance_type="Standard_F2s_v2",
- instance_count=1,
- )
- ```
-
-1. Create the deployment:
-
- Using the `MLClient` created earlier, we'll now create the deployment in the workspace. This command will start the deployment creation and return a confirmation response while the deployment creation continues.
-
- ```python
- ml_client.begin_create_or_update(blue_deployment)
- ```
-
- ```python
- # blue deployment takes 100 traffic
- endpoint.traffic = {"blue": 100}
- ml_client.begin_create_or_update(endpoint)
- ```
-
-## Test the endpoint with sample data
-
-Using the `MLClient` created earlier, we'll get a handle to the endpoint. The endpoint can be invoked using the `invoke` command with the following parameters:
-
-* `endpoint_name` - Name of the endpoint
-* `request_file` - File with request data
-* `deployment_name` - Name of the specific deployment to test in an endpoint
-
-We'll send a sample request using a [json](https://github.com/Azure/azureml-examples/blob/main/sdk/python/endpoints/online/model-1/sample-request.json) file.
-
-```python
-# test the blue deployment with some sample data
-ml_client.online_endpoints.invoke(
- endpoint_name=online_endpoint_name,
- deployment_name="blue",
- request_file="../model-1/sample-request.json",
-)
-```
-
-## Scale the deployment
-
-Using the `MLClient` created earlier, we'll get a handle to the deployment. The deployment can be scaled by increasing or decreasing the `instance_count`.
-
-```python
-# scale the deployment
-blue_deployment = ml_client.online_deployments.get(
- name="blue", endpoint_name=online_endpoint_name
-)
-blue_deployment.instance_count = 2
-ml_client.online_deployments.begin_create_or_update(blue_deployment)
-```
-
-## Get endpoint details
-
-```python
-# Get the details for online endpoint
-endpoint = ml_client.online_endpoints.get(name=online_endpoint_name)
-
-# existing traffic details
-print(endpoint.traffic)
-
-# Get the scoring URI
-print(endpoint.scoring_uri)
-```
-
-## Deploy a new model, but send no traffic yet
-
-Create a new deployment named green:
-
-```python
-# create green deployment
-model2 = Model(path="../model-2/model/sklearn_regression_model.pkl")
-env2 = Environment(
- conda_file="../model-2/environment/conda.yml",
- image="mcr.microsoft.com/azureml/openmpi3.1.2-ubuntu18.04:20210727.v1",
-)
-
-green_deployment = ManagedOnlineDeployment(
- name="green",
- endpoint_name=online_endpoint_name,
- model=model2,
- environment=env2,
- code_configuration=CodeConfiguration(
- code="../model-2/onlinescoring", scoring_script="score.py"
- ),
- instance_type="Standard_F2s_v2",
- instance_count=1,
-)
-```
-
-```python
-# use MLClient to create green deployment
-ml_client.begin_create_or_update(green_deployment)
-```
-
-### Test the new deployment
-
-Though green has 0% of traffic allocated, you can still invoke the endpoint and deployment with [json](https://github.com/Azure/azureml-examples/blob/main/sdk/python/endpoints/online/model-2/sample-request.json) file.
-
-```python
-ml_client.online_endpoints.invoke(
- endpoint_name=online_endpoint_name,
- deployment_name="green",
- request_file="../model-2/sample-request.json",
-)
-```
-
-## Test the deployment with mirrored traffic (preview)
--
-Once you've tested your `green` deployment, you can copy (or 'mirror') a percentage of the live traffic to it. Mirroring traffic doesn't change results returned to clients. Requests still flow 100% to the blue deployment. The mirrored percentage of the traffic is copied and submitted to the `green` deployment so you can gather metrics and logging without impacting your clients. Mirroring is useful when you want to validate a new deployment without impacting clients. For example, to check if latency is within acceptable bounds and that there are no HTTP errors.
-
-> [!WARNING]
-> Mirroring traffic uses your [endpoint bandwidth quota](how-to-manage-quotas.md#azure-machine-learning-managed-online-endpoints) (default 5 MBPS). Your endpoint bandwidth will be throttled if you exceed the allocated quota. For information on monitoring bandwidth throttling, see [Monitor managed online endpoints](how-to-monitor-online-endpoints.md#metrics-at-endpoint-scope).
-
-The following command mirrors 10% of the traffic to the `green` deployment:
-
-```python
-endpoint.mirror_traffic = {"green": 10}
-ml_client.begin_create_or_update(endpoint)
-```
-
-> [!IMPORTANT]
-> Mirroring has the following limitations:
-> * You can only mirror traffic to one deployment.
-> * Mirrored traffic is not currently supported with K8s.
-> * The maximum mirrored traffic you can configure is 50%. This limit is to reduce the impact on your endpoint bandwidth quota.
->
-> Also note the following behavior:
-> * A deployment can only be set to live or mirror traffic, not both.
-> * You can send traffic directly to the mirror deployment by specifying the deployment set for mirror traffic.
-> * You can send traffic directly to a live deployment by specifying the deployment set for live traffic, but in this case the traffic won't be mirrored to the mirror deployment. Mirror traffic is routed from traffic sent to endpoint without specifying the deployment.
--
-After testing, you can set the mirror traffic to zero to disable mirroring:
-
-```python
-endpoint.mirror_traffic = {"green": 0}
-ml_client.begin_create_or_update(endpoint)
-```
-
-## Test the new deployment with a small percentage of live traffic:
-
-Once you've tested your green deployment, allocate a small percentage of traffic to it:
-
-```python
-endpoint.traffic = {"blue": 90, "green": 10}
-ml_client.begin_create_or_update(endpoint)
-```
-
-Now, your green deployment will receive 10% of requests.
-
-
-## Send all traffic to your new deployment:
-
-Once you're satisfied that your green deployment is fully satisfactory, switch all traffic to it.
-
-```python
-endpoint.traffic = {"blue": 0, "green": 100}
-ml_client.begin_create_or_update(endpoint)
-```
-
-## Remove the old deployment:
-
-```python
-ml_client.online_deployments.delete(name="blue", endpoint_name=online_endpoint_name)
-```
-
-## Delete endpoint
-
-If you aren't going use the deployment, you should delete it with:
-
-```python
-ml_client.online_endpoints.begin_delete(name=online_endpoint_name)
-```
-
-## Next steps
-- [Explore online endpoint samples](https://github.com/Azure/azureml-examples/tree/main/sdk/python/endpoints)-- [Access Azure resources with a online endpoint and managed identity](how-to-access-resources-from-endpoints-managed-identities.md)-- [Monitor managed online endpoints](how-to-monitor-online-endpoints.md)-- [Manage and increase quotas for resources with Azure Machine Learning](how-to-manage-quotas.md#azure-machine-learning-managed-online-endpoints)-- [View costs for an Azure Machine Learning managed online endpoint](how-to-view-online-endpoints-costs.md)-- [Managed online endpoints SKU list](reference-managed-online-endpoints-vm-sku-list.md)-- [Troubleshooting online endpoints deployment and scoring](how-to-troubleshoot-managed-online-endpoints.md)
machine-learning How To Safely Rollout Managed Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-safely-rollout-managed-endpoints.md
Previously updated : 04/29/2022 Last updated : 10/27/2022 -+ # Safe rollout for online endpoints [!INCLUDE [cli v2](../../includes/machine-learning-cli-v2.md)]
-You've an existing model deployed in production and you want to deploy a new version of the model. How do you roll out your new ML model without causing any disruption? A good answer is blue-green deployment, an approach in which a new version of a web service is introduced to production by rolling out the change to a small subset of users/requests before rolling it out completely. This article assumes you're using online endpoints; for more information, see [What are Azure Machine Learning endpoints?](concept-endpoints.md).
+
+In this article, you'll learn how to deploy a new version of a machine learning model in production without causing any disruption. You'll use blue-green deployment, also known as a safe rollout strategy, to introduce a new version of a web service to production. This strategy will allow you to roll out your new version of the web service to a small subset of users or requests before rolling it out completely.
+
+This article assumes you're using online endpoints, that is, endpoints that are used for online (real-time) inferencing. There are two types of online endpoints: **managed online endpoints** and **Kubernetes online endpoints**. For more information on endpoints and the differences between managed online endpoints and Kubernetes online endpoints, see [What are Azure Machine Learning endpoints?](concept-endpoints.md#managed-online-endpoints-vs-kubernetes-online-endpoints).
+
+> [!Note]
+> The main example in this article uses managed online endpoints for deployment. To use Kubernetes endpoints instead, see the notes in this document inline with the managed online endpoints discussion.
In this article, you'll learn to: > [!div class="checklist"]
-> * Deploy a new online endpoint called "blue" that serves version 1 of the model
-> * Scale this deployment so that it can handle more requests
-> * Deploy version 2 of the model to an endpoint called "green" that accepts no live traffic
-> * Test the green deployment in isolation
-> * Send 10% of live traffic to the green deployment
-> * Fully cut-over all live traffic to the green deployment
+> * Define an online endpoint and a deployment called "blue" to serve version 1 of a model
+> * Scale the blue deployment so that it can handle more requests
+> * Deploy version 2 of the model (called the "green" deployment) to the endpoint, but send the deployment no live traffic
+> * Test the green deployment in isolation
+> * Mirror a percentage of live traffic to the green deployment to validate it (preview)
+> * Send a small percentage of live traffic to the green deployment
+> * Send over all live traffic to the green deployment
> * Delete the now-unused v1 blue deployment ## Prerequisites
-* To use Azure machine learning, you must have an Azure subscription. If you don't have an Azure subscription, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://azure.microsoft.com/free/) today.
-
-* You must install and configure the Azure CLI and ML extension. For more information, see [Install, set up, and use the CLI (v2)](how-to-configure-cli.md).
+# [Azure CLI](#tab/azure-cli)
-* You must have an Azure Resource group, in which you (or the service principal you use) need to have `Contributor` access. You'll have such a resource group if you configured your ML extension per the above article.
-* You must have an Azure Machine Learning workspace. You'll have such a workspace if you configured your ML extension per the above article.
+* Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure Machine Learning. To perform the steps in this article, your user account must be assigned the __owner__ or __contributor__ role for the Azure Machine Learning workspace, or a custom role allowing `Microsoft.MachineLearningServices/workspaces/onlineEndpoints/*`. For more information, see [Manage access to an Azure Machine Learning workspace](how-to-assign-roles.md).
-* If you've not already set the defaults for Azure CLI, you should save your default settings. To avoid having to repeatedly pass in the values, run:
+* If you haven't already set the defaults for the Azure CLI, save your default settings. To avoid passing in the values for your subscription, workspace, and resource group multiple times, run this code:
```azurecli az account set --subscription <subscription id> az configure --defaults workspace=<azureml workspace name> group=<resource group> ```
-* An existing online endpoint and deployment. This article assumes that your deployment is as described in [Deploy and score a machine learning model with an online endpoint](how-to-deploy-managed-online-endpoints.md).
+* (Optional) To deploy locally, you must [install Docker Engine](https://docs.docker.com/engine/install/) on your local computer. We *highly recommend* this option, so it's easier to debug issues.
+
+# [Python](#tab/python)
+++
+* Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure Machine Learning. To perform the steps in this article, your user account must be assigned the __owner__ or __contributor__ role for the Azure Machine Learning workspace, or a custom role allowing `Microsoft.MachineLearningServices/workspaces/onlineEndpoints/*`. For more information, see [Manage access to an Azure Machine Learning workspace](how-to-assign-roles.md).
+
+* (Optional) To deploy locally, you must [install Docker Engine](https://docs.docker.com/engine/install/) on your local computer. We *highly recommend* this option, so it's easier to debug issues.
+++
+## Prepare your system
+
+# [Azure CLI](#tab/azure-cli)
+
+### Clone the examples repository
+
+To follow along with this article, first clone the [examples repository (azureml-examples)](https://github.com/azure/azureml-examples). Then, go to the repository's `cli/` directory:
+
+```azurecli
+git clone --depth 1 https://github.com/Azure/azureml-examples
+cd azureml-examples
+cd cli
+```
+
+> [!TIP]
+> Use `--depth 1` to clone only the latest commit to the repository. This reduces the time to complete the operation.
+
+The commands in this tutorial are in the file `deploy-safe-rollout-online-endpoints.sh` in the `cli` directory, and the YAML configuration files are in the `endpoints/online/managed/sample/` subdirectory.
+
+> [!NOTE]
+> The YAML configuration files for Kubernetes online endpoints are in the `endpoints/online/kubernetes/` subdirectory.
+
+# [Python](#tab/python)
+
+### Clone the examples repository
+
+To run the training examples, first clone the [examples repository (azureml-examples)](https://github.com/azure/azureml-examples). Then, go into the `azureml-examples/sdk/python/endpoints/online/managed` directory:
+
+```bash
+git clone --depth 1 https://github.com/Azure/azureml-examples
+cd azureml-examples/sdk/python/endpoints/online/managed
+```
+
+> [!TIP]
+> Use `--depth 1` to clone only the latest commit to the repository. This reduces the time to complete the operation.
+
+The information in this article is based on the [online-endpoints-safe-rollout.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb) notebook. It contains the same content as this article, although the order of the codes is slightly different.
+
+> [!NOTE]
+> The steps for the Kubernetes online endpoint are based on the [kubernetes-online-endpoints-safe-rollout.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/endpoints/online/kubernetes/kubernetes-online-endpoints-safe-rollout.ipynb) notebook.
+
+### Connect to Azure Machine Learning workspace
+
+The [workspace](concept-workspace.md) is the top-level resource for Azure Machine Learning, providing a centralized place to work with all the artifacts you create when you use Azure Machine Learning. In this section, we'll connect to the workspace where you'll perform deployment tasks.
+
+1. Import the required libraries:
+
+ [!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=import_libraries)]
+
+ > [!NOTE]
+ > If you're using the Kubernetes online endpoint, import the `KubernetesOnlineEndpoint` and `KubernetesOnlineDeployment` class from the `azure.ai.ml.entities` library.
+
+1. Configure workspace details and get a handle to the workspace:
+
+ To connect to a workspace, we need identifier parametersΓÇöa subscription, resource group and workspace name. We'll use these details in the `MLClient` from `azure.ai.ml` to get a handle to the required Azure Machine Learning workspace. This example uses the [default Azure authentication](/python/api/azure-identity/azure.identity.defaultazurecredential).
+
+ [!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=workspace_details)]
+
+ [!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=workspace_handle)]
-* If you haven't already set the environment variable $ENDPOINT_NAME, do so now:
++
+## Define the endpoint and deployment
+
+Online endpoints are used for online (real-time) inferencing. Online endpoints contain deployments that are ready to receive data from clients and can send responses back in real time.
+
+# [Azure CLI](#tab/azure-cli)
+
+### Create online endpoint
+
+To create an online endpoint:
+
+1. Set your endpoint name:
+
+ For Unix, run this command (replace `YOUR_ENDPOINT_NAME` with a unique name):
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-safe-rollout-online-endpoints.sh" ID="set_endpoint_name":::
-* (Recommended) Clone the samples repository and switch to the repository's `cli/` directory:
+ > [!IMPORTANT]
+ > Endpoint names must be unique within an Azure region. For example, in the Azure `westus2` region, there can be only one endpoint with the name `my-endpoint`.
- ```azurecli
- git clone https://github.com/Azure/azureml-examples
- cd azureml-examples/cli
- ```
+1. Create the endpoint in the cloud, run the following code:
+
+ :::code language="azurecli" source="~/azureml-examples-main/cli/deploy-safe-rollout-online-endpoints.sh" ID="create_endpoint":::
+
+### Create the 'blue' deployment
+
+A deployment is a set of resources required for hosting the model that does the actual inferencing. To create a deployment named `blue` for your endpoint, run the following command:
+
+ :::code language="azurecli" source="~/azureml-examples-main/cli/deploy-safe-rollout-online-endpoints.sh" ID="create_blue":::
+
+# [Python](#tab/python)
+
+### Create online endpoint
+
+To create a managed online endpoint, use the `ManagedOnlineEndpoint` class. This class allows users to configure the following key aspects of the endpoint:
+
+* `name` - Name of the endpoint. Needs to be unique at the Azure region level
+* `auth_mode` - The authentication method for the endpoint. Key-based authentication and Azure ML token-based authentication are supported. Key-based authentication doesn't expire but Azure ML token-based authentication does. Possible values are `key` or `aml_token`.
+* `identity`- The managed identity configuration for accessing Azure resources for endpoint provisioning and inference.
+ * `type`- The type of managed identity. Azure Machine Learning supports `system_assigned` or `user_assigned` identity.
+ * `user_assigned_identities` - List (array) of fully qualified resource IDs of the user-assigned identities. This property is required if `identity.type` is user_assigned.
+* `description`- Description of the endpoint.
+
+1. Configure the endpoint:
+
+ [!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=configure_endpoint)]
-The commands in this tutorial are in the file `deploy-safe-rollout-online-endpoints.sh` and the YAML configuration files are in the `endpoints/online/managed/sample/` subdirectory.
+ > [!NOTE]
+ > To create a Kubernetes online endpoint, use the `KubernetesOnlineEndpoint` class.
-## Confirm your existing deployment is created
+1. Create the endpoint:
-You can view the status of your existing endpoint and deployment by running:
+ [!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=create_endpoint)]
+
+### Create the 'blue' deployment
+
+A deployment is a set of resources required for hosting the model that does the actual inferencing. To create a deployment for your managed online endpoint, use the `ManagedOnlineDeployment` class. This class allows users to configure the following key aspects of the deployment:
+
+**Key aspects of deployment**
+* `name` - Name of the deployment.
+* `endpoint_name` - Name of the endpoint to create the deployment under.
+* `model` - The model to use for the deployment. This value can be either a reference to an existing versioned model in the workspace or an inline model specification.
+* `environment` - The environment to use for the deployment. This value can be either a reference to an existing versioned environment in the workspace or an inline environment specification.
+* `code_configuration` - the configuration for the source code and scoring script
+ * `path`- Path to the source code directory for scoring the model
+ * `scoring_script` - Relative path to the scoring file in the source code directory
+* `instance_type` - The VM size to use for the deployment. For the list of supported sizes, see [Managed online endpoints SKU list](reference-managed-online-endpoints-vm-sku-list.md).
+* `instance_count` - The number of instances to use for the deployment
+
+1. Configure blue deployment:
+
+ [!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=configure_deployment)]
+
+ > [!NOTE]
+ > To create a deployment for a Kubernetes online endpoint, use the `KubernetesOnlineDeployment` class.
+
+1. Create the deployment:
+
+ [!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=create_deployment)]
+
+ [!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=deployment_traffic)]
+++
+## Confirm your existing deployment
+
+# [Azure CLI](#tab/azure-cli)
+
+You can view the status of your existing endpoint and deployment by running:
```azurecli az ml online-endpoint show --name $ENDPOINT_NAME
az ml online-endpoint show --name $ENDPOINT_NAME
az ml online-deployment show --name blue --endpoint $ENDPOINT_NAME ```
-You should see the endpoint identified by `$ENDPOINT_NAME` and, a deployment called `blue`.
+You should see the endpoint identified by `$ENDPOINT_NAME` and, a deployment called `blue`.
+
+### Test the endpoint with sample data
+
+The endpoint can be invoked using the `invoke` command. We'll send a sample request using a [json](https://github.com/Azure/azureml-examples/tree/main/sdk/python/endpoints/online/model-1/sample-request.json) file.
++
+# [Python](#tab/python)
+
+Check the status to see whether the model was deployed without error:
+
+```python
+ml_client.online_endpoints.get(name=online_endpoint_name)
+```
+
+### Test the endpoint with sample data
+
+Using the `MLClient` created earlier, we'll get a handle to the endpoint. The endpoint can be invoked using the `invoke` command with the following parameters:
+
+* `endpoint_name` - Name of the endpoint
+* `request_file` - File with request data
+* `deployment_name` - Name of the specific deployment to test in an endpoint
+
+We'll send a sample request using a [json](https://github.com/Azure/azureml-examples/tree/main/sdk/python/endpoints/online/model-1/sample-request.json) file.
+
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=test_deployment)]
++ ## Scale your existing deployment to handle more traffic
+# [Azure CLI](#tab/azure-cli)
+ In the deployment described in [Deploy and score a machine learning model with an online endpoint](how-to-deploy-managed-online-endpoints.md), you set the `instance_count` to the value `1` in the deployment yaml file. You can scale out using the `update` command: :::code language="azurecli" source="~/azureml-examples-main/cli/deploy-safe-rollout-online-endpoints.sh" ID="scale_blue" :::
In the deployment described in [Deploy and score a machine learning model with a
> [!Note] > Notice that in the above command we use `--set` to override the deployment configuration. Alternatively you can update the yaml file and pass it as an input to the `update` command using the `--file` input.
+# [Python](#tab/python)
+
+Using the `MLClient` created earlier, we'll get a handle to the deployment. The deployment can be scaled by increasing or decreasing the `instance_count`.
+
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=scale_deployment)]
+
+### Get endpoint details
+
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=get_endpoint_details)]
+++ ## Deploy a new model, but send it no traffic yet
-Create a new deployment named `green`:
+# [Azure CLI](#tab/azure-cli)
+
+Create a new deployment named `green`:
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-safe-rollout-online-endpoints.sh" ID="create_green" :::
-Since we haven't explicitly allocated any traffic to green, it will have zero traffic allocated to it. You can verify that using the command:
+Since we haven't explicitly allocated any traffic to `green`, it will have zero traffic allocated to it. You can verify that using the command:
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-safe-rollout-online-endpoints.sh" ID="get_traffic" :::
If you want to use a REST client to invoke the deployment directly without going
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-safe-rollout-online-endpoints.sh" ID="test_green_using_curl" :::
-## Test the deployment with mirrored traffic (preview)
+# [Python](#tab/python)
+
+Create a new deployment for your managed online endpoint and name the deployment `green`:
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=configure_new_deployment)]
+
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=create_new_deployment)]
+
+> [!NOTE]
+> If you're creating a deployment for a Kubernetes online endpoint, use the `KubernetesOnlineDeployment` class and specify a [Kubernetes instance type](how-to-manage-kubernetes-instance-types.md) in your Kubernetes cluster.
+
+### Test the new deployment
+
+Though `green` has 0% of traffic allocated, you can still invoke the endpoint and deployment with the [json](https://github.com/Azure/azureml-examples/tree/main/sdk/python/endpoints/online/model-2/sample-request.json) file.
+
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=test_new_deployment)]
+++
+## Test the deployment with mirrored traffic (preview)
[!INCLUDE [preview disclaimer](../../includes/machine-learning-preview-generic-disclaimer.md)]
-Once you've tested your `green` deployment, you can copy (or 'mirror') a percentage of the live traffic to it. Mirroring traffic doesn't change results returned to clients. Requests still flow 100% to the blue deployment. The mirrored percentage of the traffic is copied and submitted to the `green` deployment so you can gather metrics and logging without impacting your clients. Mirroring is useful when you want to validate a new deployment without impacting clients. For example, to check if latency is within acceptable bounds and that there are no HTTP errors.
+Once you've tested your `green` deployment, you can copy (or 'mirror') a percentage of the live traffic to it. Mirroring traffic doesn't change results returned to clients. Requests still flow 100% to the `blue` deployment. The mirrored percentage of the traffic is copied and submitted to the `green` deployment so you can gather metrics and logging without impacting your clients. Mirroring is useful when you want to validate a new deployment without impacting clients. For example, to check if latency is within acceptable bounds and that there are no HTTP errors.
> [!WARNING] > Mirroring traffic uses your [endpoint bandwidth quota](how-to-manage-quotas.md#azure-machine-learning-managed-online-endpoints) (default 5 MBPS). Your endpoint bandwidth will be throttled if you exceed the allocated quota. For information on monitoring bandwidth throttling, see [Monitor managed online endpoints](how-to-monitor-online-endpoints.md#metrics-at-endpoint-scope).
+# [Azure CLI](#tab/azure-cli)
+ The following command mirrors 10% of the traffic to the `green` deployment: +
+You can test mirror traffic by invoking the endpoint several times:
+ ```azurecli
-az ml online-endpoint update --name $ENDPOINT_NAME --mirror-traffic "green=10"
+for i in {1..20} ; do
+ az ml online-endpoint invoke --name $ENDPOINT_NAME --request-file endpoints/online/model-1/sample-request.json
+done
```
-> [!IMPORTANT]
-> Mirroring has the following limitations:
-> * You can only mirror traffic to one deployment.
-> * Mirrored traffic is not currently supported with K8s.
-> * The maximum mirrored traffic you can configure is 50%. This limit is to reduce the impact on your endpoint bandwidth quota.
->
-> Also note the following behavior:
-> * A deployment can only be set to live or mirror traffic, not both.
-> * You can send traffic directly to the mirror deployment by specifying the deployment set for mirror traffic.
-> * You can send traffic directly to a live deployment by specifying the deployment set for live traffic, but in this case the traffic won't be mirrored to the mirror deployment. Mirror traffic is routed from traffic sent to endpoint without specifying the deployment.
+# [Python](#tab/python)
+
+The following command mirrors 10% of the traffic to the `green` deployment:
+
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=new_deployment_traffic)]
+
+You can test mirror traffic by invoking the endpoint several times:
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=several_tests_to_mirror_traffic)]
+++
+Mirroring has the following limitations:
+* You can only mirror traffic to one deployment.
+* Mirror traffic isn't currently supported for Kubernetes online endpoints.
+* The maximum mirrored traffic you can configure is 50%. This limit is to reduce the impact on your endpoint bandwidth quota.
+
+Also note the following behavior:
+* A deployment can only be set to live or mirror traffic, not both.
+* You can send traffic directly to the mirror deployment by specifying the deployment set for mirror traffic.
+* You can send traffic directly to a live deployment by specifying the deployment set for live traffic, but in this case the traffic won't be mirrored to the mirror deployment. Mirror traffic is routed from traffic sent to endpoint without specifying the deployment.
:::image type="content" source="./media/how-to-safely-rollout-managed-endpoints/endpoint-concept-mirror.png" alt-text="Diagram showing 10% traffic mirrored to one deployment.":::
-After testing, you can set the mirror traffic to zero to disable mirroring:
+# [Azure CLI](#tab/azure-cli)
+You can confirm that the specific percentage of the traffic was sent to the `green` deployment by seeing the logs from the deployment:
```azurecli
-az ml online-endpoint update --name $ENDPOINT_NAME --mirror-traffic "green=0"
+az ml online-deployment get-logs --name blue --endpoint $ENDPOINT_NAME
+```
+
+After testing, you can set the mirror traffic to zero to disable mirroring:
++
+# [Python](#tab/python)
+You can confirm that the specific percentage of the traffic was sent to the `green` deployment by seeing the logs from the deployment:
+
+```python
+ml_client.online_deployments.get_logs(
+ name="green", endpoint_name=online_endpoint_name, lines=50
+)
```
+After testing, you can set the mirror traffic to zero to disable mirroring:
+
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=disable_traffic_mirroring)]
+++ ## Test the new deployment with a small percentage of live traffic
+# [Azure CLI](#tab/azure-cli)
+ Once you've tested your `green` deployment, allocate a small percentage of traffic to it: :::code language="azurecli" source="~/azureml-examples-main/cli/deploy-safe-rollout-online-endpoints.sh" ID="green_10pct_traffic" :::
-Now, your `green` deployment will receive 10% of requests.
+# [Python](#tab/python)
+
+Once you've tested your `green` deployment, allocate a small percentage of traffic to it:
+
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=allocate_some_traffic)]
+++
+Now, your `green` deployment will receive 10% of requests.
:::image type="content" source="./media/how-to-safely-rollout-managed-endpoints/endpoint-concept.png" alt-text="Diagram showing traffic split between deployments."::: ## Send all traffic to your new deployment
-Once you're satisfied that your `green` deployment is fully satisfactory, switch all traffic to it.
+# [Azure CLI](#tab/azure-cli)
+
+Once you're fully satisfied with your `green` deployment, switch all traffic to it.
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-safe-rollout-online-endpoints.sh" ID="green_100pct_traffic" :::
+# [Python](#tab/python)
+
+Once you're fully satisfied with your `green` deployment, switch all traffic to it.
+
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=allocate_all_traffic)]
+++ ## Remove the old deployment
+# [Azure CLI](#tab/azure-cli)
+ :::code language="azurecli" source="~/azureml-examples-main/cli/deploy-safe-rollout-online-endpoints.sh" ID="delete_blue" :::
+# [Python](#tab/python)
+
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=remove_old_deployment)]
+++ ## Delete the endpoint and deployment
+# [Azure CLI](#tab/azure-cli)
+ If you aren't going use the deployment, you should delete it with: :::code language="azurecli" source="~/azureml-examples-main/cli/deploy-safe-rollout-online-endpoints.sh" ID="delete_endpoint" :::
+# [Python](#tab/python)
+
+If you aren't going use the deployment, you should delete it with:
+
+[!notebook-python[](~/azureml-examples-main/sdk/python/endpoints/online/managed/online-endpoints-safe-rollout.ipynb?name=delete_endpoint)]
++ ## Next steps
+- [Explore online endpoint samples](https://github.com/Azure/azureml-examples/tree/v2samplesreorg/sdk/python/endpoints)
- [Deploy models with REST](how-to-deploy-with-rest.md) - [Create and use online endpoints in the studio](how-to-use-managed-online-endpoint-studio.md) - [Access Azure resources with a online endpoint and managed identity](how-to-access-resources-from-endpoints-managed-identities.md)
If you aren't going use the deployment, you should delete it with:
- [View costs for an Azure Machine Learning managed online endpoint](how-to-view-online-endpoints-costs.md) - [Managed online endpoints SKU list](reference-managed-online-endpoints-vm-sku-list.md) - [Troubleshooting online endpoints deployment and scoring](how-to-troubleshoot-managed-online-endpoints.md)-- [Online endpoint YAML reference](reference-yaml-endpoint-online.md)
+- [Online endpoint YAML reference](reference-yaml-endpoint-online.md)
machine-learning Migrate To V2 Execution Automl https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/migrate-to-v2-execution-automl.md
This article gives a comparison of scenario(s) in SDK v1 and SDK v2.
## Submit AutoML run
-* SDK v1: Below is a sample AutoML classification task. For the entire code, check out our [examples repo](https://github.com/azure/azureml-examples/blob/main/python-sdk/tutorials/automl-with-azureml/classification-credit-card-fraud/auto-ml-classification-credit-card-fraud.ipynb).
+* SDK v1: Below is a sample AutoML classification task. For the entire code, check out our [examples repo](https://github.com/Azure/azureml-examples/blob/main/v1/python-sdk/tutorials/automl-with-azureml/classification-credit-card-fraud/auto-ml-classification-credit-card-fraud.ipynb).
```python # Imports
This article gives a comparison of scenario(s) in SDK v1 and SDK v2.
print(azureml_url) ```
-* SDK v2: Below is a sample AutoML classification task. For the entire code, check out our [examples repo](https://github.com/Azure/azureml-examples/blob/main/sdk/jobs/automl-standalone-jobs/automl-classification-task-bankmarketing/automl-classification-task-bankmarketing-mlflow.ipynb).
+* SDK v2: Below is a sample AutoML classification task. For the entire code, check out our [examples repo](https://github.com/Azure/azureml-examples/blob/main/sdk/python/jobs/automl-standalone-jobs/automl-classification-task-bankmarketing/automl-classification-task-bankmarketing-mlflow.ipynb).
```python # Imports
machine-learning Migrate To V2 Execution Hyperdrive https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/migrate-to-v2-execution-hyperdrive.md
This article gives a comparison of scenario(s) in SDK v1 and SDK v2.
For more information, see:
-* [SDK v1 - Tune Hyperparameters](/v1/how-to-tune-hyperparameters-v1.md)
+* [SDK v1 - Tune Hyperparameters](/azure/machine-learning/v1/how-to-tune-hyperparameters-v1)
* [SDK v2 - Tune Hyperparameters](/python/api/azure-ai-ml/azure.ai.ml.sweep) * [SDK v2 - Sweep in Pipeline](how-to-use-sweep-in-pipeline.md)
machine-learning Migrate To V2 Execution Pipeline https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/migrate-to-v2-execution-pipeline.md
This article gives a comparison of scenario(s) in SDK v1 and SDK v2. In the foll
```
-* SDK v2. [Full sample link](https://github.com/Azure/azureml-examples/blob/main/sdk/jobs/pipelines/1b_pipeline_with_python_function_components/pipeline_with_python_function_components.ipynb)
+* SDK v2. [Full sample link](https://github.com/Azure/azureml-examples/blob/main/sdk/python/jobs/pipelines/1b_pipeline_with_python_function_components/pipeline_with_python_function_components.ipynb)
```python # import required libraries
This article gives a comparison of scenario(s) in SDK v1 and SDK v2. In the foll
|Functionality in SDK v1|Rough mapping in SDK v2| |-|-|
-|[azureml.pipeline.core.Pipeline](/python/api/azureml-pipeline-core/azureml.pipeline.core.pipeline?view=azure-ml-py&preserve-view=true)|[azure.ai.ml.dsl.pipeline]/python/api/azure-ai-ml/azure.ai.ml.dsl#azure-ai-ml-dsl-pipeline)|
-|[OutputDatasetConfig](/python/api/azureml-core/azureml.data.output_dataset_config.outputdatasetconfig?view=azure-ml-py&preserve-view=true)|[Output]/python/api/azure-ai-ml/azure.ai.ml.output|
+|[azureml.pipeline.core.Pipeline](/python/api/azureml-pipeline-core/azureml.pipeline.core.pipeline?view=azure-ml-py&preserve-view=true)|[azure.ai.ml.dsl.pipeline](/python/api/azure-ai-ml/azure.ai.ml.dsl#azure-ai-ml-dsl-pipeline)|
+|[OutputDatasetConfig](/python/api/azureml-core/azureml.data.output_dataset_config.outputdatasetconfig?view=azure-ml-py&preserve-view=true)|[Output](/python/api/azure-ai-ml/azure.ai.ml.output)|
|[dataset as_mount](/python/api/azureml-core/azureml.data.filedataset?view=azure-ml-py#azureml-data-filedataset-as-mount&preserve-view=true)|[Input](/python/api/azure-ai-ml/azure.ai.ml.input)| ## Step and job/component type mapping
machine-learning How To Secure Workspace Vnet https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/v1/how-to-secure-workspace-vnet.md
In this article you learn how to enable the following workspaces resources in a
### Azure Container Registry
-* Your Azure Container Registry must be Premium version. For more information on upgrading, see [Changing SKUs](../../container-registry/container-registry-skus.md#changing-tiers).
+* Your Azure Container Registry must be Premium version. For more information on upgrading, see [Changing SKUs](/azure/container-registry/container-registry-skus#changing-tiers).
* If your Azure Container Registry uses a __private endpoint__, it must be in the same _virtual network_ as the storage account and compute targets used for training or inference. If it uses a __service endpoint__, it must be in the same _virtual network_ and _subnet_ as the storage account and compute targets.
This article is part of a series on securing an Azure Machine Learning workflow.
* [Use a firewall](../how-to-access-azureml-behind-firewall.md) * [Tutorial: Create a secure workspace](../tutorial-create-secure-workspace.md) * [Tutorial: Create a secure workspace using a template](../tutorial-create-secure-workspace-template.md)
-* [API platform network isolation](../how-to-configure-network-isolation-with-v2.md)
+* [API platform network isolation](../how-to-configure-network-isolation-with-v2.md)
migrate Migrate Appliance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/migrate-appliance.md
The appliance can be deployed using a couple of methods:
- For physical or virtualized servers on-premises or any other cloud, you always deploy the appliance using a PowerShell installer script.Refer to the steps of deployment [here](how-to-set-up-appliance-physical.md). - Download links are available in the tables below.
+> [!Note]
+> Don't install any other components, such as the **Microsoft Monitoring Agent (MMA)** or **Replication appliance**, on the same server hosting the Azure Migrate appliance. If you install the MMA agent, you can face problems like **"Multiple custom attributes of the same type found"**. It's recommended to have a dedicated server to deploy the appliance.
+ ## Appliance services The appliance has the following
migrate Migrate Replication Appliance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/migrate-replication-appliance.md
MySQL must be installed on the replication appliance machine. It can be installe
**Method** | **Details** |
-Download and install manually | Download MySQL application & place it in the folder C:\Temp\ASRSetup, then install manually.<br> When you set up the appliance, MySQL will show as already installed.
+Download and install manually | [Download](https://dev.mysql.com/get/Downloads/MySQLInstaller/mysql-installer-community-5.7.20.0.msi) the MySQL application & place it in the folder C:\Temp\ASRSetup, then install manually.<br> When you set up the appliance, MySQL will show as already installed.
Without online download | Place the MySQL installer application in the folder C:\Temp\ASRSetup. When you install the appliance and select download and install MySQL, setup will use the installer you added. Download and install in Azure Migrate | When you install the appliance and are prompted for MySQL, select **Download and install**.
migrate Tutorial Discover Physical https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/tutorial-discover-physical.md
Check that the zipped file is secure, before you deploy it.
**Download** | **Hash value** |
- [Latest version](https://go.microsoft.com/fwlink/?linkid=2191847) | 7745817a5320628022719f24203ec0fbf56a0e0f02b4e7713386cbc003f0053c
+ [Latest version](https://go.microsoft.com/fwlink/?linkid=2191847) | 277c53620db299f57e3ac5a65569e9720f06190a245476810b36bf651c8b795b
> [!NOTE] > The same script can be used to set up Physical appliance for either Azure public or Azure Government cloud with public or private endpoint connectivity.
After the discovery has been initiated, you can delete any of the added servers
## Next steps - [Assess physical servers](tutorial-assess-physical.md) for migration to Azure VMs.-- [Review the data](discovered-metadata.md#collected-data-for-physical-servers) that the appliance collects during discovery.
+- [Review the data](discovered-metadata.md#collected-data-for-physical-servers) that the appliance collects during discovery.
migrate Tutorial Migrate Vmware https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/tutorial-migrate-vmware.md
# Migrate VMware VMs to Azure (agentless)
-This article shows you how to migrate on-premises VMware VMs to Azure, using the [Azure Migrate:Server Migration](migrate-services-overview.md#azure-migrate-server-migration-tool) tool, with agentless migration. You can also migrate VMware VMs using agent-based migration. [Compare](server-migrate-overview.md#compare-migration-methods) the methods.
+This article shows you how to migrate on-premises VMware VMs to Azure, using the [Azure Migrate: Server Migration](migrate-services-overview.md#azure-migrate-server-migration-tool) tool, with agentless migration. You can also migrate VMware VMs using agent-based migration. [Compare](server-migrate-overview.md#compare-migration-methods) the methods.
This tutorial is the third in a series that demonstrates how to assess and migrate VMware VMs to Azure.
mysql Concepts High Availability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/concepts-high-availability.md
Here are some considerations to keep in mind when you use high availability:
## Frequently asked questions (FAQ)
+- **What are the SLAs for same-zone vs zone-redundant HA enabled Flexible server?**
+
+SLA information for Azure Database for MySQL Flexible Server can be found at [SLA for Azure Database for MySQL](https://azure.microsoft.com/support/legal/sla/mysql/v1_2/).
+ - **How am I billed for high available (HA) servers?** Servers enabled with HA have a primary and secondary replica. Secondary replica can be in same zone or zone redundant. You're billed for the provisioned compute and storage for both the primary and secondary replica. For example, if you have a primary with 4 vCores of compute and 512 GB of provisioned storage, your secondary replica will also have 4 vCores and 512 GB of provisioned storage. Your zone redundant HA server will be billed for 8 vCores and 1,024 GB of storage. Depending on your backup storage volume, you may also be billed for backup storage.
mysql Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/mysql/flexible-server/overview.md
One advantage of running your workload in Azure is its global reach. The flexibl
| Central US | :heavy_check_mark: | :heavy_check_mark: | :x: | :heavy_check_mark: | | China East 2 | :heavy_check_mark: | :heavy_check_mark: | :x: | :x: | | China North 2 | :heavy_check_mark: | :heavy_check_mark: | :x: | :x: |
+| China North 3 |:heavy_check_mark: | :heavy_check_mark: | :x: | :x: |
| East Asia (Hong Kong) | :heavy_check_mark: | :heavy_check_mark: | :x: | :heavy_check_mark: | | East US | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | | East US 2 | :heavy_check_mark: | :heavy_check_mark: | :x: | :heavy_check_mark: |
One advantage of running your workload in Azure is its global reach. The flexibl
| West US | :heavy_check_mark: | :heavy_check_mark: | :x: | :heavy_check_mark: | | West US 2 | :heavy_check_mark: | :heavy_check_mark: | :x: | :heavy_check_mark: | | West US 3 | :heavy_check_mark: | :heavy_check_mark: | :x: | :x: |
+| Qatar Central | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | :x: |
+| Sweden Central | :heavy_check_mark: | :heavy_check_mark: | :x: | :x: |
## Contacts
network-watcher Connection Monitor Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/connection-monitor-overview.md
> To minimize service disruption to your current workloads, [migrate your tests from Network Performance Monitor](migrate-to-connection-monitor-from-network-performance-monitor.md), or [migrate from Connection Monitor (Classic)](migrate-to-connection-monitor-from-connection-monitor-classic.md) to the new Connection Monitor in Azure Network Watcher before February 29, 2024. > [!IMPORTANT]
-> Connection Monitor will now support end to end connectivity checks from and to *Azure Virtual Machine Scale Sets*, enabling faster performance monitoring and network troubleshooting across scale sets
+> Connection Monitor will now support end-to-end connectivity checks from and to *Azure Virtual Machine Scale Sets*, enabling faster performance monitoring and network troubleshooting across scale sets
Connection Monitor provides unified, end-to-end connection monitoring in Azure Network Watcher. The Connection Monitor feature supports hybrid and Azure cloud deployments. Network Watcher provides tools to monitor, diagnose, and view connectivity-related metrics for your Azure deployments. Here are some use cases for Connection Monitor: -- Your front-end web server virtual machine (VM) or virtual machine scale set(VMSS) communicates with a database server VM in a multi-tier application. You want to check network connectivity between the two VM/or scale sets.
+- Your front-end web server virtual machine (VM) or virtual machine scale set (VMSS) communicates with a database server VM in a multi-tier application. You want to check network connectivity between the two VM/or scale sets.
- You want VMs/scale sets in, for example, the East US region to ping VMs/scale sets in the Central US region, and you want to compare cross-region network latencies. - You have multiple on-premises office sites, one in Seattle, Washington, for example, and another in Ashburn, Virginia. Your office sites connect to Microsoft 365 URLs. For your users of Microsoft 365 URLs, you want to compare the latencies between Seattle and Ashburn. - Your hybrid application needs connectivity to an Azure storage account endpoint. Your on-premises site and your Azure application connect to the same endpoint. You want to compare the latencies of the on-premises site with the latencies of the Azure application.
You can install the Network Watcher extension when you [create a VM](./connectio
Rules for a network security group (NSG) or firewall can block communication between the source and destination. Connection Monitor detects this issue and shows it as a diagnostics message in the topology. To enable connection monitoring, ensure that the NSG and firewall rules allow packets over TCP or ICMP between the source and destination.
-If you wish to escape the installation process for enabling Network Watcher extension, you can proceed with the creation of Connection Monitor and allow auto enablement of Network Watcher extensions on your Azure VMs and VM scale sets.
+If you wish to escape the installation process for enabling the Network Watcher extension, you can proceed with the creation of Connection Monitor and allow auto enablement of Network Watcher extensions on your Azure VMs and VM scale sets.
> [!Note]
- > In the case the virtual machine scale sets is set for manual upgradation, the user will have to upgrade the scale set post Network Watcher extension installation in order to continue setting up the Connection Monitor with virtual machine scale sets as endpoints. In-case the virtual machine scale set is set to auto upgradation, the user need not worry about any upgradation after Network Watcher extension installation.
+ > In case the virtual machine scale sets is set for manual upgradation, the user will have to upgrade the scale set post Network Watcher extension installation in order to continue setting up the Connection Monitor with virtual machine scale sets as endpoints. Incase the virtual machine scale set is set to auto upgradation, the user need not worry about any upgradation after Network Watcher extension installation.
> As Connection Monitor now supports unified auto enablement of monitoring extensions, user can consent to auto upgradation of VM scale set with auto enablement of Network Watcher extension during the creation on Connection Monitor for VM scale sets with manual upgradation. ### Agents for on-premises machines
For more information, see the "Network requirements" section of [Log Analytics a
The script configures only Windows Firewall locally. If you have a network firewall, make sure that it allows traffic destined for the TCP port that's used by Network Performance Monitor.
-The Log Analytics Windows agent can be multihomed to send data to multiple workspaces and System Center Operations Manager management groups. The Linux agent can send data only to a single destination, either a workspace or management group.
+The Log Analytics Windows agent can be multi-homed to send data to multiple workspaces and System Center Operations Manager management groups. The Linux agent can send data only to a single destination, either a workspace or management group.
#### Enable the Network Performance Monitor solution for on-premises machines
To enable the Network Performance Monitor solution for on-premises machines, do
Unlike Log Analytics agents, the Network Performance Monitor solution can be configured to send data only to a single Log Analytics workspace.
-If you wish to escape the installation process for enabling Network Watcher extension, you can proceed with the creation of Connection Monitor and allow auto enablement of monitoring solution on your on-premises machines.
+If you wish to escape the installation process for enabling the Network Watcher extension, you can proceed with the creation of Connection Monitor and allow auto enablement of monitoring solution on your on-premises machines.
## Enable Network Watcher on your subscription
Connection monitors have the following scale limits:
Monitoring coverage for Azure and Non Azure Resources:
-Connection Monitor now provides 5 different coverage levels for monitoring compound resources i.e. VNets, SubNets, VM Scale Sets. Coverage level is defined as the % of instances of a compound resource actually included in monitoring those resources as source or destinations.
-Users can manually select a coverage level from Low, Below Average, Average, Above Average and Full to define an approximate % of instances to be included in monitoring the particular resource as an endpoint
+Connection Monitor now provides 5 different coverage levels for monitoring compound resources i.e. VNets, SubNets, and VM Scale Sets. The coverage level is defined as the % of instances of a compound resource actually included in monitoring those resources as sources or destinations.
+Users can manually select a coverage level from Low, Below Average, Average, Above Average, and Full to define an approximate % of instances to be included in monitoring the particular resource as an endpoint
## Analyze monitoring data and set alerts After you create a connection monitor, sources check connectivity to destinations based on your test configuration.
-While monitoring endpoints, Connection Monitor re-evaluates status of end points once every 24 hours. Hence, incase a VM gets deallocated or is turned-off during a 24-hour cycle, Connection Monitor would report indeterminate state due to absence of data in the network path till the end of 24-hour cycle before re evaluating the status of the VM and reporting the VM status as deallocated.
+While monitoring endpoints, Connection Monitor re-evaluates the status of endpoints once every 24 hours. Hence, in case a VM gets deallocated or is turned-off during a 24-hour cycle, Connection Monitor would report an indeterminate state due to absence of data in the network path till the end of the 24-hour cycle before re-evaluating the status of the VM and reporting the VM status as deallocated.
> [!NOTE]
- > In case of monitoring an Azure Virtual Machine Scale Set, instances of a particular scale set selected for monitoring (either by the user or picked up by default as part of the coverage level selected) might get deallocated or scaled down in the middle of the 24 hour cycle. In this particular time-period, Connection Monitor will not be able to recognize this action and thus end-up reporting indeterminate state due to absence of data.
- > Users are adviced to allow random selection of virtual machine scale sets instances within coverage levels instead of selecting particular instances of scale sets for monitoring, to minimize the risks of non-discoverability of deallocated or scaled down virtual machine scale sets instances in a 24 hours cycle and lead to indeterminate state of connection monitor.
+ > In case of monitoring an Azure Virtual Machine Scale Set, instances of a particular scale set selected for monitoring (either by the user or picked up by default as part of the coverage level selected) might get deallocated or scaled down in the middle of the 24-hour cycle. In this particular time period, Connection Monitor will not be able to recognize this action and thus end-up reporting an indeterminate state due to the absence of data.
+ > Users are advised to allow random selection of virtual machine scale sets instances within coverage levels instead of selecting particular instances of scale sets for monitoring, to minimize the risks of non-discoverability of deallocated or scaled down virtual machine scale sets instances in a 24 hours cycle and lead to an indeterminate state of connection monitor.
### Checks in a test Depending on the protocol that you select in the test configuration, Connection Monitor runs a series of checks for the source-destination pair. The checks run according to the test frequency that you select.
-If you use HTTP, the service calculates the number of HTTP responses that returned a valid response code. You can set valid response codes by using PowerShell and the Azure CLI. The result determines the percentage of failed checks. To calculate RTT, the service measures the time between an HTTP call and the response.
+If you use HTTP, the service calculates the number of HTTP responses that returned a valid response code. You can set valid response codes by using PowerShell and Azure CLI. The result determines the percentage of failed checks. To calculate RTT, the service measures the time between an HTTP call and the response.
If you use TCP or ICMP, the service calculates the packet-loss percentage to determine the percentage of failed checks. To calculate RTT, the service measures the time taken to receive the acknowledgment (ACK) for the packets that were sent. If you've enabled traceroute data for your network tests, you can view the hop-by-hop loss and latency for your on-premises network.
You can filter a list based on:
* **Top-level filters**: Search the list by text, entity type (Connection Monitor, test group, or test) timestamp, and scope. Scope includes subscriptions, regions, sources, and destination types. See box 1 in the following image. * **State-based filters**: Filter by the state of the connection monitor, test group, or test. See box 2 in the following image.
-* **Alert based filter**: Filter by alerts that are fired on the connection monitor resource. See box 3 in the following image.
+* **Alert-based filter**: Filter by alerts that are fired on the connection monitor resource. See box 3 in the following image.
:::image type="content" source="./media/connection-monitor-2-preview/cm-view.png" alt-text="Screenshot showing how to filter views of connection monitors, test groups, and tests in Connection Monitor." lightbox="./media/connection-monitor-2-preview/cm-view.png"::: For example, to view all tests in Connection Monitor, where the source IP is 10.192.64.56, do the following:+ 1. Change the view to **Test**. 1. In the **Search** box, enter **10.192.64.56**. 1. Under **Scope**, in the top-level filter, select **Sources**.
To view the trends in RTT and the percentage of failed checks for a test, do the
#### Log queries in Log Analytics
-Use Log Analytics to create custom views of your monitoring data. All displayed data is from Log Analytics. You can interactively analyze data in the repository. Correlate the data from Agent Health or other solutions that are based in Log Analytics. Export the data to Excel or Power BI, or create a shareable link.
+Use Log Analytics to create custom views of your monitoring data. All displayed data is from Log Analytics. You can interactively analyze data in the repository. Correlate the data from Agent Health or other solutions that are based on Log Analytics. Export the data to Excel or Power BI, or create a shareable link.
#### Network topology in Connection Monitor
In connection monitors that were created before the Connection Monitor experienc
In connection monitors that were created in the Connection Monitor experience, data is available only for ChecksFailedPercent, RoundTripTimeMs, and Test Result metrics.
-Metrics are generated according to monitoring frequency, and they describe aspects of a connection monitor at a particular time. Connection Monitor metrics also have multiple dimensions, such as SourceName, DestinationName, TestConfiguration, and TestGroup. You can use these dimensions to visualize specific data and to target it while defining alerts.
+Metrics are generated according to monitoring frequency, and they describe aspects of a connection monitor at a particular time. Connection Monitor metrics also have multiple dimensions, such as SourceName, DestinationName, TestConfiguration, and TestGroup. You can use these dimensions to visualize specific data and target it while defining alerts.
Azure metrics currently allow a minimum granularity of 1 minute. If the frequency is less than 1 minute, aggregated results will be displayed.
The migration helps produce the following results:
* Azure Virtual Machines with the Network Watcher extension send data to both the workspace and the metrics. Connection Monitor makes the data available through the new metrics (ChecksFailedPercent and RoundTripTimeMs) instead of the old metrics (ProbesFailedPercent and AverageRoundtripMs). The old metrics will get migrated to new metrics as ProbesFailedPercent > ChecksFailedPercent and AverageRoundtripMs > RoundTripTimeMs. * Data monitoring: * **Alerts**: Migrated automatically to the new metrics.
- * **Dashboards and integrations**: Require manually editing of the metrics set.
+ * **Dashboards and integrations**: Requires manual editing of the metrics set.
There are several reasons to migrate from Network Performance Monitor and Connection Monitor (Classic) to Connection Monitor. The following table lists a few use cases that show how the latest Connection Monitor performs against Network Performance Monitor and Connection Monitor (Classic). | Feature | Network Performance Monitor | Connection Monitor (Classic) | Connection Monitor | | - | | -- | | | Unified experience for Azure and hybrid monitoring | Not available | Not available | Available |
- | Cross-subscription, cross-region, cross-workspace monitoring | Allows cross-subscription, cross-region monitoring, but doesnΓÇÖt allow cross-workspace monitoring. | Not available | Allows cross-subscription, cross-workspace monitoring; cross-workspaces have regional boundary. |
+ | Cross-subscription, cross-region, and cross-workspace monitoring | Allows cross-subscription and cross-region monitoring, but doesnΓÇÖt allow cross-workspace monitoring. | Not available | Allows cross-subscription and cross-workspace monitoring; cross-workspaces have a regional boundary. |
| Centralized workspace support | Not available | Not available | Available | | Multiple sources can ping multiple destinations | Performance monitoring allows multiple sources to ping multiple destinations. Service connectivity monitoring allows multiple sources to ping a single service or URL. Express Route allows multiple sources to ping multiple destinations. | Not available | Available | | Unified topology across on-premises, internet hops, and Azure | Not available | Not available | Available |
There are several reasons to migrate from Network Performance Monitor and Connec
## FAQ ### Does Connection Monitor support classic VMs?
-No, Connection Monitor doesn't support classic VMs. We recommended that you migrate infrastructure as a service (IaaS) resources from classic to Azure Resource Manager, because classic resources [will be deprecated](../virtual-machines/classic-vm-deprecation.md). For more information, see [Migrate IaaS resources from classic to Azure Resource Manager](../virtual-machines/migration-classic-resource-manager-overview.md).
+No, Connection Monitor doesn't support classic VMs. We recommended that you migrate infrastructure as a service (IaaS) resources from classic to Azure Resource Manager because classic resources [will be deprecated](../virtual-machines/classic-vm-deprecation.md). For more information, see [Migrate IaaS resources from classic to Azure Resource Manager](../virtual-machines/migration-classic-resource-manager-overview.md).
### What if my topology isn't decorated or my hops have missing information?
-Topology can be decorated from non-Azure to Azure only if the destination Azure resource and the Connection Monitor resource are in same region.
+Topology can be decorated from non-Azure to Azure only if the destination Azure resource and the Connection Monitor resource are in the same region.
### What happens if the Connection Monitor creation fails with the following error: "We don't allow creating different endpoints for the same VM"?
-The same Azure VM can't be used with different configurations in the same connection monitor. For example, using same VM with a filter and without a filter in same connection monitor isn't supported.
+The same Azure VM can't be used with different configurations in the same connection monitor. For example, using same VM with a filter and without a filter in the same connection monitor isn't supported.
### What happens if the test failure reason is "Nothing to display"? Issues that are displayed on the Connection Monitor dashboard are found during topology discovery or hop exploration. There can be cases where the threshold set for % loss or RTT is breached but no issues are found on hops.
network-watcher Connection Monitor Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/connection-monitor-schema.md
Here are some benefits of Connection Monitor:
* Support for connectivity checks that are based on HTTP, TCP, and ICMP * Metrics and Log Analytics support for both Azure and non-Azure test setups
-There are two types of logs or data ingested into Log Analytics. The test data (NWConnectionMonitorTestResult query) is updated based on monitoring frequency of a particular test group. The path data (NWConnectionMonitorPathResult query) is updated when there is significant change in loss percentage or round-trip time. For some time durations, test data might keep getting updated while path data is not frequently updated, because both are independent.
+There are two types of logs or data ingested into Log Analytics. The test data (NWConnectionMonitorTestResult query) is updated based on the monitoring frequency of a particular test group. The path data (NWConnectionMonitorPathResult query) is updated when there is a significant change in loss percentage or round-trip time. For some time durations, test data might keep getting updated while path data is not frequently updated because both are independent.
## Connection Monitor Tests schema
network-watcher Connection Monitor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/connection-monitor.md
tags: azure-resource-manager
Previously updated : 10/17/2022 Last updated : 10/28/2022 -+ # Customer intent: I need to monitor communication between a VM and another VM. If the communication fails, I need to know why, so that I can resolve the problem.
network-watcher Network Insights Topology https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/network-insights-topology.md
When you drill down to a VM within the topology, the summary pane contains the *
## Next steps
-[Learn more](/connection-monitor-overview.md) about connectivity related metrics.
+[Learn more](/azure/network-watcher/connection-monitor-overview) about connectivity related metrics.
network-watcher Network Watcher Ip Flow Verify Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/network-watcher-ip-flow-verify-overview.md
# Introduction to IP flow verify in Azure Network Watcher
-IP flow verify checks if a packet is allowed or denied to or from a virtual machine. The information consists of direction, protocol, local IP, remote IP, local port, and remote port. If the packet is denied by a security group, the name of the rule that denied the packet is returned. While any source or destination IP can be chosen, IP flow verify helps administrators quickly diagnose connectivity issues from or to the internet and from or to the on-premises environment.
+IP flow verify checks if a packet is allowed or denied to or from a virtual machine. The information consists of direction, protocol, local IP, remote IP, local port, and a remote port. If the packet is denied by a security group, the name of the rule that denied the packet is returned. While any source or destination IP can be chosen, IP flow verify helps administrators quickly diagnose connectivity issues from or to the internet and from or to the on-premises environment.
IP flow verify looks at the rules for all Network Security Groups (NSGs) applied to the network interface, such as a subnet or virtual machine NIC. Traffic flow is then verified based on the configured settings to or from that network interface. IP flow verify is useful in confirming if a rule in a Network Security Group is blocking ingress or egress traffic to or from a virtual machine. Now along with the NSG rules evaluation, the Azure Virtual Network Manager rules will also be evaluated. [Azure Virtual Network Manager (AVNM)](../virtual-network-manager/overview.md) is a management service that enables users to group, configure, deploy, and manage Virtual Networks globally across subscriptions. AVNM security configuration allows users to define a collection of rules that can be applied to one or more network groups at the global level. These security rules have a higher priority than network security group (NSG) rules. An important difference to note here is that admin rules are a resource delivered by ANM in a central location controlled by governance and security teams, which bubble down to each vnet. NSGs are a resource controlled by the vnet owners, which apply at each subnet or NIC level.
-An instance of Network Watcher needs to be created in all regions that you plan to run IP flow verify. Network Watcher is a regional service and can only be ran against resources in the same region. The instance used does not affect the results of IP flow verify, as any route associated with the NIC or subnet is still be returned.
+An instance of Network Watcher needs to be created in all regions where you plan to run IP flow verify. Network Watcher is a regional service and can only be run against resources in the same region. The instance used does not affect the results of IP flow verify, as any route associated with the NIC or subnet is still returned.
![1][1]
network-watcher Network Watcher Network Configuration Diagnostics Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/network-watcher-network-configuration-diagnostics-overview.md
# Introduction to Network Configuration Diagnostics in Azure Network Watcher
-The Network Configuration Diagnostic tool helps customers understand which traffic flows will be allowed or denied in your Azure Virtual Network along with detailed information for debugging. It can help your in understanding if your NSG rules are configured correctly.
+The Network Configuration Diagnostic tool helps customers understand which traffic flows will be allowed or denied in your Azure Virtual Network along with detailed information for debugging. It can help you in understanding if your NSG rules are configured correctly.
## Pre-requisites For using Network Configuration Diagnostics, Network Watcher must be enabled in your subscription. See [Create an Azure Network Watcher instance](./network-watcher-create.md) to enable.
network-watcher Network Watcher Nsg Flow Logging Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/network-watcher-nsg-flow-logging-overview.md
description: This article explains how to use the NSG flow logs feature of Azure Network Watcher. documentationcenter: na-+
network-watcher Network Watcher Nsg Flow Logging Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/network-watcher/network-watcher-nsg-flow-logging-portal.md
Previously updated : 10/17/2022 Last updated : 10/28/2022 -+ # Customer intent: I need to log the network traffic to and from a VM so I can analyze it for anomalies.
object-anchors Model Conversion Error Codes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/object-anchors/model-conversion-error-codes.md
Any errors that occur outside the actual asset conversion jobs are thrown as exc
- [Quickstart: Create an Object Anchors model from a 3D model](quickstarts/get-started-model-conversion.md) - [Frequently asked questions about Azure Object Anchors](faq.md)-- [Azure Object Anchors client library for .NET](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme-pre)
+- [Azure Object Anchors client library for .NET](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme)
object-anchors Get Started Hololens Directx https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/object-anchors/quickstarts/get-started-hololens-directx.md
The app aligns a 3D model to its physical counterpart closely. A user can air ta
> [FAQ](../faq.md) > [!div class="nextstepaction"]
-> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme-pre)
+> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme)
> [!div class="nextstepaction"] > [Troubleshooting object detection](../troubleshoot/object-detection.md)
object-anchors Get Started Model Conversion https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/object-anchors/quickstarts/get-started-model-conversion.md
Azure Object Anchors is a managed cloud service that converts 3D models into AI models that enable object-aware mixed reality experiences for the HoloLens. This quickstart covers how to create an Object Anchors model from a 3D model using
-the [Azure Object Anchors Conversion SDK for .NET](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme-pre).
+the [Azure Object Anchors Conversion SDK for .NET](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme).
You'll learn how to: > [!div class="checklist"] > * Create an Object Anchors account.
-> * Convert a 3D model to create an Object Anchors model using the [Azure Object Anchors Conversion SDK for .NET](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme-pre) ([NuGet](https://www.nuget.org/packages/Azure.MixedReality.ObjectAnchors.Conversion/)).
+> * Convert a 3D model to create an Object Anchors model using the [Azure Object Anchors Conversion SDK for .NET](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme) ([NuGet](https://www.nuget.org/packages/Azure.MixedReality.ObjectAnchors.Conversion/)).
## Prerequisites
In this quickstart, you created an Object Anchors account and converted a 3D mod
> [HoloLens DirectX](get-started-hololens-directx.md) > [!div class="nextstepaction"]
-> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme-pre)
+> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme)
object-anchors Get Started Unity Hololens Mrtk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/object-anchors/quickstarts/get-started-unity-hololens-mrtk.md
You can also do other actions using the <a href="/windows/mixed-reality/mrtk-uni
> [FAQ](../faq.md) > [!div class="nextstepaction"]
-> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme-pre)
+> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme)
> [!div class="nextstepaction"] > [Troubleshooting object detection](../troubleshoot/object-detection.md)
object-anchors Get Started Unity Hololens https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/object-anchors/quickstarts/get-started-unity-hololens.md
The app looks for objects in the current field of view and then tracks them once
> [FAQ](../faq.md) > [!div class="nextstepaction"]
-> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme-pre)
+> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme)
> [!div class="nextstepaction"] > [Troubleshooting object detection](../troubleshoot/object-detection.md)
object-anchors In Depth Mrtk Walkthrough https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/object-anchors/quickstarts/in-depth-mrtk-walkthrough.md
The bottom and right submenus don't appear automatically, but are toggled with `
> [FAQ](../faq.md) > [!div class="nextstepaction"]
-> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme-pre)
+> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme)
object-anchors New Unity Hololens App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/object-anchors/quickstarts/new-unity-hololens-app.md
You're ready to start adding your own code to the **ObjectSearch** script, using
> [FAQ](../faq.md) > [!div class="nextstepaction"]
-> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme-pre)
+> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme)
object-anchors Upgrade Unity Quickstart To 2020 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/object-anchors/quickstarts/upgrade-unity-quickstart-to-2020.md
Your project is now fully upgraded to Unity 2020. Follow the instructions from e
> [FAQ](../faq.md) > [!div class="nextstepaction"]
-> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme-pre)
+> [Conversion SDK](/dotnet/api/overview/azure/mixedreality.objectanchors.conversion-readme)
private-link Create Private Link Service Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/private-link/create-private-link-service-powershell.md
$par2 = @{
ServiceName = 'myPrivateLinkService' ResourceGroupName = 'CreatePrivLinkService-rg' Description = 'Approved'
+ PrivateLinkResourceType = 'Microsoft.Network/privateLinkServices'
} Approve-AzPrivateEndpointConnection @par2
purview Concept Policies Devops https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/concept-policies-devops.md
+
+ Title: Microsoft Purview DevOps policies concepts
+description: Understand Microsoft Purview DevOps policies
+++++ Last updated : 10/07/2022++
+# Concepts for Microsoft Purview DevOps policies
++
+This article discusses concepts related to managing access to data sources in your data estate from within the Microsoft Purview governance portal. In particular, it focuses on DevOps policies.
+
+> [!Note]
+> This capability is different from access control for Microsoft Purview itself, which is described in [Access control in Microsoft Purview](./catalog-permissions.md).
+
+## Overview
+Access to system metadata is crucial for database administrators and other DevOps users to perform their job. That access can be granted and revoked efficiently and at-scale through Microsoft Purview DevOps policies.
+
+### Microsoft Purview access policies vs. DevOps policies
+Microsoft Purview access policies enable customers to manage access to different data systems across their entire data estate, all from a central location in the cloud. These policies are access grants that can be created through Microsoft Purview Studio, avoiding the need for code. They dictate whether a set of Azure AD principals (users, groups, etc.) should be allowed or denied a specific type of access to a data source or asset within it. These policies get communicated to the data sources where they get natively enforced.
+
+DevOps policies are a special type of Microsoft Purview access policies. They grant access to database system metadata instead of user data. They simplify access provisioning for IT operations and security auditing functions. DevOps policies only grant access, that is, they don't deny access.
+
+## Elements of a DevOps policy
+A DevOps policy is defined by three elements: The *data resource path*, the *role* and the *subject*. In essence, the DevOps policy assigns the *subject* to the *role* for the scope of the *data resource path*.
+
+#### The subject
+Is a set of Azure AD users, groups or service principals.
+
+#### The role
+The role maps to a set of actions that the policy permits on the data resource. DevOps policies support a couple of roles: *SQL Performance Monitor* and *SQL Security Auditor*. The DevOps policy how-to docs detail the role definition for each data source, that is, the mapping between the role in Microsoft Purview and the actions that get permitted in the data source. For example, the role definition for SQL Performance Monitor and SQL Security Auditor includes Connect actions at server and database level on the data source side.
+
+#### The data resource
+Microsoft Purview DevOps policies currently support SQL-type data sources and can be configured on individual data sources, resource groups and subscriptions. DevOps policies can only be created if the data source is first registered in Microsoft Purview with the option *Data use management enabled*. The data resource path is the composition of subscription > resource group > data source.
+
+#### Hierarchical enforcement of policies
+A DevOps policy on a data resource is enforced on the data resource itself and all children contained by it. For example, a DevOps policy on an Azure subscription applies to all resource groups, to all policy-enabled data sources within each resource group, and to all databases contained within each data source.
+
+## A sample scenario to demonstrate the concept and the benefits
+Bob and Alice are DevOps users at their company. Given their role, they need to log in to dozens of Azure SQL logical servers to monitor their performance so that critical DevOps processes donΓÇÖt break. Their manager, Mateo, creates an Azure AD group and includes Alice and Bob. He then uses Microsoft Purview DevOps policies (Policy 1 in the diagram below) to grant this Azure AD group access at resource group level, to Resource Group 1, which hosts the Azure SQL servers.
+
+![Diagram shows an example of DevOps policy on resource group.](./media/concept-policies-devops/devops-policy-on-resource-group.png).
+
+#### These are the benefits:
+- Mateo doesn't have to create local logins in each logical server
+- The policies from Microsoft Purview improve security by helping limit local privileged access. This is what we call PoLP (Principle of Least Privilege). In the scenario, Mateo only grants the minimum access necessary that Bob and Alice need to perform the task of monitoring performance.
+- When new Azure SQL servers are added to the Resource Group, Mateo doesn't need to update the policies in Microsoft Purview for them to be effective on the new logical servers.
+- If Alice or Bob leave their job and get backfilled, Mateo just updates the Azure AD group, without having to make any changes to the servers or to the policies he created in Microsoft Purview.
+- At any point in time, Mateo or the companyΓÇÖs auditor can see what access has been granted directly in Microsoft Purview Studio.
+
+## More info
+- DevOps policies can be created, updated and deleted by any user holding *Policy Author* role at root collection level in Microsoft Purview.
+- Once saved, DevOps policies get automatically published.
+
+## Next steps
+To get started with DevOps policies, consult the following guides:
+* Document: [Microsoft Purview DevOps policies on Arc-enabled SQL Server](./how-to-policies-devops-arc-sql-server.md)
+* Document: [Microsoft Purview DevOps policies on Azure SQL DB](./how-to-policies-devops-azure-sql-db.md)
+* Blog: [New granular permissions for SQL Server 2022 and Azure SQL to help PoLP](https://techcommunity.microsoft.com/t5/sql-server-blog/new-granular-permissions-for-sql-server-2022-and-azure-sql-to/ba-p/3607507)
purview How To Policies Data Owner Arc Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-data-owner-arc-sql-server.md
Previously updated : 10/11/2022 Last updated : 10/12/2022 # Provision access by data owner for SQL Server on Azure Arc-enabled servers (preview)
purview How To Policies Devops Arc Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-devops-arc-sql-server.md
+
+ Title: Provision access to Arc-enabled SQL Server for DevOps actions (preview)
+description: Step-by-step guide on provisioning access to Arc-enabled SQL Server through Microsoft Purview DevOps policies
+++++ Last updated : 10/11/2022++
+# Provision access to Arc-enabled SQL Server for DevOps actions (preview)
++
+[DevOps policies](concept-policies-devops.md) are a type of Microsoft Purview access policies. They allow you to manage access to system metadata on data sources that have been registered for *Data use management* in Microsoft Purview. These policies are configured directly in the Microsoft Purview governance portal, and after publishing, they get enforced by the data source.
+
+This how-to guide covers how to provision access from Microsoft Purview to Arc-enabled SQL Server system metadata (DMVs and DMFs) via *SQL Performance Monitoring* or *SQL Security Auditing* actions. Microsoft Purview access policies apply to Azure AD Accounts only.
+
+## Prerequisites
+
+## Microsoft Purview configuration
+
+### Register data sources in Microsoft Purview
+The Arc-enabled SQL Server data source needs to be registered first with Microsoft Purview, before policies can be created.
+
+1. Sign in to Microsoft Purview Studio.
+
+1. Navigate to the **Data map** feature on the left pane, select **Sources**, then select **Register**. Type "Azure Arc" in the search box and select **SQL Server on Azure Arc**. Then select **Continue**
+![Screenshot shows how to select a source for registration.](./media/how-to-policies-data-owner-sql/select-arc-sql-server-for-registration.png)
+
+1. Enter a **Name** for this registration. It is best practice to make the name of the registration the same as the server name in the next step.
+
+1. select an **Azure subscription**, **Server name** and **Server endpoint**.
+
+1. **Select a collection** to put this registration in.
+
+1. Enable Data Use Management. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
+
+1. Upon enabling Data Use Management, Microsoft Purview will automatically capture the **Application ID** of the App Registration related to this Arc-enabled SQL server. Come back to this screen and hit the refresh button on the side of it to refresh, in case the association between the Arc-enabled SQL server and the App Registration changes in the future.
+
+1. Select **Register** or **Apply** at the bottom
+
+Once your data source has the **Data Use Management** toggle *Enabled*, it will look like this picture.
+![Screenshot shows how to register a data source for policy.](./media/how-to-policies-data-owner-sql/register-data-source-for-policy-arc-sql.png)
+
+> [!Note]
+> If you want to create a policy on a resource group or subscription and have it enforced in Arc-enabled SQL servers, you will need to also register those servers independently for *Data use management* to provide their App ID.
+
+## Create a new DevOps policy
+Follow this link for the steps to [create a new DevOps policy in Microsoft Purview](how-to-policies-devops-authoring-generic.md#create-a-new-devops-policy).
+
+## List DevOps policies
+Follow this link for the steps to [list DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#list-devops-policies).
+
+## Update a DevOps policy
+Follow this link for the steps to [update a DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#update-a-devops-policy).
+
+## Delete a DevOps policy
+Follow this link for the steps to [delete a DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#delete-a-devops-policy).
+
+>[!Important]
+> DevOps policies are auto-published and changes can take up to **5 minutes** to be enforced by the data source.
+
+### Test the policy
+
+The Azure AD Accounts referenced in the access policies should now be able to connect to any database in the server to which the policies are published.
+
+#### Force policy download
+It is possible to force an immediate download of the latest published policies to the current SQL database by running the following command. The minimal permission required to run it is membership in ##MS_ServerStateManager##-server role.
+
+```sql
+-- Force immediate download of latest published policies
+exec sp_external_policy_refresh reload
+```
+
+#### Analyze downloaded policy state from SQL
+The following DMVs can be used to analyze which policies have been downloaded and are currently assigned to Azure AD accounts. The minimal permission required to run them is VIEW DATABASE SECURITY STATE - or assigned Action Group *SQL Security Auditor*.
+
+```sql
+
+-- Lists generally supported actions
+SELECT * FROM sys.dm_server_external_policy_actions
+
+-- Lists the roles that are part of a policy published to this server
+SELECT * FROM sys.dm_server_external_policy_roles
+
+-- Lists the links between the roles and actions, could be used to join the two
+SELECT * FROM sys.dm_server_external_policy_role_actions
+
+-- Lists all Azure AD principals that were given connect permissions
+SELECT * FROM sys.dm_server_external_policy_principals
+
+-- Lists Azure AD principals assigned to a given role on a given resource scope
+SELECT * FROM sys.dm_server_external_policy_role_members
+
+-- Lists Azure AD principals, joined with roles, joined with their data actions
+SELECT * FROM sys.dm_server_external_policy_principal_assigned_actions
+```
+
+## Additional information
+
+### Policy action mapping
+
+This section contains a reference of how actions in Microsoft Purview data policies map to specific actions in SQL Server on Azure Arc-enabled servers.
+
+| **Microsoft Purview policy action** | **Data source specific actions** |
+|-|--|
+| | |
+| *SQL Performance Monitor* |Microsoft.Sql/sqlservers/Connect |
+||Microsoft.Sql/sqlservers/databases/Connect |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabasePerformanceState/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/ServerPerformanceState/rows/select |
+|||
+| *SQL Security Auditor* |Microsoft.Sql/sqlservers/Connect |
+||Microsoft.Sql/sqlservers/databases/Connect |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerSecurityState/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseSecurityState/rows/select |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerSecurityMetadata/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseSecurityMetadata/rows/select |
+|||
+
+## Next steps
+Check the blog and related docs
+* Blog: [Microsoft Purview DevOps policies enable at scale access provisioning for IT operations](https://techcommunity.microsoft.com/t5/microsoft-purview-blog/microsoft-purview-devops-policies-enable-at-scale-access/ba-p/3604725)
+* Video: [Pre-requisite for policies: The "Data use management" option](https://youtu.be/v_lOzevLW-Q)
+* Video: [Microsoft Purview DevOps policies on data sources and resource groups](https://youtu.be/YCDJagrgEAI)
+* Video: [Reduce the effort with Microsoft Purview DevOps policies on resource groups](https://youtu.be/yMMXCeIFCZ8)
+* Doc: [Microsoft Purview DevOps policies on Azure SQL DB](./how-to-policies-devops-azure-sql-db.md)
+* Blog: [Deep dive on SQL Performance Monitor and SQL Security Auditor permissions](https://techcommunity.microsoft.com/t5/sql-server-blog/new-granular-permissions-for-sql-server-2022-and-azure-sql-to/ba-p/3607507)
purview How To Policies Devops Authoring Generic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-devops-authoring-generic.md
+
+ Title: Create, list, update and delete DevOps policies (preview)
+description: Step-by-step guide on provisioning access through Microsoft Purview DevOps policies
+++++ Last updated : 10/11/2022++
+# Create, list, update and delete DevOps policies (preview)
++
+[DevOps policies](concept-policies-devops.md) are a type of Microsoft Purview access policies. They allow you to manage access to system metadata on data sources that have been registered for *Data use management* in Microsoft Purview. These policies are configured directly in the Microsoft Purview governance portal, and after publishing, they get enforced by the data source.
+
+This how-to guide covers how to provision access from Microsoft Purview to SQL-type data sources via *SQL Performance Monitoring* or *SQL Security Auditing* actions. Microsoft Purview access policies apply to Azure AD Accounts only.
+
+## Prerequisites
+
+### Data source configuration
+Before authoring policies in the Microsoft Purview policy portal, you'll need to configure the data sources so that they can enforce those policies.
+
+1. Follow any policy-specific prerequisites for your source. Check the [Microsoft Purview supported data sources table](./microsoft-purview-connector-overview.md) and select the link in the **Access Policy** column for sources where access policies are available. Follow any steps listed in the Access policy or Prerequisites sections.
+1. Register the data source in Microsoft Purview. Follow the **Prerequisites** and **Register** sections of the [source pages](./microsoft-purview-connector-overview.md) for your resources.
+1. [Enable the "Data use management" toggle on the data source](how-to-enable-data-use-management.md). Additional permissions for this step are described in the linked document.
++
+## Create a new DevOps policy
+To create a new DevOps policy, ensure first that you have the Microsoft Purview Policy author role at **root collection level**. Check the section on managing Microsoft Purview role assignments in this [guide](./how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
+
+1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
+
+1. Navigate to the **Data policy** feature using the left side panel. Then select **DevOps policies**.
+
+1. Select the **New Policy** button in the policy page. After that, the policy detail page will open.
+![Screenshot shows to enter SQL DevOps policies to create.](./media/how-to-policies-devops-authoring-generic/enter-devops-policies-to-create.png)
+
+1. Select the **Data source type** and then one of the listed data sources under **Data source name**. Then click on **Select**. This will take you back to the New Policy experience
+![Screenshot shows to select a data source for policy.](./media/how-to-policies-devops-authoring-generic/select-a-data-source.png)
+
+1. Select one of two roles, *SQL Performance monitor* or *SQL Security auditor*. Then select **Add/remove subjects**. This will open the Subject window. Type the name of an Azure AD principal (user, group or service principal) in the **Select subjects** box. Keep adding or removing subjects until you are satisfied. Select **Save**. This will take you back to the prior window.
+![Screenshot shows to select role and subject for policy.](./media/how-to-policies-devops-authoring-generic/select-role-and-subjects.png)
+
+1. Select **Save** to save the policy. A policy has been created and automatically published. Enforcement will start at the data source within 5 minutes.
+
+## List DevOps policies
+To update a DevOps policy, ensure first that you have one of the following Microsoft Purview roles at **root collection level**: Policy author, Data source admin, Data curator or Data reader. Check the section on managing Microsoft Purview role assignments in this [guide](./how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
+
+1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
+
+1. Navigate to the **Data policy** feature using the left side panel. Then select **DevOps policies**.
+
+1. If any DevOps policies have been created they will be listed as shown in the following screenshot
+![Screenshot shows to enter SQL DevOps policies to list.](./media/how-to-policies-devops-authoring-generic/enter-devops-policies-to-list.png)
++
+## Update a DevOps policy
+To update a DevOps policy, ensure first that you have the Microsoft Purview Policy author role at **root collection level**. Check the section on managing Microsoft Purview role assignments in this [guide](./how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
+
+1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
+
+1. Navigate to the **Data policy** feature using the left side panel. Then select **DevOps policies**.
+
+1. Enter the policy detail for one of the policies by selecting it from its Data resource path as shown in the following screenshot
+![Screenshot shows to enter SQL DevOps policies to update.](./media/how-to-policies-devops-authoring-generic/enter-devops-policies-to-update.png)
+
+1. In the policy detail page, select **Edit**.
+
+1. Continue same as with step 5 and 6 of the policy create.
+
+## Delete a DevOps policy
+To delete a DevOps policy, ensure first that you have the Microsoft Purview Policy author role at **root collection level**. Check the section on managing Microsoft Purview role assignments in this [guide](./how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
+
+1. Sign in to the [Microsoft Purview governance portal](https://web.purview.azure.com/resource/).
+
+1. Navigate to the **Data policy** feature using the left side panel. Then select **DevOps policies**.
+
+1. Check one of the policies and then select **Delete** as shown in the following screenshot:
+![Screenshot shows to enter SQL DevOps policies to delete.](./media/how-to-policies-devops-authoring-generic/enter-devops-policies-to-delete.png)
+
+## Next steps
+Check the blog, related videos and documents
+* Blog: [Microsoft Purview DevOps policies enable at scale access provisioning for IT operations](https://techcommunity.microsoft.com/t5/microsoft-purview-blog/microsoft-purview-devops-policies-enable-at-scale-access/ba-p/3604725)
+* Video: [Pre-requisite for policies: The "Data use management" option](https://youtu.be/v_lOzevLW-Q)
+* Video: [Microsoft Purview DevOps policies on data sources and resource groups](https://youtu.be/YCDJagrgEAI)
+* Video: [Reduce the effort with Microsoft Purview DevOps policies on resource groups](https://youtu.be/yMMXCeIFCZ8)
+* Document: [Microsoft Purview DevOps policies on Arc-enabled SQL Server](./how-to-policies-devops-arc-sql-server.md)
+* Document: [Microsoft Purview DevOps policies on Azure SQL DB](./how-to-policies-devops-azure-sql-db.md)
purview How To Policies Devops Azure Sql Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-policies-devops-azure-sql-db.md
+
+ Title: Provision access to Azure SQL Database for DevOps actions (preview)
+description: Step-by-step guide on provisioning access to Azure SQL Database through Microsoft Purview DevOps policies
+++++ Last updated : 10/11/2022++
+# Provision access to Azure SQL Database for DevOps actions (preview)
++
+[DevOps policies](concept-policies-devops.md) are a type of Microsoft Purview access policies. They allow you to manage access to system metadata on data sources that have been registered for *Data use management* in Microsoft Purview. These policies are configured directly in the Microsoft Purview governance portal, and after publishing, they get enforced by the data source.
+
+This how-to guide covers how to provision access from Microsoft Purview to Azure SQL Database system metadata (DMVs and DMFs) via *SQL Performance Monitoring* or *SQL Security Auditing* actions. Microsoft Purview access policies apply to Azure AD Accounts only.
+
+## Prerequisites
+
+## Microsoft Purview Configuration
+
+### Register the data sources in Microsoft Purview
+The Azure SQL Database data source needs to be registered first with Microsoft Purview, before access policies can be created. You can follow these guides:
+
+[Register and scan Azure SQL Database](./register-scan-azure-sql-database.md)
+
+After you've registered your resources, you'll need to enable Data Use Management. Data Use Management needs certain permissions and can affect the security of your data, as it delegates to certain Microsoft Purview roles to manage access to the data sources. **Go through the secure practices related to Data Use Management in this guide**: [How to enable Data Use Management](./how-to-enable-data-use-management.md)
+
+Once your data source has the **Data Use Management** toggle *Enabled*, it will look like this screenshot. This will enable the access policies to be used with the given data source
+![Screenshot shows how to register a data source for policy.](./media/how-to-policies-data-owner-sql/register-data-source-for-policy-azure-sql-db.png)
+
+## Create a new DevOps policy
+Follow this link for the steps to [create a new DevOps policy in Microsoft Purview](how-to-policies-devops-authoring-generic.md#create-a-new-devops-policy).
+
+## List DevOps policies
+Follow this link for the steps to [list DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#list-devops-policies).
+
+## Update a DevOps policy
+Follow this link for the steps to [update a DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#update-a-devops-policy).
+
+## Delete a DevOps policy
+Follow this link for the steps to [delete a DevOps policies in Microsoft Purview](how-to-policies-devops-authoring-generic.md#delete-a-devops-policy).
+
+>[!Important]
+> DevOps policies are auto-published and changes can take up to **5 minutes** to be enforced by the data source.
+
+### Test the policy
+The Azure AD Accounts referenced in the access policies should now be able to connect to any database in the server to which the policies are published.
+
+#### Force policy download
+It is possible to force an immediate download of the latest published policies to the current SQL database by running the following command. The minimal permission required to run it is membership in ##MS_ServerStateManager##-server role.
+
+```sql
+-- Force immediate download of latest published policies
+exec sp_external_policy_refresh reload
+```
+
+#### Analyze downloaded policy state from SQL
+The following DMVs can be used to analyze which policies have been downloaded and are currently assigned to Azure AD accounts. The minimal permission required to run them is VIEW DATABASE SECURITY STATE - or assigned Action Group *SQL Security Auditor*.
+
+```sql
+
+-- Lists generally supported actions
+SELECT * FROM sys.dm_server_external_policy_actions
+
+-- Lists the roles that are part of a policy published to this server
+SELECT * FROM sys.dm_server_external_policy_roles
+
+-- Lists the links between the roles and actions, could be used to join the two
+SELECT * FROM sys.dm_server_external_policy_role_actions
+
+-- Lists all Azure AD principals that were given connect permissions
+SELECT * FROM sys.dm_server_external_policy_principals
+
+-- Lists Azure AD principals assigned to a given role on a given resource scope
+SELECT * FROM sys.dm_server_external_policy_role_members
+
+-- Lists Azure AD principals, joined with roles, joined with their data actions
+SELECT * FROM sys.dm_server_external_policy_principal_assigned_actions
+```
+
+## Additional information
+
+### Policy action mapping
+
+This section contains a reference of how actions in Microsoft Purview data policies map to specific actions in Azure SQL DB.
+
+| **Microsoft Purview policy action** | **Data source specific actions** |
+|-|--|
+| | |
+| *SQL Performance Monitor* |Microsoft.Sql/sqlservers/Connect |
+||Microsoft.Sql/sqlservers/databases/Connect |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabasePerformanceState/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/ServerPerformanceState/rows/select |
+|||
+| *SQL Security Auditor* |Microsoft.Sql/sqlservers/Connect |
+||Microsoft.Sql/sqlservers/databases/Connect |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerSecurityState/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseSecurityState/rows/select |
+||Microsoft.Sql/sqlservers/SystemViewsAndFunctions/ServerSecurityMetadata/rows/select |
+||Microsoft.Sql/sqlservers/databases/SystemViewsAndFunctions/DatabaseSecurityMetadata/rows/select |
+|||
+
+## Next steps
+Check the blog and related docs
+* Blog: [Microsoft Purview DevOps policies enable at scale access provisioning for IT operations](https://techcommunity.microsoft.com/t5/microsoft-purview-blog/microsoft-purview-devops-policies-enable-at-scale-access/ba-p/3604725)
+* Video: [Pre-requisite for policies: The "Data use management" option](https://youtu.be/v_lOzevLW-Q)
+* Video: [Microsoft Purview DevOps policies on data sources and resource groups](https://youtu.be/YCDJagrgEAI)
+* Video: [Reduce the effort with Microsoft Purview DevOps policies on resource groups](https://youtu.be/yMMXCeIFCZ8)
+* Doc: [Microsoft Purview DevOps policies on Arc-enabled SQL Server](./how-to-policies-devops-arc-sql-server.md)
+* Blog: [Deep dive on SQL Performance Monitor and SQL Security Auditor permissions](https://techcommunity.microsoft.com/t5/sql-server-blog/new-granular-permissions-for-sql-server-2022-and-azure-sql-to/ba-p/3607507)
purview Manage Integration Runtimes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/manage-integration-runtimes.md
Previously updated : 05/09/2022 Last updated : 10/27/2022 # Create and manage a self-hosted integration runtime
Then go to path C:\Program Files\Microsoft Integration Runtime\5.0\Gateway\DataS
</configuration> ```
+Local traffic must be excluded from proxy, for example if your Microsoft Purview account is behind private endpoints. In such cases, update the following four files under the path to include bypass list C:\Program Files\Microsoft Integration Runtime\5.0\ with required bypass list:
+
+- .\Shared\diahost.exe.config
+- .\Shared\diawp.exe.config
+- .\Gateway\DataScan\Microsoft.DataMap.Agent.exe.config
+- .\Gateway\DataScan\DataTransfer\Microsoft.DataMap.Agent.Connectors.Azure.DataFactory.ServiceHost.exe.config
+
+An example for bypass list for scanning an Azure SQL Database and ADLS gen 2 Storage:
+
+ ```xml
+ <system.net>
+ <defaultProxy>
+ <bypasslist>
+ <add address="scaneastus4123.blob.core.windows.net" />
+ <add address="scaneastus4123.queue.core.windows.net" />
+ <add address="Atlas-abc12345-1234-abcd-a73c-394243a566fa.servicebus.windows.net" />
+ <add address="contosopurview123.purview.azure.com" />
+ <add address="contososqlsrv123.database.windows.net" />
+ <add address="contosoadls123.dfs.core.windows.net" />
+ <add address="contosoakv123.vault.azure.net" />
+ </bypasslist>
+ <proxy proxyaddress=http://proxy.domain.org:8888 bypassonlocal="True" />
+ </defaultProxy>
+ </system.net>
+ ```
Restart the self-hosted integration runtime host service, which picks up the changes. To restart the service, use the services applet from Control Panel. Or from Integration Runtime Configuration Manager, select the **Stop Service** button, and then select **Start Service**. If the service doesn't start, you likely added incorrect XML tag syntax in the application configuration file that you edited. > [!IMPORTANT]
search Query Lucene Syntax https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/query-lucene-syntax.md
You can embed Boolean operators in a query string to improve the precision of a
|--|-- |--|-| | AND | `+` | `wifi AND luxury` | Specifies terms that a match must contain. In the example, the query engine will look for documents containing both `wifi` and `luxury`. The plus character (`+`) can also be used directly in front of a term to make it required. For example, `+wifi +luxury` stipulates that both terms must appear somewhere in the field of a single document.| | OR | (none) <sup>1</sup> | `wifi OR luxury` | Finds a match when either term is found. In the example, the query engine will return match on documents containing either `wifi` or `luxury` or both. Because OR is the default conjunction operator, you could also leave it out, such that `wifi luxury` is the equivalent of `wifi OR luxury`.|
-| NOT | `!`, `-` | `wifi ΓÇôluxury` | Returns matches on documents that exclude the term. For example, `wifi ΓÇôluxury` will search for documents that have the `wifi` term but not `luxury`. </p>The `searchMode` parameter on a query request controls whether a term with the NOT operator is ANDed or ORed with other terms in the query (assuming there's no boolean operators on the other terms). Valid values include `any` or `all`. </p>`searchMode=any` increases the recall of queries by including more results, and by default `-` will be interpreted as "OR NOT". For example, `wifi -luxury` will match documents that either contain the term `wifi` or those that don't contain the term `luxury`. </p>`searchMode=all` increases the precision of queries by including fewer results, and by default - will be interpreted as "AND NOT". For example, `wifi -luxury` will match documents that contain the term `wifi` and don't contain the term "luxury". This is arguably a more intuitive behavior for the `-` operator. Therefore, you should consider using `searchMode=all` instead of `searchMode=any` if you want to optimize searches for precision instead of recall, *and* Your users frequently use the `-` operator in searches.</p> When deciding on a `searchMode` setting, consider the user interaction patterns for queries in various applications. Users who are searching for information are more likely to include an operator in a query, as opposed to e-commerce sites that have more built-in navigation structures. |
+| NOT | `!`, `-` | `wifi ΓÇôluxury` | Returns matches on documents that exclude the term. For example, `wifi ΓÇôluxury` will search for documents that have the `wifi` term but not `luxury`. </p>It's important to note that the NOT operator (`NOT`, `!`, or `-`) behaves differently in full syntax than it does in simple syntax. In full syntax, negations will always be ANDed onto the query such that `wifi -luxury` is interpreted as "wifi AND NOT luxury" regardless of if the `searchMode` parameter is set to `any` or `all`. This gives you a more intuitive behavior for negations by default. </p>A single negation such as the query `-luxury` isn't allowed in full search syntax and will always return an empty result set.|
<sup>1</sup> The `|` character isn't supported for OR operations.
search Query Simple Syntax https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/query-simple-syntax.md
Last updated 10/27/2022
Azure Cognitive Search implements two Lucene-based query languages: [Simple Query Parser](https://lucene.apache.org/core/6_6_1/queryparser/org/apache/lucene/queryparser/simple/SimpleQueryParser.html) and the [Lucene Query Parser](https://lucene.apache.org/core/6_6_1/queryparser/org/apache/lucene/queryparser/classic/package-summary.html). The simple parser is more flexible and will attempt to interpret a request even if it's not perfectly composed. Because it's flexible, it's the default for queries in Azure Cognitive Search.
-The simple syntax is used for query expressions passed in the "search" parameter of a [Search Documents (REST API)](/rest/api/searchservice/search-documents) request, not to be confused with the [OData syntax](query-odata-filter-orderby-syntax.md) used for the ["$filter"](search-filters.md) and ["$orderby"](search-query-odata-orderby.md) expressions in the same request. OData parameters have different syntax and rules for constructing queries, escaping strings, and so on.
+Query syntax for either parser applies to query expressions passed in the "search" parameter of a [Search Documents (REST API)](/rest/api/searchservice/search-documents) request, not to be confused with the [OData syntax](query-odata-filter-orderby-syntax.md) used for the ["$filter"](search-filters.md) and ["$orderby"](search-query-odata-orderby.md) expressions in the same request. OData parameters have different syntax and rules for constructing queries, escaping strings, and so on.
Although the simple parser is based on the [Apache Lucene Simple Query Parser](https://lucene.apache.org/core/6_6_1/queryparser/org/apache/lucene/queryparser/simple/SimpleQueryParser.html) class, its implementation in Cognitive Search excludes fuzzy search. If you need [fuzzy search](search-query-fuzzy.md), consider the alternative [full Lucene query syntax](query-lucene-syntax.md) instead.
You can embed Boolean operators in a query string to improve the precision of a
|-- |--|-| | `+` | `pool + ocean` | An AND operation. For example, `pool + ocean` stipulates that a document must contain both terms.| | `|` | `pool | ocean` | An OR operation finds a match when either term is found. In the example, the query engine will return match on documents containing either `pool` or `ocean` or both. Because OR is the default conjunction operator, you could also leave it out, such that `pool ocean` is the equivalent of `pool | ocean`.|
-| `-` | `pool ΓÇô ocean` | A NOT operation returns matches on documents that exclude the term. </p>To get the expected behavior on a NOT expression, set `"searchMode=all"` on the request. Otherwise, under the default of `"searchMode=any"`, you'll get matches on `pool`, plus matches on all documents that don't contain `ocean`, which could be a lot of documents. The "searchMode" parameter on a query request controls whether a term with the NOT operator is ANDed or ORed with other terms in the query (assuming there's no `+` or `|` operator on the other terms). Using `"searchMode=all"` increases the precision of queries by including fewer results, and by default - will be interpreted as "AND NOT". </p>When deciding on a "searchMode" setting, consider the user interaction patterns for queries in various applications. Users who are searching for information are more likely to include an operator in a query, as opposed to e-commerce sites that have more built-in navigation structures. |
+| `-` | `pool ΓÇô ocean` | A NOT operation returns matches on documents that exclude the term. </p></p>The `searchMode` parameter on a query request controls whether a term with the NOT operator is ANDed or ORed with other terms in the query (assuming there's no boolean operators on the other terms). Valid values include `any` or `all`. </p>`searchMode=any` increases the recall of queries by including more results, and by default `-` will be interpreted as "OR NOT". For example, `wifi -luxury` will match documents that either contain the term `wifi` or those that don't contain the term `luxury`. </p>`searchMode=all` increases the precision of queries by including fewer results, and by default - will be interpreted as "AND NOT". For example, `wifi -luxury` will match documents that contain the term `wifi` and don't contain the term "luxury". This is arguably a more intuitive behavior for the `-` operator. Therefore, you should consider using `searchMode=all` instead of `searchMode=any` if you want to optimize searches for precision instead of recall, *and* Your users frequently use the `-` operator in searches.</p> When deciding on a `searchMode` setting, consider the user interaction patterns for queries in various applications. Users who are searching for information are more likely to include an operator in a query, as opposed to e-commerce sites that have more built-in navigation structures. |
<a name="prefix-search"></a>
search Resource Demo Sites https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/resource-demo-sites.md
Previously updated : 09/20/2022 Last updated : 10/27/2022 # Demos - Azure Cognitive Search
The following demos are built and hosted by Microsoft.
| Demo name | Description | Source code | |--| |-|
+| [AzSearchLab](https://azuresearchlab.azurewebsites.net/) | A web front end that makes calls to a search index. | [https://github.com/Azure-Samples/azure-search-lab](https://github.com/Azure-Samples/azure-search-lab) |
| [NYC Jobs demo](https://azjobsdemo.azurewebsites.net/) | An ASP.NET app with facets, filters, details, geo-search (map controls). | [https://github.com/Azure-Samples/search-dotnet-asp-net-mvc-jobs](https://github.com/Azure-Samples/search-dotnet-asp-net-mvc-jobs) | | [JFK files demo](https://jfk-demo-2019.azurewebsites.net/#/) | An ASP.NET web app built on a public data set, transformed with custom and predefined skills to extract searchable content from scanned document (JPEG) files. [Learn more...](https://www.microsoft.com/ai/ai-lab-jfk-files) | [https://github.com/Microsoft/AzureSearch_JFK_Files](https://github.com/Microsoft/AzureSearch_JFK_Files) | | [Semantic search for retail](https://brave-meadow-0f59c9b1e.1.azurestaticapps.net/) | Web app for a fictitious online retailer, "Terra" | Not available |
search Search Filters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/search-filters.md
Previously updated : 10/06/2022 Last updated : 10/27/2022 # Filters in Azure Cognitive Search
-A *filter* provides value-based criteria for including or excluding content before query execution. For example, you could set filters to select documents based on dates, locations, or some other field. Filters are specified on individual fields. A field definition must be attributed as "filterable" if you want to use it in filter expressions.
+A *filter* provides value-based criteria for including or excluding content before query execution. For example, including or excluding documents based on dates, locations, or language. Filters are specified on individual fields. A field definition must be attributed as "filterable" if you want to use it in filter expressions.
-A filter can be a single value or an OData [filter expression](search-query-odata-filter.md). In contrast with full text search, a filter succeeds only if an exact match is made.
+A filter is specified using [OData filter expression syntax](search-query-odata-filter.md). In contrast with full text search, a filter succeeds only if the match is exact.
## When to use a filter
search Search What Is Azure Search https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/search-what-is-azure-search.md
Key strengths include:
Among our customers, those able to apply the widest range of features in Azure Cognitive Search include online catalogs, line-of-business programs, and document discovery applications.
-## Watch this video
+<!-- ## Watch this video
In this 15-minute video, review the main capabilities of Azure Cognitive Search.
->[!VIDEO https://www.youtube.com/embed/kOJU0YZodVk?version=3]
+>[!VIDEO https://www.youtube.com/embed/kOJU0YZodVk?version=3] -->
search Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/whats-new.md
# What's new in Azure Cognitive Search
-Learn about the latest updates to Azure Cognitive Search functionality.
+Learn about the latest updates to Azure Cognitive Search functionality, docs, and samples.
> [!NOTE]
-> Looking for preview feature status? Preview features are announced in this what's new article, but we also maintain a [preview features list](search-api-preview.md) so that you can find them all in one place.
+> Looking for preview features? Previews are announced here, but we also maintain a [preview features list](search-api-preview.md) so you can find them in one place.
+
+## November 2022
+
+|Sample&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; | Description |
+||--|
+| [Query performance dashboard](https://github.com/Azure-Samples/azure-samples-search-evaluation) | This Application Insights sample demonstrates an approach for deep monitoring of query usage and performance of an Azure Cognitive Search index. It includes a JSON template that creates a workbook and dashboard in Application Insights and a Jupyter Notebook that populates the dashboard with simulated data. |
+
+## October 2022
+
+|Content&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; | Description |
+||--|
+| [Compliance risk analysis using Azure Cognitive Search](/azure/architecture/guide/ai/compliance-risk-analysis) | Published on Azure Architecture Center, this guide covers the implementation of a compliance risk analysis solution that uses Azure Cognitive Search. |
+| [Beiersdorf customer story using Azure Cognitive Search](https://customers.microsoft.com/story/1552642769228088273-Beiersdorf-consumer-goods-azure-cognitive-search) | This customer story showcases semantic search and document summarization to provide researchers with ready access to institutional knowledge. |
+
+## September 2022
+
+|Sample&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; | Description |
+||--|
+| [Azure Cognitive Search Lab](https://github.com/Azure-Samples/azure-search-lab/blob/main/README.md) | This C# sample provides the source code for building a web front-end that accesses all of the REST API calls against an index. This tool is used by support engineers to investigate customer support issues. You can try this [demo site](https://azuresearchlab.azurewebsites.net/) before building your own copy. |
+| [Event-driven indexing for Cognitive Search](https://github.com/aditmer/Event-Driven-Indexing-For-Cognitive-Search/blob/main/README.md) | This C# sample is an Azure Function app that demonstrates event-driven indexing in Azure Cognitive Search. If you've used indexers and skillsets before, you know that indexers can run on demand or on a schedule, but not in response to events. This demo shows you how to set up an indexing pipeline that responds to data update events. |
+
+## August 2022
+
+|Tutorial&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; | Description |
+||--|
+| [Tutorial: Index large data from Apache Spark](search-synapseml-cognitive-services.md) | This tutorial explains how to use the SynapseML open-source library to push data from Apache Spark into a search index. It also shows you how to make calls to Cognitive Services to get AI enrichment without skillsets and indexers. |
## June 2022
Learn about the latest updates to Azure Cognitive Search functionality.
|Feature&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; | Description | Availability | ||--||
-| [Power Query connector preview](search-how-to-index-power-query-data-sources.md) | This indexer data source was introduced in May 2021 but will not be moving forward. Please migrate your data ingestion code by November 2022. See the feature documentation for migration guidance. | Retired |
+| [Power Query connector preview](search-how-to-index-power-query-data-sources.md) | This indexer data source was introduced in May 2021 but won't be moving forward. Please migrate your data ingestion code by November 2022. See the feature documentation for migration guidance. | Retired |
## February 2022
Learn about the latest updates to Azure Cognitive Search functionality.
| February | [Azure CLI](/cli/azure/search) </br>[Azure PowerShell](/powershell/module/az.search/) | New revisions now provide the full range of operations in the Management REST API 2020-08-01, including support for IP firewall rules and private endpoint. Generally available. | | January | [Solution accelerator for Azure Cognitive Search and QnA Maker](https://github.com/Azure-Samples/search-qna-maker-accelerator) | Pulls questions and answers out of the document and suggest the most relevant answers. A live demo app can be found at [https://aka.ms/qnaWithAzureSearchDemo](https://aka.ms/qnaWithAzureSearchDemo). This feature is an open-source project (no SLA). |
-## 2019 and 2020 announcements
+## 2020 announcements
+
+See [2020 Archive for "What's New in Cognitive Search"](/previous-versions/azure/search/search-whats-new-2020) in the content archive.
+
+## 2019 announcements
-For feature announcements from 2019 and 2020, see the content archive, [**Previous versions**](/previous-versions/azure/search/) on the Microsoft Learn website.
+See [2019 Archive for "What's New in Cognitive Search"](/previous-versions/azure/search/search-whats-new-2019) in the content archive.
<a name="new-service-name"></a>
security Iaas https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/security/fundamentals/iaas.md
In most infrastructure as a service (IaaS) scenarios, [Azure virtual machines (V
The first step in protecting your VMs is to ensure that only authorized users can set up new VMs and access VMs. > [!NOTE]
-> To improve the security of Linux VMs on Azure, you can integrate with Azure AD authentication. When you use [Azure AD authentication for Linux VMs](/azure-docs-archive-pr/virtual-machines/linux/login-using-aad), you centrally control and enforce policies that allow or deny access to the VMs.
+> To improve the security of Linux VMs on Azure, you can integrate with Azure AD authentication. When you use [Azure AD authentication for Linux VMs](/azure/active-directory/devices/howto-vm-sign-in-azure-ad-linux), you centrally control and enforce policies that allow or deny access to the VMs.
> >
security Trusted Hardware Identity Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/security/fundamentals/trusted-hardware-identity-management.md
+
+ Title: Trusted Hardware Identity Management
+description: Technical overview of Trusted Hardware Identity Management, which handles cache management of certificates and provides a trusted computing base.
+++++ Last updated : 10/24/2022++
+# Trusted Hardware Identity Management
+
+The Trusted Hardware Identity Management (THIM) service handles cache management of certificates for all Trusted Execution Environments (TEE) residing in Azure and provides trusted computing base (TCB) information to enforce a minimum baseline for attestation solutions.
+
+## THIM & attestation interactions
+
+THIM defines the Azure security baseline for Azure Confidential computing (ACC) nodes and caches collateral from TEE providers. The cached information can be further used by attestation services and ACC nodes in validating TEEs. The diagram below shows the interactions between an attestation service or node, THIM, and an enclave host.
++
+## Frequently asked questions
+
+**The "next update" date of the Azure-internal caching service API ,used by Microsoft Azure Attestation, seems to be out of date. Is it still in operation and can it be used?**
+
+The "tcbinfo" field contains the TCB information. The THIM service by default provides an older tcbinfo -- updating to the latest tcbinfo from Intel would cause attestation failures for those customers who have not migrated to the latest Intel SDK, and could results in outages.
+
+Open Enclave SDK and Microsoft Azure Attestation do not look at nextUpdate date, however, and will pass attestation.
+
+### What is the Azure DCAP Library?
+
+Azure Data Center Attestation Primitives (DCAP), a replacement for Intel Quote Provider Library (QPL), fetches quote generation collateral and quote validation collateral directly from the THIM Service. Fetching collateral directly from the THIM service ensures that all Azure hosts have collateral readily available within the Azure cloud to reduce external dependencies. The current recommended version of the DCAP library is 1.11.2.
+
+### Where can I download the latest DCAP packages?
+
+- Ubuntu 20.04: <https://packages.microsoft.com/ubuntu/20.04/prod/pool/main/64.deb>
+- Ubuntu 18.04: <https://packages.microsoft.com/ubuntu/18.04/prod/pool/main/64.deb>
+- Windows: <https://www.nuget.org/packages/Microsoft.Azure.DCAP/1.11.2>
+
+### Why are there different baselines between THIM and Intel?
+
+THIM and Intel provide different baseline levels of the trusted computing base. While Intel can be viewed as having the latest and greatest, this imposes requirements upon the consumer to ensure that all the requirements are satisfied, thus leading to a potential breakage of customers if they have not updated to the specified requirements. THIM takes a slower approach to updating the TCB baseline to allow customers to make the necessary changes at their own pace. This approach, while does provide an older TCB baseline, ensures that customers will not break if they have not been able to meet the requirements of the new TCB baseline. This reason is why THIM's TCB baseline is of a different version from Intel's. We are customer-focused and want to empower the customer to meet the requirements imposed by the new TCB baseline on their pace, instead of forcing them to update and causing them a disruption that would require reprioritization of their workstreams.
+
+THIM is also introducing a new feature that will enable customers to select their own custom baseline. This feature will allow customers to decide between the newest TCB or using an older TCB than provided by Intel, enabling customers to ensure that the TCB version to enforce is compliant with their specific configuration. This new feature will be reflected in a future iteration of the THIM documentation.
+
+### With Coffeelake I could get my certificates directly from Intel PCK. Why, with Icelake, do I need to get the certificates from THIM, and what do I need to do to fetch those certificates?
+
+The certificates are fetched and cached in THIM service using platform manifest and indirect registration. As a result, Key Caching Policy will be set to never store platform root keys for a given platform. Direct calls to the Intel service from inside the VM are expected to fail.
+
+To retrieve the certificate, you must install the [Azure DCAP library](#what-is-the-azure-dcap-library) which replaces Intel QPL. This library directs the fetch requests to THIM service running in Azure cloud. For the downloading the latest DCAP packages, please see: [Where can I download the latest DCAP packages?](#where-can-i-download-the-latest-dcap-packages)
+
+### How do I request collateral in a Confidential Virtual Machine (CVM)?**
+
+Use the following sample in a CVM guest for requesting AMD collateral that includes the VCEK certificate and certificate chain. For details on this collateral and where it originates from, see [Versioned Chip Endorsement Key (VCEK) Certificate and KDS Interface Specification](https://www.amd.com/system/files/TechDocs/57230.pdf) (from <amd.com>).
+
+#### URI parameters
+
+```bash
+GET "http://169.254.169.254/metadat/certification"
+```
+
+##### Request body
+
+| Name | Type | Description |
+|--|--|--|
+| Metadata | Boolean | Setting to True allows for collateral to be returned |
+
+##### Sample request
+
+```bash
+curl GET "http://169.254.169.254/metadat/certification" -H "Metadata: trueΓÇ¥
+```
+
+##### Responses
+
+| Name | Description |
+|--|--|
+| 200 OK | Lists available collateral in http body within JSON format. For details on the keys in the JSON, please see Definitions |
+| Other Status Codes | Error response describing why the operation failed |
+
+##### Definitions
+
+| Key | Description |
+|--|--|
+| VcekCert | X.509v3 certificate as defined in RFC 5280. |
+| tcbm | Trusted Computing Base |
+| certificateChain | Includes the AMD SEV Key (ASK) and AMD Root Key (ARK) certificates |
+
+## Next steps
+
+- Learn more about [Azure Attestation documentation](../../attestation/overview.md)
+- Learn more about [Azure Confidential Computing](https://azure.microsoft.com/blog/introducing-azure-confidential-computing)
service-bus-messaging Service Bus Quickstart Topics Subscriptions Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-bus-messaging/service-bus-quickstart-topics-subscriptions-portal.md
Title: Use the Azure portal to create Service Bus topics and subscriptions
description: 'Quickstart: In this quickstart, you learn how to create a Service Bus topic and subscriptions to that topic by using the Azure portal.' Previously updated : 09/15/2021 Last updated : 10/28/2022 #Customer intent: In a retail scenario, how do I update inventory assortment and send a set of messages from the back office to the stores?
In this quickstart, you use the Azure portal to create a Service Bus topic and t
Service Bus topics and subscriptions support a *publish/subscribe* messaging communication model. When using topics and subscriptions, components of a distributed application do not communicate directly with each other; instead they exchange messages via a topic, which acts as an intermediary.
-![TopicConcepts](./media/service-bus-java-how-to-use-topics-subscriptions/sb-topics-01.png)
In contrast with Service Bus queues, in which each message is processed by a single consumer, topics and subscriptions provide a one-to-many form of communication, using a publish/subscribe pattern. It is possible to register multiple subscriptions to a topic. When a message is sent to a topic, it is then made available to each subscription to handle/process independently. A subscription to a topic resembles a virtual queue that receives copies of the messages that were sent to the topic. You can optionally register filter rules for a topic on a per-subscription basis, which allows you to filter or restrict which messages to a topic are received by which topic subscriptions.
service-health Resource Health Vm Annotation https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/service-health/resource-health-vm-annotation.md
The below table summarizes all the annotations that the platform emits today:
||-| | VirtualMachineRestarted | The Virtual Machine is undergoing a reboot as requested by a restart action triggered by an authorized user or process from within the Virtual Machine. No other action is required at this time. For more information, see [understanding Virtual Machine reboots in Azure](/troubleshoot/azure/virtual-machines/understand-vm-reboot). | | VirtualMachineCrashed | The Virtual Machine is undergoing a reboot due to a guest OS crash. The local data remains unaffected during this process. No other action is required at this time. For more information, see [understanding Virtual Machine crashes in Azure](/troubleshoot/azure/virtual-machines/understand-vm-reboot#vm-crashes). |
-| VirtualMachineStorageOffline | The Virtual Machine is either currently undergoing a reboot or experiencing an application freeze due to a temporary loss of access to disk. |
+| VirtualMachineStorageOffline | The Virtual Machine is either currently undergoing a reboot or experiencing an application freeze due to a temporary loss of access to disk. No other action is required at this time, while the platform is working on reestablishing disk connectivity. |
| VirtualMachineFailedToSecureBoot | Applicable to Azure Confidential Compute Virtual Machines when guest activity such as unsigned booting components leads to a guest OS issue preventing the Virtual Machine from booting securely. You can attempt to retry deployment after ensuring OS boot components are signed by trusted publishers. For more information, see [Secure Boot](/windows-hardware/design/device-experiences/oem-secure-boot). | | LiveMigrationSucceeded | The Virtual Machine was briefly paused as a Live Migration operation was successfully performed on your Virtual Machine. This operation was carried out either as a repair action, for allocation optimization or as part of routine maintenance workflows. No other action is required at this time. For more information, see [Live Migration](../virtual-machines/maintenance-and-updates.md#live-migration). | | LiveMigrationFailure | A Live Migration operation was attempted on your Virtual Machine as either a repair action, for allocation optimization or as part of routine maintenance workflows. This operation, however, could not be successfully completed and may have resulted in a brief pause of your Virtual Machine. No other action is required at this time. <br/> Also note that [M Series](../virtual-machines/m-series.md), [L Series](../virtual-machines/lasv3-series.md) VM SKUs are not applicable for Live Migration. For more information, see [Live Migration](../virtual-machines/maintenance-and-updates.md#live-migration). |
The below table summarizes all the annotations that the platform emits today:
| VirtualMachineStartInitiatedByControlPlane |The Virtual Machine is starting as requested by an authorized user or process. No other action is required at this time. | | VirtualMachineStopInitiatedByControlPlane | The Virtual Machine is stopping as requested by an authorized user or process. No other action is required at this time. | | VirtualMachineStoppedInternally | The Virtual Machine is stopping as requested by an authorized user or process, or due to a guest activity from within the Virtual Machine. No other action is required at this time. |
-| VirtualMachineProvisioningTimedOut | The Virtual Machine provisioning has failed due to Guest OS issues or incorrect user run scripts. You can attempt to either re-create this Virtual Machine. If this Virtual Machine is part of a virtual machine scale set, you can try reimaging it. |
-
+| VirtualMachineProvisioningTimedOut | The Virtual Machine provisioning has failed due to Guest OS issues or incorrect user run scripts. You can attempt to either re-create this Virtual Machine or if this Virtual Machine is part of a virtual machine scale set, you can try reimaging it. |
| AccelnetUnhealthy | Applicable if Accelerated Networking is enabled for your Virtual Machine ΓÇô We have detected that the Accelerated Networking feature is not functioning as expected. You can attempt to redeploy your Virtual Machine to potentially mitigate the issue. |
static-web-apps Deploy Nextjs Static Export https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/static-web-apps/deploy-nextjs-static-export.md
By default, the application is treated as a hybrid rendered Next.js application,
uses: azure/static-web-apps-deploy@latest with: azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_TOKEN }}
- repo_token: ${{ secrets.GITHUB_TOKEN }} # Used for Github integrations (i.e. PR comments)
+ repo_token: ${{ secrets.GITHUB_TOKEN }} # Used for GitHub integrations (i.e. PR comments)
action: "upload" app_location: "/" # App source code path api_location: "" # Api source code path - optional
static-web-apps Monitor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/static-web-apps/monitor.md
Use the following steps to add Application Insights monitoring to your static we
Once you create the Application Insights instance, it creates an associated application setting in the Azure Static Web Apps instance used to link the services together. > [!NOTE]
-> If you want to track how the different features of your web app are used end-to-end client side, you can insert trace calls in your JavaScript code. For more information, see [Application Insights for webpages](/azure-monitor/app/javascript?tabs=snippet).
+> If you want to track how the different features of your web app are used end-to-end client side, you can insert trace calls in your JavaScript code. For more information, see [Application Insights for webpages](/azure/azure-monitor/app/javascript?tabs=snippet).
## Access data
static-web-apps Publish Hugo https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/static-web-apps/publish-hugo.md
jobs:
runs-on: ubuntu-latest name: Build and Deploy Job steps:
- - uses: actions/checkout@v2
+ - uses: actions/checkout@v3
with: submodules: true - name: Build And Deploy
If your Hugo application uses the [Git Info feature](https://gohugo.io/variables
Update your workflow file to [fetch your full Git history](https://github.com/actions/checkout/blob/main/README.md#fetch-all-history-for-all-tags-and-branches) by adding a new parameter under the `actions/checkout` step to set the `fetch-depth` to `0` (no limit): ```yaml
- - uses: actions/checkout@v2
+ - uses: actions/checkout@v3
with: submodules: true fetch-depth: 0
static-web-apps Review Publish Pull Requests https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/static-web-apps/review-publish-pull-requests.md
Title: Review pull requests in pre-production environments in Azure Static Web Apps
+ Title: Review pull requests in pre-production environments
description: Learn how to use pre-production environments to review pull requests changes in Azure Static Web Apps. + Previously updated : 05/08/2020 Last updated : 10/27/2022
-# Review pull requests in pre-production environments in Azure Static Web Apps
+# Review pull requests in pre-production environments
-This article demonstrates how to use pre-production environments to review changes to applications deployed with [Azure Static Web Apps](overview.md).
+This article shows you how to use pre-production environments to review changes to applications that are deployed with [Azure Static Web Apps](overview.md). A pre-production environment is a fully functional staged version of your application that includes changes not available in production.
-A pre-production (staging) environment is a fully-functional staged version of your application that includes changes not available in production.
+Azure Static Web Apps generates a GitHub Actions workflow in the repo. When a pull request is created against a branch that the workflow watches, the pre-production environment gets built. The pre-production environment stages the app, so you can review the changes before you push them to production.
-Azure Static Web Apps generates a GitHub Actions workflow in the repository. When a pull request is created against a branch that the workflow watches, the pre-production environment is built. The pre-production environment stages the app, enables you to perform reviews before pushing them to production.
+You can do the following tasks within pre-production environments:
-Multiple pre-production environments can co-exist at the same time when using Azure Static Web Apps. Each time you create a pull request against the watched branch, a staged version with your changes is deployed to a distinct pre-production environment.
-
-There are many benefits of using pre-production environments. For example, you can:
--- Review visual changes between production and staging. For example, viewing updates to content and layout.-- Demonstrate the changes to your team.-- Compare different versions of your application.-- Validate changes using acceptance tests.-- Perform sanity checks before deploying to production.
+- Review visual changes between production and staging, like updates to content and layout
+- Demonstrate the changes to your team
+- Compare different versions of your application
+- Validate changes using acceptance tests
+- Perform sanity checks before you deploy to production
> [!NOTE]
-> Pull requests and pre-production environments are currently only supported in GitHub Actions deployments.
+> Pull requests and pre-production environments are only supported in GitHub Actions deployments.
## Prerequisites -- An existing GitHub repository configured with Azure Static Web Apps. See [Building your first static app](getting-started.md) if you don't have one.
+- An existing GitHub repo configured with Azure Static Web Apps. See [Building your first static app](getting-started.md) if you don't have one.
## Make a change
-Begin by making a change in your repository. You can do it directly on GitHub as shown in the following steps.
+Make a change in your repo directly on GitHub, as shown in the following steps.
-1. Go to your project's repository on GitHub, then select **Branch** to create a new branch.
+1. Go to your project's repo on GitHub, and then select **Branch**.
:::image type="content" source="./media/review-publish-pull-requests/create-branch.png" alt-text="Create new branch using GitHub interface":::
- Type a branch name and select **Create branch**.
+1. Enter a branch name and select **Create branch**.
-2. Go to your _app_ folder and change some text content. For example, you can change a title or paragraph. Once you found the file you want to edit, select **Edit** to make the change.
+1. Go to your _app_ folder and change some text content, like a title or paragraph. Select **Edit** to make the change in the file.
:::image type="content" source="./media/review-publish-pull-requests/edit-file.png" alt-text="Edit file button in GitHub interface":::
-3. After you make the changes, select **Commit changes** to commit your changes to the branch.
+1. Select **Commit changes** when you're done.
- :::image type="content" source="./media/review-publish-pull-requests/commit-changes.png" alt-text="Commit changes button in GitHub interface":::
+ :::image type="content" source="./media/review-publish-pull-requests/commit-changes.png" alt-text="Screenshot showing the Commit changes button in the GitHub interface.":::
## Create a pull request
-Next, create a pull request from this change.
+Create a pull request to publish your update.
-1. Open the **Pull request** tab of your project on GitHub:
+1. Open the **Pull request** tab of your project on GitHub.
- :::image type="content" source="./media/review-publish-pull-requests/tab.png" alt-text="Pull request tab in a GitHub repository":::
+ :::image type="content" source="./media/review-publish-pull-requests/tab.png" alt-text="Screenshot showing the pull request tab in a GitHub repo.":::
-2. Select **Compare & pull request** on your branch.
+1. Select **Compare & pull request**.
-3. You can optionally fill in some details about your changes, then select **Create pull request**.
+1. Optionally, enter details about your changes, and then select **Create pull request**.
- :::image type="content" source="./media/review-publish-pull-requests/open.png" alt-text="Pull request creation in GitHub":::
+ :::image type="content" source="./media/review-publish-pull-requests/open.png" alt-text="Screenshot showing the pull request creation in GitHub.":::
-You can assign reviewers and add comments to discuss your changes if needed.
+Assign reviewers and add comments to discuss your changes, if needed.
-> [!NOTE]
-> You can make multiple changes by pushing new commits to your branch. The pull request is then automatically updated to reflect all changes.
+Multiple pre-production environments can co-exist at the same time when you use Azure Static Web Apps. Each time you create a pull request against the watched branch, a staged version with your changes deploys to a distinct pre-production environment.
-## Review changes
+You can make multiple changes and push new commits to your branch. The pull request automatically updates to reflect all changes.
-After the pull request is created, the [GitHub Actions](https://github.com/features/actions) deployment workflow runs and deploys your changes to a pre-production environment.
+## Review changes
-Once the workflow has completed building and deploying your app, the GitHub bot adds a comment to your pull request which contains the URL of the pre-production environment. You can select this link to see your staged changes.
+The [GitHub Actions](https://github.com/features/actions) deployment workflow runs and deploys your pull request changes to a pre-production environment.
+Once the workflow completes building and deploying your app, the GitHub bot adds a comment to your pull request, which contains the URL of the pre-production environment.
-Select the generated URL to see the changes.
+1. Select the pre-production URL to see your staged changes.
-If you take a closer look at the URL, you can see that it's composed like this: `https://<SUBDOMAIN-PULL_REQUEST_ID>.<AZURE_REGION>.azurestaticapps.net`.
+ :::image type="content" source="./media/review-publish-pull-requests/bot-comment.png" alt-text="Screenshot of pull request comment with the pre-production URL.":::
-For a given pull request, the URL remains the same even if you push new updates. In addition to the URL staying constant, the same pre-production environment is reused for the life of the pull request.
+ The URL is composed like this: `https://<SUBDOMAIN-PULL_REQUEST_ID>.<AZURE_REGION>.azurestaticapps.net`. For a given pull request, the URL remains the same, even if you push new updates. The same pre-production environment also gets reused for the life of the pull request.
-To automate the review process with end-to-end testing, the [Azure Static Web App Deploy GitHub Action](https://github.com/Azure/static-web-apps-deploy) has the `static_web_app_url` output variable.
-This URL can be referenced in the rest of your workflow to run your tests against the pre-production environment.
+To automate the review process with end-to-end testing, the [GitHub Action for deploying Azure Static Web Apps](https://github.com/Azure/static-web-apps-deploy) has the `static_web_app_url` output variable.
+You can reference this URL in the rest of your workflow to run your tests against the pre-production environment.
## Publish changes
-Once changes are approved, you can publish your changes to production by merging the pull request.
+Merge the pull request to publish to production.
-Select **Merge pull request**:
+1. Select **Merge pull request**.
+ :::image type="content" source="./media/review-publish-pull-requests/merge.png" alt-text="Screenshot showing the Merge pull request button in GitHub interface.":::
-Merging copies your changes to the tracked branch (the "production" branch). Then, the deployment workflow starts on the tracked branch and the changes are live after your application has been rebuilt.
+ Your changes get copied to the tracked branch (the "production" branch). Then, the deployment workflow starts on the tracked branch and the changes go live after your application rebuilds.
-To verify the changes in production, open your production URL to load the live version of the website.
+1. Open your production URL to load the live version of the website and verify.
## Limitations -- Staged versions of your application are currently accessible publicly by their URL, even if your GitHub repository is private.
+- Anyone can access the staged versions of your application via their URL, even if your GitHub repo is private.
> [!WARNING]
- > Be careful when publishing sensitive content to staged versions, as access to pre-production environments are not restricted.
--- The number of pre-production environments available for each app deployed with Static Web Apps depends on the [hosting plan](plans.md) you are using. For example, with the Free tier you can have 3 pre-production environments in addition to the production environment.--- Pre-production environments are not geo-distributed.
+ > Be careful with sensitive content, since anyone can access pre-production environments.
-- Currently, only GitHub Actions deployments support pre-production environments.
+- The number of pre-production environments available for each app deployed with Static Web Apps depends your [hosting plan](plans.md). For example, with the [Free tier](https://azure.microsoft.com/pricing/details/devops/azure-devops-services/) you can have three pre-production environments along with the production environment.
+- Pre-production environments aren't geo-distributed.
+- Only GitHub Actions deployments support pre-production environments.
## Next steps > [!div class="nextstepaction"]
-> [Branch preview environments](branch-environments.md)
+> [Create branch preview environments](branch-environments.md)
storage Anonymous Read Access Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/anonymous-read-access-configure.md
Previously updated : 03/01/2022 Last updated : 10/28/2022 -+ ms.devlang: azurecli
storage Anonymous Read Access Prevent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/anonymous-read-access-prevent.md
Previously updated : 12/09/2020 Last updated : 10/28/2022 -+ # Prevent anonymous public read access to containers and blobs
+This article describes how to use a DRAG (Detection-Remediation-Audit-Governance) framework to continuously manage public access for your storage accounts.
+ Anonymous public read access to containers and blobs in Azure Storage is a convenient way to share data, but may also present a security risk. It's important to manage anonymous access judiciously and to understand how to evaluate anonymous access to your data. Operational complexity, human error, or malicious attack against data that is publicly accessible can result in costly data breaches. Microsoft recommends that you enable anonymous access only when necessary for your application scenario. By default, public access to your blob data is always prohibited. However, the default configuration for a storage account permits a user with appropriate permissions to configure public access to containers and blobs in a storage account. For enhanced security, you can disallow all public access to storage account, regardless of the public access setting for an individual container. Disallowing public access to the storage account prevents a user from enabling public access for a container in the account. Microsoft recommends that you disallow public access to a storage account unless your scenario requires it. Disallowing public access helps to prevent data breaches caused by undesired anonymous access. When you disallow public blob access for the storage account, Azure Storage rejects all anonymous requests to that account. After public access is disallowed for an account, containers in that account cannot be subsequently configured for public access. Any containers that have already been configured for public access will no longer accept anonymous requests. For more information, see [Configure anonymous public read access for containers and blobs](anonymous-read-access-configure.md).
-This article describes how to use a DRAG (Detection-Remediation-Audit-Governance) framework to continuously manage public access for your storage accounts.
+> [!WARNING]
+> When a container is configured for public access, any client can read data in that container. Public access presents a potential security risk, so if your scenario does not require it, Microsoft recommends that you disallow it for the storage account.
## Detect anonymous requests from client applications
storage Storage Blob Change Feed https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-change-feed.md
description: Learn about change feed logs in Azure Blob Storage and how to use t
Previously updated : 06/15/2022 Last updated : 10/28/2022 -+ # Change feed support in Azure Blob Storage
This section describes known issues and conditions in the current release of the
- The `url` property of the log file is currently always empty. - The `LastConsumable` property of the segments.json file does not list the very first segment that the change feed finalizes. This issue occurs only after the first segment is finalized. All subsequent segments after the first hour are accurately captured in the `LastConsumable` property. - You currently cannot see the **$blobchangefeed** container when you call the ListContainers API. You can view the contents by calling the ListBlobs API on the $blobchangefeed container directly.
+- [Storage account failover](../common/storage-disaster-recovery-guidance.md) is not supported on accounts with the change feed enabled. Disable the change feed before initiating a failover.
- Storage accounts that have previously initiated an [account failover](../common/storage-disaster-recovery-guidance.md) may have issues with the log file not appearing. Any future account failovers may also impact the log file. ## Feature support
storage Storage Quickstart Blobs Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-quickstart-blobs-nodejs.md
Title: "Quickstart: Azure Blob storage library v12 - JavaScript"
-description: In this quickstart, you learn how to use the Azure Blob storage blob npm package version 12 for JavaScript to create a container and a blob in Blob (object) storage. Next, you learn how to download the blob to your local computer, and how to list all of the blobs in a container.
+ Title: "Quickstart: Azure Blob storage library - JavaScript"
+description: In this quickstart, you learn how to use the Azure Blob Storage for JavaScript to create a container and a blob in Blob (object) storage. Next, you learn how to download the blob to your local computer, and how to list all of the blobs in a container.
Previously updated : 09/13/2022 Last updated : 10/28/2022
ms.devlang: javascript
-# Quickstart: Manage blobs with JavaScript v12 SDK in Node.js
+# Quickstart: Manage blobs with JavaScript SDK in Node.js
In this quickstart, you learn to manage blobs by using Node.js. Blobs are objects that can hold large amounts of text or binary data, including images, documents, streaming media, and archive data.
-These [**example code**](https://github.com/Azure-Samples/AzureStorageSnippets/tree/master/blobs/quickstarts/JavaScript/V12/nodejs) snippets show you how to perform the following with the Azure Blob storage package library for JavaScript:
--- [Get the connection string](#get-the-connection-string)-- [Create a container](#create-a-container)-- [Upload blobs to a container](#upload-blobs-to-a-container)-- [List the blobs in a container](#list-the-blobs-in-a-container)-- [Download blobs](#download-blobs)-- [Delete a container](#delete-a-container)-
-Additional resources:
- [API reference](/javascript/api/@azure/storage-blob) | [Library source code](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/storage/storage-blob) | [Package (npm)](https://www.npmjs.com/package/@azure/storage-blob) | [Samples](../common/storage-samples-javascript.md?toc=%2fazure%2fstorage%2fblobs%2ftoc.json#blob-samples)
Additional resources:
- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio). - An Azure Storage account. [Create a storage account](../common/storage-account-create.md). - [Node.js LTS](https://nodejs.org/en/download/).-- [Microsoft Visual Studio Code](https://code.visualstudio.com)-
-## Object model
-
-Azure Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that doesn't adhere to a particular data model or definition, such as text or binary data. Blob storage offers three types of resources:
--- The storage account-- A container in the storage account-- A blob in the container-
-The following diagram shows the relationship between these resources.
-
-![Diagram of Blob storage architecture](./media/storage-blobs-introduction/blob1.png)
-
-Use the following JavaScript classes to interact with these resources:
--- [BlobServiceClient](/javascript/api/@azure/storage-blob/blobserviceclient): The `BlobServiceClient` class allows you to manipulate Azure Storage resources and blob containers.-- [ContainerClient](/javascript/api/@azure/storage-blob/containerclient): The `ContainerClient` class allows you to manipulate Azure Storage containers and their blobs.-- [BlobClient](/javascript/api/@azure/storage-blob/blobclient): The `BlobClient` class allows you to manipulate Azure Storage blobs. ## Create the Node.js project
-Create a JavaScript application named *blob-quickstart-v12*.
+Create a JavaScript application named *blob-quickstart*.
1. In a console window (such as cmd, PowerShell, or Bash), create a new directory for the project. ```console
- mkdir blob-quickstart-v12
+ mkdir blob-quickstart
```
-1. Switch to the newly created *blob-quickstart-v12* directory.
+1. Switch to the newly created *blob-quickstart* directory.
```console
- cd blob-quickstart-v12
+ cd blob-quickstart
``` 1. Create a *package.json*.
Create a JavaScript application named *blob-quickstart-v12*.
code . ```
-## Install the npm package for blob storage
+## Install the packages
+
+From the project directory, install the following packages using the `npm install` command.
1. Install the Azure Storage npm package: ```console npm install @azure/storage-blob ```+
+1. Install the Azure Identity npm package for a passwordless connection:
+
+ ```console
+ npm install @azure/identity
+ ```
1. Install other dependencies used in this quickstart:
From the project directory:
1. Create a new file named `index.js`. 1. Copy the following code into the file. More code will be added as you go through this quickstart.
- ```javascript
- const { BlobServiceClient } = require('@azure/storage-blob');
- const { v1: uuidv1} = require('uuid');
- require('dotenv').config()
-
- async function main() {
- console.log('Azure Blob storage v12 - JavaScript quickstart sample');
+ :::code language="javascript" source="~/azure_storage-snippets/blobs/quickstarts/JavaScript/V12/nodejs/boilerplate.js" :::
+
+## Object model
+
+Azure Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that doesn't adhere to a particular data model or definition, such as text or binary data. Blob storage offers three types of resources:
+
+- The storage account
+- A container in the storage account
+- A blob in the container
+
+The following diagram shows the relationship between these resources.
+
+![Diagram of Blob storage architecture](./media/storage-blobs-introduction/blob1.png)
+
+Use the following JavaScript classes to interact with these resources:
+
+- [BlobServiceClient](/javascript/api/@azure/storage-blob/blobserviceclient): The `BlobServiceClient` class allows you to manipulate Azure Storage resources and blob containers.
+- [ContainerClient](/javascript/api/@azure/storage-blob/containerclient): The `ContainerClient` class allows you to manipulate Azure Storage containers and their blobs.
+- [BlobClient](/javascript/api/@azure/storage-blob/blobclient): The `BlobClient` class allows you to manipulate Azure Storage blobs.
+
- // Quick start code goes here
+## Code examples
- }
+These example code snippets show you how to do the following tasks with the Azure Blob Storage client library for JavaScript:
- main()
- .then(() => console.log('Done'))
- .catch((ex) => console.log(ex.message));
+- [Authenticate to Azure and authorize access to blob data](#authenticate-to-azure-and-authorize-access-to-blob-data)
+- [Create a container](#create-a-container)
+- [Upload blobs to a container](#upload-blobs-to-a-container)
+- [List the blobs in a container](#list-the-blobs-in-a-container)
+- [Download blobs](#download-blobs)
+- [Delete a container](#delete-a-container)
+
+Sample code is also available on [GitHub](https://github.com/Azure-Samples/AzureStorageSnippets/tree/master/blobs/quickstarts/JavaScript/V12/nodejs).
+
+### Authenticate to Azure and authorize access to blob data
++
+### [Passwordless (Recommended)](#tab/managed-identity)
+
+`DefaultAzureCredential` supports multiple authentication methods and determines which method should be used at runtime. This approach enables your app to use different authentication methods in different environments (local vs. production) without implementing environment-specific code.
+
+The order and locations in which `DefaultAzureCredential` looks for credentials can be found in the [Azure Identity library overview](/javascript/api/overview/azure/identity-readme#defaultazurecredential).
+
+For example, your app can authenticate using your Azure CLI sign-in credentials with when developing locally. Your app can then use a [managed identity](/azure/active-directory/managed-identities-azure-resources/overview) once it has been deployed to Azure. No code changes are required for this transition.
+
+#### Assign roles to your Azure AD user account
++
+#### Sign in and connect your app code to Azure using DefaultAzureCredential
+
+You can authorize access to data in your storage account using the following steps:
+
+1. Make sure you're authenticated with the same Azure AD account you assigned the role to on your storage account. You can authenticate via the Azure CLI, Visual Studio Code, or Azure PowerShell.
+
+ #### [Azure CLI](#tab/sign-in-azure-cli)
+
+ Sign-in to Azure through the Azure CLI using the following command:
+
+ ```azurecli
+ az login
```
+ #### [Visual Studio Code](#tab/sign-in-visual-studio-code)
-## Get the connection string
+ You'll need to [install the Azure CLI](/cli/azure/install-azure-cli) to work with `DefaultAzureCredential` through Visual Studio Code.
-The code below retrieves the connection string for the storage account from the environment variable created in the [Configure your storage connection string](#configure-your-storage-connection-string) section.
+ On the main menu of Visual Studio Code, navigate to **Terminal > New Terminal**.
-Add this code inside the `main` function:
+ Sign-in to Azure through the Azure CLI using the following command:
+
+ ```azurecli
+ az login
+ ```
+
+ #### [PowerShell](#tab/sign-in-powershell)
+
+ Sign-in to Azure using PowerShell via the following command:
+
+ ```azurepowershell
+ Connect-AzAccount
+ ```
+2. To use `DefaultAzureCredential`, make sure that the **@azure\identity** package is [installed](#install-the-packages), and the class is imported:
+
+ :::code language="javascript" source="~/azure_storage-snippets/blobs/quickstarts/JavaScript/V12/nodejs/index.js" id="snippet_StorageAcctInfo_without_secrets":::
+
+3. Add this code inside the `try` block. When the code runs on your local workstation, `DefaultAzureCredential` uses the developer credentials of the prioritized tool you're logged into to authenticate to Azure. Examples of these tools include Azure CLI or Visual Studio Code.
+
+ :::code language="javascript" source="~/azure_storage-snippets/blobs/quickstarts/JavaScript/V12/nodejs/index.js" id="snippet_StorageAcctInfo_create_client":::
+
+4. Make sure to update the storage account name, `AZURE_STORAGE_ACCOUNT_NAME`, in the `.env` file or your environment's variables. The storage account name can be found on the overview page of the Azure portal.
+
+ :::image type="content" source="./media/storage-quickstart-blobs-python/storage-account-name.png" alt-text="A screenshot showing how to find the storage account name.":::
+
+ > [!NOTE]
+ > When deployed to Azure, this same code can be used to authorize requests to Azure Storage from an application running in Azure. However, you'll need to enable managed identity on your app in Azure. Then configure your storage account to allow that managed identity to connect. For detailed instructions on configuring this connection between Azure services, see the [Auth from Azure-hosted apps](/azure/developer/javascript/sdk/authentication-azure-hosted-apps) tutorial.
+
+### [Connection String](#tab/connection-string)
+
+A connection string includes the storage account access key and uses it to authorize requests. Always be careful to never expose the keys in an unsecure location.
+
+> [!NOTE]
+> To authorize data access with the storage account access key, you'll need permissions for the following Azure RBAC action: [Microsoft.Storage/storageAccounts/listkeys/action](../../role-based-access-control/resource-provider-operations.md#microsoftstorage). The least privileged built-in role with permissions for this action is [Reader and Data Access](../../role-based-access-control/built-in-roles.md#reader-and-data-access), but any role which includes this action will work.
++
+#### Configure your storage connection string
+
+After you copy the connection string, write it to a new environment variable on the local machine running the application. To set the environment variable, open a console window, and follow the instructions for your operating system. Replace `<yourconnectionstring>` with your actual connection string.
+
+**Windows**:
+
+```cmd
+setx AZURE_STORAGE_CONNECTION_STRING "<yourconnectionstring>"
+```
+
+After you add the environment variable in Windows, you must start a new instance of the command window.
+
+**Linux**:
+
+```bash
+export AZURE_STORAGE_CONNECTION_STRING="<yourconnectionstring>"
+```
+
+**.env file**:
+
+```bash
+AZURE_STORAGE_CONNECTION_STRING="<yourconnectionstring>"
+```
+
+The code below retrieves the connection string for the storage account from the environment variable created earlier, and uses the connection string to construct a service client object.
+
+Add this code inside the `try` block:
++
+> [!IMPORTANT]
+> The account access key should be used with caution. If your account access key is lost or accidentally placed in an insecure location, your service may become vulnerable. Anyone who has the access key is able to authorize requests against the storage account, and effectively has access to all the data. `DefaultAzureCredential` provides enhanced security features and benefits and is the recommended approach for managing authorization to Azure services.
++ ## Create a container
The preceding code cleans up the resources the app created by removing the entir
2. The output of the app is similar to the following example: ```output
- Azure Blob storage v12 - JavaScript quickstart sample
+ Azure Blob storage - JavaScript quickstart sample
Creating container... quickstart4a0780c0-fb72-11e9-b7b9-b387d3c488da
This quickstart created a container and blob on the Azure cloud. You can also us
## Clean up
-1. When you're done with this quickstart, delete the `blob-quickstart-v12` directory.
+1. When you're done with this quickstart, delete the `blob-quickstart` directory.
1. If you're done using your Azure Storage resource, use the [Azure CLI to remove the Storage resource](storage-quickstart-blobs-cli.md#clean-up-resources). ## Next steps
For tutorials, samples, quickstarts, and other documentation, visit:
> [Azure for JavaScript developer center](/azure/developer/javascript/) - To learn how to deploy a web app that uses Azure Blob storage, see [Tutorial: Upload image data in the cloud with Azure Storage](./storage-upload-process-images.md?preserve-view=true&tabs=javascript)-- To see Blob storage sample apps, continue to [Azure Blob storage package library v12 JavaScript samples](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/storage/storage-blob/samples).-- To learn more, see the [Azure Blob storage client library for JavaScript](https://github.com/Azure/azure-sdk-for-js/blob/master/sdk/storage/storage-blob).
+- To see Blob storage sample apps, continue to [Azure Blob storage package library JavaScript samples](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/storage/storage-blob/samples).
+- To learn more, see the [Azure Blob storage client library for JavaScript](https://github.com/Azure/azure-sdk-for-js/blob/master/sdk/storage/storage-blob).
storage Redundancy Migration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/redundancy-migration.md
Previously updated : 09/21/2022 Last updated : 10/28/2022 -+ # Change how a storage account is replicated
For more detailed guidance on how to perform a manual migration, see [Move an Az
Limitations apply to some replication change scenarios depending on: -- [Storage account type](#storage-account-type) - [Region](#region)
+- [Feature conflicts](#feature-conflicts)
+- [Storage account type](#storage-account-type)
- [Access tier](#access-tier) - [Protocol support](#protocol-support) - [Failover and failback](#failover-and-failback)
+### Region
+
+Make sure the region where your storage account is located supports all of the desired replication settings. For example, if you are converting your account to zone-redundant (ZRS, GZRS, or RA-GZRS), make sure your storage account is in a region that supports it. See the lists of supported regions for [Zone-redundant storage](storage-redundancy.md#zone-redundant-storage) and [Geo-zone-redundant storage](storage-redundancy.md#geo-zone-redundant-storage).
+
+The [customer-initiated conversion (preview)](#customer-initiated-conversion-preview) to ZRS is available in all public ZRS regions except for the following:
+
+- (Europe) West Europe
+- (Europe) UK South
+- (North America) Canada Central
+- (North America) East US
+- (North America) East US 2
+
+### Feature conflicts
+
+Some storage account features are not compatible with other features or operations. For example, the ability to failover to the secondary region is the key feature of geo-redundancy, but other features are not compatible with failover. For more information about features and services not supported with failover, see [Unsupported features and services](storage-disaster-recovery-guidance.md#unsupported-features-and-services). Converting an account to GRS, GZRS, or RA-GZRS might be blocked if a conflicting feature is enabled, or it might be necessary to disable the feature later before initiating a failover.
+ ### Storage account type When planning to change your replication settings, consider the following limitations related to the storage account type.
az storage account update -g <resource_group> -n <storage_account> --set kind=St
To manually migrate your ZRS Classic account data to another type of replication, follow the steps to [perform a manual migration](#manual-migration).
-### Region
-
-Make sure the region where your storage account is located supports all of the desired replication settings. For example, if you are converting your account to zone-redundant (ZRS, GZRS, or RA-GZRS), make sure your storage account is in a region that supports it. See the lists of supported regions for [Zone-redundant storage](storage-redundancy.md#zone-redundant-storage) and [Geo-zone-redundant storage](storage-redundancy.md#geo-zone-redundant-storage).
-
-The [customer-initiated conversion (preview)](#customer-initiated-conversion-preview) to ZRS is available in all public ZRS regions except for the following:
--- (Europe) West Europe-- (Europe) UK South-- (North America) Canada Central-- (North America) East US-- (North America) East US 2- If you want to migrate your data into a zone-redundant storage account located in a region different from the source account, you must perform a manual migration. For more details, see [Move an Azure Storage account to another region](storage-account-move.md). ### Access tier
storage Storage Disaster Recovery Guidance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/common/storage-disaster-recovery-guidance.md
Previously updated : 03/01/2022 Last updated : 10/28/2022 + # Disaster recovery and storage account failover
Keep in mind that any data stored in a temporary disk is lost when the VM is shu
The following features and services are not supported for account failover:
+- Storage accounts that have [change feed](../blobs/storage-blob-change-feed.md) enabled are not supported for failover. For example, [operational backup of Azure Blob Storage](../../backup/blob-backup-support-matrix.md#limitations) requires the change feed. For this reason, storage accounts that have operational backup configured do not support failover. You must disable operational backup and any other features that require the change feed before initiating a failover.
- Azure File Sync does not support storage account failover. Storage accounts containing Azure file shares being used as cloud endpoints in Azure File Sync should not be failed over. Doing so will cause sync to stop working and may also cause unexpected data loss in the case of newly tiered files. - Storage accounts that have hierarchical namespace enabled (such as for Data Lake Storage Gen2) are not supported at this time. - A storage account containing premium block blobs cannot be failed over. Storage accounts that support premium block blobs do not currently support geo-redundancy.
storage Elastic San Create https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/elastic-san/elastic-san-create.md
The following command creates an Elastic SAN that uses locally-redundant storage
```azurecli ## Variables
-$sanName="yourSANNameHere"
-$resourceGroupName="yourResourceGroupNameHere"
-$sanLocation="desiredRegion"
-$volumeGroupName="desiredVolumeGroupName"
+sanName="yourSANNameHere"
+resourceGroupName="yourResourceGroupNameHere"
+sanLocation="desiredRegion"
+volumeGroupName="desiredVolumeGroupName"
az elastic-san create -n $sanName -g $resourceGroupName -l $sanLocation --base-size-tib 100 --extended-capacity-size-tib 20 --sku ΓÇ£{name:Premium_LRS,tier:Premium}ΓÇ¥ ```
storage File Sync Troubleshoot Sync Group Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/file-sync/file-sync-troubleshoot-sync-group-management.md
A sync group defines the sync topology for a set of files. Endpoints within a sy
<a id="cloud-endpoint-mgmtinternalerror"></a>**Cloud endpoint creation fails, with this error: "MgmtInternalError"** This error can occur if the Azure File Sync service cannot access the storage account due to SMB security settings. To enable Azure File Sync to access the storage account, the SMB security settings on the storage account must allow **SMB 3.1.1** protocol version, **NTLM v2** authentication and **AES-128-GCM** encryption. To check the SMB security settings on the storage account, see [SMB security settings](../files/files-smb-protocol.md#smb-security-settings).
+<a id="cloud-endpoint-mgmtforbidden"></a>**Cloud endpoint creation fails, with this error: "MgmtForbidden"**
+This error occurs if the Azure File Sync service cannot access the storage account.
+
+To resolve this issue, perform the following steps:
+- Verify the "Allow trusted Microsoft services to access this storage account" setting is checked on your storage account. To learn more, see [Restrict access to the storage account public endpoint](file-sync-networking-endpoints.md#restrict-access-to-the-storage-account-public-endpoint).
+- Verify the SMB security settings on your storage account. To enable Azure File Sync to access the storage account, the SMB security settings on the storage account must allow **SMB 3.1.1** protocol version, **NTLM v2** authentication and **AES-128-GCM** encryption. To check the SMB security settings on the storage account, see [SMB security settings](../files/files-smb-protocol.md#smb-security-settings).
+ <a id="cloud-endpoint-authfailed"></a>**Cloud endpoint creation fails, with this error: "AuthorizationFailed"** This error occurs if your user account doesn't have sufficient rights to create a cloud endpoint.
stream-analytics Azure Data Explorer Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/stream-analytics/azure-data-explorer-managed-identity.md
Previously updated : 08/12/2022 Last updated : 10/27/2022
-# Use managed identities to access Azure Data Explorer from an Azure Stream Analytics job (preview)
+# Use managed identities to access Azure Data Explorer from an Azure Stream Analytics job
Azure Stream Analytics supports managed identity authentication for Azure Data Explorer output. Managed identities for Azure resources is a cross-Azure feature that enables you to create a secure identity associated with the deployment under which your application code runs. You can then associate that identity with access-control roles that grant custom permissions for accessing specific Azure resources that your application needs.
stream-analytics Azure Database Explorer Output https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/stream-analytics/azure-database-explorer-output.md
Previously updated : 09/26/2022 Last updated : 10/27/2022 # Azure Data Explorer output from Azure Stream Analytics
For more information about Azure Data Explorer, visit the [What is Azure Data Ex
To learn more about how to create an Azure Data Explorer cluster by using the Azure portal, visit: [Quickstart: Create an Azure Data Explorer cluster and database](/azure/data-explorer/create-cluster-database-portal/) > [!NOTE]
-> Azure Data Explorer from Azure Stream Analytics supports output to Synapse Data Explorer clusters. To write to your synapse data explorer clusters, you have to specify the url of your cluster in the configuration blade in for Azure Data Explorer output in your Azure Stream Analytics job.
+> Azure Data Explorer from Azure Stream Analytics supports output to Synapse Data Explorer clusters. To write to your synapse data explorer clusters, you have to specify the url of your cluster in the configuration blade for Azure Data Explorer output in your Azure Stream Analytics job.
## Output configuration
stream-analytics Postgresql Database Output https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/stream-analytics/postgresql-database-output.md
Previously updated : 04/27/2022 Last updated : 10/27/2022 # Azure Database for PostgreSQL output from Azure Stream Analytics
You can use [Azure Database for PostgreSQL](https://azure.microsoft.com/services
Azure Database for PostgreSQL powered by the PostgreSQL community edition is available in three deployment options: * Single Server
-* Flexible Server (Preview)
+* Flexible Server
* Hyperscale (Citus) For more information about Azure Database for PostgreSQL please visit the: [What is Azure Database for PostgreSQL documentation.](../postgresql/overview.md)
stream-analytics Stream Analytics Managed Identities Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/stream-analytics/stream-analytics-managed-identities-overview.md
Previously updated : 08/10/2022 Last updated : 10/27/2022 # Managed identities for Azure Stream Analytics
Stream Analytics supports two types of managed identities:
Below is a table that shows Azure Stream Analytics inputs and outputs that support system-assigned managed identity or user-assigned managed identity:
-| Type |  Adapter | User-assigned managed identity (Preview) | System-assigned managed identity |
+| Type |  Adapter | User-assigned managed identity | System-assigned managed identity |
|--|-||| | Storage Account | Blob/ADLS Gen 2 | Yes | Yes | | Inputs | Event Hubs | Yes | Yes |
virtual-desktop Autoscale Scenarios https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/autoscale-scenarios.md
Before you create your plan, keep the following things in mind:
- Make sure you understand usage patterns before defining your schedule. You'll need to schedule around the following times of day: - Ramp-up: the start of the day, when usage picks up.
- - Peak hours: the time of day when usage is highest.
+ - Peak hours: the time of day when usage is expected to be at its highest.
- Ramp-down: when usage tapers off. This is usually when you shut down your VMs to save costs.
- - Off-peak hours: the time with the lowest possible usage. You can define the maximum number of VMs that can be active during this time.
+ - Off-peak hours: the time of the day when usage is expected to be at its lowest.
- The scaling plan will take effect as soon as you enable it.
virtual-desktop Configure Adfs Sso https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/configure-adfs-sso.md
Title: Configure Azure Virtual Desktop AD FS single sign-on - Azure
-description: How to configure AD FS single sign-on for a Azure Virtual Desktop environment.
+ Title: Configure single sign-on for Azure Virtual Desktop using AD FS - Azure
+description: How to configure single sign-on for an Azure Virtual Desktop environment using Active Directory Federation Services.
Last updated 06/30/2021
-# Configure AD FS single sign-on for Azure Virtual Desktop
+# Configure single sign-on for Azure Virtual Desktop using AD FS
This article will walk you through the process of configuring Active Directory Federation Service (AD FS) single sign-on (SSO) for Azure Virtual Desktop.
virtual-desktop Configure Single Sign On https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/configure-single-sign-on.md
Title: Configure single sign-on for Azure Virtual Desktop - Azure
-description: How to configure single sign-on for an Azure Virtual Desktop environment.
+ Title: Configure single sign-on for Azure Virtual Desktop using Azure AD Authentication - Azure
+description: How to configure single sign-on for an Azure Virtual Desktop environment using Azure AD Authentication.
Last updated 09/22/2022
-# Configure single sign-on for Azure Virtual Desktop
+# Configure single sign-on for Azure Virtual Desktop using Azure AD Authentication
> [!IMPORTANT] > Single sign-on using Azure AD authentication is currently in public preview. > This preview version is provided without a service level agreement, and is not recommended for production workloads. Certain features might not be supported or might have constrained capabilities. > For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
-This article will walk you through the process of configuring single sign-on (SSO) using Azure AD authentication for Azure Virtual Desktop (preview). When you enable SSO, you can use passwordless authentication and third-party Identity Providers that federate with Azure AD to sign in to your resources.
+This article will walk you through the process of configuring single sign-on (SSO) using Azure Active Directory (Azure AD) authentication for Azure Virtual Desktop (preview). When you enable SSO, you can use passwordless authentication and third-party Identity Providers that federate with Azure AD to sign in to your resources.
> [!NOTE] > Azure Virtual Desktop (classic) doesn't support this feature.
virtual-desktop Set Up Scaling Script https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/set-up-scaling-script.md
In this article, you'll learn about the scaling tool that uses an Azure Automation runbook and Azure Logic App to automatically scale session host VMs in your Azure Virtual Desktop environment. To learn more about the scaling tool, see [Scale session hosts using Azure Automation and Azure Logic Apps](scaling-automation-logic-apps.md). > [!NOTE]
-> You can't scale session hosts using Azure Automation and Azure Logic Apps together with [autoscale](autoscale-scaling-plan.md) on the same host pool. You must use one or the other.
+> - Autoscale is an alternative way to scale session host VMs and is a native feature of Azure Virtual Desktop. We recommend you use Autoscale instead. For more information, see [Autoscale scaling plans](autoscale-scenarios.md).
+>
+> - You can't scale session hosts using Azure Automation and Azure Logic Apps together with [autoscale](autoscale-scaling-plan.md) on the same host pool. You must use one or the other.
## Prerequisites
Before you start setting up the scaling tool, make sure you have the following t
- An [Azure Virtual Desktop host pool](create-host-pools-azure-marketplace.md). - Session host pool VMs configured and registered with the Azure Virtual Desktop service.-- A user with the [Contributor role](../role-based-access-control/role-assignments-portal.md) assigned on the Azure subscription.
+- A user with the [*Contributor*](../role-based-access-control/role-assignments-portal.md) role-based access control (RBAC) role assigned on the Azure subscription to create the resources. You'll also need the *Application administrator* and/or *Owner* RBAC role to create a Run As account.
- A Log Analytics workspace (optional). The machine you use to deploy the tool must have:
First, you'll need an Azure Automation account to run the PowerShell runbook. Th
Now that you have an Azure Automation account, you'll also need to create an Azure Automation Run As account if you don't have one already. This account will let the tool access your Azure resources.
+> [!IMPORTANT]
+> This scaling tool uses a Run As account with Azure Automation. Azure Automation Run As accounts will retire on September 30, 2023. Microsoft won't provide support beyond that date. From now through September 30, 2023, you can continue to use Azure Automation Run As accounts. This scaling tool won't be updated to create the resources using managed identities, however, you can transition to use [managed identities](../automation/automation-security-overview.md#managed-identities) and will need to before then. For more information, see [Migrate from an existing Run As account to a managed identity](../automation/migrate-run-as-accounts-managed-identity.md).
+>
+> Autoscale is an alternative way to scale session host VMs and is a native feature of Azure Virtual Desktop. We recommend you use Autoscale instead. For more information, see [Autoscale scaling plans](autoscale-scenarios.md).
+ An [Azure Automation Run As account](../automation/manage-runas-account.md) provides authentication for managing resources in Azure with Azure cmdlets. When you create a Run As account, it creates a new service principal user in Azure Active Directory and assigns the Contributor role to the service principal user at the subscription level. An Azure Run As account is a great way to authenticate securely with certificates and a service principal name without needing to store a username and password in a credential object. To learn more about Run As account authentication, see [Limit Run As account permissions](../automation/manage-runas-account.md#limit-run-as-account-permissions).
-Any user who's a member of the Subscription Admins role and coadministrator of the subscription can create a Run As account.
+Any user who's assigned the *Application administrator* and/or *Owner* RBAC role on the subscription can create a Run As account.
To create a Run As account in your Azure Automation account:
If you decided to use Log Analytics, you can view all the log data in a custom l
| project TimeStampUTC = TimeGenerated, TimeStampLocal = TimeStamp_s, HostPool = hostpoolName_s, LineNumAndMessage = logmessage_s, AADTenantId = TenantId ```
-## Report issues
+## Limitations
+
+Here are some limitations with scaling session host VMs with this scaling script:
-Issue reports for the scaling tool are currently being handled by Microsoft Support. When you make an issue report, make sure to follow the instructions in [Reporting issues](#reporting-issues). If you have feedback about the tool or want to request new features, open a GitHub issue labeled *4-WVD-scaling-tool* on the [RDS GitHub page](https://github.com/Azure/RDS-Templates/issues?q=is%3Aissue+is%3Aopen+label%3A4-WVD-scaling-tool).
+- The scaling script doesnΓÇÖt consider time changes between standard and daylight savings.
virtual-desktop Store Fslogix Profile https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/store-fslogix-profile.md
Title: Storage FSLogix profile container Azure Virtual Desktop - Azure
description: Options for storing your Azure Virtual Desktop FSLogix profile on Azure Storage. Previously updated : 01/04/2021 Last updated : 10/27/2022
The following tables compare the storage solutions Azure Storage offers for Azur
|Features|Azure Files|Azure NetApp Files|Storage Spaces Direct| |--|--|||
-|Use case|General purpose|Ultra performance or migration from NetApp on-premises|Cross-platform|
+|Use case|General purpose|General purpose to enterprise scale|Cross-platform|
|Platform service|Yes, Azure-native solution|Yes, Azure-native solution|No, self-managed|
-|Regional availability|All regions|[Select regions](https://azure.microsoft.com/global-infrastructure/services/?products=netapp&regions=all)|All regions|
+|Regional availability|All regions|[Select regions](https://azure.microsoft.com/explore/global-infrastructure/products-by-region/?products=netapp&regions=all&rar=true)|All regions|
|Redundancy|Locally redundant/zone-redundant/geo-redundant/geo-zone-redundant|Locally redundant/geo-redundant [with cross-region replication](../azure-netapp-files/cross-region-replication-introduction.md)|Locally redundant/zone-redundant/geo-redundant|
-|Tiers and performance| Standard (Transaction optimized)<br>Premium<br>Up to max 100K IOPS per share with 10 GBps per share at about 3 ms latency|Standard<br>Premium<br>Ultra<br>Up to 4.5GBps per volume at about 1 ms latency. For IOPS and performance details, see [Azure NetApp Files performance considerations](../azure-netapp-files/azure-netapp-files-performance-considerations.md) and [the FAQ](../azure-netapp-files/faq-performance.md#how-do-i-convert-throughput-based-service-levels-of-azure-netapp-files-to-iops).|Standard HDD: up to 500 IOPS per-disk limits<br>Standard SSD: up to 4k IOPS per-disk limits<br>Premium SSD: up to 20k IOPS per-disk limits<br>We recommend Premium disks for Storage Spaces Direct|
-|Capacity|100 TiB per share, Up to 5 PiB per general purpose account |100 TiB per volume, up to 12.5 PiB per subscription|Maximum 32 TiB per disk|
+|Tiers and performance| Standard (Transaction optimized)<br>Premium<br>Up to max 100K IOPS per share with 10 GBps per share at about 3-ms latency|Standard<br>Premium<br>Ultra<br>Up to max 460K IOPS per volume with 4.5 GBps per volume at about 1 ms latency. For IOPS and performance details, see [Azure NetApp Files performance considerations](../azure-netapp-files/azure-netapp-files-performance-considerations.md) and [the FAQ](../azure-netapp-files/faq-performance.md#how-do-i-convert-throughput-based-service-levels-of-azure-netapp-files-to-iops).|Standard HDD: up to 500 IOPS per-disk limits<br>Standard SSD: up to 4k IOPS per-disk limits<br>Premium SSD: up to 20k IOPS per-disk limits<br>We recommend Premium disks for Storage Spaces Direct|
+|Capacity|100 TiB per share, Up to 5 PiB per general purpose account |100 TiB per volume, up to 12.5 PiB per NetApp account|Maximum 32 TiB per disk|
|Required infrastructure|Minimum share size 1 GiB|Minimum capacity pool 4 TiB, min volume size 100 GiB|Two VMs on Azure IaaS (+ Cloud Witness) or at least three VMs without and costs for disks|
-|Protocols|SMB 3.0/2.1, NFSv4.1 (preview), REST|NFSv3, NFSv4.1 (preview), SMB 3.x/2.x|NFSv3, NFSv4.1, SMB 3.1|
+|Protocols|SMB 3.0/2.1, NFSv4.1 (preview), REST|[NFSv3, NFSv4.1](../azure-netapp-files/azure-netapp-files-create-volumes.md), [SMB 3.x/2.x](../azure-netapp-files/azure-netapp-files-create-volumes-smb.md), [dual-protocol](../azure-netapp-files/create-volumes-dual-protocol.md)|NFSv3, NFSv4.1, SMB 3.1|
## Azure management details |Features|Azure Files|Azure NetApp Files|Storage Spaces Direct| |--|--|||
-|Access|Cloud, on-premises and hybrid (Azure file sync)|Cloud, on-premises (via ExpressRoute)|Cloud, on-premises|
-|Backup|Azure backup snapshot integration|Azure NetApp Files snapshots|Azure backup snapshot integration|
-|Security and compliance|[All Azure supported certificates](https://www.microsoft.com/trustcenter/compliance/complianceofferings)|ISO completed|[All Azure supported certificates](https://www.microsoft.com/trustcenter/compliance/complianceofferings)|
+|Access|Cloud, on-premises and hybrid (Azure file sync)|Cloud, on-premises|Cloud, on-premises|
+|Backup|Azure backup snapshot integration|Azure NetApp Files snapshots<br>Azure NetApp Files backup|Azure backup snapshot integration|
+|Security and compliance|[All Azure supported certificates](https://www.microsoft.com/trustcenter/compliance/complianceofferings)|[Azure supported certificates](https://www.microsoft.com/trustcenter/compliance/complianceofferings)|[All Azure supported certificates](https://www.microsoft.com/trustcenter/compliance/complianceofferings)|
|Azure Active Directory integration|[Native Active Directory and Azure Active Directory Domain Services](../storage/files/storage-files-active-directory-overview.md)|[Azure Active Directory Domain Services and Native Active Directory](../azure-netapp-files/faq-smb.md#does-azure-netapp-files-support-azure-active-directory)|Native Active Directory or Azure Active Directory Domain Services support only| Once you've chosen your storage method, check out [Azure Virtual Desktop pricing](https://azure.microsoft.com/pricing/details/virtual-desktop/) for information about our pricing plans.
For more information about Azure Files performance, see [File share and file sca
## Azure NetApp Files tiers
-Azure NetApp Files volumes are organized in capacity pools. Volume performance is defined by the service level of the hosting capacity pool. Three performance levels are offered, ultra, premium and standard. For more information, see [Storage hierarchy of Azure NetApp Files](../azure-netapp-files/azure-netapp-files-understand-storage-hierarchy.md).
+Azure NetApp Files volumes are organized in capacity pools. Volume performance is defined by the service level of the hosting capacity pool. Three performance levels are offered, ultra, premium and standard. For more information, see [Storage hierarchy of Azure NetApp Files](../azure-netapp-files/azure-netapp-files-understand-storage-hierarchy.md). Azure NetApp Files performance is [a function of tier times capacity](../azure-netapp-files/azure-netapp-files-performance-considerations.md). More provisioned capacity leads to higher performance budget, which likely results in a lower tier requirement, providing a more optimal TCO.
+
+The following table lists our recommendations for which performance tier to use based on workload defaults.
+
+| Workload | Example Users | Azure NetApp Files |
+|-||--|
+| Light | Users doing basic data entry tasks | Standard tier |
+| Medium | Consultants and market researchers | Premium tier: small-medium user count<br>Standard tier: large user count |
+| Heavy | Software engineers, content creators | Premium tier: small-medium user count<br>Standard tier: large user count |
+| Power | Graphic designers, 3D model makers, machines learning researchers | Ultra tier: small user count<br>Premium tier: medium user count<br> Standard tier: large user count |
+
+In order to provision the optimal tier and volume size, consider using [this calculator](https://github.com/ANFTechTeam/Fslogix-Calculator) for guidance.
## Next steps
To learn more about FSLogix profile containers, user profile disks, and other us
If you're ready to create your own FSLogix profile containers, get started with one of these tutorials: - [Set up FSLogix Profile Container with Azure Files and Active Directory](fslogix-profile-container-configure-azure-files-active-directory.md)-- [Set up FSLogix Profile Container with Azure NetApp Files](create-fslogix-profile-container.md)
+- [Set up FSLogix Profile Container with Azure NetApp Files](create-fslogix-profile-container.md)
virtual-desktop Create Service Principal Role Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/virtual-desktop-fall-2019/create-service-principal-role-powershell.md
In this tutorial, learn how to:
## Prerequisites
-Before you can create service principals and role assignments, you need to do three things:
+Before you can create service principals and role assignments, you need to do the following:
-1. Install the AzureAD module. To install the module, run PowerShell as an administrator and run the following cmdlet:
-
- ```powershell
- Install-Module AzureAD
- ```
+1. Follow the steps to [Install the Azure Az PowerShell module](/powershell/azure/install-az-ps).
2. [Download and import the Azure Virtual Desktop PowerShell module](/powershell/windows-virtual-desktop/overview/).
-3. Follow all instructions in this article in the same PowerShell session. The process might not work if you interrupt your PowerShell session by closing the window and reopening it later.
+> [!IMPORTANT]
+> Follow all instructions in this article in the same PowerShell session. The process might not work if you interrupt your PowerShell session by closing the window and reopening it later.
## Create a service principal in Azure Active Directory After you've fulfilled the prerequisites in your PowerShell session, run the following PowerShell cmdlets to create a multitenant service principal in Azure. ```powershell
-Import-Module AzureAD
-$aadContext = Connect-AzureAD
-$svcPrincipal = New-AzureADApplication -AvailableToOtherTenants $true -DisplayName "Azure Virtual Desktop Svc Principal"
-$svcPrincipalCreds = New-AzureADApplicationPasswordCredential -ObjectId $svcPrincipal.ObjectId
+Import-Module Az.Resources
+Connect-AzConnect
+$aadContext = Get-AzContext
+$svcPrincipal = New-AzADApplication -AvailableToOtherTenants $true -DisplayName "Azure Virtual Desktop Svc Principal"
+$svcPrincipalCreds = New-AzADAppCredential -ObjectId $svcPrincipal.Id
``` ## View your credentials in PowerShell Before you create the role assignment for your service principal, view your credentials and write them down for future reference. The password is especially important because you won't be able to retrieve it after you close this PowerShell session.
-Here are the three credentials you should write down and the cmdlets you need to run to get them:
+Here are the three values you should write down and the cmdlets you need to run to get them:
- Password: ```powershell
- $svcPrincipalCreds.Value
+ $svcPrincipalCreds.SecretText
``` - Tenant ID: ```powershell
- $aadContext.TenantId.Guid
+ $aadContext.Tenant.Id
``` - Application ID:
After you create a role assignment for the service principal, make sure the serv
```powershell $creds = New-Object System.Management.Automation.PSCredential($svcPrincipal.AppId, (ConvertTo-SecureString $svcPrincipalCreds.Value -AsPlainText -Force))
-Add-RdsAccount -DeploymentUrl "https://rdbroker.wvd.microsoft.com" -Credential $creds -ServicePrincipal -AadTenantId $aadContext.TenantId.Guid
+Add-RdsAccount -DeploymentUrl "https://rdbroker.wvd.microsoft.com" -Credential $creds -ServicePrincipal -AadTenantId $aadContext.Tenant.Id
``` After you've signed in, make sure everything works by testing a few Azure Virtual Desktop PowerShell cmdlets with the service principal.
virtual-desktop What Is App Attach https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-desktop/what-is-app-attach.md
MSIX is a new packaging format that offers many features aimed to improve packaging experience for all Windows apps. To learn more about MSIX, see the [MSIX overview](/windows/msix/overview).
-MSIX app attach is a way to deliver MSIX applications to both physical and virtual machines. However, MSIX app attach is different from regular MSIX because it's made especially for Azure Virtual Desktop. This article will describe what MSIX app attach is and what it can do for you.
+MSIX app attach is a way to deliver MSIX applications to both physical and virtual machines. However, MSIX app attach is different from regular MSIX because it's made especially for supported products, such as Azure Virtual Desktop. This article will describe what MSIX app attach is and what it can do for you.
## Application delivery options in Azure Virtual Desktop You can deliver apps in Azure Virtual Desktop through one of the following methods: - Put apps in a master image-- Use tools like SCCM or Intune for central management-- Dynamic app provisioning (AppV, VMware AppVolumes, or Citrix AppLayering)
+- Use tools like Microsoft Endpoint Configuration Manager or Intune for central management
+- Dynamic app provisioning (App-V, VMware AppVolumes, or Citrix AppLayering)
- Create custom tools or scripts using Microsoft and a third-party tool ## What does MSIX app attach do?
virtual-machines Image Builder Json https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/image-builder-json.md
properties: {
If the `stagingResourceGroup` property is specified with a resource group that does exist, then the Image Builder service will check to make sure the resource group isn't associated with another image template, is empty (no resources inside), in the same region as the image template, and has either "Contributor" or "Owner" RBAC applied to the identity assigned to the Azure Image Builder image template resource. If any of the aforementioned requirements aren't met, an error will be thrown. The staging resource group will have the following tags added to it: `usedBy`, `imageTemplateName`, `imageTemplateResourceGroupName`. Pre-existing tags aren't deleted. > [!IMPORTANT]
-> You will need to assign the contributor role to the resource group for the service principal corresponding to Azure Image Builder's first party app when trying to specify a pre-existing resource group and VNet to the Azure Image Builder service with a Windows source image. For the CLI command and portal instructions on how to assign the contributor role to the resource group see the following documentation [Troubleshoot VM Azure Image Builder: Authorization error creating disk](https://learn.microsoft.com/us/azure/virtual-machines/linux/image-builder-troubleshoot#authorization-error-creating-disk)
+> You will need to assign the contributor role to the resource group for the service principal corresponding to Azure Image Builder's first party app when trying to specify a pre-existing resource group and VNet to the Azure Image Builder service with a Windows source image. For the CLI command and portal instructions on how to assign the contributor role to the resource group see the following documentation [Troubleshoot VM Azure Image Builder: Authorization error creating disk](/azure/virtual-machines/linux/image-builder-troubleshoot#authorization-error-creating-disk)
- **The stagingResourceGroup property is specified with a resource group that doesn't exist**
virtual-machines Automation Bom Get Files https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-bom-get-files.md
Title: Get SAP media for Bill of Materials
-description: How to download SAP media to use in your Bill of Materials (BOM) for the SAP deployment automation framework on Azure.
+description: How to download SAP media to use in your Bill of Materials (BOM) for the SAP on Azure Deployment Automation Framework.
# Acquire media for BOM creation
-The [SAP deployment automation framework on Azure](automation-deployment-framework.md) uses a Bill of Materials (BOM). To create your BOM, you have to locate and download relevant SAP installation media. Then, you need to upload these media files to your Azure storage account.
+The [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) uses a Bill of Materials (BOM). To create your BOM, you have to locate and download relevant SAP installation media. Then, you need to upload these media files to your Azure storage account.
> [!NOTE] > This guide covers advanced deployment topics. For a basic explanation of how to deploy the automation framework, see the [get started guide](automation-get-started.md) instead.
virtual-machines Automation Bom Prepare https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-bom-prepare.md
Title: Prepare Bill of Materials for automation
-description: How to prepare a full SAP Bill of Materials (BOM) for use with the SAP deployment automation framework on Azure.
+description: How to prepare a full SAP Bill of Materials (BOM) for use with the SAP on Azure Deployment Automation Framework.
# Prepare SAP BOM
-The [SAP deployment automation framework on Azure](automation-deployment-framework.md) uses a Bill of Materials (BOM). The BOM helps configure your SAP systems.
+The [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) uses a Bill of Materials (BOM). The BOM helps configure your SAP systems.
The automation framework's GitHub repository contains a set of [Sample BOMs](https://github.com/Azure/sap-automation/tree/main/deploy/ansible/BOM-catalog) that you can use to get started. It is also possible to create BOMs for other SAP Applications and databases.
virtual-machines Automation Bom Templates Db https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-bom-templates-db.md
Title: Generate Application Installation templates
-description: How to generate SAP Application templates for use with the SAP deployment automation framework on Azure.
+description: How to generate SAP Application templates for use with the SAP on Azure Deployment Automation Framework.
# Generate SAP Application templates for automation
-The [SAP deployment automation framework on Azure](automation-deployment-framework.md) uses a Bill of Materials (BOM) to define the SAP Application. Before you can deploy a system using a custom BOM, you need to also create the templates for the ini-files used in the unattended SAP installation. This guide covers how to create the application templates for an SAP/S4 deployment. The process is the same for the other SAP applications.
+The [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) uses a Bill of Materials (BOM) to define the SAP Application. Before you can deploy a system using a custom BOM, you need to also create the templates for the ini-files used in the unattended SAP installation. This guide covers how to create the application templates for an SAP/S4 deployment. The process is the same for the other SAP applications.
## Prerequisites
virtual-machines Automation Configure Control Plane https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-configure-control-plane.md
Title: Configure control plane for automation framework
-description: Configure your deployment control plane for the SAP deployment automation framework on Azure.
+description: Configure your deployment control plane for the SAP on Azure Deployment Automation Framework.
# Configure the control plane
-The control plane for the [SAP deployment automation framework on Azure](automation-deployment-framework.md) consists of the following components:
+The control plane for the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) consists of the following components:
- Deployer - SAP library
virtual-machines Automation Configure Devops https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-configure-devops.md
Title: Configure Azure DevOps Services for SAP deployment automation framework
-description: Configure your Azure DevOps Services for the SAP deployment automation framework on Azure.
+ Title: Configure Azure DevOps Services for SAP on Azure Deployment Automation Framework
+description: Configure your Azure DevOps Services for the SAP on Azure Deployment Automation Framework.
-# Use SAP deployment automation framework from Azure DevOps Services
+# Use SAP on Azure Deployment Automation Framework from Azure DevOps Services
Using Azure DevOps will streamline the deployment process by providing pipelines that can be executed to perform both the infrastructure deployment and the configuration and SAP installation activities. You can use Azure Repos to store your configuration files and Azure Pipelines to deploy and configure the infrastructure and the SAP application.
You can use Azure Repos to store your configuration files and Azure Pipelines to
To use Azure DevOps Services, you'll need an Azure DevOps organization. An organization is used to connect groups of related projects. Use your work or school account to automatically connect your organization to your Azure Active Directory (Azure AD). To create an account, open [Azure DevOps](https://azure.microsoft.com/services/devops/) and either _sign-in_ or create a new account.
-## Configure Azure DevOps Services for the SAP deployment automation framework
+## Configure Azure DevOps Services for the SAP on Azure Deployment Automation Framework
-You can use the following script to do a basic installation of Azure Devops Services for the SAP deployment automation framework.
+You can use the following script to do a basic installation of Azure Devops Services for the SAP on Azure Deployment Automation Framework.
Log in to Azure Cloud Shell ```bash
Invoke-WebRequest -Uri https://raw.githubusercontent.com/Azure/sap-automation/ma
You can run the 'Create Sample Deployer Configuration' pipeline to create a sample configuration for the Control Plane. When running choose the appropriate Azure region.
-## Manual configuration of Azure DevOps Services for the SAP deployment automation framework
+## Manual configuration of Azure DevOps Services for the SAP on Azure Deployment Automation Framework
### Create a new project
Open (https://dev.azure.com) and create a new project by clicking on the _New Pr
Record the URL of the project. ### Import the repository
-Start by importing the SAP deployment automation framework GitHub repository into Azure Repos.
+Start by importing the SAP on Azure Deployment Automation Framework GitHub repository into Azure Repos.
Navigate to the Repositories section and choose Import a repository, import the 'https://github.com/Azure/sap-automation.git' repository into Azure DevOps. For more info, see [Import a repository](/azure/devops/repos/git/import-git-repository?view=azure-devops&preserve-view=true)
-If you're unable to import a repository, you can create the 'sap-automation' repository, and manually import the content from the SAP deployment automation framework GitHub repository to it.
+If you're unable to import a repository, you can create the 'sap-automation' repository, and manually import the content from the SAP on Azure Deployment Automation Framework GitHub repository to it.
### Create the repository for manual import
Clone the repository to a local folder by clicking the _Clone_ button in the Fi
### Manually importing the repository content using a local clone
-You can also download the content from the SAP deployment automation framework repository manually and add it to your local clone of the Azure DevOps repository.
+You can also download the content from the SAP on Azure Deployment Automation Framework repository manually and add it to your local clone of the Azure DevOps repository.
Navigate to 'https://github.com/Azure/SAP-automation' repository and download the repository content as a ZIP file by clicking the _Code_ button and choosing _Download ZIP_.
Select the source control icon and provide a message about the change, for examp
### Create configuration root folder > [!IMPORTANT]
- > In order to ensure that your configuration files are not overwritten by changes in the SAP deployment automation framework, store them in a separate folder hierarchy.
+ > In order to ensure that your configuration files are not overwritten by changes in the SAP on Azure Deployment Automation Framework, store them in a separate folder hierarchy.
-Create a top level folder called 'WORKSPACES', this folder will be the root folder for all the SAP deployment configuration files. Create the following folders in the 'WORKSPACES' folder: 'DEPLOYER', 'LIBRARY', 'LANDSCAPE' and 'SYSTEM'. These will contain the configuration files for the different components of the SAP deployment automation framework.
+Create a top level folder called 'WORKSPACES', this folder will be the root folder for all the SAP deployment configuration files. Create the following folders in the 'WORKSPACES' folder: 'DEPLOYER', 'LIBRARY', 'LANDSCAPE' and 'SYSTEM'. These will contain the configuration files for the different components of the SAP on Azure Deployment Automation Framework.
Optionally you may copy the sample configuration files from the 'samples/WORKSPACES' folders to the WORKSPACES folder you created, this will allow you to experiment with sample deployments.
virtual-machines Automation Configure Extra Disks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-configure-extra-disks.md
Title: Custom disk configurations
-description: Provide custom disk configurations for your system in the SAP deployment automation framework on Azure. Add extra disks to a new system, or an existing system.
+description: Provide custom disk configurations for your system in the SAP on Azure Deployment Automation Framework. Add extra disks to a new system, or an existing system.
# Change the disk configuration for the SAP deployment automation
-By default, the [SAP deployment automation framework on Azure](automation-deployment-framework.md) defines the disk configuration for the SAP systems. As needed, you can change the default configuration by providing a custom disk configuration json file.
+By default, the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) defines the disk configuration for the SAP systems. As needed, you can change the default configuration by providing a custom disk configuration json file.
> [!TIP] > When possible, it's a best practice to increase the disk size instead of adding more disks.
virtual-machines Automation Configure Sap Parameters https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-configure-sap-parameters.md
disks:
From the v3.4 release, it is possible to deploy SAP on Azure systems in a Shared Home configuration using an Oracle database backend. For more information on running SAP on Oracle in Azure, see [Azure Virtual Machines Oracle DBMS deployment for SAP workload](dbms_guide_oracle.md).
-In order to install the Oracle backend using the SAP deployment automation framework, you need to provide the following parameters
+In order to install the Oracle backend using the SAP on Azure Deployment Automation Framework, you need to provide the following parameters
> [!div class="mx-tdCol2BreakAll "] > | Parameter | Description | Type |
virtual-machines Automation Configure System https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-configure-system.md
Title: Configure SAP system parameters for automation
-description: Define the SAP system properties for the SAP deployment automation framework on Azure using a parameters file.
+description: Define the SAP system properties for the SAP on Azure Deployment Automation Framework using a parameters file.
# Configure SAP system parameters
-Configuration for the [SAP deployment automation framework on Azure](automation-deployment-framework.md)] happens through parameters files. You provide information about your SAP system properties in a tfvars file, which the automation framework uses for deployment. You can find examples of the variable file in the 'samples/WORKSPACES/SYSTEM' folder.
+Configuration for the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md)] happens through parameters files. You provide information about your SAP system properties in a tfvars file, which the automation framework uses for deployment. You can find examples of the variable file in the 'samples/WORKSPACES/SYSTEM' folder.
The automation supports both creating resources (green field deployment) or using existing resources (brownfield deployment).
The table below contains the parameters that define the resource group.
## SAP Virtual Hostname parameters
-In the SAP deployment automation framework, the SAP virtual hostname is defined by specifying the `use_secondary_ips` parameter.
+In the SAP on Azure Deployment Automation Framework, the SAP virtual hostname is defined by specifying the `use_secondary_ips` parameter.
> [!div class="mx-tdCol2BreakAll "]
The table below defines the parameters used for defining the Key Vault informati
### Anchor virtual machine parameters
-The SAP deployment automation framework supports having an Anchor virtual machine. The anchor virtual machine will be the first virtual machine to be deployed and is used to anchor the proximity placement group.
+The SAP on Azure Deployment Automation Framework supports having an Anchor virtual machine. The anchor virtual machine will be the first virtual machine to be deployed and is used to anchor the proximity placement group.
The table below contains the parameters related to the anchor virtual machine.
virtual-machines Automation Configure Webapp https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-configure-webapp.md
Title: Configure a Deployer Web Application for SAP deployment automation framework
+ Title: Configure a Deployer Web Application for SAP on Azure Deployment Automation Framework
description: Configure a web app as a part of the control plane to help creating and deploying SAP workload zones and systems on Azure.
rm ./manifest.json
## Deploy via Azure Pipelines
-For full instructions on setting up the web app using Azure DevOps, see [Use SAP deployment automation framework from Azure DevOps Services](automation-configure-devops.md)
+For full instructions on setting up the web app using Azure DevOps, see [Use SAP on Azure Deployment Automation Framework from Azure DevOps Services](automation-configure-devops.md)
### Summary of steps required to set up the web app before deploying the control plane: 1. Add the web app deployment pipeline (deploy/pipelines/21-deploy-web-app.yaml).
virtual-machines Automation Configure Workload Zone https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-configure-workload-zone.md
Title: About workload zone configuration in automation framework
-description: Overview of the SAP workload zone configuration process within the SAP deployment automation framework on Azure.
+description: Overview of the SAP workload zone configuration process within the SAP on Azure Deployment Automation Framework.
# Workload zone configuration in SAP automation framework
-An [SAP application](automation-deployment-framework.md#sap-concepts) typically has multiple development tiers. For example, you might have development, quality assurance, and production tiers. The [SAP deployment automation framework on Azure](automation-deployment-framework.md) refers to these tiers as [workload zones](automation-deployment-framework.md#deployment-components). See the following diagram for an example of a workload zone with two SAP systems.
+An [SAP application](automation-deployment-framework.md#sap-concepts) typically has multiple development tiers. For example, you might have development, quality assurance, and production tiers. The [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) refers to these tiers as [workload zones](automation-deployment-framework.md#deployment-components). See the following diagram for an example of a workload zone with two SAP systems.
:::image type="content" source="./media/automation-deployment-framework/workload-zone-architecture.png" alt-text="Diagram of SAP workflow zones and systems.":::
virtual-machines Automation Deploy Control Plane https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-deploy-control-plane.md
Title: About Control Plane deployment for the SAP Deployment automation framework
-description: Overview of the Control Plan deployment process within the SAP deployment automation framework on Azure.
+ Title: About Control Plane deployment for the SAP on Azure Deployment Automation Framework
+description: Overview of the Control Plan deployment process within the SAP on Azure Deployment Automation Framework.
# Deploy the control plane
-The control plane deployment for the [SAP deployment automation framework on Azure](automation-deployment-framework.md) consists of the following components:
+The control plane deployment for the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) consists of the following components:
- Deployer - SAP library
virtual-machines Automation Deploy System https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-deploy-system.md
Title: About SAP system deployment for the automation framework
-description: Overview of the SAP system deployment process within the SAP deployment automation framework on Azure.
+description: Overview of the SAP system deployment process within the SAP on Azure Deployment Automation Framework.
# SAP system deployment for the automation framework
-The creation of the [SAP system](automation-deployment-framework.md#sap-concepts) is part of the [SAP deployment automation framework on Azure](automation-deployment-framework.md) process. The SAP system creates your virtual machines (VMs), and supporting components for your [SAP application](automation-deployment-framework.md#sap-concepts).
+The creation of the [SAP system](automation-deployment-framework.md#sap-concepts) is part of the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) process. The SAP system creates your virtual machines (VMs), and supporting components for your [SAP application](automation-deployment-framework.md#sap-concepts).
The SAP system deploys:
virtual-machines Automation Deploy Workload Zone https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-deploy-workload-zone.md
Title: About workload zone deployment in automation framework
-description: Overview of the SAP workload zone deployment process within the SAP deployment automation framework on Azure.
+description: Overview of the SAP workload zone deployment process within the SAP on Azure Deployment Automation Framework.
# Workload zone deployment in SAP automation framework
-An [SAP application](automation-deployment-framework.md#sap-concepts) typically has multiple development tiers. For example, you might have development, quality assurance, and production tiers. The [SAP deployment automation framework on Azure](automation-deployment-framework.md) refers to these tiers as [workload zones](automation-deployment-framework.md#deployment-components).
+An [SAP application](automation-deployment-framework.md#sap-concepts) typically has multiple development tiers. For example, you might have development, quality assurance, and production tiers. The [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) refers to these tiers as [workload zones](automation-deployment-framework.md#deployment-components).
You can use workload zones in multiple Azure regions. Each workload zone then has its own Azure Virtual Network (Azure VNet)
virtual-machines Automation Deployment Framework https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-deployment-framework.md
Title: About SAP deployment automation framework on Azure
-description: Overview of the framework and tooling for the SAP deployment automation framework on Azure.
+ Title: About SAP on Azure Deployment Automation Framework
+description: Overview of the framework and tooling for the SAP on Azure Deployment Automation Framework.
Last updated 05/29/2022
-# SAP deployment automation framework on Azure
+# SAP on Azure Deployment Automation Framework
-The [SAP deployment automation framework on Azure](https://github.com/Azure/sap-automation) is an open-source orchestration tool for deploying, installing and maintaining SAP environments. You can create infrastructure for SAP landscapes based on SAP HANA and NetWeaver with AnyDB. The framework uses [Terraform](https://www.terraform.io/) for infrastructure deployment, and [Ansible](https://www.ansible.com/) for the operating system and application configuration. The systems can be deployed on any of the SAP-supported operating system versions and deployed into any Azure region.
+The [SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation) is an open-source orchestration tool for deploying, installing and maintaining SAP environments. You can create infrastructure for SAP landscapes based on SAP HANA and NetWeaver with AnyDB. The framework uses [Terraform](https://www.terraform.io/) for infrastructure deployment, and [Ansible](https://www.ansible.com/) for the operating system and application configuration. The systems can be deployed on any of the SAP-supported operating system versions and deployed into any Azure region.
Hashicorp [Terraform](https://www.terraform.io/) is an open-source tool for provisioning and managing cloud infrastructure.
The [automation framework](https://github.com/Azure/sap-automation) has two main
- Deployment infrastructure (control plane) - SAP Infrastructure (SAP Workload)
-You'll use the control plane of the SAP deployment automation framework to deploy the SAP Infrastructure and the SAP application infrastructure. The deployment uses Terraform templates to create the [infrastructure as a service (IaaS)](https://azure.microsoft.com/overview/what-is-iaas) defined infrastructure to host the SAP Applications.
+You'll use the control plane of the SAP on Azure Deployment Automation Framework to deploy the SAP Infrastructure and the SAP application infrastructure. The deployment uses Terraform templates to create the [infrastructure as a service (IaaS)](https://azure.microsoft.com/overview/what-is-iaas) defined infrastructure to host the SAP Applications.
> [!NOTE] > This automation framework is based on Microsoft best practices and principles for SAP on Azure. Review the [get-started guide for SAP on Azure virtual machines (Azure VMs)](get-started.md) to understand how to use certified virtual machines and storage solutions for stability, reliability, and performance.
The Distributed (Highly Available) deployment is similar to the Distributed arch
The dependency between the control plane and the application plane is illustrated in the diagram below. In a typical deployment, a single control plane is used to manage multiple SAP deployments. ## About the control plane
The key components of the control plane are:
The following diagram shows the key components of the control plane and workload zone. The application configuration will be performed from the Ansible Controller in the Control plane using a set of pre-defined playbooks. These playbooks will:
virtual-machines Automation Devops Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-devops-tutorial.md
Title: SAP deployment automation framework DevOps hands-on lab
-description: DevOps Hands-on lab for the SAP deployment automation framework on Azure.
+ Title: SAP on Azure Deployment Automation Framework DevOps hands-on lab
+description: DevOps Hands-on lab for the SAP on Azure Deployment Automation Framework.
-# SAP deployment automation framework DevOps - Hands-on lab
+# SAP on Azure Deployment Automation Framework DevOps - Hands-on lab
-This tutorial shows how to perform the deployment activities of the [SAP deployment automation framework on Azure](automation-deployment-framework.md) using Azure DevOps Services.
+This tutorial shows how to perform the deployment activities of the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) using Azure DevOps Services.
You'll perform the following tasks during this lab:
virtual-machines Automation Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-get-started.md
Title: Get started with the SAP on Azure deployment automation framework
-description: Quickly get started with the SAP deployment automation framework on Azure. Deploy an example configuration using sample parameter files.
+description: Quickly get started with the SAP on Azure Deployment Automation Framework. Deploy an example configuration using sample parameter files.
# Get started with SAP automation framework on Azure
-Get started quickly with the [SAP deployment automation framework on Azure](automation-deployment-framework.md).
+Get started quickly with the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md).
## Prerequisites
Import-Module C:\Azure_SAP_Automated_Deployment\sap-automation\deploy\scripts\pw
> [!TIP]
-> The deployer already clones [SAP deployment automation framework repository](https://github.com/Azure/sap-automation).
+> The deployer already clones [SAP on Azure Deployment Automation Framework repository](https://github.com/Azure/sap-automation).
## Copy the samples
virtual-machines Automation Manual Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-manual-deployment.md
Title: Get started with manual deployment of automation framework
-description: Manually deploy the SAP deployment automation framework on Azure using an example configuration and sample parameter files.
+description: Manually deploy the SAP on Azure Deployment Automation Framework using an example configuration and sample parameter files.
# Get started with manual deployment
-Along with [automated deployment](automation-get-started.md), you can also do manual deployment of the [SAP deployment automation framework on Azure](automation-deployment-framework.md). Use this example configuration and sample parameter files to get started.
+Along with [automated deployment](automation-get-started.md), you can also do manual deployment of the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md). Use this example configuration and sample parameter files to get started.
> [!TIP] > This guide covers only how to perform a **manual** deployment. If you want to get started quickly, see the [**automated** deployment guide](automation-get-started.md) instead.
virtual-machines Automation Naming Module https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-naming-module.md
Title: Configure custom naming for the automation framework
-description: Explanation of how to implement custom naming conventions for the SAP deployment automation framework on Azure.
+description: Explanation of how to implement custom naming conventions for the SAP on Azure Deployment Automation Framework.
# Overview
-The [SAP deployment automation framework on Azure](automation-deployment-framework.md) uses a standard naming convention for Azure [resource naming](automation-naming.md).
+The [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) uses a standard naming convention for Azure [resource naming](automation-naming.md).
The Terraform module `sap_namegenerator` defines the names of all resources that the automation framework deploys. The module is located at `/deploy/terraform/terraform-units/modules/sap_namegenerator/` in the repository. The framework also supports providing your own names for some of the resources using the [parameter files](automation-configure-system.md).
The names for the key vaults are defined in the "keyvault_names" structure. The
``` > [!NOTE]
-> This key vault names need to be unique across Azure, SAP deployment automation framework appends 3 random characters (ABC in the example) at the end of the key vault name to reduce the likelihood for name conflicts.
+> This key vault names need to be unique across Azure, SAP on Azure Deployment Automation Framework appends 3 random characters (ABC in the example) at the end of the key vault name to reduce the likelihood for name conflicts.
The "private_access" names are currently not used.
The names for the storage accounts are defined in the "storageaccount_names" str
``` > [!NOTE]
-> This key vault names need to be unique across Azure, SAP deployment automation framework appends 3 random characters (abc in the example) at the end of the key vault name to reduce the likelihood for name conflicts.
+> This key vault names need to be unique across Azure, SAP on Azure Deployment Automation Framework appends 3 random characters (abc in the example) at the end of the key vault name to reduce the likelihood for name conflicts.
### Virtual Machine names
virtual-machines Automation Naming https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-naming.md
Title: Naming standards for the automation framework
-description: Explanation of naming conventions for the SAP deployment automation framework on Azure.
+description: Explanation of naming conventions for the SAP on Azure Deployment Automation Framework.
# Naming conventions for SAP automation framework
-The [SAP deployment automation framework on Azure](automation-deployment-framework.md) uses standard naming conventions. Consistent naming helps the automation framework run correctly with Terraform. Standard naming helps you deploy the automation framework smoothly. For example, consistent naming helps you to:
+The [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) uses standard naming conventions. Consistent naming helps the automation framework run correctly with Terraform. Standard naming helps you deploy the automation framework smoothly. For example, consistent naming helps you to:
- Deploy the SAP virtual network infrastructure into any supported Azure region.
virtual-machines Automation New Vs Existing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-new-vs-existing.md
Title: Configuring the automation framework for new and existing deployments
-description: How to configure the SAP deployment automation framework on Azure for both new and existing scenarios.
+description: How to configure the SAP on Azure Deployment Automation Framework for both new and existing scenarios.
# Configuring for new and existing deployments
-You can use the [SAP deployment automation framework on Azure](automation-deployment-framework.md) in both new and existing deployment scenarios.
+You can use the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) in both new and existing deployment scenarios.
In new deployment scenarios, the automation framework doesn't use existing Azure infrastructure. The deployment process creates the virtual networks, subnets, key vaults, and more.
In this scenario, the automation framework creates all Azure components, and use
To test this scenario:
-Clone the [SAP deployment automation framework](https://github.com/Azure/sap-automation/) repository and copy the sample files to your root folder for parameter files:
+Clone the [SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation/) repository and copy the sample files to your root folder for parameter files:
```bash cd ~/Azure_SAP_Automated_Deployment
virtual-machines Automation Plan Deployment https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-plan-deployment.md
Title: Plan your SAP deployment with the automation framework on Azure
-description: Prepare for using the SAP deployment automation framework on Azure. Steps include planning for credentials management, DevOps structure, and deployment scenarios.
+description: Prepare for using the SAP on Azure Deployment Automation Framework. Steps include planning for credentials management, DevOps structure, and deployment scenarios.
# Plan your deployment of SAP automation framework
-There are multiple considerations for planning an SAP deployment and running the [SAP deployment automation framework on Azure](automation-deployment-framework.md), this include topics like deployment credentials management, virtual network design.
+There are multiple considerations for planning an SAP deployment and running the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md), this include topics like deployment credentials management, virtual network design.
For generic SAP on Azure design considerations please visit [Introduction to an SAP adoption scenario](/azure/cloud-adoption-framework/scenarios/sap) > [!NOTE]
-> The Terraform deployment uses Terraform templates provided by Microsoft from the [SAP deployment automation framework repository](https://github.com/Azure/sap-automation/). The templates use parameter files with your system-specific information to perform the deployment.
+> The Terraform deployment uses Terraform templates provided by Microsoft from the [SAP on Azure Deployment Automation Framework repository](https://github.com/Azure/sap-automation/). The templates use parameter files with your system-specific information to perform the deployment.
## Credentials management
For more information, see [the Azure CLI documentation for creating a service pr
## DevOps structure
-The Terraform automation templates are in the [SAP deployment automation framework repository](https://github.com/Azure/sap-automation/). For most use cases, consider this repository as read-only and don't modify it.
+The Terraform automation templates are in the [SAP on Azure Deployment Automation Framework repository](https://github.com/Azure/sap-automation/). For most use cases, consider this repository as read-only and don't modify it.
-For your own parameter files, it's a best practice to keep these files in a source control repository that you manage. You can clone the [SAP deployment automation framework repository](https://github.com/Azure/sap-automation/) into your source control repository and then [create an appropriate folder structure](#folder-structure) in the repository.
+For your own parameter files, it's a best practice to keep these files in a source control repository that you manage. You can clone the [SAP on Azure Deployment Automation Framework repository](https://github.com/Azure/sap-automation/) into your source control repository and then [create an appropriate folder structure](#folder-structure) in the repository.
> [!IMPORTANT] > Your parameter file's name becomes the name of the Terraform state file. Make sure to use a unique parameter file name for this reason.
virtual-machines Automation Reference Bash https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-reference-bash.md
Title: SAP deployment automation framework Bash reference | Microsoft Docs
-description: SAP deployment automation framework on Azure Bash reference
+ Title: SAP on Azure Deployment Automation Framework Bash reference | Microsoft Docs
+description: SAP on Azure Deployment Automation Framework Bash reference
Last updated 11/17/2021
-# Using SAP deployment automation framework shell scripts
+# Using SAP on Azure Deployment Automation Framework shell scripts
-You can deploy all [SAP deployment automation framework on Azure](automation-deployment-framework.md) components using shell scripts.
+You can deploy all [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) components using shell scripts.
## Control Plane operations
virtual-machines Automation Reference Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-reference-powershell.md
Title: SAP deployment automation framework PowerShell reference | Microsoft Docs
-description: SAP deployment automation framework on Azure PowerShell reference
+ Title: SAP on Azure Deployment Automation Framework PowerShell reference | Microsoft Docs
+description: SAP on Azure Deployment Automation Framework PowerShell reference
Last updated 11/17/2021
-# Using PowerShell in SAP deployment automation framework
+# Using PowerShell in SAP on Azure Deployment Automation Framework
-You can deploy all [SAP deployment automation framework on Azure](automation-deployment-framework.md) components using Microsoft PowerShell.
+You can deploy all [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) components using Microsoft PowerShell.
## Control Plane operations
virtual-machines Automation Run Ansible https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-run-ansible.md
Title: Run Ansible to configure SAP system
-description: Configure the environment and install SAP using Ansible playbooks with the SAP deployment automation framework on Azure.
+description: Configure the environment and install SAP using Ansible playbooks with the SAP on Azure Deployment Automation Framework.
# Get started Ansible configuration
-When you use the [SAP deployment automation framework on Azure](automation-deployment-framework.md), you have the option to do an [automated infrastructure deployment](automation-get-started.md), However, you can also do the required operating system configurations and install SAP using Ansible playbooks provided in the repository. These playbooks are located in the automation framework repository in the `/sap-automation/deploy/ansible` folder.
+When you use the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md), you have the option to do an [automated infrastructure deployment](automation-get-started.md), However, you can also do the required operating system configurations and install SAP using Ansible playbooks provided in the repository. These playbooks are located in the automation framework repository in the `/sap-automation/deploy/ansible` folder.
| Filename | Description | | | - |
virtual-machines Automation Software https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-software.md
Title: Download SAP software for automation framework
-description: Download the SAP software to your Azure environment using Ansible playbooks to use the SAP deployment automation framework on Azure.
+description: Download the SAP software to your Azure environment using Ansible playbooks to use the SAP on Azure Deployment Automation Framework.
# Download SAP software
-You need a copy of the SAP software before you can use [the SAP deployment automation framework on Azure](automation-deployment-framework.md). [Prepare your Azure environment](#configure-key-vault) so you can put the SAP media in your storage account. Then, [download the SAP software using Ansible playbooks](#download-sap-software).
+You need a copy of the SAP software before you can use [the SAP on Azure Deployment Automation Framework](automation-deployment-framework.md). [Prepare your Azure environment](#configure-key-vault) so you can put the SAP media in your storage account. Then, [download the SAP software using Ansible playbooks](#download-sap-software).
## Prerequisites
virtual-machines Automation Supportability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-supportability.md
Title: Supportability matrix for the SAP deployment automation framework
-description: Supported platforms, topologies, and capabilities for the SAP deployment automation framework on Azure.
+ Title: Supportability matrix for the SAP on Azure Deployment Automation Framework
+description: Supported platforms, topologies, and capabilities for the SAP on Azure Deployment Automation Framework.
# Supportability matrix for the SAP Automation Framework
-The [SAP deployment automation framework on Azure](automation-deployment-framework.md) supports deployment of all the supported SAP on Azure topologies.
+The [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md) supports deployment of all the supported SAP on Azure topologies.
## Supported operating systems
virtual-machines Automation Tools Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-tools-configuration.md
Title: Configuring external tools for the SAP deployment automation framework
-description: Describes how to configure external tools for using SAP deployment automation framework.
+ Title: Configuring external tools for the SAP on Azure Deployment Automation Framework
+description: Describes how to configure external tools for using SAP on Azure Deployment Automation Framework.
-# Configuring external tools to use with the SAP deployment automation framework
+# Configuring external tools to use with the SAP on Azure Deployment Automation Framework
-This document describes how to configure external tools to use the SAP deployment automation framework.
+This document describes how to configure external tools to use the SAP on Azure Deployment Automation Framework.
## Configuring Visual Studio Code
virtual-machines Automation Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/automation-tutorial.md
Title: SAP deployment automation framework hands-on lab
-description: Hands-on lab for the SAP deployment automation framework on Azure.
+ Title: SAP on Azure Deployment Automation Framework hands-on lab
+description: Hands-on lab for the SAP on Azure Deployment Automation Framework.
-# Enterprise Scale for SAP deployment automation framework - Hands-on Lab
+# Enterprise Scale for SAP on Azure Deployment Automation Framework - Hands-on Lab
-This tutorial shows how to do enterprise scaling for deployments using the [SAP deployment automation framework on Azure](automation-deployment-framework.md). This example uses Azure Cloud Shell to deploy the control plane infrastructure. The deployer virtual machine (VM) creates the remaining infrastructure and SAP HANA configurations.
+This tutorial shows how to do enterprise scaling for deployments using the [SAP on Azure Deployment Automation Framework](automation-deployment-framework.md). This example uses Azure Cloud Shell to deploy the control plane infrastructure. The deployer virtual machine (VM) creates the remaining infrastructure and SAP HANA configurations.
You'll perform the following tasks during this lab:
The following diagram shows the dependency between the control plane and the app
The framework uses Terraform for infrastructure deployment, and Ansible for the operating system and application configuration. The following diagram shows the logical separation of the control plane and workload zone.
You configure the deployer and library in a Terraform `.tfvars` variable file. S
#### Workload Zone
-An SAP application typically has multiple deployment tiers. For example, you might have development, quality assurance, and production tiers. The SAP deployment automation framework refers to these tiers as workload zones.
+An SAP application typically has multiple deployment tiers. For example, you might have development, quality assurance, and production tiers. The SAP on Azure Deployment Automation Framework refers to these tiers as workload zones.
:::image type="content" source="./media/automation-deployment-framework/workload-zone.png" alt-text="Workload zone.":::
The system deployment consists of the virtual machines that will be running the
### Prerequisites
-The [SAP deployment automation framework repository](https://github.com/Azure/sap-automation) is available on GitHub.
+The [SAP on Azure Deployment Automation Framework repository](https://github.com/Azure/sap-automation) is available on GitHub.
You need an SSH client to connect to the Deployer. Use any SSH client that you feel comfortable with.
virtual-machines Automation Advanced_State_Management https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/bash/automation-advanced_state_management.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation Install_Deployer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/bash/automation-install_deployer.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation Install_Library https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/bash/automation-install_library.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation Install_Workloadzone https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/bash/automation-install_workloadzone.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation Installer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/bash/automation-installer.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation Prepare Region https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/bash/automation-prepare-region.md
Licensed under the MIT license.
## Related Links
-+[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
++[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation Remove Region https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/bash/automation-remove-region.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation )
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation )
virtual-machines Automation Remover https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/bash/automation-remover.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation Set Secrets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/bash/automation-set-secrets.md
Copyright (c) Microsoft Corporation.
Licensed under the MIT license. ## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-hana)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-hana)
virtual-machines Automation Update_Sas_Token https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/bash/automation-update_sas_token.md
Copyright (c) Microsoft Corporation.
Licensed under the MIT license. ## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/get-started.md
Besides hosting SAP NetWeaver and S/4HANA scenarios with the different DBMS on A
We just announced our new services of Azure Center for SAP solutions and Azure Monitor for SAP solutions 2.0 entering the public preview stage. These services will give you the possibility to deploy SAP workload on Azure in a highly automated manner in an optimal architecture and configuration. And monitor your Azure infrastructure, OS, DBMS, and ABAP stack deployments on one single pane of glass.
-For customers and partners who are focussed on deploying and operating their assets in public cloud through Terraform and Ansible, leverage our SAP deployment automation framework to jump start your SAP deployments into Azure using our public Terraform and Ansible modules on [github](https://github.com/Azure/sap-automation).
+For customers and partners who are focussed on deploying and operating their assets in public cloud through Terraform and Ansible, leverage our SAP on Azure Deployment Automation Framework to jump start your SAP deployments into Azure using our public Terraform and Ansible modules on [github](https://github.com/Azure/sap-automation).
Hosting SAP workload scenarios in Azure also can create requirements of identity integration and single sign-on. This situation can occur when you use Azure Active Directory (Azure AD) to connect different SAP components and SAP software-as-a-service (SaaS) or platform-as-a-service (PaaS) offers. A list of such integration and single sign-on scenarios with Azure AD and SAP entities is described and documented in the section "Azure AD SAP identity integration and single sign-on."
virtual-machines Automation New Sapautomationregion https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/module/automation-new-sapautomationregion.md
Licensed under the MIT license.
## Related Links
-+[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
++[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation New Sapdeployer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/module/automation-new-sapdeployer.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation New Saplibrary https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/module/automation-new-saplibrary.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation New Sapsystem https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/module/automation-new-sapsystem.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation New Sapworkloadzone https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/module/automation-new-sapworkloadzone.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-automation)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-automation)
virtual-machines Automation Remove Sapautomationregion https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/module/automation-remove-sapautomationregion.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-hana)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-hana)
virtual-machines Automation Remove Sapsystem https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/module/automation-remove-sapsystem.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-hana)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-hana)
virtual-machines Automation Set Sapsecrets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/module/automation-set-sapsecrets.md
Copyright (c) Microsoft Corporation.
Licensed under the MIT license. ## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-hana)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-hana)
virtual-machines Automation Update Tfstate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/sap/module/automation-update-tfstate.md
Licensed under the MIT license.
## Related links
-[GitHub repository: SAP deployment automation framework](https://github.com/Azure/sap-hana)
+[GitHub repository: SAP on Azure Deployment Automation Framework](https://github.com/Azure/sap-hana)
virtual-network Associate Public Ip Address Vm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/associate-public-ip-address-vm.md
Previously updated : 10/26/2022 Last updated : 10/28/2022 -+ # Associate a public IP address to a virtual machine
virtual-network Public Ip Upgrade Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/public-ip-upgrade-cli.md
Previously updated : 10/25/2022- Last updated : 10/28/2022+ ms.devlang: azurecli
virtual-network Public Ip Upgrade Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/public-ip-upgrade-portal.md
Previously updated : 10/25/2022- Last updated : 10/28/2022+ # Upgrade a public IP address using the Azure portal
virtual-network Public Ip Upgrade Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/public-ip-upgrade-powershell.md
Previously updated : 10/25/2022- Last updated : 10/28/2022+ # Upgrade a public IP address using Azure PowerShell
virtual-network Virtual Networks Static Private Ip Arm Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/virtual-networks-static-private-ip-arm-cli.md
Title: Create a VM with a static private IP address - Azure CLI
+ Title: 'Create a VM with a static private IP address - Azure CLI'
description: Learn how to create a virtual machine with a static private IP address using the Azure CLI. Previously updated : 10/01/2021- Last updated : 10/28/2022+ ms.devlang: azurecli
The following command creates a Windows Server virtual machine. When prompted, p
--resource-group myResourceGroup \ --public-ip-address myPublicIP \ --public-ip-sku Standard \
- --size Standard_A2 \
--image MicrosoftWindowsServer:WindowsServer:2019-Datacenter:latest \ --admin-username azureuser ```
The following command changes the private IP address of the virtual machine to s
``` > [!WARNING]
-> Though you can add private IP address settings to the operating system, we recommend not doing so until after reading [Add a private IP address to an operating system](virtual-network-network-interface-addresses.md#private).
+> From within the operating system of a VM, you shouldn't statically assign the *private* IP that's assigned to the Azure VM. Only do static assignment of a private IP when it's necessary, such as when [assigning many IP addresses to VMs](virtual-network-multiple-ip-addresses-portal.md).
+>
+>If you manually set the private IP address within the operating system, make sure it matches the private IP address assigned to the Azure [network interface](virtual-network-network-interface-addresses.md#change-ip-address-settings). Otherwise, you can lose connectivity to the VM. Learn more about [private IP address](virtual-network-network-interface-addresses.md#private) settings.
## Clean up resources
When no longer needed, you can use [az group delete](/cli/azure/group#az-group-d
- Learn more about [public IP addresses](public-ip-addresses.md#public-ip-addresses) in Azure. - Learn more about all [public IP address settings](virtual-network-public-ip-address.md#create-a-public-ip-address). - Learn more about [private IP addresses](private-ip-addresses.md) and assigning a [static private IP address](virtual-network-network-interface-addresses.md#add-ip-addresses) to an Azure virtual machine.-- Learn more about creating [Linux](../../virtual-machines/windows/tutorial-manage-vm.md?toc=%2fazure%2fvirtual-network%2ftoc.json) and [Windows](../../virtual-machines/windows/tutorial-manage-vm.md?toc=%2fazure%2fvirtual-network%2ftoc.json) virtual machines.
+- Learn more about creating [Linux](../../virtual-machines/windows/tutorial-manage-vm.md) and [Windows](../../virtual-machines/windows/tutorial-manage-vm.md) virtual machines.
virtual-network Virtual Networks Static Private Ip Arm Pportal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/virtual-networks-static-private-ip-arm-pportal.md
Previously updated : 10/27/2022- Last updated : 10/28/2022+ # Create a virtual machine with a static private IP address using the Azure portal
virtual-network Virtual Networks Static Private Ip Arm Ps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/ip-services/virtual-networks-static-private-ip-arm-ps.md
Previously updated : 10/27/2022- Last updated : 10/28/2022+ # Create a virtual machine with a static private IP address using Azure PowerShell
virtual-network Manage Nat Gateway https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/nat-gateway/manage-nat-gateway.md
+
+ Title: Manage a NAT gateway
+
+description: Learn how to create and remove a NAT gateway resource from a virtual network subnet. Add and remove public IP addresses and prefixes used for outbound connectivity.
+++++ Last updated : 10/31/2022+++
+# Manage NAT gateway
+
+Learn how to create and remove a NAT gateway resource from a virtual network subnet. A NAT gateway enables outbound connectivity for resources in an Azure Virtual Network. You may wish to change the IP address or prefix your resources use for outbound connectivity to the internet. The public and public IP address prefixes associated with the NAT gateway can be changed after deployment.
+
+This article explains how to manage the following aspects of NAT gateway:
+
+- Create a NAT gateway and associate it with an existing subnet.
+
+- Remove a NAT gateway from an existing subnet and delete the resource.
+
+- Add or remove a public IP address or public IP prefix.
+
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+
+- An existing Azure Virtual Network. For information about creating an Azure Virtual Network, see [Quickstart: Create a virtual network using the Azure portal](/azure/virtual-network/quick-create-portal).
+
+ - The example virtual network used in this article is named **myVNet**. Replace the example value with the name of your virtual network.
+
+ - The example subnet used in this article is named **mySubnet**. Replace the example value with the name of your subnet.
+
+ - The example nat gateway used in this article is named **myNATgateway**.
+
+
+- This how-to article requires version 2.31.0 or later of the Azure CLI. If using Azure Cloud Shell, the latest version is already installed.
+
+- Azure PowerShell installed locally or Azure Cloud Shell.
+
+- Sign in to Azure PowerShell and ensure you've selected the subscription with which you want to use this feature. For more information, see [Sign in with Azure PowerShell](/powershell/azure/authenticate-azureps).
+
+- Ensure your `Az.Network` module is 4.3.0 or later. To verify the installed module, use the command `Get-InstalledModule -Name "Az.Network"`. If the module requires an update, use the command `Update-Module -Name Az.Network` if necessary.
+
+If you choose to install and use PowerShell locally, this article requires the Azure PowerShell module version 5.4.1 or later. Run `Get-Module -ListAvailable Az` to find the installed version. If you need to upgrade, see [Install Azure PowerShell module](/powershell/azure/install-Az-ps). If you're running PowerShell locally, you also need to run `Connect-AzAccount` to create a connection with Azure.
+
+## Create a NAT gateway and associate it with an existing subnet
+
+You can create a NAT gateway resource and add it to an existing subnet with the Azure portal, PowerShell, and the Azure CLI.
+
+# [**Portal**](#tab/manage-nat-portal)
+
+1. Sign-in to the [Azure portal](https://portal.azure.com).
+
+2. In the search box at the top of the portal, enter **NAT gateway**. Select **NAT gateways** in the search results.
+
+3. Select **+ Create**.
+
+4. Enter or select the following information in the **Basics** tab of **Create network address translation (NAT) gateway**.
+
+ | Setting | Value |
+ | - | -- |
+ | **Project details** | |
+ | Subscription | Select your subscription. |
+ | Resource group | Select your resource group or select **Create new** to create a new resource group. |
+ | **Instance details** | |
+ | NAT gateway name | Enter **myNATgateway**. |
+ | Region | Select your region. **East US 2** is used in this example. |
+ | Availability zone | Select an availability zone. **No Zone** is used in this example. </br> For more information about NAT gateway availability, see [NAT gateway and availability zones](nat-availability-zones.md). |
+ | TCP idle timeout (minutes) | Select an idle timeout. The default of **4** is used in this example. |
+
+5. Select the **Outbound IP** tab, or select **Next: Outbound IP**.
+
+6. You can select an existing public IP address or prefix or both to associate with the NAT gateway and enable outbound connectivity.
+
+ - To create a new public IP for the NAT gateway, select **Create a new public IP address**. Enter **myPublicIP-NAT** in **Name**. Select **OK**.
+
+ - To create a new public IP prefix for the NAT gateway, select **Create a new public IP prefix**. Enter **myPublicIPPrefix-NAT** in **Name**. Select a **Prefix size**. Select **OK**.
+
+8. Select the **Subnet** tab, or select **Next: Subnet**.
+
+9. Select your virtual network or select **Create new** to create a new virtual network. In this example, select **myVNet** or your existing virtual network in the pull-down box.
+
+10. Select the checkbox next to **mySubnet** or your existing subnet.
+
+11. Select **Review + create**.
+
+12. Select **Create**.
+
+# [**PowerShell**](#tab/manage-nat-powershell)
+
+### Public IP address
+
+To create a NAT gateway with a public IP address, continue with the following steps.
+
+Use [New-AzPublicIpAddress](/powershell/module/az.network/new-azpublicipaddress) to create a public IP address for the NAT gateway.
+
+```azurepowershell
+## Create public IP address for NAT gateway ##
+$ip = @{
+ Name = 'myPublicIP-NAT'
+ ResourceGroupName = 'myResourceGroup'
+ Location = 'eastus2'
+ Sku = 'Standard'
+ AllocationMethod = 'Static'
+}
+New-AzPublicIpAddress @ip
+```
+
+Use [New-AzNatGateway](/powershell/module/az.network/new-aznatgateway) to create a NAT gateway resource and associate the public IP you created previously. You'll use [Set-AzVirtualNetworkSubnetConfig](/powershell/module/az.network/set-azvirtualnetworksubnetconfig) to configure the NAT gateway for your virtual network subnet.
+
+```azurepowershell
+## Place the virtual network into a variable. ##
+$net = @{
+ Name = 'myVNet'
+ ResourceGroupName = 'myResourceGroup'
+}
+$vnet = Get-AzVirtualNetwork @net
+
+## Place the public IP address you created previously into a variable. ##
+$pip = @{
+ Name = 'myPublicIP-NAT'
+ ResourceGroupName = 'myResourceGroup'
+}
+$publicIP = Get-AzPublicIPAddress @pip
+
+## Create NAT gateway resource ##
+$nat = @{
+ ResourceGroupName = 'myResourceGroupNAT'
+ Name = 'myNATgateway'
+ IdleTimeoutInMinutes = '10'
+ Sku = 'Standard'
+ Location = 'eastus2'
+ PublicIpAddress = $publicIP
+}
+$natGateway = New-AzNatGateway @nat
+
+## Create the subnet configuration. ##
+$sub = @{
+ Name = 'mySubnet'
+ VirtualNetwork = $vnet
+ NatGateway = $natGateway
+}
+Set-AzVirtualNetworkSubnetConfig @sub
+
+## Save the configuration to the virtual network. ##
+$vnet | Set-AzVirtualNetwork
+```
+
+### Public IP prefix
+
+To create a NAT gateway with a public IP prefix, continue with the following steps.
+
+Use [New-AzPublicIpPrefix](/powershell/module/az.network/new-azpublicipprefix) to create a public IP prefix for the NAT gateway.
+
+```azurepowershell
+## Create public IP prefix for NAT gateway ##
+$ip = @{
+ Name = 'myPublicIPPrefix-NAT'
+ ResourceGroupName = 'myResourceGroup'
+ Location = 'eastus2'
+ Sku = 'Standard'
+ PrefixLength ='29'
+}
+New-AzPublicIpPrefix @ip
+```
+
+Use [New-AzNatGateway](/powershell/module/az.network/new-aznatgateway) to create a NAT gateway resource and associate the public IP prefix you created previously. You'll use [Set-AzVirtualNetworkSubnetConfig](/powershell/module/az.network/set-azvirtualnetworksubnetconfig) to configure the NAT gateway for your virtual network subnet.
+
+```azurepowershell
+## Place the virtual network into a variable. ##
+$net = @{
+ Name = 'myVNet'
+ ResourceGroupName = 'myResourceGroup'
+}
+$vnet = Get-AzVirtualNetwork @net
+
+## Place the public IP prefix you created previously into a variable. ##
+$pip = @{
+ Name = 'myPublicIPPrefix-NAT'
+ ResourceGroupName = 'myResourceGroup'
+}
+$publicIPprefix = Get-AzPublicIPPrefix @pip
+
+## Create NAT gateway resource ##
+$nat = @{
+ ResourceGroupName = 'myResourceGroupNAT'
+ Name = 'myNATgateway'
+ IdleTimeoutInMinutes = '10'
+ Sku = 'Standard'
+ Location = 'eastus2'
+ PublicIpPrefix = $publicIPprefix
+}
+$natGateway = New-AzNatGateway @nat
+
+## Create the subnet configuration. ##
+$sub = @{
+ Name = 'mySubnet'
+ VirtualNetwork = $vnet
+ NatGateway = $natGateway
+}
+Set-AzVirtualNetworkSubnetConfig @sub
+
+## Save the configuration to the virtual network. ##
+$vnet | Set-AzVirtualNetwork
+```
+
+# [**Azure CLI**](#tab/manage-nat-cli)
+
+### Public IP address
+
+To create a NAT gateway with a public IP address, continue with the following steps.
+
+Use [az network public-ip create](/cli/azure/network/public-ip#az-network-public-ip-create) to create a public IP address for the NAT gateway.
+
+```azurecli
+az network public-ip create \
+ --resource-group myResourceGroup \
+ --location eastus2 \
+ --name myPublicIP-NAT \
+ --sku standard
+```
+
+Use [az network nat gateway create](/cli/azure/network/nat/gateway#az-network-nat-gateway-create) to create a NAT gateway resource and associate the public IP you created previously.
+
+```azurecli
+az network nat gateway create \
+ --resource-group myResourceGroup \
+ --name myNATgateway \
+ --public-ip-addresses myPublicIP-NAT \
+ --idle-timeout 10
+
+```
+
+Use [az network vnet subnet update](/cli/azure/network/vnet/subnet#az-network-vnet-subnet-update) to associate the NAT gateway with your virtual network subnet.
+
+```azurecli
+az network vnet subnet update \
+ --resource-group myResourceGroup \
+ --vnet-name myVNet \
+ --name mySubnet \
+ --nat-gateway myNATgateway
+```
+
+### Public IP prefix
+
+To create a NAT gateway with a public IP prefix, continue with the following steps.
+
+Use [az network public-ip prefix create](/cli/azure/network/public-ip/prefix#az-network-public-ip-prefix-create) to create a public IP prefix for the NAT gateway.
+
+```azurecli
+az network public-ip prefix create \
+ --length 29 \
+ --resource-group myResourceGroup \
+ --location eastus2 \
+ --name myPublicIPprefix-NAT
+```
+
+Use [az network nat gateway create](/cli/azure/network/nat/gateway#az-network-nat-gateway-create) to create a NAT gateway resource and associate the public IP prefix you created previously.
+
+```azurecli
+az network nat gateway create \
+ --resource-group myResourceGroup \
+ --name myNATgateway \
+ --public-ip-prefixes myPublicIPprefix-NAT \
+ --idle-timeout 10
+
+```
+
+Use [az network vnet subnet update](/cli/azure/network/vnet/subnet#az-network-vnet-subnet-update) to associate the NAT gateway with your virtual network subnet.
+```azurecli
+az network vnet subnet update \
+ --resource-group myResourceGroup \
+ --vnet-name myVNet \
+ --name mySubnet \
+ --nat-gateway myNATgateway
+```
+++
+## Remove a NAT gateway from an existing subnet and delete the resource
+
+To remove a NAT gateway from an existing subnet, complete the following steps.
+
+# [**Portal**](#tab/manage-nat-portal)
+
+1. Sign-in to the [Azure portal](https://portal.azure.com).
+
+2. In the search box at the top of the portal, enter **NAT gateway**. Select **NAT gateways** in the search results.
+
+3. Select **myNATgateway** or the name of your NAT gateway.
+
+4. Select **Subnets** in **Settings**.
+
+5. Select **Disassociate** to remove the NAT gateway from the configured subnet.
+
+You can now associate the NAT gateway with a different subnet or virtual network in your subscription. To delete the NAT gateway resource, complete the following steps.
+
+1. In the search box at the top of the portal, enter **NAT gateway**. Select **NAT gateways** in the search results.
+
+2. Select **myNATgateway** or the name of your NAT gateway.
+
+3. Select **Delete**.
+
+4. Select **Yes**.
+
+# [**PowerShell**](#tab/manage-nat-powershell)
+
+Removing the NAT gateway from a subnet with Azure PowerShell is currently unsupported.
+
+# [**Azure CLI**](#tab/manage-nat-cli)
+
+Use [az network vnet subnet update](/cli/azure/network/vnet/subnet#az-network-vnet-subnet-update) to remove the NAT gateway from the subnet.
+
+```azurecli
+az network vnet subnet update \
+ΓÇéΓÇéΓÇéΓÇéΓÇéΓÇé--resource-group myResourceGroup \
+ΓÇéΓÇéΓÇéΓÇéΓÇéΓÇé--vnet-name myVNet \
+ΓÇéΓÇéΓÇéΓÇéΓÇéΓÇé--name mySubnet \
+ΓÇéΓÇéΓÇéΓÇéΓÇéΓÇé--remove natGateway
+```
+
+Use [az network nat gateway delete](/cli/azure/network/nat/gateway#az-network-nat-gateway-delete) to delete the NAT gateway resource.
+
+```azurecli
+az network nat gateway delete \
+ --name myNATgateway \
+ --resource-group myResourceGroup
+```
+++
+> [!NOTE]
+> The public IP address or prefix associated with the NAT gateway aren't deleted when you delete the NAT gateway resource.
+
+## Add or remove a public IP address
+
+Complete the following steps to add or remove a public IP address from a NAT gateway.
+
+# [**Portal**](#tab/manage-nat-portal)
+
+1. Sign-in to the [Azure portal](https://portal.azure.com).
+
+2. In the search box at the top of the portal, enter **Public IP address**. Select **Public IP addresses** in the search results.
+
+3. Select **+ Create**.
+
+4. Enter or select the following information in **Create public IP address**.
+
+ | Setting | Value |
+ | - | -- |
+ | IP version | Select **IPv4**. |
+ | SKU | Select **Standard**. |
+ | Tier | Select **Regional**. |
+ | **IPv4 IP Address Configuration** | |
+ | Name | Enter **myPublicIP-NAT2**. |
+ | Routing preference | Leave the default of **Microsoft network**. |
+ | Subscription | Select your subscription. |
+ | Resource group | Select your resource group. **myResourceGroup** is used in this example. |
+ | Location | Select a location. **East US 2** is used in this example. |
+ | Availability zone | Leave the default of **Zone-redundant**. |
+
+5. Select **Create**.
+
+6. In the search box at the top of the portal, enter **NAT gateway**. Select **NAT gateways** in the search results.
+
+7. Select **myNATgateway** or the name of your NAT gateway.
+
+8. Select **Outbound IP** in **Settings**.
+
+9. The IP addresses and prefixes associated with the NAT gateway are displayed. Select **Change** next to **Public IP addresses**.
+
+10. Select the pull-down box next to **Public IP addresses**. Select the checkbox next to the IP address you created previously to add the IP address to the NAT gateway. To remove an address, uncheck the box next to its name.
+
+7. Select **OK**.
+
+8. Select **Save**.
+
+# [**PowerShell**](#tab/manage-nat-powershell)
+
+### Add public IP address
+
+The public IP that you want to add to the NAT gateway must be added to an array object along with the current IP addresses. The PowerShell cmdlets do a full replace and not add when they're executed.
+
+For the purposes of this example, the existing IP address associated with the NAT gateway is named **myPublicIP-NAT**. Replace this value with the existing IP associated with your NAT gateway. If you have multiple IPs already configured, they must also be added to the array.
+
+Use [New-AzPublicIpAddress](/powershell/module/az.network/new-azpublicipaddress) to create a new IP address for the NAT gateway.
+
+```azurepowershell
+## Create public IP address for NAT gateway ##
+$ip = @{
+ Name = 'myPublicIP-NAT2'
+ ResourceGroupName = 'myResourceGroup'
+ Location = 'eastus2'
+ Sku = 'Standard'
+ AllocationMethod = 'Static'
+}
+New-AzPublicIpAddress @ip
+```
+
+Use [Set-AzNatGateway](/powershell/module/az.network/set-aznatgateway) to add the public IP address to the NAT gateway.
+
+```azurepowershell
+## Place NAT gateway into a variable. ##
+$ng = @{
+ Name = 'myNATgateway'
+ ResourceGroupName = 'myResourceGroup'
+}
+$nat = Get-AzNatGateway @ng
+
+## Place the existing public IP address associated with the NAT gateway into a variable. ##
+$ip = @{
+ Name = 'myPublicIP-NAT'
+ ResourceGroupName = 'myResourceGroup'
+}
+$publicIP1 = Get-AzPublicIPaddress @ip
+
+## Place the public IP address you created previously into a variable. ##
+$ip = @{
+ Name = 'myPublicIP-NAT2'
+ ResourceGroupName = 'myResourceGroup'
+}
+$publicIP2 = Get-AzPublicIPaddress @ip
+
+## Place the public IP address variables into an array. ##
+$pipArray = $publicIP1,$publicIP2
+
+## Add the IP address to the NAT gateway. ##
+$nt = @{
+ NatGateway = $nat
+ PublicIpAddress = $pipArray
+}
+Set-AzNatGateway @nt
+```
+
+### Remove public IP address
+
+To remove a public IP from a NAT gateway, you must create an array object that **doesn't** contain the IP address you wish to remove. For example, you have a NAT gateway configured with two public IP addresses. You wish to remove one of the IP addresses. The IP addresses associated with the NAT gateway are named **myPublicIP-NAT** and **myPublicIP-NAT2**. To remove **myPublicIP-NAT2**, you create an array object for the PowerShell command that **only** contains **myPublicIP-NAT**. When you apply the command, the array is reapplied to the NAT gateway, and **myPublicIP-NAT** is the only public IP associated.
+
+Use [Set-AzNatGateway](/powershell/module/az.network/set-aznatgateway) to remove a public IP address from the NAT gateway.
+
+```azurepowershell
+## Place NAT gateway into a variable. ##
+$ng = @{
+ Name = 'myNATgateway'
+ ResourceGroupName = 'myResourceGroup'
+}
+$nat = Get-AzNatGateway @ng
+
+## Place the existing public IP prefix associated with the NAT gateway into a variable. ##
+$ip = @{
+ Name = 'myPublicIP-NAT'
+ ResourceGroupName = 'myResourceGroup'
+}
+$prefixIP1 = Get-AzPublicIPAddress @ip
+
+## Place the secondary public IP address into a variable. ##
+$ip = @{
+ Name = 'myPublicIP-NAT2'
+ ResourceGroupName = 'myResourceGroup'
+}
+$publicIP2 = Get-AzPublicIPAddress @ip
+
+## Place ONLY the public IP you wish to keep in the array. ##
+$pipArray = $publicIP1
+
+## Add the IP address prefix to the NAT gateway. ##
+$nt = @{
+ NatGateway = $nat
+ PublicIpAddress = $pipArray
+}
+Set-AzNatGateway @nt
+```
+
+# [**Azure CLI**](#tab/manage-nat-cli)
+
+### Add public IP address
+
+For the purposes of this example, the existing public IP address associated with the NAT gateway is named **myPublicIP-NAT**.
+
+Use [az network public-ip create](/cli/azure/network/public-ip#az-network-public-ip-create) to create a new IP address for the NAT gateway.
+
+```azurecli
+az network public-ip create \
+ --resource-group myResourceGroup \
+ --location eastus2 \
+ --name myPublicIP-NAT2 \
+ --sku standard
+```
+
+Use [az network nat gateway update](/cli/azure/network/nat/gateway#az-network-nat-gateway-update) to add the public IP address you created previously to the NAT gateway. The Azure CLI command performs a replacement of the values, not an addition. To add the new IP address to the NAT gateway, you must also include any other IP addresses associated to the NAT gateway, or they'll be removed.
+
+```azurecli
+az network nat gateway update \
+ --name myNATgateway \
+ --resource-group myResourceGroup \
+ --public-ip-addresses myPublicIP-NAT myPublicIP-NAT2
+```
+
+### Remove public IP address
+
+Use [az network nat gateway update](/cli/azure/network/nat/gateway#az-network-nat-gateway-update) to remove a public IP address from the NAT gateway. The Azure CLI command performs a replacement of the values, not a subtraction. To remove a public IP address, you must include any IP address in the command that you wish to keep, and omit the one you wish to remove. For example, you have a NAT gateway configured with two public IP addresses. You wish to remove one of the IP addresses. The IP addresses associated with the NAT gateway are named **myPublicIP-NAT** and **myPublicIP-NAT2**. To remove **myPublicIP-NAT2**, you must omit the name of the IP from the command. The command will reapply the IPs listed in the command to the NAT gateway. Any IP not listed will be removed.
+
+```azurecli
+az network nat gateway update \
+ --name myNATgateway \
+ --resource-group myResourceGroup \
+ --public-ip-addresses myPublicIP-NAT
+```
+++
+## Add or remove a public IP prefix
+
+Complete the following steps to add or remove a public IP prefix from a NAT gateway.
+
+# [**Portal**](#tab/manage-nat-portal)
+
+1. Sign-in to the [Azure portal](https://portal.azure.com).
+
+2. In the search box at the top of the portal, enter **Public IP prefix**. Select **Public IP Prefixes** in the search results.
+
+3. Select **+ Create**.
+
+4. Enter or select the following information in the **Basics** tab of **Create a public IP prefix**.
+
+ | Setting | Value |
+ | - | -- |
+ | **Project details** | |
+ | Subscription | Select your subscription. |
+ | Resource group | Select your resource group. **myResourceGroup** is used in this example. |
+ | **Instance details** | |
+ | Name | Enter **myPublicIPPrefix-NAT**. |
+ | Region | Select your region. **East US 2** is used in this example. |
+ | IP version | Select **IPv4**. |
+ | Prefix ownership | Select **Microsoft owned**. |
+ | Prefix size | Select a prefix size. **/28 (16 addresses)** is used in this example. |
+
+5. Select **Review + create**.
+
+6. Select **Create**.
+
+7. In the search box at the top of the portal, enter **NAT gateway**. Select **NAT gateways** in the search results.
+
+8. Select **myNATgateway** or the name of your NAT gateway.
+
+9. Select **Outbound IP** in **Settings**.
+
+10. The IP addresses and prefixes associated with the NAT gateway are displayed. Select **Change** next to **Public IP prefixes**.
+
+11. Select the pull-down box next to **Public IP Prefixes**. Select the checkbox next to the IP address prefix you created previously to add the prefix to the NAT gateway. To remove a prefix, uncheck the box next to its name.
+
+7. Select **OK**.
+
+8. Select **Save**.
+
+# [**PowerShell**](#tab/manage-nat-powershell)
+
+### Add public IP prefix
+
+The public IP prefix that you want to add to the NAT gateway must be added to an array object along with the current IP prefixes. The PowerShell cmdlets do a full replace and not add when they're executed.
+
+For the purposes of this example, the existing public IP prefix associated with the NAT gateway is named **myPublicIPprefix-NAT**. Replace this value with the existing IP prefix associated with your NAT gateway. If you have multiple prefixes already configured, they must also be added to the array.
+
+Use [New-AzPublicIpPrefix](/powershell/module/az.network/new-azpublicipprefix) to create a new public IP prefix for the NAT gateway.
+
+```azurepowershell
+## Create public IP prefix for NAT gateway ##
+$ip = @{
+ Name = 'myPublicIPPrefix-NAT2'
+ ResourceGroupName = 'myResourceGroup'
+ Location = 'eastus2'
+ Sku = 'Standard'
+ PrefixLength = '29'
+}
+New-AzPublicIpPrefix @ip
+```
+
+Use [Set-AzNatGateway](/powershell/module/az.network/set-aznatgateway) to add the public IP prefix to the NAT gateway.
+
+```azurepowershell
+## Place NAT gateway into a variable. ##
+$ng = @{
+ Name = 'myNATgateway'
+ ResourceGroupName = 'myResourceGroup'
+}
+$nat = Get-AzNatGateway @ng
+
+## Place the existing public IP prefix associated with the NAT gateway into a variable. ##
+$ip = @{
+ Name = 'myPublicIPprefix-NAT'
+ ResourceGroupName = 'myResourceGroup'
+}
+$prefixIP1 = Get-AzPublicIPPrefix @ip
+
+## Place the public IP prefix you created previously into a variable. ##
+$ip = @{
+ Name = 'myPublicIPprefix-NAT2'
+ ResourceGroupName = 'myResourceGroup'
+}
+$prefixIP2 = Get-AzPublicIPprefix @ip
+
+## Place the public IP address variables into an array. ##
+$preArray = $prefixIP1,$prefixIP2
+
+## Add the IP address prefix to the NAT gateway. ##
+$nt = @{
+ NatGateway = $nat
+ PublicIpPrefix = $preArray
+}
+Set-AzNatGateway @nt
+```
+
+### Remove public IP prefix
+
+To remove a public IP prefix from a NAT gateway, you must create an array object that **doesn't** contain the IP address prefix you wish to remove. For example, you have a NAT gateway configured with two public IP prefixes. You wish to remove one of the IP prefixes. The IP prefixes associated with the NAT gateway are named **myPublicIPprefix-NAT** and **myPublicIPprefix-NAT2**. To remove **myPublicIPprefix-NAT2**, you create an array object for the PowerShell command that **only** contains **myPublicIPprefix-NAT**. When you apply the command, the array is reapplied to the NAT gateway, and **myPublicIPprefix-NAT** is the only prefix associated.
+
+Use [Set-AzNatGateway](/powershell/module/az.network/set-aznatgateway) to remove a public IP prefix from the NAT gateway.
+
+```azurepowershell
+## Place NAT gateway into a variable. ##
+$ng = @{
+ Name = 'myNATgateway'
+ ResourceGroupName = 'myResourceGroup'
+}
+$nat = Get-AzNatGateway @ng
+
+## Place the existing public IP prefix associated with the NAT gateway into a variable. ##
+$ip = @{
+ Name = 'myPublicIPprefix-NAT'
+ ResourceGroupName = 'myResourceGroup'
+}
+$prefixIP1 = Get-AzPublicIPPrefix @ip
+
+## Place the secondary public IP prefix into a variable. ##
+$ip = @{
+ Name = 'myPublicIPprefix-NAT2'
+ ResourceGroupName = 'myResourceGroup'
+}
+$prefixIP2 = Get-AzPublicIPprefix @ip
+
+## Place ONLY the prefix you wish to keep in the array. DO NOT ADD THE SECONDARY VARIABLE ##
+$preArray = $prefixIP1
+
+## Add the IP address prefix to the NAT gateway. ##
+$nt = @{
+ NatGateway = $nat
+ PublicIpPrefix = $preArray
+}
+Set-AzNatGateway @nt
+```
+
+# [**Azure CLI**](#tab/manage-nat-cli)
+
+### Add public IP prefix
+
+For the purposes of this example, the existing public IP prefix associated with the NAT gateway is named **myPublicIPprefix-NAT**.
+
+Use [az network public-ip prefix create](/cli/azure/network/public-ip/prefix#az-network-public-ip-prefix-create) to create a public IP prefix for the NAT gateway.
+
+```azurecli
+az network public-ip prefix create \
+ --length 29 \
+ --resource-group myResourceGroup \
+ --location eastus2 \
+ --name myPublicIPprefix-NAT2
+```
+
+Use [az network nat gateway update](/cli/azure/network/nat/gateway#az-network-nat-gateway-update) to add the public IP prefix you created previously to the NAT gateway. The Azure CLI command is a replacement of the values, not an addition. To add the new IP address prefix to the NAT gateway, you must also include any other IP prefixes associated to the NAT gateway, or they'll be removed.
+
+```azurecli
+az network nat gateway update \
+ --name myNATgateway \
+ --resource-group myResourceGroup \
+ --public-ip-prefixes myPublicIPprefix-NAT myPublicIPprefix-NAT2
+```
+
+### Remove public IP prefix
+
+Use [az network nat gateway update](/cli/azure/network/nat/gateway#az-network-nat-gateway-update) to remove a public IP prefix from the NAT gateway. The Azure CLI command is a replacement of the values and not subtraction. To remove a public IP prefix, you must include any prefix in the command that you wish to keep, and omit the one you wish to remove. For example, you have a NAT gateway configured with two public IP prefixes. You wish to remove one of the prefixes. The IP prefixes associated with the NAT gateway are named **myPublicIPprefix-NAT** and **myPublicIPprefix-NAT2**. To remove **myPublicIPprefix-NAT2**, you must omit the name of the IP prefix from the command. The command will reapply the IPs listed in the command to the NAT gateway. Any IP not listed will be removed.
+
+```azurecli
+az network nat gateway update \
+ --name myNATgateway \
+ --resource-group myResourceGroup \
+ --public-ip-prefixes myPublicIPprefix-NAT
+```
++
+## Next steps
+To learn more about Azure Virtual Network NAT and its capabilities, see the following articles:
+
+- [What is Azure Virtual Network NAT?](nat-overview.md)
+
+- [NAT gateway and availability zones](nat-availability-zones.md)
+
+- [Design virtual networks with NAT gateway](nat-gateway-resource.md)
+
virtual-network Service Tags Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/service-tags-overview.md
By default, service tags reflect the ranges for the entire cloud. Some service t
| **AppConfiguration** | App Configuration. | Outbound | No | No | | **AppService** | Azure App Service. This tag is recommended for outbound security rules to web apps and Function apps.<br/><br/>**Note**: This tag doesn't include IP addresses assigned when using IP-based SSL (App-assigned address). | Outbound | Yes | Yes | | **AppServiceManagement** | Management traffic for deployments dedicated to App Service Environment. | Both | No | Yes |
-| **AutonomousDevelopmentPlatform** | Autonomous Development Platform | Both | Yes | Yes |
+| **AutonomousDevelopmentPlatform** | Autonomous Development Platform | Both | Yes | No |
| **AzureActiveDirectory** | Azure Active Directory. | Outbound | No | Yes | | **AzureActiveDirectoryDomainServices** | Management traffic for deployments dedicated to Azure Active Directory Domain Services. | Both | No | Yes | | **AzureAdvancedThreatProtection** | Azure Advanced Threat Protection. | Outbound | No | No |
By default, service tags reflect the ranges for the entire cloud. Some service t
| **AzureCloud** | All [datacenter public IP addresses](https://www.microsoft.com/download/details.aspx?id=56519). | Both | Yes | Yes | | **AzureCognitiveSearch** | Azure Cognitive Search. <br/><br/>This tag or the IP addresses covered by this tag can be used to grant indexers secure access to data sources. For more information about indexers, see [indexer connection documentation](../search/search-indexer-troubleshooting.md#connection-errors). <br/><br/> **Note**: The IP of the search service isn't included in the list of IP ranges for this service tag and **also needs to be added** to the IP firewall of data sources. | Inbound | No | No | | **AzureConnectors** | This tag represents the IP addresses used for managed connectors that make inbound webhook callbacks to the Azure Logic Apps service and outbound calls to their respective services, for example, Azure Storage or Azure Event Hubs. | Both | Yes | Yes |
-| **AzureContainerAppsService** | Azure Container Apps Service | Both | Yes | Yes |
+| **AzureContainerAppsService** | Azure Container Apps Service | Both | Yes | No |
| **AzureContainerRegistry** | Azure Container Registry. | Outbound | Yes | Yes | | **AzureCosmosDB** | Azure Cosmos DB. | Outbound | Yes | Yes | | **AzureDatabricks** | Azure Databricks. | Both | No | No |
By default, service tags reflect the ranges for the entire cloud. Some service t
| **AzurePlatformIMDS** | Azure Instance Metadata Service (IMDS), which is a basic infrastructure service.<br/><br/>You can use this tag to disable the default IMDS. Be cautious when you use this tag. We recommend that you read [Azure platform considerations](./network-security-groups-overview.md#azure-platform-considerations). We also recommend that you perform testing before you use this tag. | Outbound | No | No | | **AzurePlatformLKM** | Windows licensing or key management service.<br/><br/>You can use this tag to disable the defaults for licensing. Be cautious when you use this tag. We recommend that you read [Azure platform considerations](./network-security-groups-overview.md#azure-platform-considerations). We also recommend that you perform testing before you use this tag. | Outbound | No | No | | **AzureResourceManager** | Azure Resource Manager. | Outbound | No | No |
-| **AzureSentinel** | Microsoft Sentinel. | Inbound | Yes | Yes |
+| **AzureSentinel** | Microsoft Sentinel. | Inbound | Yes | No |
| **AzureSignalR** | Azure SignalR. | Outbound | No | No | | **AzureSiteRecovery** | Azure Site Recovery.<br/><br/>**Note**: This tag has a dependency on the **AzureActiveDirectory**, **AzureKeyVault**, **EventHub**,**GuestAndHybridManagement** and **Storage** tags. | Outbound | No | No | | **AzureSphere** | This tag or the IP addresses covered by this tag can be used to restrict access to Azure Sphere Security Services. | Both | No | Yes | | **AzureStack** | Azure Stack Bridge services. </br> This tag represents the Azure Stack Bridge service endpoint per region. | Outbound | No | Yes | | **AzureTrafficManager** | Azure Traffic Manager probe IP addresses.<br/><br/>For more information on Traffic Manager probe IP addresses, see [Azure Traffic Manager FAQ](../traffic-manager/traffic-manager-faqs.md). | Inbound | No | Yes | | **AzureUpdateDelivery** | For accessing Windows Updates. <br/><br/>**Note**: This tag provides access to Windows Update metadata services. To successfully download updates, you must also enable the **AzureFrontDoor.FirstParty** service tag and configure outbound security rules with the protocol and port defined as follows: <ul><li>AzureUpdateDelivery: TCP, port 443</li><li>AzureFrontDoor.FirstParty: TCP, port 80</li></ul> | Outbound | No | No |
+| AzureWebSubPub | AzureWebSubPub | Both | Yes | No |
| **BatchNodeManagement** | Management traffic for deployments dedicated to Azure Batch. | Both | No | Yes |
-| **ChaosStudio** | Azure Chaos Studio. <br/><br/>**Note**: If you have enabled Application Insights integration on the Chaos Agent, the AzureMonitor tag is also required. | Both | Yes | Yes |
+| **ChaosStudio** | Azure Chaos Studio. <br/><br/>**Note**: If you have enabled Application Insights integration on the Chaos Agent, the AzureMonitor tag is also required. | Both | Yes | No |
| **CognitiveServicesManagement** | The address ranges for traffic for Azure Cognitive Services. | Both | No | No | | **DataFactory** | Azure Data Factory | Both | No | No | | **DataFactoryManagement** | Management traffic for Azure Data Factory. | Outbound | No | No |
virtual-wan User Groups Create https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-wan/user-groups-create.md
Before beginning, make sure you've configured a virtual WAN that uses one or mor
1. Every address pool specified on the gateway. Address pools are split into two address pools and assigned to each active-active instance in a point-to-site VPN gateway pair. These split addresses should show up in the effective route table. For example, if you specify 10.0.0.0/24, you should see two /25 routes in the effective route table. If this isn't the case, try changing the address pools defined on the gateway. 1. Make sure all point-to-site VPN connection configurations are associated to the defaultRouteTable and propagate to the same set of route tables. This should be configured automatically if you're using portal, but if you're using REST, PowerShell or CLI, make sure all propagations and associations are set appropriately. 1. If you're using the Azure VPN client, make sure the Azure VPN client installed on user devices are the latest version.
+1. If you're using Azure Active Directory authentication, please make sure the tenant URL input in the server configuration (`https://login.microsoftonline.com/<tenant ID>`) does **not** end in a `\`. If the URL is input to end with `\`, the Gateway will not be able to properly process Azure Active Directory user groups and all users will be assigned to the default group. To remediate, please modify the server configuration to remove the trailing `\` and modify the address pools configured on the gateway to apply the changes to the gateway. This is a known issue that will be fixed in a later relase.
## Next steps
-* For more information about user groups, see [About user groups and IP address pools for P2S User VPNs](user-groups-about.md).
+* For more information about user groups, see [About user groups and IP address pools for P2S User VPNs](user-groups-about.md).
web-application-firewall Waf Front Door Create Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/web-application-firewall/afds/waf-front-door-create-portal.md
Previously updated : 10/21/2022 Last updated : 10/28/2022 + # Tutorial: Create a Web Application Firewall policy on Azure Front Door using the Azure portal
web-application-firewall Application Gateway Web Application Firewall Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/web-application-firewall/ag/application-gateway-web-application-firewall-portal.md
Previously updated : 10/21/2022 Last updated : 10/28/2022 + #Customer intent: As an IT administrator, I want to use the Azure portal to set up an application gateway with Web Application Firewall so I can protect my applications.