Updates from: 08/27/2021 03:05:38
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Customize Ui With Html https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/customize-ui-with-html.md
When using your own HTML and CSS files to customize the UI, host your UI content
- Use an absolute URL when you include external resources like media, CSS, and JavaScript files in your HTML file. - Using [page layout version](page-layout.md) 1.2.0 and above, you can add the `data-preload="true"` attribute in your HTML tags to control the load order for CSS and JavaScript. With `data-preload="true"`, the page is constructed before being shown to the user. This attribute helps prevent the page from "flickering" by preloading the CSS file, without the un-styled HTML being shown to the user. The following HTML code snippet shows the use of the `data-preload` tag.
- ```HTML
+
+ ```html
<link href="https://path-to-your-file/sample.css" rel="stylesheet" type="text/css" data-preload="true"/> ``` - We recommend that you start with the default page content and build on top of it.
When using your own HTML and CSS files to customize the UI, host your UI content
## Localize content
-You localize your HTML content by enabling [language customization](language-customization.md) in your Azure AD B2C tenant. Enabling this feature allows Azure AD B2C to forward the OpenID Connect parameter `ui_locales` to your endpoint. Your content server can use this parameter to provide language-specific HTML pages.
+You localize your HTML content by enabling [language customization](language-customization.md) in your Azure AD B2C tenant. Enabling this feature allows Azure AD B2C to set the HTML page language attribute and pass the OpenID Connect parameter `ui_locales` to your endpoint.
-> [!NOTE]
-> Azure AD B2C doesn't pass OpenID Connect parameters, such as `ui_locales` to the [Exception pages](page-layout.md#exception-page-globalexception).
+#### Single-template approach
+
+During page load, Azure AD B2C sets the HTML page language attribute with the current language. For example, `<html lang="en">`. To render different styles per the current language, use the CSS `:lang` selector along with your CSS definition.
+
+The following example defines the following classes:
+
+* `imprint-en` - Used when the current language is English.
+* `imprint-de` - Used when the current language is German.
+* `imprint` - Default class that is used when the current language is neither English nor German.
+
+```css
+.imprint-en:lang(en),
+.imprint-de:lang(de) {
+ display: inherit !important;
+}
+.imprint {
+ display: none;
+}
+```
+The following HTML elements will be shown or hidden according to the page language:
+
+```html
+<a class="imprint imprint-en" href="Link EN">Imprint</a>
+<a class="imprint imprint-de" href="Link DE">Impressum</a>
+```
+
+#### Multi-template approach
+
+The language customization feature allows Azure AD B2C to pass the OpenID Connect parameter `ui_locales` to your endpoint. Your content server can use this parameter to provide language-specific HTML pages.
+
+> [!NOTE]
+> Azure AD B2C doesn't pass OpenID Connect parameters, such as `ui_locales`, to the [exception pages](page-layout.md#exception-page-globalexception).
Content can be pulled from different places based on the locale that's used. In your CORS-enabled endpoint, you set up a folder structure to host content for specific languages. You'll call the right one if you use the wildcard value `{Culture:RFC5646}`.
active-directory-b2c Deploy Custom Policies Devops https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/deploy-custom-policies-devops.md
Previously updated : 06/01/2021 Last updated : 08/26/2021
Param(
[Parameter(Mandatory = $true)][string]$ClientID, [Parameter(Mandatory = $true)][string]$ClientSecret, [Parameter(Mandatory = $true)][string]$TenantId,
- [Parameter(Mandatory = $true)][string]$PolicyId,
- [Parameter(Mandatory = $true)][string]$PathToFile
+ [Parameter(Mandatory = $true)][string]$Folder,
+ [Parameter(Mandatory = $true)][string]$Files
) try {
try {
$headers.Add("Content-Type", 'application/xml') $headers.Add("Authorization", 'Bearer ' + $token)
- $graphuri = 'https://graph.microsoft.com/beta/trustframework/policies/' + $PolicyId + '/$value'
- $policycontent = Get-Content $PathToFile
-
- # Optional: Change the content of the policy. For example, replace the tenant-name with your tenant name.
- # $policycontent = $policycontent.Replace("your-tenant.onmicrosoft.com", "contoso.onmicrosoft.com")
-
- $response = Invoke-RestMethod -Uri $graphuri -Method Put -Body $policycontent -Headers $headers
-
- Write-Host "Policy" $PolicyId "uploaded successfully."
+ # Get the list of files to upload
+ $filesArray = $Files.Split(",")
+
+ Foreach ($file in $filesArray) {
+
+ $filePath = $Folder + $file.Trim()
+
+ # Check if file exists
+ $FileExists = Test-Path -Path $filePath -PathType Leaf
+
+ if ($FileExists) {
+ $policycontent = Get-Content $filePath
+
+ # Optional: Change the content of the policy. For example, replace the tenant-name with your tenant name.
+ # $policycontent = $policycontent.Replace("your-tenant.onmicrosoft.com", "contoso.onmicrosoft.com")
+
+
+ # Get the policy name from the XML document
+ $match = Select-String -InputObject $policycontent -Pattern '(?<=\bPolicyId=")[^"]*'
+
+ If ($match.matches.groups.count -ge 1) {
+ $PolicyId = $match.matches.groups[0].value
+
+ Write-Host "Uploading the" $PolicyId "policy..."
+
+ $graphuri = 'https://graph.microsoft.com/beta/trustframework/policies/' + $PolicyId + '/$value'
+ $response = Invoke-RestMethod -Uri $graphuri -Method Put -Body $policycontent -Headers $headers
+
+ Write-Host "Policy" $PolicyId "uploaded successfully."
+ }
+ }
+ else {
+ $warning = "File " + $filePath + " couldn't be not found."
+ Write-Warning -Message $warning
+ }
+ }
} catch { Write-Host "StatusCode:" $_.Exception.Response.StatusCode.value__
A pipeline task is a pre-packaged script that performs an action. Add a task tha
* **Display name**: The name of the policy that this task should upload. For example, *B2C_1A_TrustFrameworkBase*. * **Type**: File Path * **Script Path**: Select the ellipsis (***...***), navigate to the *Scripts* folder, and then select the *DeployToB2C.ps1* file.
- * **Arguments:**
-
- Enter the following values for **Arguments**. Replace the `{alias-name}` with the alias you specified in the previous section. Replace the `{policy-id}` with the policy name. Replace the `{policy-file-name}` with the policy file name.
-
- The first policy your upload must be the *TrustFrameworkBase.xml*.
-
- ```PowerShell
- -ClientID $(clientId) -ClientSecret $(clientSecret) -TenantId $(tenantId) -PolicyId {policy-id} -PathToFile $(System.DefaultWorkingDirectory)/{alias-name}/B2CAssets/{policy-file-name}
- ```
-
- The `PolicyId` is a value found at the start of an XML policy file within the TrustFrameworkPolicy node. For example, the `PolicyId` in the following policy XML is *B2C_1A_TrustFrameworkBase*:
-
- ```xml
- <TrustFrameworkPolicy
- xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
- xmlns:xsd="http://www.w3.org/2001/XMLSchema"
- xmlns="http://schemas.microsoft.com/online/cpim/schemas/2013/06"
- PolicySchemaVersion="0.3.0.0"
- TenantId="your-tenant.onmicrosoft.com"
- PolicyId= "B2C_1A_TrustFrameworkBase"
- PublicPolicyUri="http://your-tenant.onmicrosoft.com/B2C_1A_TrustFrameworkBase">
- ```
+ * **Arguments**: Enter the following PowerShell script.
- Your final arguments should look like the following example:
```PowerShell
- -ClientID $(clientId) -ClientSecret $(clientSecret) -TenantId $(tenantId) -PolicyId B2C_1A_TrustFrameworkBase -PathToFile $(System.DefaultWorkingDirectory)/policyRepo/B2CAssets/TrustFrameworkBase.xml
+ -ClientID $(clientId) -ClientSecret $(clientSecret) -TenantId $(tenantId) -Folder $(System.DefaultWorkingDirectory)/policyRepo/B2CAssets/ -Files "TrustFrameworkBase.xml,TrustFrameworkExtensions.xml,SignUpOrSignin.xml,ProfileEdit.xml,PasswordReset.xml"
```-
+
+ The `-Files` parameter is a comma delimiter list of policy files to deploy. Update the list with your policy files.
+
+ > [!IMPORTANT]
+ > Ensure the policies are uploaded in the correct order. First the base policy, the extensions policy, then the relying party policies. For example, `TrustFrameworkBase.xml,TrustFrameworkExtensions.xml,SignUpOrSignin.xml`.
+
1. Select **Save** to save the Agent job. ## Test your pipeline
To test your release pipeline:
You should see a notification banner that says that a release has been queued. To view its status, select the link in the notification banner, or select it in the list on the **Releases** tab.
-## Add more pipeline tasks
-
-To deploy the rest of your policies, repeat the [preceding steps](#add-pipeline-tasks) for each of the custom policy files.
-
-When running the agents and uploading the policy files, ensure they're uploaded in the correct order:
-
-1. *TrustFrameworkBase.xml*
-1. *TrustFrameworkExtensions.xml*
-1. *SignUpOrSignin.xml*
-1. *ProfileEdit.xml*
-1. *PasswordReset.xml*
## Next steps
active-directory-b2c Identity Provider Generic Saml Options https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-generic-saml-options.md
Previously updated : 03/22/2021 Last updated : 08/25/2021
active-directory-b2c Phone Authentication User Flows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/phone-authentication-user-flows.md
To enable consent information
13. Select **Phone signUp page**, and then repeat steps 10 through 12. +
+## Get a user's phone number in your directory
+
+1. Run the following request in Graph Explorer:
+
+ `GET https://graph.microsoft.com/v1.0/users/{object_id}?$select=identities`
+
+1. Find the `issuerAssignedId` property in the response returned:
+
+ ```json
+ "identities": [
+ {
+ "signInType": "phoneNumber",
+ "issuer": "contoso.onmicrosoft.com",
+ "issuerAssignedId": "+11231231234"
+ }
+ ]
+ ```
+ ## Next steps - [Add external identity providers](add-identity-provider.md)-- [Create a user flow](tutorial-create-user-flows.md)
+- [Create a user flow](tutorial-create-user-flows.md)
active-directory-b2c Saml Identity Provider Technical Profile https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/saml-identity-provider-technical-profile.md
Previously updated : 12/01/2020 Last updated : 08/25/2021
active-directory-b2c Troubleshoot With Application Insights https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/troubleshoot-with-application-insights.md
Previously updated : 04/05/2021 Last updated : 08/26/2021
After you save the settings the Application insights logs appear on the **Azure
## Configure Application Insights in Production
-To improve your production environment performance and better user experience, it's important to configure your policy to ignore messages that are unimportant. Use the following configuration to send only critical error messages to your Application Insights.
+To improve your production environment performance and better user experience, it's important to configure your policy to ignore messages that are unimportant. Use the following configuration in production environments.
1. Set the `DeploymentMode` attribute of the [TrustFrameworkPolicy](trustframeworkpolicy.md) to `Production`.
active-directory Active Directory Enterprise App Role Management https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/active-directory-enterprise-app-role-management.md
Last updated 02/15/2021
-# How to: Configure the role claim issued in the SAML token for enterprise applications
+# Configure the role claim issued in the SAML token for enterprise applications
By using Azure Active Directory (Azure AD), you can customize the claim type for the role claim in the response token that you receive after you authorize an app.
active-directory Active Directory Optional Claims https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/active-directory-optional-claims.md
-# How to: Provide optional claims to your app
+# Provide optional claims to your app
Application developers can use optional claims in their Azure AD applications to specify which claims they want in tokens sent to their application.
active-directory Active Directory Saml Claims Customization https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/active-directory-saml-claims-customization.md
-# How to: customize claims issued in the SAML token for enterprise applications
+# Customize claims issued in the SAML token for enterprise applications
Today, the Microsoft identity platform supports single sign-on (SSO) with most enterprise applications, including both applications pre-integrated in the Azure AD app gallery as well as custom applications. When a user authenticates to an application through the Microsoft identity platform using the SAML 2.0 protocol, the Microsoft identity platform sends a token to the application (via an HTTP POST). And then, the application validates and uses the token to log the user in instead of prompting for a username and password. These SAML tokens contain pieces of information about the user known as *claims*.
active-directory Authorization Basics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/authorization-basics.md
One advantage of ABAC is that more granular and dynamic access control can be ac
One method for achieving ABAC with Azure Active Directory is using [dynamic groups](../enterprise-users/groups-create-rule.md). Dynamic groups allow administrators to dynamically assign users to groups based on specific user attributes with desired values. For example, an Authors group could be created where all users with the job title Author are dynamically assigned to the Authors group. Dynamic groups can be used in combination with RBAC for authorization where you map roles to groups and dynamically assign users to groups.
+[Azure ABAC](../../role-based-access-control/conditions-overview.md) is an example of an ABAC solution that is available today. Azure ABAC builds on Azure RBAC by adding role assignment conditions based on attributes in the context of specific actions.
+ ## Implementing authorization Authorization logic is often implemented within the applications or solutions where access control is required. In many cases, application development platforms offer middleware or other API solutions that simplify the implementation of authorization. Examples include use of the [AuthorizeAttribute](/aspnet/core/security/authorization/simple?view=aspnetcore-5.0&preserve-view=true) in ASP.NET or [Route Guards](./scenario-spa-sign-in.md?tabs=angular2#sign-in-with-a-pop-up-window) in Angular.
active-directory Config Authority https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/config-authority.md
-# How to: Configure MSAL for iOS and macOS to use different identity providers
+# Configure MSAL for iOS and macOS to use different identity providers
This article will show you how to configure your Microsoft authentication library app for iOS and macOS (MSAL) for different authorities such as Azure Active Directory (Azure AD), Business-to-Consumer (B2C), sovereign clouds, and guest users. Throughout this article, you can generally think of an authority as an identity provider.
active-directory Customize Webviews https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/customize-webviews.md
-# How to: Customize browsers and WebViews for iOS/macOS
+# Customize browsers and WebViews for iOS/macOS
A web browser is required for interactive authentication. On iOS and macOS 10.15+, the Microsoft Authentication Library (MSAL) uses the system web browser by default (which might appear on top of your app) to do interactive authentication to sign in users. Using the system browser has the advantage of sharing the Single Sign On (SSO) state with other applications and with web applications.
active-directory Howto Add App Roles In Azure Ad Apps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-add-app-roles-in-azure-ad-apps.md
-# How to: Add app roles to your application and receive them in the token
+# Add app roles to your application and receive them in the token
Role-based access control (RBAC) is a popular mechanism to enforce authorization in applications. When using RBAC, an administrator grants permissions to roles, and not to individual users or groups. The administrator can then assign roles to different users and groups to control who has access to what content and functionality.
active-directory Howto Add Terms Of Service Privacy Statement https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-add-terms-of-service-privacy-statement.md
-# How to: Configure terms of service and privacy statement for an app
+# Configure terms of service and privacy statement for an app
Developers who build and manage multi-tenant apps that integrate with Azure Active Directory (Azure AD) and Microsoft accounts should include links to the app's terms of service and privacy statement. The terms of service and privacy statement are surfaced to users through the user consent experience. They help your users know that they can trust your app. The terms of service and privacy statement are especially critical for user-facing multi-tenant apps--apps that are used by multiple directories or are available to any Microsoft account.
active-directory Howto Authenticate Service Principal Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-authenticate-service-principal-powershell.md
-# How to: Use Azure PowerShell to create a service principal with a certificate
+# Use Azure PowerShell to create a service principal with a certificate
When you have an app or script that needs to access resources, you can set up an identity for the app and authenticate the app with its own credentials. This identity is known as a service principal. This approach enables you to:
active-directory Howto Build Services Resilient To Metadata Refresh https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-build-services-resilient-to-metadata-refresh.md
# Customer intent: As a web app or web API developer, I want to learn how to ensure that my app is resilient to outages due to Azure AD OpenID Connect metadata refresh.
-# How to: Build services that are resilient to Azure AD's OpenID Connect metadata refresh
+# Build services that are resilient to Azure AD's OpenID Connect metadata refresh
Protected web APIs need to validate access tokens. Web apps also validate the ID tokens. Token Validation has multiple parts, checking whether the token belongs to the application, has been issued by a trusted Identity Provider (IDP), has a lifetime that's still in range and hasn't been tampered with. There can also be special validations. For instance, the app needs to validate the signature and that signing keys (when embedded in a token) are trusted and that the token isn't being replayed. When the signing keys aren't embedded in the token, they need to be fetched from the identity provider (Discovery or Metadata). Sometimes it's also necessary to obtain keys dynamically at runtime.
active-directory Howto Configure Publisher Domain https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-configure-publisher-domain.md
-# How to: Configure an application's publisher domain
+# Configure an application's publisher domain
An applicationΓÇÖs publisher domain is displayed to users on the [applicationΓÇÖs consent prompt](application-consent-experience.md) to let users know where their information is being sent. Multi-tenant applications that are registered after May 21, 2019 that don't have a publisher domain show up as **unverified**. Multi-tenant applications are applications that support accounts outside of a single organizational directory; for example, support all Azure AD accounts, or support all Azure AD accounts and personal Microsoft accounts.
active-directory Howto Convert App To Be Multi Tenant https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-convert-app-to-be-multi-tenant.md
-# How to: Sign in any Azure Active Directory user using the multi-tenant application pattern
+# Sign in any Azure Active Directory user using the multi-tenant application pattern
If you offer a Software as a Service (SaaS) application to many organizations, you can configure your application to accept sign-ins from any Azure Active Directory (Azure AD) tenant. This configuration is called *making your application multi-tenant*. Users in any Azure AD tenant will be able to sign in to your application after consenting to use their account with your application.
active-directory Howto Create Self Signed Certificate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-create-self-signed-certificate.md
#Customer intent: As an application developer, I want to understand the basic concepts of authentication and authorization in the Microsoft identity platform.
-# How to: Create a self-signed public certificate to authenticate your application
+# Create a self-signed public certificate to authenticate your application
Azure Active Directory (Azure AD) supports two types of authentication for service principals: **password-based authentication** (app secret) and **certificate-based authentication**. While app secrets can easily be created in the Azure portal, it's recommended that your application uses a certificate.
active-directory Howto Create Service Principal Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-create-service-principal-portal.md
-# How to: Use the portal to create an Azure AD application and service principal that can access resources
+# Use the portal to create an Azure AD application and service principal that can access resources
This article shows you how to create a new Azure Active Directory (Azure AD) application and service principal that can be used with the role-based access control. When you have applications, hosted services, or automated tools that needs to access or modify resources, you can create an identity for the app. This identity is known as a service principal. Access to resources is restricted by the roles assigned to the service principal, giving you control over which resources can be accessed and at which level. For security reasons, it's always recommended to use service principals with automated tools rather than allowing them to log in with a user identity.
active-directory Howto Get List Of All Active Directory Auth Library Apps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-get-list-of-all-active-directory-auth-library-apps.md
# Customer intent: As an application developer / IT admin, I need to know / identify which of my apps are using ADAL.
-# How to: Get a complete list of apps using ADAL in your tenant
+# Get a complete list of apps using ADAL in your tenant
Support for Active Directory Authentication Library (ADAL) will end on June 30, 2022. Apps using ADAL on existing OS versions will continue to work, but technical support and security updates will end. Without continued security updates, apps using ADAL will become increasingly vulnerable to the latest security attack patterns. This article provides guidance on how to use Azure Monitor workbooks to obtain a list of all apps that use ADAL in your tenant.
active-directory Howto Modify Supported Accounts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-modify-supported-accounts.md
# Customer intent: As an application developer, I need to know how to modify which account types can sign in to or access my application or API.
-# How to modify the accounts supported by an application
+# Modify the accounts supported by an application
When you registered your application with the Microsoft identity platform, you specified who--which account types--can access it. For example, you might've specified accounts only in your organization, which is a *single-tenant* app. Or, you might've specified accounts in any organization (including yours), which is a *multi-tenant* app.
active-directory Howto Remove App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-remove-app.md
#Customer intent: As an application developer, I want to know how to remove my application from the Microsoft identity registered.
-# How to remove an application registered with the Microsoft identity platform
+# Remove an application registered with the Microsoft identity platform
Enterprise developers and software-as-a-service (SaaS) providers who have registered applications with the Microsoft identity platform may need to remove an application's registration.
active-directory Howto Restrict Your App To A Set Of Users https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-restrict-your-app-to-a-set-of-users.md
#Customer intent: As a tenant administrator, I want to restrict an application that I have registered in Azuren-e AD to a select set of users available in my Azure AD tenant
-# How to: Restrict your Azure AD app to a set of users in an Azure AD tenant
+# Restrict your Azure AD app to a set of users in an Azure AD tenant
Applications registered in an Azure Active Directory (Azure AD) tenant are, by default, available to all users of the tenant who authenticate successfully.
active-directory Msal Android Single Sign On https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-android-single-sign-on.md
-# How to: Enable cross-app SSO on Android using MSAL
+# Enable cross-app SSO on Android using MSAL
Single sign-on (SSO) allows users to only enter their credentials once and have those credentials automatically work across applications.
active-directory Request Custom Claims https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/request-custom-claims.md
-# How to: Request custom claims using MSAL for iOS and macOS
+# Request custom claims using MSAL for iOS and macOS
OpenID Connect allows you to optionally request the return of individual claims from the UserInfo Endpoint and/or in the ID Token. A claims request is represented as a JSON object that contains a list of requested claims. See [OpenID Connect Core 1.0](https://openid.net/specs/openid-connect-core-1_0-final.html#ClaimsParameter) for more details.
active-directory Single Sign On Macos Ios https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/single-sign-on-macos-ios.md
-# How to: Configure SSO on macOS and iOS
+# Configure SSO on macOS and iOS
The Microsoft Authentication Library (MSAL) for macOS and iOS supports Single Sign-on (SSO) between macOS/iOS apps and browsers. This article covers the following SSO scenarios:
active-directory Ssl Issues https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/ssl-issues.md
-# How to: Troubleshoot MSAL for iOS and macOS TLS/SSL issues
+# Troubleshoot MSAL for iOS and macOS TLS/SSL issues
This article provides information to help you troubleshoot issues that you may come across while using the [Microsoft Authentication Library (MSAL) for iOS and macOS](reference-v2-libraries.md)
active-directory Licensing Service Plan Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/licensing-service-plan-reference.md
Previously updated : 8/24/2021 Last updated : 8/26/2021
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
- **Service plans included (friendly names)**: A list of service plans (friendly names) in the product that correspond to the string ID and GUID >[!NOTE]
->This information last updated on August 24th, 2021.
+>This information last updated on August 26th, 2021.
| Product name | String ID | GUID | Service plans included | Service plans included (friendly names) | | | | | | |
When managing licenses in [the Azure portal](https://portal.azure.com/#blade/Mic
| INTUNE | INTUNE_A | 061f9ace-7d42-4136-88ac-31dc755f143f | INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5) | MICROSOFT INTUNE (c1ec4a95-1f05-45b3-a911-aa3fa01094f5) | | Microsoft Dynamics AX7 User Trial | AX7_USER_TRIAL | fcecd1f9-a91e-488d-a918-a96cdb6ce2b0 | ERP_TRIAL_INSTANCE (e2f705fd-2468-4090-8c58-fad6e6b1e724)<br/>EXCHANGE_S_FOUNDATION (113feb6c-3fe4-4440-bddc-54d774bf0318) | Dynamics 365 Operations Trial Environment (e2f705fd-2468-4090-8c58-fad6e6b1e724)<br/>Exchange Foundation (113feb6c-3fe4-4440-bddc-54d774bf0318) | | Microsoft 365 A1 | M365EDU_A1 | b17653a4-2443-4e8c-a550-18249dda78bb | AAD_EDU (3a3976ce-de18-4a87-a78e-5e9245e252df)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>WINDOWS_STORE (a420f25f-a7b3-4ff5-a9d0-5d58f73b537d) | Azure Active Directory for Education (3a3976ce-de18-4a87-a78e-5e9245e252df)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Windows Store Service (a420f25f-a7b3-4ff5-a9d0-5d58f73b537d) |
-| MICROSOFT 365 A3 FOR FACULTY | M365EDU_A3_FACULTY | 4b590615-0888-425a-a965-b3bf7789848d | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Flow for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala Pro Plan 3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>PowerApps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
+| Microsoft 365 A3 for Faculty | M365EDU_A3_FACULTY | 4b590615-0888-425a-a965-b3bf7789848d | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>CDS_O365_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>POWER_VIRTUAL_AGENTS_O365_P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/> Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Common Data Service - O365 P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Common Data Service for Teams_P2 (95b76021-6a53-4741-ab8b-1d1f3d66a95a)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft Kaizala Pro Plan 3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Power Virtual Agents for Office 365 P2 (041fe683-03e4-45b6-b1af-c0cdc516daee)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint (Plan 2) for Education (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
| MICROSOFT 365 A3 FOR STUDENTS | M365EDU_A3_STUDENT | 7cfd9a2b-e110-4c39-bf20-c6a3f36a3121 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>MYANALYTICS_P2 (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Flow for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Insights by MyAnalytics (33c4f319-9bdd-48d6-9c4d-410b750a4a5a)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala Pro Plan 3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>PowerApps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) | | Microsoft 365 A3 for students use benefit | M365EDU_A3_STUUSEBNFT | 18250162-5d87-4436-a834-d795c15c80f3 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>UNIVERSAL_PRINT_NO_SEEDING (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Common Data Service - O365 P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft 365 Apps for enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala Pro Plan 3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Universal Print Without Seeding (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) | | Microsoft 365 A3 - Unattended License for students use benefit | M365EDU_A3_STUUSEBNFT_RPA1 | 1aa94593-ca12-4254-a738-81a5972958e8 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>DYN365_CDS_O365_P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>OFFICESUBSCRIPTION_unattended (8d77e2d9-9e28-4450-8431-0def64078fc5)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_O365_P3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>PROJECT_O365_P2 (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU(63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>UNIVERSAL_PRINT_NO_SEEDING (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>WHITEBOARD_PLAN2 (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Common Data Service - O365 P2 (4ff01e01-1ba7-4d71-8cf8-ce96c3bbcf14)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft 365 Apps for enterprise (unattended) (8d77e2d9-9e28-4450-8431-0def64078fc5)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala Pro Plan 3 (aebd3021-9f8f-4bf8-bbe3-0ed2f4f047a1)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Project for Office (Plan E3) (31b4e2fc-4cd6-4e7d-9c1b-41407303bd66)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Universal Print Without Seeding (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Whiteboard (Plan 2) (94a54592-cd8b-425e-87c6-97868b000b91)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
-| MICROSOFT 365 A5 FOR FACULTY | M365EDU_A5_FACULTY | e97c048c-37a4-45fb-ab50-922fbf07a370 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>OFFICE_FORMS_PLAN_3 (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Advanced Threat Protection (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Flow for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Cloud App Security (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>MICROSOFT DEFENDER FOR ENDPOINT (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Microsoft Forms (Plan 3) (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E5 SKU (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PowerApps for Office 365 Plan 3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
+| Microsoft 365 A5 for Faculty | M365EDU_A5_FACULTY | e97c048c-37a4-45fb-ab50-922fbf07a370 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>CDS_O365_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>MIP_S_Exchange (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>ContentExplorer_Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MICROSOFT_COMMUNICATION_COMPLIANCE (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>COMMUNICATIONS_DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>CUSTOMER_KEY (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>DATA_INVESTIGATIONS (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>MICROSOFTENDPOINTDLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>EXCEL_PREMIUM (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>OFFICE_FORMS_PLAN_3 (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>INFO_GOVERNANCE (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>INSIDER_RISK (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>ML_CLASSIFICATION (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>RECORDS_MANAGEMENT (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>POWER_VIRTUAL_AGENTS_O365_P3 (ded3d325-1bdc-453e-8432-5bac26d7a014)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>COMMUNICATIONS_COMPLIANCE (41fcdd7d-4733-4863-9cf4-c65b83ce2df4)<br/>INSIDER_RISK_MANAGEMENT (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>UNIVERSAL_PRINT_01 (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>WINDOWSUPDATEFORBUSINESS_DEPLOYMENTSERVICE (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for Education (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1(41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Common Data Service - O365 P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Common Data Service for Teams_P3 (afa73018-811e-46e9-988f-f75d2b1b8430)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Data Classification in Microsoft 365 (cd31b152-6326-4d1b-ae1b-997b625182e6)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection and Governance Analytics -(Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection and Governance Analytics ΓÇô Standard (2b815d45-56e4-4e3a-b65c-66cb9175b560)<br/>Information Protection for Office 365 ΓÇô Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Apps for Enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Communication Compliance (a413a9ff-720c-4822-98ef-2f37c2a21f4c)<br/> Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Cloud App Security (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Communications DLP (6dc145d6-95dd-4191-b9c3-185575ee6f6b)<br/>Microsoft Customer Key (6db1f1db-2b46-403f-be40-e39395f08dbb)<br/>Microsoft Data Investigations (46129a58-a698-46f0-aa5b-17f6586297d9)<br/>Microsoft Defender for Endpoint (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Endpoint DLP (64bfac92-2b17-4482-b5e5-a0304429de3e)<br/>Microsoft Excel Advanced Analytics (531ee2f8-b1cb-453b-9c21-d2180d014ca5)<br/>Microsoft Forms (Plan 3) (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>Microsoft Information Governance (e26c2fcc-ab91-4a61-b35c-03cdc8dddf66)<br/>Microsoft Insider Risk Management (d587c7a3-bda9-4f99-8776-9bcf59c84f75)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft ML-Based Classification (d2d51368-76c9-4317-ada2-a12c004c432f)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Records Management (65cc641f-cccd-4643-97e0-a17e3045e541)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E5 SKU (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the Web for Education (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (Plan 3) (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Power Automate for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>Power Virtual Agents for Office 365 P3 (ded3d325-1bdc-453e-8432-5bac26d7a014)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>Microsoft Communications Compliance(41fcdd7d-4733-4863-9cf4-c65b83ce2d f4)<br/>Microsoft Insider Risk Management (9d0c4ee5-e4a1-4625-ab39-d82b619b1a34)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint (Plan 2) for Education (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Universal Print (795f6fe0-cc4d-4773-b050-5dde4dc704c9)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Windows Update for Business Deployment Service (7bf960f6-2cd9-443a-8046-5dbff9558365)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
| MICROSOFT 365 A5 FOR STUDENTS | M365EDU_A5_STUDENT | 46c119d4-0379-4a9d-85e4-97c66d3f909e | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>RMS_S_PREMIUM (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>RMS_S_PREMIUM2 (5689bec4-755d-4753-8b61-40975025187c)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>LOCKBOX_ENTERPRISE (9f431833-0334-42de-a7dc-70aa40db46db)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>FLOW_O365_P3 (07699545-9485-468e-95b6-2fca3738be01)<br/>INFORMATION_BARRIERS (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>MIP_S_CLP2 (efb0351d-3b08-4503-993d-383af8de41e3)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>M365_ADVANCED_AUDITING (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>MCOMEETADV (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>MCOEV (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>MICROSOFTBOOKINGS (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>WINDEFATP (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>OFFICE_FORMS_PLAN_3 (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>EXCHANGE_ANALYTICS (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E5 (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>EQUIVIO_ANALYTICS (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>PAM_ENTERPRISE (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>BI_AZURE_P2 (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>POWERAPPS_O365_P3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>PREMIUM_ENCRYPTION (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_3 (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Azure Advanced Threat Protection (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Azure Information Protection Premium P1 (6c57d4b6-3b23-47a5-9bc9-69f17b4947b3)<br/>Azure Information Protection Premium P2 (5689bec4-755d-4753-8b61-40975025187c)<br/>Azure Rights Management (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Customer Lockbox (9f431833-0334-42de-a7dc-70aa40db46db)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Flow for Office 365 (07699545-9485-468e-95b6-2fca3738be01)<br/>Information Barriers (c4801e8a-cb58-4c35-aca6-f2dcc106f287)<br/>Information Protection for Office 365 - Premium (efb0351d-3b08-4503-993d-383af8de41e3)<br/>Information Protection for Office 365 - Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft 365 Advanced Auditing (2f442157-a11c-46b9-ae5b-6e39ff4e5849)<br/>Microsoft 365 Audio Conferencing (3e26ee1f-8a5f-4d52-aee2-b81ce45c8f40)<br/>Microsoft 365 Phone System (4828c8ec-dc2e-4779-b502-87ac9ce28ab7)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Bookings (199a5c09-e0ca-4e37-8f7c-b05d533e1ea2)<br/>Microsoft Cloud App Security (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>MICROSOFT DEFENDER FOR ENDPOINT (871d91ec-ec1a-452b-a83f-bd76c7d770ef)<br/>Microsoft Forms (Plan 3) (96c1e14a-ef43-418d-b115-9636cdaa8eed)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft MyAnalytics (Full) (34c0d7a0-a70f-4668-9238-47f9fc208882)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E5 SKU (6c6042f5-6f01-4d67-b8c1-eb99d36eed3e)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced eDiscovery (4de31727-a228-4ec3-a5bf-8e45b5ca48cc)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Office 365 Privileged Access Management (b1188c4c-1b36-4018-b48b-ee07604f6feb)<br/>Office 365 ProPlus (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power BI Pro (70d33638-9c74-4d01-bfd3-562de28bd4ba)<br/>PowerApps for Office 365 Plan 3 (9c0dab89-a30c-4117-86e7-97bda240acd2)<br/>Premium Encryption in Office 365 (617b097b-4b93-4ede-83de-5f075bb5fb2f)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 3) (3fb82609-8c27-4f7b-bd51-30634711ee67)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) | | Microsoft 365 A5 for students use benefit | M365EDU_A5_STUUSEBNFT | 31d57bc7-3a05-4867-ab53-97a17835a411 | AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>ADALLOM_S_DISCOVERY (932ad362-64a8-4783-9106-97849a1a30b9)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>UNIVERSAL_PRINT_NO_SEEDING (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Cloud App Security Discovery (932ad362-64a8-4783-9106-97849a1a30b9)<br/>Common Data Service - O365 P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection and Governance Analytics ΓÇô Premium (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft 365 Apps for enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Cloud App Security (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Universal Print Without Seeding (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) | | Microsoft 365 A5 without Audio Conferencing for students use benefit | M365EDU_A5_NOPSTNCONF_STUUSEBNFT | 81441ae1-0b31-4185-a6c0-32b6b84d419f| AAD_BASIC_EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>AAD_PREMIUM (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>AAD_PREMIUM_P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>DYN365_CDS_O365_P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>EducationAnalyticsP1 (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>EXCHANGE_S_ENTERPRISE (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Content_Explorer (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>MIP_S_CLP1 (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>INTUNE_EDU (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>OFFICESUBSCRIPTION (43de0ff5-c92c-492b-9116-175376d08c38)<br/>MTP (bf28f719-7844-4079-9c78-c1307898e192)<br/>RMS_S_ENTERPRISE (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>MFA_PREMIUM (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>ADALLOM_S_STANDALONE (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>ATA (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>ATP_ENTERPRISE (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>THREAT_INTELLIGENCE (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>OFFICE_FORMS_PLAN_2 (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>INTUNE_A (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>KAIZALA_STANDALONE (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>PROJECTWORKMANAGEMENT (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>MICROSOFT_SEARCH (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Deskless (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>STREAM_O365_E3 (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>TEAMS1 (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>MINECRAFT_EDUCATION_EDITION (4c246bbc-f513-4311-beff-eba54c353256)<br/>INTUNE_O365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>ADALLOM_S_O365 (8c098270-9dd4-4350-9b30-ba4703f3b36b <br/>SAFEDOCS (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7 <br/>SHAREPOINTWAC_EDU (e03c7e47-402c-463c-ab25-949079bedb21)<br/>POWERAPPS_O365_P2 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>FLOW_O365_P2 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>PROJECT_O365_P3 (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>SCHOOL_DATA_SYNC_P2 (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SHAREPOINTENTERPRISE_EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>MCOSTANDARD (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>SWAY (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>BPOS_S_TODO_2 (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>UNIVERSAL_PRINT_NO_SEEDING (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>WHITEBOARD_PLAN3 (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Virtualization Rights for Windows 10 (E3/E5+VDA) (e7c91390-7625-45be-94e0-e16907e03118)<br/>YAMMER_EDU (2078e8df-cff6-4290-98cb-5408261a760a) | Azure Active Directory Basic for EDU (1d0f309f-fdf9-4b2a-9ae7-9c48b91f1426)<br/>Azure Active Directory Premium P1 (41781fb2-bc02-4b7c-bd55-b576c07bb09d)<br/>Azure Active Directory Premium P2 (eec0eb4f-6444-4f95-aba0-50c24d67f998)<br/>Common Data Service - O365 P3 (28b0fa46-c39a-4188-89e2-58e979a6b014)<br/>Education Analytics (a9b86446-fa4e-498f-a92a-41b447e03337)<br/>Exchange Online (Plan 2) (efb87545-963c-4e0d-99df-69c6916d9eb0)<br/>Information Protection and Governance Analytics - Premium) (d9fa6af4-e046-4c89-9226-729a0786685d)<br/>Information Protection for Office 365 ΓÇô Standard (5136a095-5cf0-4aff-bec3-e84448b38ea5)<br/>Intune for Education (da24caf9-af8e-485c-b7c8-e73336da2693)<br/>Microsoft 365 Apps for enterprise (43de0ff5-c92c-492b-9116-175376d08c38)<br/>Microsoft 365 Defender (bf28f719-7844-4079-9c78-c1307898e192)<br/>Microsoft Azure Active Directory Rights (bea4c11e-220a-4e6d-8eb8-8ea15d019f90)<br/>Microsoft Azure Multi-Factor Authentication (8a256a2b-b617-496d-b51b-e76466e88db0)<br/>Microsoft Cloud App Security (2e2ddb96-6af9-4b1d-a3f0-d6ecfd22edb2)<br/>Microsoft Defender for Identity (14ab5db5-e6c4-4b20-b4bc-13e36fd2227f)<br/>Microsoft Defender for Office 365 (Plan 1) (f20fedf3-f3c3-43c3-8267-2bfdd51c0939)<br/>Microsoft Defender for Office 365 (Plan 2) (8e0c0a52-6a6c-4d40-8370-dd62790dcd70)<br/>Microsoft Forms (Plan 2) (9b5de886-f035-4ff2-b3d8-c9127bea3620)<br/>Microsoft Intune (c1ec4a95-1f05-45b3-a911-aa3fa01094f5)<br/>Microsoft Kaizala (0898bdbb-73b0-471a-81e5-20f1fe4dd66e)<br/>Microsoft Planner (b737dad2-2f6c-4c65-90e3-ca563267e8b9)<br/>Microsoft Search (94065c59-bc8e-4e8b-89e5-5138d471eaff)<br/>Microsoft StaffHub (8c7d2df8-86f0-4902-b2ed-a0458298f3b3)<br/>Microsoft Stream for O365 E3 SKU (9e700747-8b1d-45e5-ab8d-ef187ceec156)<br/>Microsoft Teams (57ff2da0-773e-42df-b2af-ffb7a2317929)<br/>Minecraft Education Edition (4c246bbc-f513-4311-beff-eba54c353256)<br/>Mobile Device Management for Office 365 (882e1d05-acd1-4ccb-8708-6ee03664b117)<br/>Office 365 Advanced Security Management (8c098270-9dd4-4350-9b30-ba4703f3b36b)<br/>Office 365 SafeDocs (bf6f5520-59e3-4f82-974b-7dbbc4fd27c7)<br/>Office for the web (Education) (e03c7e47-402c-463c-ab25-949079bedb21)<br/>Power Apps for Office 365 (c68f8d98-5534-41c8-bf36-22fa496fa792)<br/>Power Automate for Office 365 (76846ad7-7776-4c40-a281-a386362dd1b9)<br/>Project for Office (Plan E5) (b21a6b06-1988-436e-a07b-51ec6d9f52ad)<br/>School Data Sync (Plan 2) (500b6a2a-7a50-4f40-b5f9-160e5b8c2f48)<br/>SharePoint Plan 2 for EDU (63038b2c-28d0-45f6-bc36-33062963b498)<br/>Skype for Business Online (Plan 2) (0feaeb32-d00e-4d66-bd5a-43b5b83db82c)<br/>Sway (a23b959c-7ce8-4e57-9140-b90eb88a9e97)<br/>To-Do (Plan 2) (c87f142c-d1e9-4363-8630-aaea9c4d9ae5)<br/>Universal Print Without Seeding (b67adbaf-a096-42c9-967e-5a84edbe0086)<br/>Whiteboard (Plan 3) (4a51bca5-1eff-43f5-878c-177680f191af)<br/>Windows 10 Enterprise (New) (e7c91390-7625-45be-94e0-e16907e03118)<br/>Yammer for Academic (2078e8df-cff6-4290-98cb-5408261a760a) |
active-directory Entitlement Management Access Package Request Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/governance/entitlement-management-access-package-request-policy.md
Follow these steps if you want to bypass access requests and allow administrator
1. Skip to the [Enable requests](#enable-requests) section.
+> [!NOTE]
+> When assigning users to an access package, administrators will need to verify that the users are eligible for that access package based on the existing policy requirements. Otherwise, the users won't successfully be assigned to the access package. If the access package contains a policy that requires user requests to be approved, users can't be directly assigned to the package without necessary approval(s) from the designated approver(s).
+ ## Open and edit an existing policy of request settings
active-directory Reference Connect Version History https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/reference-connect-version-history.md
ms.assetid: ef2797d7-d440-4a9a-a648-db32ad137494
Previously updated : 03/16/2021 Last updated : 08/26/2021
The Azure Active Directory (Azure AD) team regularly updates Azure AD Connect wi
This article is designed to help you keep track of the versions that have been released, and to understand what the changes are in the latest version. -- This table is a list of related topics: Topic | Details
Topic | Details
Steps to upgrade from Azure AD Connect | Different methods to [upgrade from a previous version to the latest](how-to-upgrade-previous-version.md) Azure AD Connect release. Required permissions | For permissions required to apply an update, see [accounts and permissions](reference-connect-accounts-permissions.md#upgrade). +
+>[!IMPORTANT]
+> **On 31 August 2022, all 1.x versions of Azure Active Directory (Azure AD) Connect will be retired because they include SQL Server 2012 components that will no longer be supported.** Either upgrade to the most recent version of Azure AD Connect (2.x version) by that date, or [evaluate and switch to Azure AD cloud sync](https://docs.microsoft.com/azure/active-directory/cloud-sync/what-is-cloud-sync).
+>
+> You need to make sure you are running a recent version of Azure AD Connect to receive an optimal support experience.
+>
+> If you run a retired version of Azure AD Connect it may unexpectedly stop working and you may not have the latest security fixes, performance improvements, troubleshooting and diagnostic tools and service enhancements. Moreover, if you require support we may not be able to provide you with the level of service your organization needs.
+>
+> Go to this article to learn more about [Azure Active Directory Connect V2.0](whatis-azure-ad-connect-v2.md), what has changed in V2.0 and how this change impacts you.
+>
+> Please refer to [this article](./how-to-upgrade-previous-version.md) to learn more about how to upgrade Azure AD Connect to the latest version.
+>
+> For version history information on retired versions, see [Azure AD Connect version release history archive](reference-connect-version-history-archive.md)
+ >[!NOTE] >Releasing a new version of Azure AD Connect is a process that requires several quality control step to ensure the operation functionality of the service, and while we go through this process the version number of a new release as well as the release status will be updated to reflect the most recent state.
-While we go through this process, the version number of the release will be shown with an "X" in the minor release number position, as in "1.3.X.0" - this indicates that the release notes in this document are valid for all versions beginning with "1.3.". As soon as we have finalized the release process the release version number will be updated to the most recently released version and the release status will be updated to "Released for download and auto upgrade".
Not all releases of Azure AD Connect will be made available for auto upgrade. The release status will indicate whether a release is made available for auto upgrade or for download only. If auto upgrade was enabled on your Azure AD Connect server then that server will automatically upgrade to the latest version of Azure AD Connect that is released for auto upgrade. Note that not all Azure AD Connect configurations are eligible for auto upgrade.
-To clarify the use of Auto Upgrade, it is meant to push all important updates and critical fixes to you. This is not necessarily the latest version because not all versions will require/include a fix to a critical security issue (just one example of many). An issue like that would be addressed with a new version provided via Auto Upgrade. If there are no such issues, there are no updates pushed out using Auto Upgrade, and in general if you are using the latest auto upgrade version you should be good.
+>To clarify the use of Auto Upgrade, it is meant to push all important updates and critical fixes to you. This is not necessarily the latest version because not all versions will require/include a fix to a critical security issue (just one example of many). Critical issues would usually be addressed with a new version provided via Auto Upgrade. If there are no such issues, there are no updates pushed out using Auto Upgrade, and in general if you are using the latest auto upgrade version you should be good.
However, if youΓÇÖd like all the latest features and updates, the best way to see if there are any is to check this page and install them as you see fit.
-Please follow this link to read more about [auto upgrade](how-to-connect-install-automatic-upgrade.md)
+>Please follow this link to read more about [auto upgrade](how-to-connect-install-automatic-upgrade.md)
->[!IMPORTANT]
-> Starting on April 1st, 2024, we will retire versions of Azure AD Connect that were released before May 1st, 2018 - version 1.1.751.0 and older.
->
-> You need to make sure you are running a recent version of Azure AD Connect to receive an optimal support experience.
->
->If you run a retired version of Azure AD Connect you may not have the latest security fixes, performance improvements, troubleshooting and diagnostic tools and service enhancements, and if you require support we may not be able to provide you with the level of service your organization needs.
->
-
->
->Please refer to [this article](./how-to-upgrade-previous-version.md) to learn more about how to upgrade Azure AD Connect to the latest version.
->
->For version history information on retired versions, see [Azure AD Connect version release history archive](reference-connect-version-history-archive.md)
## Download links If you are using Windows Server 2016 or newer you should use Azure AD Connect V2.0. You can download the latest version of Azure AD Connect 2.0 using [this link](https://www.microsoft.com/en-us/download/details.aspx?id=47594).
active-directory Whatis Azure Ad Connect V2 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/hybrid/whatis-azure-ad-connect-v2.md
Previously updated : 06/24/2021 Last updated : 08/26/2021
Yes, you can do that, and it is a great way to migrate to Azure AD Connect V2.0
No ΓÇô Azure AD Connect V2.0 will not be made available for auto upgrade at this time. **I am not ready to upgrade yet ΓÇô how much time do I have?** </br>
-You should upgrade to Azure AD Connect V2.0 as soon as you can. For the time being we will continue to support older versions of Azure AD Connect, but it may prove difficult to provide a good support experience if some of the components in Azure AD Connect have dropped out of support. This upgrade is particularly important for ADAL and TLS1.0/1.1 as these services might stop working unexpectedly after they are deprecated.
+You should upgrade to Azure AD Connect V2.0 as soon as you can. **__All Azure AD Connect V1 versions will be retired on 31 August, 2022.__** For the time being we will continue to support older versions of Azure AD Connect, but it may prove difficult to provide a good support experience if some of the components in Azure AD Connect have dropped out of support. This upgrade is particularly important for ADAL and TLS1.0/1.1 as these services might stop working unexpectedly after they are deprecated.
**I use an external SQL database and do not use SQL 2012 LocalDb ΓÇô do I still have to upgrade?** </br> Yes, you still need to upgrade to remain in a supported state even if you do not use SQL Server 2012, due to the TLS1.0/1.1 and ADAL deprecation.
active-directory Datawiza With Azure Ad https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/datawiza-with-azure-ad.md
+
+ Title: Secure hybrid access with Azure AD and Datawiza
+description: In this tutorial, learn how to integrate Datawiza with Azure AD for secure hybrid access
+++++++ Last updated : 8/30/2021++++
+# Tutorial: Configure Datawiza with Azure Active Directory for secure hybrid access
+
+In this sample tutorial, learn how to integrate Azure Active Directory (Azure AD) with [Datawiza](https://www.datawiza.com/) for secure hybrid access.
+
+Datawiza's [Datawiza Access Broker
+(DAB)](https://www.datawiza.com/access-broker) extends Azure AD to enable Single Sign-on (SSO) and granular access controls to protect on-premise and cloud-hosted applications, such as Oracle E-Business Suite, Microsoft IIS, and SAP.
+
+Using this solution enterprises can quickly transition from legacy Web Access Managers (WAMs), such as Symantec SiteMinder, NetIQ, Oracle, and IBM to Azure AD without rewriting applications. Enterprises can also use Datawiza as a no-code or low-code solution to integrate new applications to Azure AD. This saves engineering time, reduces cost significantly and delivers the project in a secured manner.
+
+## Prerequisites
+
+To get started, you'll need:
+
+- An Azure subscription. If you don\'t have a subscription, you can get a [trial account](https://azure.microsoft.com/free/).
+
+- An [Azure AD tenant](https://docs.microsoft.com/azure/active-directory/fundamentals/active-directory-access-create-new-tenant)
+that's linked to your Azure subscription.
+
+- [Docker](https://docs.docker.com/get-docker/) and
+[docker-compose](https://docs.docker.com/compose/install/)
+are required to run DAB. Your applications can run on any platform, such as the virtual machine and bare metal.
+
+- An application that you'll transition from a legacy identity system to Azure AD. In this example, DAB is deployed on the same server where the application is. The application will run on localhost: 3001 and DAB proxies traffic to the application via localhost: 9772. The traffic to the application will reach DAB first and then be proxied to the application.
+
+## Scenario description
+
+Datawiza integration includes the following components:
+
+- [Azure AD](https://docs.microsoft.com/azure/active-directory/fundamentals/active-directory-whatis) - Microsoft's cloud-based identity and access management service, which helps users sign in and access external and internal resources.
+
+- Datawiza Access Broker (DAB) - The service user sign on and transparently passes identity to applications through HTTP headers.
+
+- Datawiza Cloud Management Console (DCMC) - A centralized management console that manages DAB. DCMC provides UI and Restful APIs for administrators to manage the configurations of DAB and its access control policies.
+
+The following architecture diagram shows the implementation.
+
+![image shows architecture diagram](./media/datawiza-with-azure-active-directory/datawiza-architecture-diagram.png)
+
+|Steps| Description|
+|:-|:--|
+| 1. | The user makes a request to access the on-premises or cloud-hosted application. DAB proxies the request made by the user to the application.|
+| 2. |The DAB checks the user's authentication state. If it doesn't receive a session token, or the supplied session token is invalid, then it sends the user to Azure AD for authentication.|
+| 3. | Azure AD sends the user request to the endpoint specified during the DAB application's registration in the Azure AD tenant.|
+| 4. | The DAB evaluates access policies and calculates attribute values to be included in HTTP headers forwarded to the application. During this step, the DAB may call out to the Identity provider to retrieve the information needed to set the header values correctly. The DAB sets the header values and sends the request to the application. |
+| 5. | The user is now authenticated and has access to the application.|
+
+## Onboard with Datawiza
+
+To integrate your on-premises or cloud-hosted application with Azure AD, login to [Datawiza Cloud Management
+Console](https://console.datawiza.com/) (DCMC).
+
+## Create an application on DCMC
+
+[Create an application](https://docs.datawiza.com/step-by-step/step2.html) and generate a key pair of `PROVISIONING_KEY` and `PROVISIONING_SECRET` for the application on the DCMC.
+
+For Azure AD, Datawiza offers a convenient [One click integration](https://docs.datawiza.com/tutorial/web-app-azure-one-click.html). This method to integrate Azure AD with DCMC can create an application registration on your behalf in your Azure AD tenant.
+
+![image shows configure idp](./media/datawiza-with-azure-active-directory/configure-idp.png)
+
+Instead, if you want to use an existing web application in your Azure AD tenant, you can disable the option and populate the fields of the form. You'll need the tenant ID, client ID, and client secret. [Create a web application and get these values in your tenant](https://docs.datawiza.com/idp/azure.html).
+
+![image shows configure idp using form](./media/datawiza-with-azure-active-directory/use-form.png)
+
+## Run DAB with a header-based application
+
+1. You can use either Docker or Kubernetes to run DAB. The docker image is needed for users to create a sample header-based application. [Configure DAB and SSO
+integration](https://docs.datawiza.com/step-by-step/step3.html). [Deploy DAB with Kubernetes](https://docs.datawiza.com/tutorial/web-app-AKS.html). A sample docker image `docker-compose.yml` file is provided for you to download and use. [Log in to the container registry](https://docs.datawiza.com/step-by-step/step3.html#important-step) to download the images of DAB and the header-based application.
+
+ ```YML
+
+ datawiza-access-broker:\
+ image: registry.gitlab.com/datawiza/access-broker\
+ container\_name: datawiza-access-broker\
+ restart: always\
+ ports:\
+ - \"9772:9772\"\
+ environment:\
+ PROVISIONING\_KEY: \#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\
+ PROVISIONING\_SECRET: \#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\#\
+ \
+ header-based-app:\
+ image: registry.gitlab.com/datawiza/header-based-app\
+ restart: always\
+ ports:\
+ - \"3001:3001\"
+ ```
+
+2. After executing `docker-compose -f docker-compose.yml up`, the
+header-based application should have SSO enabled with Azure AD. Open a browser and type in `http://localhost:9772/`.
+
+3. An Azure AD login page will show up.
+
+## Pass user attributes to the header-based application
+
+1. DAB gets user attributes from IdP and can pass the user attributes to the application via header or cookie. See the instructions on how to [pass user attributes](https://docs.datawiza.com/step-by-step/step4.html) such as email address, firstname, and lastname to the header-based application.
+
+2. After successfully configuring the user attributes, you should see the green check sign for each of the user attributes.
+
+ ![image shows datawiza application home page](./media/datawiza-with-azure-active-directory/datawiza-application-home-page.png)
+
+## Test the flow
+
+1. Navigate to the application URL.
+
+2. The DAB should redirect to the Azure AD login page.
+
+3. After successfully authenticating, you should be redirected to DAB.
+
+4. The DAB evaluates policies, calculates headers, and sends the user to the upstream application.
+
+5. Your requested application should show up.
+
+## Next steps
+
+- [Configure Datawiza with Azure AD B2C](https://docs.microsoft.com/azure/active-directory-b2c/partner-datawiza)
+
+- [Datawiza documentation](https://docs.datawiza.com)
active-directory Grant Admin Consent https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/grant-admin-consent.md
Previously updated : 11/04/2019 Last updated : 08/21/2021 +
+#customer intent: As an admin, I want to grant tenant wide admin consent to an application in Azure AD.
# Grant tenant-wide admin consent to an application
To grant tenant-wide admin consent to an app listed in **Enterprise applications
1. Sign in to the [Azure portal](https://portal.azure.com) with a role that allows granting admin consent (see [Prerequisites](#prerequisites)). 2. Select **Azure Active Directory** then **Enterprise applications**. 3. Select the application to which you want to grant tenant-wide admin consent.
-4. Select **Permissions** and then click **Grant admin consent**.
+4. Select **Permissions** and then click **Grant admin consent**. In this example, we use 10,000ft Plans applications.
+
+ :::image type="content" source="media/grant-tenant-wide-admin-consent/grant-tenant-wide-admin-consent.png" alt-text="Screenshot shows how to grant tenant wide admin consent.":::
+ 5. Carefully review the permissions the application requires. 6. If you agree with the permissions the application requires, grant consent. If not, click **Cancel** or close the window.
active-directory Secure Hybrid Access https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/secure-hybrid-access.md
The following partners offer pre-built solutions to support conditional access p
- [Citrix Application Delivery Controller (ADC)](https://docs.microsoft.com/azure/active-directory/saas-apps/citrix-netscaler-tutorial) -- [Datawiza Access Broker](https://docs.microsoft.com/azure/active-directory/manage-apps/add-application-portal-setup-oidc-sso)
+- [Datawiza Access Broker](datawiza-with-azure-ad.md)
- [F5 Big-IP APM ADC](https://docs.microsoft.com/azure/active-directory/manage-apps/f5-aad-integration)
active-directory How To View Managed Identity Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/how-to-view-managed-identity-activity.md
+
+ Title: View update and sign-in activities for Managed identities
+description: Step-by-step instructions for viewing the activities made to managed identities, and authentications carried out by managed identities
+
+documentationcenter: ''
++
+editor: ''
+++
+ms.devlang: na
+
+ na
+ Last updated : 08/26/2021++++
+# View update and sign-in activities for Managed identities
+
+This article will explain how to view updates carried out to managed identities, and sign-in attempts made by managed identities.
+
+## Prerequisites
+
+- If you're unfamiliar with managed identities for Azure resources, check out the [overview section](overview.md).
+- If you don't already have an Azure account, [sign up for a free account](https://azure.microsoft.com/free/).
+
+## View updates made to user-assigned managed identities
+
+This procedure demonstrates how to view updates carried out to user-assigned managed identities
+
+1. In the Azure portal, browse to **Activity Log**.
+
+ ![Browse to the activity log in the Azure portal](./media/how-to-view-managed-identity-activity/browse-to-activity-log.png)
+
+2. Select the **Add Filter** search pill and select **Operation** from the list.
+
+![Start building the search filter](./media/how-to-view-managed-identity-activity/start-adding-search-filter.png)
+
+3. In the **Operation** dropdown list, enter these operation names: "Delete User Assigned Identity" and "Write UserAssignedIdentities".
+
+![Add operations to the search filter](./media/how-to-view-managed-identity-activity/add-operations-to-search-filter.png)
+
+4. When matching operations are displayed, select one to view the summary.
+
+![View summary of the operation](./media/how-to-view-managed-identity-activity/view-summary-of-operation.png)
+
+5. Select the **JSON** tab to view more detailed information about the operation, and scroll to the **properties** node to view information about the identity that was modified.
+
+![View detail of the operation](./media/how-to-view-managed-identity-activity/view-json-of-operation.png)
+
+## View role assignments added and removed for managed identities
+
+ > [!NOTE]
+ > You will need to search by the object (principal) ID of the managed identity you want to view role assignment changes for
+
+1. Locate the managed identity you wish to view the role assignment changes for. If you're looking for a system-assigned managed identity, the object ID will be displayed in the **Identity** screen under the resource. If you're looking for a user-assigned identity, the object ID will be displayed in the **Overview** page of the managed identity.
+
+User-assigned identity:
+
+![Get object ID of user-assigned identity](./media/how-to-view-managed-identity-activity/get-object-id-of-user-assigned-identity.png)
+
+System-assigned identity:
+
+![Get object ID of system-assigned identity](./media/how-to-view-managed-identity-activity/get-object-id-of-system-assigned-identity.png)
+
+2. Copy the object ID.
+3. Browse to the **Activity log**.
+
+ ![Browse to the activity log in the Azure portal](./media/how-to-view-managed-identity-activity/browse-to-activity-log.png)
+
+4. Select the **Add Filter** search pill and select **Operation** from the list.
+
+![Start building the search filter](./media/how-to-view-managed-identity-activity/start-adding-search-filter.png)
+
+5. In the **Operation** dropdown list, enter these operation names: "Create role assignment" and "Delete role assignment".
+
+![Add role assignment operations to the search filter](./media/how-to-view-managed-identity-activity/add-role-assignment-operations-to-search-filter.png)
+
+6. Paste the object ID in the search box; the results will be filtered automatically.
+
+![Search by object ID](./media/how-to-view-managed-identity-activity/search-by-object-id.png)
+
+7. When matching operations are displayed, select one to view the summary.
+
+![Summary of role assignment for managed identity](./media/how-to-view-managed-identity-activity/summary-of-role-assignment-for-msi.png)
+
+## View authentication attempts by managed identities
+
+1. Browse to **Azure Active Directory**.
+
+![Browse to active directory](./media/how-to-view-managed-identity-activity/browse-to-active-directory.png)
+
+2. Select **Sign-in logs** from the **Monitoring** section.
+
+![Select sign-in logs](./media/how-to-view-managed-identity-activity/sign-in-logs-menu-item.png)
+
+3. Select the **Managed identity sign-ins** tab.
+
+![managed identity sign-in](./media/how-to-view-managed-identity-activity/msi-sign-ins.png)
+
+4. The sign-in events will now be filtered by managed identities.
+
+![managed identity sign-in events](./media/how-to-view-managed-identity-activity/msi-sign-in-events.png)
+
+5. To view the identity's Enterprise application in Azure Active Directory, select the ΓÇ£Managed Identity IDΓÇ¥ column.
+6. To view the Azure resource or user-assigned managed identity, search by name in the search bar of the Azure portal.
+
+## Next steps
+
+* [Managed identities for Azure resources](./overview.md)
+* [Azure Activity log](/azure/azure-monitor/essentials/activity-log)
+* [Azure Active Directory sign-ins log](/azure/active-directory/reports-monitoring/concept-sign-ins )
active-directory Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/overview.md
ms.devlang: Previously updated : 05/20/2021 Last updated : 08/26/2021
A common challenge for developers is the management of secrets and credentials used to secure communication between different components making up a solution. Managed identities eliminate the need for developers to manage credentials. Managed identities provide an identity for applications to use when connecting to resources that support Azure Active Directory (Azure AD) authentication. Applications may use the managed identity to obtain Azure AD tokens. For example, an application may use a managed identity to access resources like [Azure Key Vault](../../key-vault/general/overview.md) where developers can store credentials in a secure manner or to access storage accounts.
-What can a managed identity be used for?</br>
+Take a look at how you can use managed identities</br>
+
+> [!VIDEO https://channel9.msdn.com/Shows/On-NET/Using-Azure-Managed-identities/player?format=ny]
+
-> [!VIDEO https://www.youtube.com/embed/5lqayO_oeEo]
Here are some of the benefits of using Managed identities:
There are two types of managed identities:
- **System-assigned** Some Azure services allow you to enable a managed identity directly on a service instance. When you enable a system-assigned managed identity an identity is created in Azure AD that is tied to the lifecycle of that service instance. So when the resource is deleted, Azure automatically deletes the identity for you. By design, only that Azure resource can use this identity to request tokens from Azure AD. - **User-assigned** You may also create a managed identity as a standalone Azure resource. You can [create a user-assigned managed identity](how-to-manage-ua-identity-portal.md) and assign it to one or more instances of an Azure service. In the case of user-assigned managed identities, the identity is managed separately from the resources that use it. </br></br>
-> [!VIDEO https://www.youtube.com/embed/OzqpxeD3fG0]
The table below shows the differences between the two types of managed identities.
active-directory Reference Azure Monitor Sign Ins Log Schema https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/reports-monitoring/reference-azure-monitor-sign-ins-log-schema.md
description: Describe the Azure AD sign in log schema for use in Azure Monitor
documentationcenter: '' -+ editor: '' ms.assetid: 4b18127b-d1d0-4bdc-8f9c-6a4c991c5f75
na Previously updated : 07/30/2021 Last updated : 08/26/2021
This article describes the Azure Active Directory (Azure AD) sign-in log schema
```json
-{
- "time": "2019-03-12T16:02:15.5522137Z",
- "resourceId": "/tenants/<TENANT ID>/providers/Microsoft.aadiam",
- "operationName": "Sign-in activity",
- "operationVersion": "1.0",
- "category": "SignInLogs",
- "tenantId": "<TENANT ID>",
- "resultType": "50140",
- "resultSignature": "None",
- "resultDescription": "This error occurred due to 'Keep me signed in' interrupt when the user was signing-in.",
- "durationMs": 0,
- "callerIpAddress": "<CALLER IP ADDRESS>",
- "correlationId": "a75a10bd-c126-486b-9742-c03110d36262",
- "identity": "Timothy Perkins",
- "Level": 4,
- "location": "US",
- "properties":
- {
- "id":"0231f922-93fa-4005-bb11-b344eca03c01",
- "createdDateTime":"2019-03-12T16:02:15.5522137+00:00",
- "userDisplayName":"Timothy Perkins",
- "userPrincipalName":"<USER PRINCIPAL NAME>",
- "userId":"<USER ID>",
- "appId":"<APPLICATION ID>",
- "appDisplayName":"Azure Portal",
- "ipAddress":"<IP ADDRESS>",
- "status":
- {
- "errorCode":50140,
- "failureReason":"This error occurred due to 'Keep me signed in' interrupt when the user was signing-in."
- },
- "clientAppUsed":"Browser",
- "deviceDetail":
- {
- "operatingSystem":"Windows 10",
- "browser":"Chrome 72.0.3626"
- },
- "location":
- {
- "city":"Bellevue",
- "state":"Washington",
- "countryOrRegion":"US",
- "geoCoordinates":
- {
- "latitude":45,
- "longitude":122
- }
- },
- "correlationId":"a75a10bd-c126-486b-9742-c03110d36262",
- "conditionalAccessStatus":"notApplied",
- "appliedConditionalAccessPolicies":
- [
- {
- "id":"ae11ffaa-9879-44e0-972c-7538fd5c4d1a",
- "displayName":"Hr app access policy",
- "enforcedGrantControls":
- [
- "Mfa"
- ],
- "enforcedSessionControls":
- [
- ],
- "result":"notApplied"
- },
- {
- "id":"b915a70b-2eee-47b6-85b6-ff4f4a66256d",
- "displayName":"MFA for all but global support access",
- "enforcedGrantControls":[],
- "enforcedSessionControls":[],
- "result":"notEnabled"
- },
- {
- "id":"830f27fa-67a8-461f-8791-635b7225caf1",
- "displayName":"Header Based Application Control",
- "enforcedGrantControls":["Mfa"],
- "enforcedSessionControls":[],
- "result":"notApplied"
- },
- {
- "id":"8ed8d7f7-0a2e-437b-b512-9e47bed562e6",
- "displayName":"MFA for everyones",
- "enforcedGrantControls":[],
- "enforcedSessionControls":[],
- "result":"notEnabled"
- },
- {
- "id":"52924e0f-798b-4afd-8c42-49055c7d6395",
- "displayName":"Device compliant",
- "enforcedGrantControls":[],
- "enforcedSessionControls":[],
- "result":"notEnabled"
- },
- ],
- "isInteractive":true,
- "tokenIssuerType":"AzureAD",
- "authenticationProcessingDetails":[],
- "networkLocationDetails":[],
- "processingTimeInMilliseconds":0,
- "riskDetail":"hidden",
- "riskLevelAggregated":"hidden",
- "riskLevelDuringSignIn":"hidden",
- "riskState":"none",
- "riskEventTypes":[],
- "resourceDisplayName":"windows azure service management api",
- "resourceId":"797f4846-ba00-4fd7-ba43-dac1f8f63013"
- }
+{
+ "time": "2019-03-12T16:02:15.5522137Z",
+ "resourceId": "/tenants/<TENANT ID>/providers/Microsoft.aadiam",
+ "operationName": "Sign-in activity",
+ "operationVersion": "1.0",
+ "category": "SignInLogs",
+ "tenantId": "<TENANT ID>",
+ "resultType": "50140",
+ "resultSignature": "None",
+ "resultDescription": "This error occurred due to 'Keep me signed in' interrupt when the user was signing-in.",
+ "durationMs": 0,
+ "callerIpAddress": "<CALLER IP ADDRESS>",
+ "correlationId": "a75a10bd-c126-486b-9742-c03110d36262",
+ "identity": "Timothy Perkins",
+ "Level": 4,
+ "location": "US",
+ "properties":
+ {
+ "id": "0231f922-93fa-4005-bb11-b344eca03c01",
+ "createdDateTime": "2019-03-12T16:02:15.5522137+00:00",
+ "userDisplayName": "Timothy Perkins",
+ "userPrincipalName": "<USER PRINCIPAL NAME>",
+ "userId": "<USER ID>",
+ "appId": "<APPLICATION ID>",
+ "appDisplayName": "Azure Portal",
+ "ipAddress": "<IP ADDRESS>",
+ "status": {
+ "errorCode": 50140,
+ "failureReason": "This error occurred due to 'Keep me signed in' interrupt when the user was signing-in."
+ },
+ "clientAppUsed": "Browser",
+ "userAgent": "<USER AGENT>",
+ "deviceDetail":
+ {
+ "deviceId": "8bfcb982-6856-4402-924c-ada2486321cc",
+ "operatingSystem": "Windows 10",
+ "browser": "Chrome 72.0.3626"
+ },
+ "location":
+ {
+ "city": "Bellevue",
+ "state": "Washington",
+ "countryOrRegion": "US",
+ "geoCoordinates":
+ {
+ "latitude": 45,
+ "longitude": 122
+ }
+ },
+ "correlationId": "a75a10bd-c126-486b-9742-c03110d36262",
+ "conditionalAccessStatus": "notApplied",
+ "appliedConditionalAccessPolicies": [
+ {
+ "id": "ae11ffaa-9879-44e0-972c-7538fd5c4d1a",
+ "displayName": "HR app access policy",
+ "enforcedGrantControls": [
+ "Mfa"
+ ],
+ "enforcedSessionControls": [],
+ "result": "notApplied",
+ "conditionsSatisfied": 0,
+ "conditionsNotSatisfied": 0
+ },
+ {
+ "id": "b915a70b-2eee-47b6-85b6-ff4f4a66256d",
+ "displayName": "MFA for all but global support access",
+ "enforcedGrantControls": [],
+ "enforcedSessionControls": [],
+ "result": "notEnabled",
+ "conditionsSatisfied": 0,
+ "conditionsNotSatisfied": 0
+ },
+ {
+ "id": "830f27fa-67a8-461f-8791-635b7225caf1",
+ "displayName": "Header Based Application Control",
+ "enforcedGrantControls": [
+ "Mfa"
+ ],
+ "enforcedSessionControls": [],
+ "result": "notApplied",
+ "conditionsSatisfied": 0,
+ "conditionsNotSatisfied": 0
+ },
+ {
+ "id": "8ed8d7f7-0a2e-437b-b512-9e47bed562e6",
+ "displayName": "MFA for everyones",
+ "enforcedGrantControls": [],
+ "enforcedSessionControls": [],
+ "result": "notEnabled",
+ "conditionsSatisfied": 0,
+ "conditionsNotSatisfied": 0
+ },
+ {
+ "id": "52924e0f-798b-4afd-8c42-49055c7d6395",
+ "displayName": "Device compliant",
+ "enforcedGrantControls": [],
+ "enforcedSessionControls": [],
+ "result": "notEnabled",
+ "conditionsSatisfied": 0,
+ "conditionsNotSatisfied": 0
+ }
+ ],
+ "originalRequestId": "f2f0a254-f831-43b9-bcb0-2646fb645c00",
+ "isInteractive": true,
+ "authenticationProcessingDetails": [
+ {
+ "key": "Login Hint Present",
+ "value": "True"
+ }
+ ],
+ "networkLocationDetails": [],
+ "processingTimeInMilliseconds": 238,
+ "riskDetail": "none",
+ "riskLevelAggregated": "none",
+ "riskLevelDuringSignIn": "none",
+ "riskState": "none",
+ "riskEventTypes": [],
+ "riskEventTypes_v2": [],
+ "resourceDisplayName": "Office 365 SharePoint Online",
+ "resourceId": "00000003-0000-0ff1-ce00-000000000000",
+ "resourceTenantId": "72f988bf-86f1-41af-91ab-2d7cd011db47",
+ "homeTenantId": "<USER HOME TENANT ID>",
+ "tokenIssuerName": "",
+ "tokenIssuerType": "AzureAD",
+ "authenticationDetails": [
+ {
+ "authenticationStepDateTime": "2019-03-12T16:02:15.5522137+00:00",
+ "authenticationMethod": "Previously satisfied",
+ "succeeded": true,
+ "authenticationStepResultDetail": "First factor requirement satisfied by claim in the token",
+ "authenticationStepRequirement": "Primary authentication",
+ "StatusSequence": 0,
+ "RequestSequence": 0
+ },
+ {
+ "authenticationStepDateTime": "2021-08-12T15:48:12.8677211+00:00",
+ "authenticationMethod": "Previously satisfied",
+ "succeeded": true,
+ "authenticationStepResultDetail": "MFA requirement satisfied by claim in the token",
+ "authenticationStepRequirement": "Multi-factor authentication"
+ }
+ ],
+ "authenticationRequirementPolicies": [
+ {
+ "requirementProvider": "multiConditionalAccess",
+ "detail": "Conditional Access"
+ }
+ ],
+ "authenticationRequirement": "multiFactorAuthentication",
+ "alternateSignInName": "<ALTERNATE SIGN IN>",
+ "signInIdentifier": "<SIGN IN IDENTIFIER>",
+ "servicePrincipalId": "",
+ "userType": "Member",
+ "flaggedForReview": false,
+ "isTenantRestricted": false,
+ "autonomousSystemNumber": 8000,
+ "crossTenantAccessType": "none",
+ "privateLinkDetails": {},
+ "ssoExtensionVersion": ""
+ }
}+ ```
active-directory Reports Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/reports-monitoring/reports-faq.md
na
ms.devlang: na Previously updated : 07/28/2021 Last updated : 08/26/2021
You can also export that log data to Azure Monitor, Azure Event Hubs, and Azure
+## Sign-in logs
+
+**Q: What data is included in the CSV file I can download from the Azure AD Sign-in logs blade in the Azure Portal?**
+**A:** The CSV includes sign-in logs for your users and service principals. However, data that is represented as a nested array in the MS Graph API for sign in logs is not included in CSV downloads. For example, CA policies and report-only information are not included in the CSV download. If you need to export all the information contained in your sign-in logs, the Export Data Settings button in the Azure AD Sign-in logs blade will let you export all data.
++
+
+**Q: Why is Client app not populated when a guest signs into my tenant?**
+**A:** When a guest user signs into your tenant, the client app information for that user is not displayed in your tenant's sign-in logs to maintain customer privacy. Your users' client apps will not be displayed to other tenants that your users attempt to access.
+++
+**Q: Why is Device ID not populated when a guest signs into my tenant?**
+**A:** When a user signs into your tenant using a device registered with another tenant, the Device ID for that device is not displayed in your tenant's sign-in logs to maintain customer privacy. Your Device IDs will not be displayed to other tenants that your users attempt to access.
+++
+**Q: In some interrupted sign-ins, why do I see an Object ID rather than a UPN for my user?**
+**A:** When our service is unable to resolve the UPN of a user due to an interrupted or failed sign in, it may display an object ID instead.
+++
+**Q: Why is a user's sign-in shown as interactive sign-ins even if the property isInteractive is False?**
+**A:** This property is being deprecated. It does not reliably indicate which sign-in events are interactive and which are non-interactive.
+
+Within the Azure AD sign-in logs blade in the Azure Portal, you can find interactive sign-ins in the User sign-ins (interactive) tab and non-interactive sign-ins in the User sign-ins (non-interactive) tab. In the MS Graph API, you should rely on the signInEventTypes property to determine which signins are interactive. For example:
+
+`"signInEventTypes":["interactiveUser"],`
+
+You can also filter using the $filter parameter when requesting sign-in logs from the MS Graph API.
+ ## Risky sign-ins **Q: There is a risk detection in Identity Protection but IΓÇÖm not seeing corresponding sign-in in the sign-ins report. Is this expected?**
active-directory Amazon Managed Grafana Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/amazon-managed-grafana-tutorial.md
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Amazon Managed Grafana | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and Amazon Managed Grafana.
++++++++ Last updated : 08/25/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with Amazon Managed Grafana
+
+In this tutorial, you'll learn how to integrate Amazon Managed Grafana with Azure Active Directory (Azure AD). When you integrate Amazon Managed Grafana with Azure AD, you can:
+
+* Control in Azure AD who has access to Amazon Managed Grafana.
+* Enable your users to be automatically signed-in to Amazon Managed Grafana with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Amazon Web Services (AWS) [free account](https://aws.amazon.com/free/).
+* Amazon Managed Grafana single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Amazon Managed Grafana supports **SP** initiated SSO.
+* Amazon Managed Grafana supports **Just In Time** user provisioning.
+
+## Add Amazon Managed Grafana from the gallery
+
+To configure the integration of Amazon Managed Grafana into Azure AD, you need to add Amazon Managed Grafana from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Amazon Managed Grafana** in the search box.
+1. Select **Amazon Managed Grafana** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Amazon Managed Grafana
+
+Configure and test Azure AD SSO with Amazon Managed Grafana using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Amazon Managed Grafana.
+
+To configure and test Azure AD SSO with Amazon Managed Grafana, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Amazon Managed Grafana SSO](#configure-amazon-managed-grafana-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Amazon Managed Grafana test user](#create-amazon-managed-grafana-test-user)** - to have a counterpart of B.Simon in Amazon Managed Grafana that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Amazon Managed Grafana** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, perform the following steps:
+
+ a. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
+ `https://<namespace>.grafana-workspace.<region>.amazonaws.com/saml/metadata`
+
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
+ `https://<namespace>.grafana-workspace.<region>.amazonaws.com/login/saml`
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Identifier and Sign on URL. Contact [Amazon Managed Grafana Client support team](https://aws.amazon.com/contact-us/) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. Amazon Managed Grafana application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
+
+ ![image](common/default-attributes.png)
+
+1. In addition to above, Amazon Managed Grafana application expects few more attributes to be passed back in SAML response which are shown below. These attributes are also pre populated but you can review them as per your requirements.
+
+ | Name | Source attribute |
+ | -| |
+ | displayName | user.displayname |
+ | mail | user.userprincipalname |
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/metadataxml.png)
+
+1. On the **Set up Amazon Managed Grafana** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Amazon Managed Grafana.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Amazon Managed Grafana**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Amazon Managed Grafana SSO
+
+1. Log in to your Amazon Managed Grafana Console as an administrator.
+
+1. Click **Create workspace**.
+
+ ![Screenshot shows creating workspace.](./media/amazon-managed-grafana-tutorial/console.png "Workspace")
+
+1. In the **Specify workspace details** page, type a unique **Workspace name** and click **Next**.
+
+ ![Screenshot shows workspace details.](./media/amazon-managed-grafana-tutorial/details.png "Specify details")
+
+1. In the **Configure settings** page, select **Security Assertion Markup Language(SAML)** checkbox and enable **Service managed** as permission type and click **Next**.
+
+ ![Screenshot shows workspace settings.](./media/amazon-managed-grafana-tutorial/security.png "Settings")
+
+1. In the **Service managed permission settings**, select **Current account** and click **Next**.
+
+ ![Screenshot shows permission settings.](./media/amazon-managed-grafana-tutorial/setting.png "permissions")
+
+1. In the **Review and create** page, verify all the workspace details and click **Create workspace**.
+
+ ![Screenshot shows review and create page.](./media/amazon-managed-grafana-tutorial/review-workspace.png " Create Workspace")
+
+1. After creating workspace, click **Complete setup** to complete the SAML configuration.
+
+ ![Screenshot shows SAML configuration.](./media/amazon-managed-grafana-tutorial/setup.png "SAML Configuration")
+
+1. In the **Security Assertion Markup Language(SAML)** page, perform the following steps.
+
+ ![Screenshot shows SAML Setup.](./media/amazon-managed-grafana-tutorial/configuration.png "SAML Setup")
+
+ 1. Copy **Service provider identifier(Entity ID)** value, paste this value into the **Identifier** text box in the **Basic SAML Configuration** section in the Azure portal.
+
+ 1. Copy **Service provider reply URL(Assertion consumer service URL)** value, paste this value into the **Reply URL** text box in the **Basic SAML Configuration** section in the Azure portal.
+
+ 1. Copy **Service provider login URL** value, paste this value into the **Sign on URL** text box in the **Basic SAML Configuration** section in the Azure portal.
+
+ 1. Open the downloaded **Federation Metadata XML** from the Azure portal into Notepad and upload the XML file by clicking **Choose file** option.
+
+ 1. In the **Assertion mapping** section, fill the required values according to your requirement.
+
+ 1. Click **Save SAML configuration**.
+
+### Create Amazon Managed Grafana test user
+
+In this section, a user called Britta Simon is created in Amazon Managed Grafana. Amazon Managed Grafana supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Amazon Managed Grafana, a new one is created after authentication.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on **Test this application** in Azure portal. This will redirect to Amazon Managed Grafana Sign-on URL where you can initiate the login flow.
+
+* Go to Amazon Managed Grafana Sign-on URL directly and initiate the login flow from there.
+
+* You can use Microsoft My Apps. When you click the Amazon Managed Grafana tile in the My Apps, this will redirect to Amazon Managed Grafana Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure Amazon Managed Grafana you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Animaker Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/animaker-tutorial.md
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Animaker | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and Animaker.
++++++++ Last updated : 08/24/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with Animaker
+
+In this tutorial, you'll learn how to integrate Animaker with Azure Active Directory (Azure AD). When you integrate Animaker with Azure AD, you can:
+
+* Control in Azure AD who has access to Animaker.
+* Enable your users to be automatically signed-in to Animaker with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Animaker single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Animaker supports **SP and IDP** initiated SSO.
+
+## Add Animaker from the gallery
+
+To configure the integration of Animaker into Azure AD, you need to add Animaker from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Animaker** in the search box.
+1. Select **Animaker** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Animaker
+
+Configure and test Azure AD SSO with Animaker using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Animaker.
+
+To configure and test Azure AD SSO with Animaker, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Animaker SSO](#configure-animaker-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Animaker test user](#create-animaker-test-user)** - to have a counterpart of B.Simon in Animaker that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Animaker** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, the user does not have to perform any step as the app is already pre-integrated with Azure.
+
+1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
+
+ In the **Sign-on URL** text box, type the URL:
+ `https://app.animaker.com/login/samlsuccess/azure/`
+
+1. On the **Set up single sign-on with SAML** page, In the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer.
+
+ ![The Certificate download link](common/copy-metadataurl.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Animaker.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Animaker**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Animaker SSO
+
+To configure single sign-on on **Animaker** side, you need to send the **App Federation Metadata Url** to [Animaker support team](mailto:help@animaker.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create Animaker test user
+
+In this section, you create a user called Britta Simon in Animaker. Work with [Animaker support team](mailto:help@animaker.com) to add the users in the Animaker platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Animaker Sign on URL where you can initiate the login flow.
+
+* Go to Animaker Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
+
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Animaker for which you set up the SSO.
+
+You can also use Microsoft My Apps to test the application in any mode. When you click the Animaker tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Animaker for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure Animaker you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Baldwin Safety & Compliance Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/baldwin-safety-&-compliance-tutorial.md
Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Baldwin Safety & Compliance | Microsoft Docs'
-description: Learn how to configure single sign-on between Azure Active Directory and Baldwin Safety & Compliance.
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Baldwin Safety and Compliance | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and Baldwin Safety and Compliance.
Previously updated : 08/16/2021 Last updated : 08/25/2021
-# Tutorial: Azure Active Directory single sign-on (SSO) integration with Baldwin Safety & Compliance
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with Baldwin Safety and Compliance
-In this tutorial, you'll learn how to integrate Baldwin Safety & Compliance with Azure Active Directory (Azure AD). When you integrate Baldwin Safety & Compliance with Azure AD, you can:
+In this tutorial, you'll learn how to integrate Baldwin Safety and Compliance with Azure Active Directory (Azure AD). When you integrate Baldwin Safety and Compliance with Azure AD, you can:
-* Control in Azure AD who has access to Baldwin Safety & Compliance.
-* Enable your users to be automatically signed-in to Baldwin Safety & Compliance with their Azure AD accounts.
+* Control in Azure AD who has access to Baldwin Safety and Compliance.
+* Enable your users to be automatically signed-in to Baldwin Safety and Compliance with their Azure AD accounts.
* Manage your accounts in one central location - the Azure portal. ## Prerequisites
In this tutorial, you'll learn how to integrate Baldwin Safety & Compliance with
To get started, you need the following items: * An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
-* Baldwin Safety & Compliance single sign-on (SSO) enabled subscription.
+* Baldwin Safety and Compliance single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Baldwin Safety & Compliance supports **IDP** initiated SSO.
+* Baldwin Safety and Compliance supports **IDP** initiated SSO.
-## Add Baldwin Safety & Compliance from the gallery
+## Add Baldwin Safety and Compliance from the gallery
-To configure the integration of Baldwin Safety & Compliance into Azure AD, you need to add Baldwin Safety & Compliance from the gallery to your list of managed SaaS apps.
+To configure the integration of Baldwin Safety and Compliance into Azure AD, you need to add Baldwin Safety and Compliance from the gallery to your list of managed SaaS apps.
1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account. 1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**.
-1. In the **Add from the gallery** section, type **Baldwin Safety & Compliance** in the search box.
-1. Select **Baldwin Safety & Compliance** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+1. In the **Add from the gallery** section, type **Baldwin Safety and Compliance** in the search box.
+1. Select **Baldwin Safety and Compliance** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD SSO for Baldwin Safety & Compliance
+## Configure and test Azure AD SSO for Baldwin Safety and Compliance
-Configure and test Azure AD SSO with Baldwin Safety & Compliance using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Baldwin Safety & Compliance.
+Configure and test Azure AD SSO with Baldwin Safety and Compliance using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Baldwin Safety and Compliance.
-To configure and test Azure AD SSO with Baldwin Safety & Compliance, perform the following steps:
+To configure and test Azure AD SSO with Baldwin Safety and Compliance, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature. 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon. 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on. 1. **[Configure Baldwin Safety and Compliance SSO](#configure-baldwin-safety-and-compliance-sso)** - to configure the single sign-on settings on application side.
- 1. **[Create Baldwin Safety and Compliance test user](#create-baldwin-safety-and-compliance-test-user)** - to have a counterpart of B.Simon in Baldwin Safety & Compliance that is linked to the Azure AD representation of user.
+ 1. **[Create Baldwin Safety and Compliance test user](#create-baldwin-safety-and-compliance-test-user)** - to have a counterpart of B.Simon in Baldwin Safety and Compliance that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the Azure portal, on the **Baldwin Safety & Compliance** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Baldwin Safety and Compliance** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**. 1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
Follow these steps to enable Azure AD SSO in the Azure portal.
![Copy Thumbprint value](common/copy-thumbprint.png)
-1. On the **Set up Baldwin Safety & Compliance** section, copy the appropriate URL(s) based on your requirement.
+1. On the **Set up Baldwin Safety and Compliance** section, copy the appropriate URL(s) based on your requirement.
![Copy configuration URLs](common/copy-configuration-urls.png)
In this section, you'll create a test user in the Azure portal called B.Simon.
### Assign the Azure AD test user
-In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Baldwin Safety & Compliance.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Baldwin Safety and Compliance.
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
-1. In the applications list, select **Baldwin Safety & Compliance**.
+1. In the applications list, select **Baldwin Safety and Compliance**.
1. In the app's overview page, find the **Manage** section and select **Users and groups**. 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog. 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
## Configure Baldwin Safety and Compliance SSO
-To configure single sign-on on **Baldwin Safety & Compliance** side, you need to send the **Thumbprint Value** and appropriate copied URLs from Azure portal to [Baldwin Safety & Compliance support team](mailto:support@baldwinaviation.com). They set this setting to have the SAML SSO connection set properly on both sides.
+To configure single sign-on on **Baldwin Safety and Compliance** side, you need to send the **Thumbprint Value** and appropriate copied URLs from Azure portal to [Baldwin Safety and Compliance support team](mailto:support@baldwinaviation.com). They set this setting to have the SAML SSO connection set properly on both sides.
### Create Baldwin Safety and Compliance test user
-In this section, you create a user called Britta Simon in Baldwin Safety & Compliance. Work with [Baldwin Safety & Compliance support team](mailto:support@baldwinaviation.com) to add the users in the Baldwin Safety & Compliance platform. Users must be created and activated before you use single sign-on.
+In this section, you create a user called Britta Simon in Baldwin Safety and Compliance. Work with [Baldwin Safety and Compliance support team](mailto:support@baldwinaviation.com) to add the users in the Baldwin Safety and Compliance platform. Users must be created and activated before you use single sign-on.
## Test SSO In this section, you test your Azure AD single sign-on configuration with following options.
-* Click on Test this application in Azure portal and you should be automatically signed in to the Baldwin Safety & Compliance for which you set up the SSO.
+* Click on Test this application in Azure portal and you should be automatically signed in to the Baldwin Safety and Compliance for which you set up the SSO.
-* You can use Microsoft My Apps. When you click the Baldwin Safety & Compliance tile in the My Apps, you should be automatically signed in to the Baldwin Safety & Compliance for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* You can use Microsoft My Apps. When you click the Baldwin Safety and Compliance tile in the My Apps, you should be automatically signed in to the Baldwin Safety and Compliance for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
## Next steps
-Once you configure Baldwin Safety & Compliance you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
+Once you configure Baldwin Safety and Compliance you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Bldng App Provisioning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/bldng-app-provisioning-tutorial.md
+
+ Title: 'Tutorial: Configure BLDNG APP for automatic user provisioning with Azure Active Directory | Microsoft Docs'
+description: Learn how to automatically provision and de-provision user accounts from Azure AD to BLDNG APP.
++
+writer: twimmers
++
+ms.assetid: 5ccc1176-c244-4003-8486-67586bcdf317
++++ Last updated : 05/10/2021+++
+# Tutorial: Configure BLDNG APP for automatic user provisioning in BLDNG.AI
+
+This tutorial describes the steps you need to perform in both BLDNG APP and Azure Active Directory (Azure AD) to configure automatic user provisioning. When configured, Azure AD automatically provisions and de-provisions users and groups to [BLDNG APP](https://dashboard.bldng.ai/) using the Azure AD Provisioning service. For important details on what this service does, how it works, and frequently asked questions, see [Automate user provisioning and deprovisioning to SaaS applications with Azure Active Directory](../app-provisioning/user-provisioning.md).
++
+## Capabilities supported
+> [!div class="checklist"]
+> * Create users in BLDNG.AI
+> * Remove users in BLDNG.AI when they do not require access anymore.
+> * Keep user attributes synchronized between Azure AD and BLDNG.AI
+> * Provision groups and group memberships in BLDNG.AI
+> * [Single sign-on](../manage-apps/add-application-portal-setup-oidc-sso.md) to BLDNG.AI (recommended).
++
+## Prerequisites
+
+The scenario outlined in this tutorial assumes that you already have the following prerequisites:
+
+* [An Azure AD tenant](../develop/quickstart-create-new-tenant.md)
+* A user account in Azure AD with [permission](../roles/permissions-reference.md) to configure provisioning (for example, Application Administrator, Cloud Application administrator, Application Owner, or Global Administrator).
+* A [BLDNG.AI](https://dashboard.bldng.ai/) agreement.
+* An invitation from BLDNG.AI to enable user provisioning and use BLDNG APP
+
+## Step 1. Plan your provisioning deployment
+1. Learn about [how the provisioning service works](../app-provisioning/user-provisioning.md).
+1. Determine who will be in [scope for provisioning](../app-provisioning/define-conditional-rules-for-provisioning-user-accounts.md).
+1. Determine what data to [map between Azure AD and BLDNG APP](../app-provisioning/customize-application-attributes.md).
+
+## Step 2. Configure BLDNG APP to support provisioning with Azure AD
+
+* To configure provisioning of users, user groups and group memberships from Azure you'll need a BLDNG.AI agreement and tenant.
+* To attain an agreement, please contact [sales](mailto:salg@bldng.ai) to get in contact with a sales representative. You will not be able to proceed nor use BLDNG APP if an agreement does not exist.
+* If you already have an active agreement but need to enable user provisioning only, contact [support](mailto:support@bldng.ai) directly.
+
+When an agreement has been established, you will receive an email with detailed instructions on how to set up user provisioning. The email will also include details regarding admin consent (on behalf of your organization) for using BLDNG APP if needed.
+
+The email will also include Tenant URL and Secret Token for use when configuring automatic user provisioning.
+
+## Step 3. Add BLDNG APP from the Azure AD application gallery
+
+Add BLDNG APP from the Azure AD application gallery to start managing provisioning to BLDNG APP. If you have previously setup BLDNG APP for SSO you can use the same application. However it is recommended that you create a separate app when testing out the integration initially. Learn more about adding an application from the gallery [here](../manage-apps/add-application-portal.md).
+
+## Step 4. Define who will be in scope for provisioning
+
+The Azure AD provisioning service allows you to scope who will be provisioned based on assignment to the application and or based on attributes of the user / group. If you choose to scope who will be provisioned to your app based on assignment, you can use the following [steps](../manage-apps/assign-user-or-group-access-portal.md) to assign users and groups to the application. If you choose to scope who will be provisioned based solely on attributes of the user or group, you can use a scoping filter as described [here](../app-provisioning/define-conditional-rules-for-provisioning-user-accounts.md).
+
+* When assigning users and groups to BLDNG APP, you must select a role other than **Default Access**. Users with the Default Access role are excluded from provisioning and will be marked as not effectively entitled in the provisioning logs. If the only role available on the application is the default access role, you can [update the application manifest](../develop/howto-add-app-roles-in-azure-ad-apps.md) to add additional roles.
+
+* Start small. Test with a small set of users and groups before rolling out to everyone. When scope for provisioning is set to assigned users and groups, you can control this by assigning one or two users or groups to the app. When scope is set to all users and groups, you can specify an [attribute based scoping filter](../app-provisioning/define-conditional-rules-for-provisioning-user-accounts.md).
++
+## Step 5. Configure automatic user provisioning to BLDNG APP
+
+This section guides you through the steps to configure the Azure AD provisioning service to create, update, and disable users and/or groups in BLDNG APP based on user and/or group assignments in Azure AD.
+
+### To configure automatic user provisioning for BLDNG APP in Azure AD:
+
+1. Sign in to the [Azure portal](https://portal.azure.com). Select **Enterprise Applications**, then select **All applications**.
+
+ ![Enterprise applications blade](common/enterprise-applications.png)
+
+1. In the applications list, select **BLDNG APP**.
+
+ ![The BLDNG APP link in the Applications list](common/all-applications.png)
+
+1. Select the **Provisioning** tab.
+
+ ![Provisioning tab](common/provisioning.png)
+
+1. Set the **Provisioning Mode** to **Automatic**.
+
+ ![Provisioning tab automatic](common/provisioning-automatic.png)
+
+1. In the **Admin Credentials** section, input your BLDNG APP **Tenant URL** and **Secret Token**. Click **Test Connection** to ensure Azure AD can connect to BLDNG APP. If the connection fails , ensure your BLDNG APP account has Admin permissions and try again.
+
+ ![Token](common/provisioning-testconnection-tenanturltoken.png)
+
+1. In the **Notification Email** field, enter the email address of a person or group who should receive the provisioning error notifications and select the **Send an email notification when a failure occurs** check box.
+
+ ![Notification Email](common/provisioning-notification-email.png)
+
+1. Select **Save**.
+
+1. In the **Mappings** section, select **Synchronize Azure Active Directory Users to BLDNG APP**.
+
+1. Review the user attributes that are synchronized from Azure AD to BLDNG APP in the **Attribute-Mapping** section. The attributes selected as **Matching** properties are used to match the user accounts in BLDNG APP for update operations. If you choose to change the [matching target attribute](../app-provisioning/customize-application-attributes.md), you will need to ensure that the BLDNG APP API supports filtering users based on that attribute. Select the **Save** button to commit any changes.
+
+> [!NOTE]
+> It is important to note that if you change the mapping of **externalId**, the users in your tenant will not be able to log in using BLDNG APP.
+
+ |Attribute|Type|Supported for filtering|
+ ||||
+ |userName|String|&check;
+ |active|Boolean|
+ |displayName|String|
+ |emails[type eq "work"].value|String|
+ |name.givenName|String|
+ |name.familyName|String|
+ |phoneNumbers[type eq "mobile"].value|String|
+ |externalId|String|
+ |urn:ietf:params:scim:schemas:extension:enterprise:2.0:User:employeeNumber|String|
+ |urn:ietf:params:scim:schemas:extension:enterprise:2.0:User:department|String|
+
+
+1. Under the **Mappings** section, select **Synchronize Azure Active Directory Groups to BLDNG APP**.
+
+1. Review the group attributes that are synchronized from Azure AD to BLDNG APP in the **Attribute-Mapping** section. The attributes selected as **Matching** properties are used to match the groups in BLDNG APP for update operations. Select the **Save** button to commit any changes.
+
+ |Attribute|Type|Supported for filtering|
+ ||||
+ |displayName|String|&check;
+ |members|Reference|
+ |externalId|String|
+
+1. To configure scoping filters, refer to the following instructions provided in the [Scoping filter tutorial](../app-provisioning/define-conditional-rules-for-provisioning-user-accounts.md).
+
+1. To enable the Azure AD provisioning service for BLDNG APP, change the **Provisioning Status** to **On** in the **Settings** section.
+
+ ![Provisioning Status Toggled On](common/provisioning-toggle-on.png)
+
+1. Define the users and/or groups that you would like to provision to BLDNG APP by choosing the desired values in **Scope** in the **Settings** section.
+
+ ![Provisioning Scope](common/provisioning-scope.png)
+
+1. When you are ready to provision, click **Save**.
+
+ ![Saving Provisioning Configuration](common/provisioning-configuration-save.png)
+
+This operation starts the initial synchronization cycle of all users and groups defined in **Scope** in the **Settings** section. The initial cycle takes longer to perform than subsequent cycles, which occur approximately every 40 minutes as long as the Azure AD provisioning service is running.
+
+## Step 6. Monitor your deployment
+Once you've configured provisioning, use the following resources to monitor your deployment:
+
+* Use the [provisioning logs](../reports-monitoring/concept-provisioning-logs.md) to determine which users have been provisioned successfully or unsuccessfully
+* Check the [progress bar](../app-provisioning/application-provisioning-when-will-provisioning-finish-specific-user.md) to see the status of the provisioning cycle and how close it is to completion
+* If the provisioning configuration seems to be in an unhealthy state, the application will go into quarantine. Learn more about quarantine states [here](../app-provisioning/application-provisioning-quarantine-status.md).
+
+## More resources
+
+* [Managing user account provisioning for Enterprise Apps](../app-provisioning/configure-automatic-user-provisioning-portal.md)
+* [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+
+## Next steps
+
+* [Learn how to review logs and get reports on provisioning activity](../app-provisioning/check-status-user-account-provisioning.md)
active-directory Logzio Cloud Observability For Engineers Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/logzio-cloud-observability-for-engineers-tutorial.md
Last updated 06/16/2021
-# Tutorial: Azure Active Directory single sign-on (SSO) integration with Logz.io - Azure AD Integration
+# Tutorial: Azure Active Directory single sign-on (SSO) set up for Logz.io - Azure AD Integration
-In this tutorial, you'll learn how to integrate Logz.io - Azure AD Integration with Azure Active Directory (Azure AD). When you integrate Logz.io - Azure AD Integration with Azure AD, you can:
+## Single sign-on (SSO) for the Logz.io - Azure portal integration
+
+Logz.io offers an integration with Azure Marketplace.
+This topic provides guidance for admins to set up SSO for the Logz.io-Azure portal integration, which enables an SSO link for users who access Logz.io resources via Microsoft Azure Marketplace.
+
+### Benefits
+
+The advantages of providing your users access to the Logz.io Azure resource via SSO:
+++ No need to predefine a unique username and password for each user: Any user who has the SSO link can sign in to the application.++ Better user control: A user must be defined in the Azure account to be able to use the SSO link.+
+Prepare SSO connectivity before setting up the Azure resource for Logz.io. You'll need the credentials you create in this process to set up the resource.
+
+### Creating SSO connectivity for your Logz.io resource in Azure Active Directory
+
+You'll create an Azure Active Directory (AD) Enterprise application to allow you use SSO to connect to your Logz.io account from your Azure resource.
+
+### Prerequisites:
+
+To get started, you need the following privileges:
+
+* Access to Azure Active Directory (AAD)
+* Permissions to create a new Enterprise Application
+* Owner role permissions for the Azure subscription for which you are creating the Logz.io resource
+
+To be able to access and use the SSO link that is created for a Logz.io-Azure integration resource, users must be defined in the associated Azure account.
+
+#### Setting up an SSO link for the Logz.io - Azure portal resource
+
+##### Add the Logz.io-Azure Active Directory Integration from the gallery
+
+To configure SSO for the Logz.io resource in the Azure portal, you need to add the Logz.io - Azure AD Integration from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using a Microsoft account.
+2. In the Azure portal, in **Logz.io | Overview**, in the **+ Add** menu, select **Enterprise application**.
+
+ ![Enterprise application option](./media/logzio-cloud-observability-for-engineers-tutorial/liftr-ovrview-enterprise-apps.png)
+
+3. In the Azure Active Directory Gallery, browse to the **Logz.io - Azure AD Integration** application and select it.
+4. Rename the integration with a relevant name and click **Create**. (In the steps that follow, we used the name **AD app for a logz.io resource**)
+
+ ![Rename the integration](./media/logzio-cloud-observability-for-engineers-tutorial/liftr-rename-logzio-ad-integration.png)
+
+##### Copy the Application ID
+
+In **AD app for a logz.io resource | Overview > Properties**, copy the **Application ID** property.
+
+![Copy Application ID](./media/logzio-cloud-observability-for-engineers-tutorial/liftr-copy-application-id-2.png)
+
+##### Configure Azure AD SSO
+
+1. In **AD app for a logz.io resource | Overview > Getting Started**, in **2. Set up single sign on**, click **Get started** to open **Single sign-on**.
+
+ ![Set up SSO](./media/logzio-cloud-observability-for-engineers-tutorial/liftr-set-up-sso.png)
+
+2. In **AD app for a logz.io resource | Single sign-on**, select the **SAML** method.
+
+ ![Select SAML SSO method](./media/logzio-cloud-observability-for-engineers-tutorial/liftr-select-saml.png)
+
+##### Basic SAML configuration
+
+1. In **AD app for a logz.io resource | SAML-based Sign-on**, click **Edit** to open the **Basic SAML Configuration** panel.
+
+ ![Edit basic SAML](./media/logzio-cloud-observability-for-engineers-tutorial/liftr-edit-basic-saml.png)
+
+2. In the **Identifier (Entity ID)** text box, type a value using the pattern `urn:auth0:logzio:*`: Replace the `*` with the **Application ID** you copied in procedure 2, and click the **Default** option.
+
+3. In the **Reply URL (Assertion Consumer Service URL)**, text box, type a URL using the pattern `https://logzio.auth0.com/login/callback?connection=`: Replace `CONNECTION_NAME` with the **Application ID** you copied in procedure 2.
+
+4. Click **Save** at the top of the panel.
+
+ ![Set SML](./media/logzio-cloud-observability-for-engineers-tutorial/liftr-basic-saml-config.png)
+
+##### Configure the user assignment option
+
+In **AD app for a logz.io resource|Properties (Manage > Properties)**, set **User assignment required?** to **No** and click **Save**.
+This step enables users with access to the SSO link to sign in to Logz.io via Microsoft Azure portal, without requiring that you predefine each user in Active Directory.
+
+This option allows any user who is defined under Active Directory to use the SSO link, instead of requiring that you define specific access rights for each user through the AD app that was just created.
+
+If you don't want to configure this option, your organization will have to assign specific access rights to Logz.io for each user.
+
+![User assignment not required](./media/logzio-cloud-observability-for-engineers-tutorial/liftr-user-assignment-required-no.png)
+
+### Enable SSO for your Logz.io resource via Azure Active Directory
+
+When you create a Logz.io account, use the AD app you created for the Logz.io resource to enable single sign-on with Azure Active Directory.
+
+The Logz.io AAD app resource name is automatically populated as you type.
+
+![Select your Logz AAD app to enable SSO](./media/logzio-cloud-observability-for-engineers-tutorial/liftr-select-logz-aad-app.png)
++
+The SSO link is displayed when you sign into your Logz.io resource. <br>
+Click the link to access your account in Logz.io.
+
+If you don't configure SSO while you are creating the Logz.io resource, you can configure it later via the Single sign-on blade.
+
+You'll have to configure your logs in Azure to ensure they're sent to Logz.io.
+
+![One click SSO to Logz.io](./media/logzio-cloud-observability-for-engineers-tutorial/liftr-logzio-sso-link.png)
+
+## Azure Active Directory single sign-on for an existing Logz.io account
+
+In this section, you'll learn how to integrate Logz.io - Azure AD Integration with Azure Active Directory (Azure AD). When you integrate Logz.io - Azure AD Integration with Azure AD, you can:
* Control in Azure AD who has access to Logz.io - Azure AD Integration. * Enable your users to be automatically signed-in to Logz.io - Azure AD Integration with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-## Prerequisites
+### Prerequisites
To get started, you need the following items: * An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/). * Logz.io - Azure AD Integration single sign-on (SSO) enabled subscription.
-## Scenario description
+### Scenario description
In this tutorial, you configure and test Azure AD SSO in a test environment. * Logz.io - Azure AD Integration supports **IDP** initiated SSO.
-## Add Logz.io - Azure AD Integration from the gallery
+### Add Logz.io - Azure AD Integration from the gallery
To configure the integration of Logz.io - Azure AD Integration into Azure AD, you need to add Logz.io - Azure AD Integration from the gallery to your list of managed SaaS apps.
To configure the integration of Logz.io - Azure AD Integration into Azure AD, yo
1. In the **Add from the gallery** section, type **Logz.io - Azure AD Integration** in the search box. 1. Select **Logz.io - Azure AD Integration** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD SSO for Logz.io - Azure AD Integration
+### Configure and test Azure AD SSO for Logz.io - Azure AD Integration
Configure and test Azure AD SSO with Logz.io - Azure AD Integration using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Logz.io - Azure AD Integration.
To configure and test Azure AD SSO with Logz.io - Azure AD Integration, perform
1. **[Create Logz.io - Azure AD Integration test user](#create-logzio-azure-ad-integration-test-user)** - to have a counterpart of B.Simon in Logz.io - Azure AD Integration that is linked to the Azure AD representation of user. 1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-## Configure Azure AD SSO
+### Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
Follow these steps to enable Azure AD SSO in the Azure portal.
![Copy configuration URLs](common/copy-configuration-urls.png)
-### Create an Azure AD test user
+#### Create an Azure AD test user
In this section, you'll create a test user in the Azure portal called B.Simon.
In this section, you'll create a test user in the Azure portal called B.Simon.
1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box. 1. Click **Create**.
-### Assign the Azure AD test user
+#### Assign the Azure AD test user
In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Logz.io - Azure AD Integration.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
-## Configure Logz.io Azure AD Integration SSO
+### Configure Logz.io Azure AD Integration SSO
To configure single sign-on on **Logz.io - Azure AD Integration** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [Logz.io - Azure AD Integration support team](mailto:help@logz.io). They set this setting to have the SAML SSO connection set properly on both sides.
-### Create Logz.io Azure AD Integration test user
+#### Create Logz.io Azure AD Integration test user
-In this section, you create a user called Britta Simon in Logz.io - Azure AD Integration. Work with [Logz.io - Azure AD Integration support team](mailto:help@logz.io) to add the users in the Logz.io - Azure AD Integration platform. Users must be created and activated before you use single sign-on.
+In this section, you create a user called Britta Simon in Logz.io - Azure AD Integration. Work with [Logz.io - Azure AD Integration support team](mailto:help@logz.io) to add the users in the Logz.io - Azure AD Integration platform. Users must be created and activated before you use single sign-on.
-## Test SSO
+### Test SSO
In this section, you test your Azure AD single sign-on configuration with following options.
In this section, you test your Azure AD single sign-on configuration with follow
* You can use Microsoft My Apps. When you click the Logz.io Azure AD Integration tile in the My Apps, you should be automatically signed in to the Logz.io Azure AD Integration for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-## Next steps
+### Next steps
Once you configure Logz.io Azure AD Integration you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Merchlogix Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/merchlogix-tutorial.md
Previously updated : 02/20/2019 Last updated : 08/24/2021 # Tutorial: Azure Active Directory integration with Merchlogix
-In this tutorial, you learn how to integrate Merchlogix with Azure Active Directory (Azure AD).
-Integrating Merchlogix with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Merchlogix with Azure Active Directory (Azure AD). When you integrate Merchlogix with Azure AD, you can:
-* You can control in Azure AD who has access to Merchlogix.
-* You can enable your users to be automatically signed-in to Merchlogix (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Merchlogix.
+* Enable your users to be automatically signed-in to Merchlogix with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Merchlogix, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Merchlogix single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Merchlogix single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Merchlogix supports **SP** initiated SSO
-
-## Adding Merchlogix from the gallery
-
-To configure the integration of Merchlogix into Azure AD, you need to add Merchlogix from the gallery to your list of managed SaaS apps.
-
-**To add Merchlogix from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
+* Merchlogix supports **SP** initiated SSO.
-4. In the search box, type **Merchlogix**, select **Merchlogix** from result panel then click **Add** button to add the application.
+* Merchlogix supports [Automated user provisioning](merchlogix-provisioning-tutorial.md).
- ![Merchlogix in the results list](common/search-new-app.png)
+## Add Merchlogix from the gallery
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with Merchlogix based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Merchlogix needs to be established.
-
-To configure and test Azure AD single sign-on with Merchlogix, you need to complete the following building blocks:
-
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Merchlogix Single Sign-On](#configure-merchlogix-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Merchlogix test user](#create-merchlogix-test-user)** - to have a counterpart of Britta Simon in Merchlogix that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure the integration of Merchlogix into Azure AD, you need to add Merchlogix from the gallery to your list of managed SaaS apps.
-### Configure Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Merchlogix** in the search box.
+1. Select **Merchlogix** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure and test Azure AD SSO for Merchlogix
-To configure Azure AD single sign-on with Merchlogix, perform the following steps:
+Configure and test Azure AD SSO with Merchlogix using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Merchlogix.
-1. In the [Azure portal](https://portal.azure.com/), on the **Merchlogix** application integration page, select **Single sign-on**.
+To configure and test Azure AD SSO with Merchlogix, perform the following steps:
- ![Configure single sign-on link](common/select-sso.png)
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Merchlogix SSO](#configure-merchlogix-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Merchlogix test user](#create-merchlogix-test-user)** - to have a counterpart of B.Simon in Merchlogix that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+## Configure Azure AD SSO
- ![Single sign-on select mode](common/select-saml-option.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
+1. In the Azure portal, on the **Merchlogix** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![Merchlogix Domain and URLs single sign-on information](common/sp-identifier.png)
+ a. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
+ `https://<DOMAIN>/simplesaml/module.php/saml/sp/metadata.php/<SAML_NAME>`
- a. In the **Sign on URL** text box, type a URL using the following pattern:
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
`https://<DOMAIN>/login.php?saml=true`
- b. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
- `https://<DOMAIN>/simplesaml/module.php/saml/sp/metadata.php/<SAML_NAME>`
- > [!NOTE]
- > These values are not real. Update these values with the actual Sign on URL and Identifier. Contact [Merchlogix Client support team](https://www.merchlogix.com/contact/) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier and Sign on URL. Contact [Merchlogix Client support team](https://www.merchlogix.com/contact/) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Certificate (Base64)** from the given options as per your requirement and save it on your computer.
To configure Azure AD single sign-on with Merchlogix, perform the following step
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure AD Identifier
-
- c. Logout URL
-
-### Configure Merchlogix Single Sign-On
-
-To configure single sign-on on **Merchlogix** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [Merchlogix support team](https://www.merchlogix.com/contact/). They set this setting to have the SAML SSO connection set properly on both sides.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
+In this section, you'll create a test user in the Azure portal called B.Simon.
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Merchlogix.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Merchlogix.
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Merchlogix**.
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Merchlogix**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![Enterprise applications blade](common/enterprise-applications.png)
+## Configure Merchlogix SSO
-2. In the applications list, select **Merchlogix**.
-
- ![The Merchlogix link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
+To configure single sign-on on **Merchlogix** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [Merchlogix support team](https://www.merchlogix.com/contact/). They set this setting to have the SAML SSO connection set properly on both sides.
### Create Merchlogix test user In this section, you create a user called Britta Simon in Merchlogix. Work with [Merchlogix support team](https://www.merchlogix.com/contact/) to add the users in the Merchlogix platform. Users must be created and activated before you use single sign-on.
-### Test single sign-on
+Merchlogix also supports automatic user provisioning, you can find more details [here](./merchlogix-provisioning-tutorial.md) on how to configure automatic user provisioning.
+
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the Merchlogix tile in the Access Panel, you should be automatically signed in to the Merchlogix for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Merchlogix Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to Merchlogix Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Merchlogix tile in the My Apps, this will redirect to Merchlogix Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Merchlogix you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Metanetworksconnector Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/metanetworksconnector-tutorial.md
Previously updated : 02/21/2019 Last updated : 08/24/2021 # Tutorial: Azure Active Directory integration with Meta Networks Connector
-In this tutorial, you learn how to integrate Meta Networks Connector with Azure Active Directory (Azure AD).
-Integrating Meta Networks Connector with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Meta Networks Connector with Azure Active Directory (Azure AD). When you integrate Meta Networks Connector with Azure AD, you can:
-* You can control in Azure AD who has access to Meta Networks Connector.
-* You can enable your users to be automatically signed-in to Meta Networks Connector (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Meta Networks Connector.
+* Enable your users to be automatically signed-in to Meta Networks Connector with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Meta Networks Connector, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Meta Networks Connector single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Meta Networks Connector single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Meta Networks Connector supports **SP** and **IDP** initiated SSO
+* Meta Networks Connector supports **SP** and **IDP** initiated SSO.
-* Meta Networks Connector supports **Just In Time** user provisioning
-
-## Adding Meta Networks Connector from the gallery
-
-To configure the integration of Meta Networks Connector into Azure AD, you need to add Meta Networks Connector from the gallery to your list of managed SaaS apps.
-
-**To add Meta Networks Connector from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Meta Networks Connector**, select **Meta Networks Connector** from result panel then click **Add** button to add the application.
-
- ![Meta Networks Connector in the results list](common/search-new-app.png)
+* Meta Networks Connector supports **Just In Time** user provisioning.
-## Configure and test Azure AD single sign-on
+* Meta Networks Connector supports [Automated user provisioning](meta-networks-connector-provisioning-tutorial.md).
-In this section, you configure and test Azure AD single sign-on with Meta Networks Connector based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Meta Networks Connector needs to be established.
+## Add Meta Networks Connector from the gallery
-To configure and test Azure AD single sign-on with Meta Networks Connector, you need to complete the following building blocks:
-
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Meta Networks Connector Single Sign-On](#configure-meta-networks-connector-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Meta Networks Connector test user](#create-meta-networks-connector-test-user)** - to have a counterpart of Britta Simon in Meta Networks Connector that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure the integration of Meta Networks Connector into Azure AD, you need to add Meta Networks Connector from the gallery to your list of managed SaaS apps.
-### Configure Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Meta Networks Connector** in the search box.
+1. Select **Meta Networks Connector** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure and test Azure AD SSO for Meta Networks Connector
-To configure Azure AD single sign-on with Meta Networks Connector, perform the following steps:
+Configure and test Azure AD SSO with Meta Networks Connector using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Meta Networks Connector.
-1. In the [Azure portal](https://portal.azure.com/), on the **Meta Networks Connector** application integration page, select **Single sign-on**.
+To configure and test Azure AD SSO with Meta Networks Connector, perform the following steps:
- ![Configure single sign-on link](common/select-sso.png)
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Meta Networks Connector SSO](#configure-meta-networks-connector-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Meta Networks Connector test user](#create-meta-networks-connector-test-user)** - to have a counterpart of B.Simon in Meta Networks Connector that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+## Configure Azure AD SSO
- ![Single sign-on select mode](common/select-saml-option.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
+1. In the Azure portal, on the **Meta Networks Connector** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, If you wish to configure the application in **IDP** initiated mode, perform the following steps:
- ![Screenshot shows the Basic SAML Configuration, where you can enter Identifier, Reply U R L, and select Save.](common/idp-intiated.png)
- a. In the **Identifier** text box, type a URL using the following pattern: `https://login.nsof.io/v1/<ORGANIZATION-SHORT-NAME>/saml/metadata`
To configure Azure AD single sign-on with Meta Networks Connector, perform the f
5. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- ![Screenshot shows Set additional U R Ls where you can enter a Sign on U R L.](common/both-advanced-urls.png)
- a. In the **Sign-on URL** text box, type a URL using the following pattern: `https://<ORGANIZATION-SHORT-NAME>.metanetworks.com/login`
To configure Azure AD single sign-on with Meta Networks Connector, perform the f
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
- b. Azure AD Identifier
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
- c. Logout URL
+### Assign the Azure AD test user
-### Configure Meta Networks Connector Single Sign-On
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Meta Networks Connector.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Meta Networks Connector**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Meta Networks Connector SSO
1. Open a new tab in your browser and log in to your Meta Networks Connector administrator account.
To configure Azure AD single sign-on with Meta Networks Connector, perform the f
2. Go to **Administrator** and select **Settings**.
- ![Screenshot shows Settings selected from the Administration menu.](./media/metanetworksconnector-tutorial/configure3.png)
+ ![Screenshot shows Settings selected from the Administration menu.](./media/metanetworksconnector-tutorial/menu.png)
3. Make sure **Log Internet Traffic** and **Force VPN MFA** are set to off.
- ![Screenshot shows turning off these settings.](./media/metanetworksconnector-tutorial/configure1.png)
+ ![Screenshot shows turning off these settings.](./media/metanetworksconnector-tutorial/settings.png)
4. Go to **Administrator** and select **SAML**.
- ![Screenshot shows SAML selected from the Administration menu.](./media/metanetworksconnector-tutorial/configure4.png)
+ ![Screenshot shows SAML selected from the Administration menu.](./media/metanetworksconnector-tutorial/admin.png)
5. Perform the following steps on the **DETAILS** page:
- ![Screenshot shows the DETAILS page where you can enter the values described.](./media/metanetworksconnector-tutorial/configure2.png)
+ ![Screenshot shows the DETAILS page where you can enter the values described.](./media/metanetworksconnector-tutorial/details.png)
a. Copy **SSO URL** value and paste it into the **Sign-In URL** textbox in the **Meta Networks Connector Domain and URLs** section.
To configure Azure AD single sign-on with Meta Networks Connector, perform the f
c. Copy **Audience URI (SP Entity ID)** value and paste it into the **Identifier (Entity ID)** textbox in the **Meta Networks Connector Domain and URLs** section.
- d. Enable the SAML
+ d. Enable the SAML.
-6. On the **GENERAL** tab. perform the following steps:
+6. On the **GENERAL** tab and perform the following steps:
- ![Screenshot shows the GENERAL page where you can enter the values described.](./media/metanetworksconnector-tutorial/configure5.png)
+ ![Screenshot shows the GENERAL page where you can enter the values described.](./media/metanetworksconnector-tutorial/configuration.png)
a. In the **Identity Provider Single Sign-On URL**, paste the **Login URL** value which you have copied from the Azure portal.
To configure Azure AD single sign-on with Meta Networks Connector, perform the f
d. Enable the **Just-in-Time Provisioning**.
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field, enter **BrittaSimon**.
-
- b. In the **User name** field, type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Meta Networks Connector.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Meta Networks Connector**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **Meta Networks Connector**.
-
- ![The Meta Networks Connector link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog, click the **Assign** button.
- ### Create Meta Networks Connector test user In this section, a user called Britta Simon is created in Meta Networks Connector. Meta Networks Connector supports just-in-time provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Meta Networks Connector, a new one is created when you attempt to access Meta Networks Connector.
In this section, a user called Britta Simon is created in Meta Networks Connecto
>[!Note] >If you need to create a user manually, contact [Meta Networks Connector Client support team](mailto:support@metanetworks.com).
-### Test single sign-on
+Meta Networks also supports automatic user provisioning, you can find more details [here](./meta-networks-connector-provisioning-tutorial.md) on how to configure automatic user provisioning.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Meta Networks Connector Sign on URL where you can initiate the login flow.
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+* Go to Meta Networks Connector Sign-on URL directly and initiate the login flow from there.
-When you click the Meta Networks Connector tile in the Access Panel, you should be automatically signed in to the Meta Networks Connector for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Meta Networks Connector for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the Meta Networks Connector tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Meta Networks Connector for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Meta Networks Connector you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Mindtickle Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/mindtickle-tutorial.md
Previously updated : 01/23/2019 Last updated : 08/24/2021 # Tutorial: Azure Active Directory integration with MindTickle
-In this tutorial, you learn how to integrate MindTickle with Azure Active Directory (Azure AD).
-Integrating MindTickle with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate MindTickle with Azure Active Directory (Azure AD). When you integrate MindTickle with Azure AD, you can:
-* You can control in Azure AD who has access to MindTickle.
-* You can enable your users to be automatically signed-in to MindTickle (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to MindTickle.
+* Enable your users to be automatically signed-in to MindTickle with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with MindTickle, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* MindTickle single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* MindTickle single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* MindTickle supports **SP** initiated SSO
-
-* MindTickle supports **Just In Time** user provisioning
-
-## Adding MindTickle from the gallery
-
-To configure the integration of MindTickle into Azure AD, you need to add MindTickle from the gallery to your list of managed SaaS apps.
-
-**To add MindTickle from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
+* MindTickle supports **SP** initiated SSO.
-3. To add new application, click **New application** button on the top of dialog.
+* MindTickle supports **Just In Time** user provisioning.
- ![The New application button](common/add-new-app.png)
+* MindTickle supports [Automated user provisioning](mindtickle-provisioning-tutorial.md).
-4. In the search box, type **MindTickle**, select **MindTickle** from result panel then click **Add** button to add the application.
+## Add MindTickle from the gallery
- ![MindTickle in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with MindTickle based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in MindTickle needs to be established.
-
-To configure and test Azure AD single sign-on with MindTickle, you need to complete the following building blocks:
-
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure MindTickle Single Sign-On](#configure-mindtickle-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create MindTickle test user](#create-mindtickle-test-user)** - to have a counterpart of Britta Simon in MindTickle that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure the integration of MindTickle into Azure AD, you need to add MindTickle from the gallery to your list of managed SaaS apps.
-### Configure Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **MindTickle** in the search box.
+1. Select **MindTickle** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure and test Azure AD SSO for MindTickle
-To configure Azure AD single sign-on with MindTickle, perform the following steps:
+Configure and test Azure AD SSO with MindTickle using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in MindTickle.
-1. In the [Azure portal](https://portal.azure.com/), on the **MindTickle** application integration page, select **Single sign-on**.
+To configure and test Azure AD SSO with MindTickle, perform the following steps:
- ![Configure single sign-on link](common/select-sso.png)
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure MindTickle SSO](#configure-mindtickle-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create MindTickle test user](#create-mindtickle-test-user)** - to have a counterpart of B.Simon in MindTickle that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+## Configure Azure AD SSO
- ![Single sign-on select mode](common/select-saml-option.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
+1. In the Azure portal, on the **MindTickle** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, if you have **Service Provider metadata file**, perform the following steps:
To configure Azure AD single sign-on with MindTickle, perform the following step
![choose metadata file](common/browse-upload-metadata.png)
- c. After the metadata file is successfully uploaded, the **Identifier** value gets auto populated in **Basic SAML Configuration** section:
-
- ![MindTickle Domain and URLs single sign-on information](common/sp-identifier.png)
+ c. After the metadata file is successfully uploaded, the **Identifier** value gets auto populated in **Basic SAML Configuration** section.
In the **Sign-on URL** text box, type a URL using the following pattern: `https://<subdomain>.mindtickle.com` > [!Note]
- > If the **Identifier** value does not get auto polulated, then please fill in the value manually according to your requirement. The Sign-on URL value is not real. Update the value with the actual Sign-on URL. Contact [MindTickle support team](mailto:support@mindtickle.com) to get this value.
+ > If the **Identifier** value does not get auto populated, then please fill in the value manually according to your requirement. The Sign-on URL value is not real. Update the value with the actual Sign-on URL. Contact [MindTickle support team](mailto:support@mindtickle.com) to get this value.
5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Federation Metadata XML** from the given options as per your requirement and save it on your computer.
To configure Azure AD single sign-on with MindTickle, perform the following step
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure Ad Identifier
-
- c. Logout URL
-
-### Configure MindTickle Single Sign-On
-
-To configure single sign-on on **MindTickle** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [MindTickle support team](mailto:support@mindtickle.com). They set this setting to have the SAML SSO connection set properly on both sides.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
+In this section, you'll create a test user in the Azure portal called B.Simon.
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to MindTickle.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to MindTickle.
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **MindTickle**.
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **MindTickle**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![Enterprise applications blade](common/enterprise-applications.png)
+## Configure MindTickle SSO
-2. In the applications list, select **MindTickle**.
-
- ![The MindTickle link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
+To configure single sign-on on **MindTickle** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [MindTickle support team](mailto:support@mindtickle.com). They set this setting to have the SAML SSO connection set properly on both sides.
### Create MindTickle test user In this section, a user called Britta Simon is created in MindTickle. MindTickle supports **just-in-time user provisioning**, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in MindTickle, a new one is created after authentication.
-### Test single sign-on
+MindTickle also supports automatic user provisioning, you can find more details [here](./mindtickle-provisioning-tutorial.md) on how to configure automatic user provisioning.
+
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the MindTickle tile in the Access Panel, you should be automatically signed in to the MindTickle for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to MindTickle Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to MindTickle Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the MindTickle tile in the My Apps, this will redirect to MindTickle Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure MindTickle you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Mixpanel Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/mixpanel-tutorial.md
Previously updated : 07/23/2021 Last updated : 08/20/2021 # Tutorial: Azure Active Directory integration with Mixpanel
In this tutorial, you configure and test Azure AD single sign-on in a test envir
* Mixpanel supports **SP** initiated SSO.
+* Mixpanel supports [Automated user provisioning](mixpanel-provisioning-tutorial.md).
+ > [!NOTE] > Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
The objective of this section is to create a user called Britta Simon in Mixpane
> [!Note] > The user will get an email to set up the profile.
+> [!NOTE]
+> Mixpanel also supports automatic user provisioning, you can find more details [here](./mixpanel-provisioning-tutorial.md) on how to configure automatic user provisioning.
+ ## Test SSO In this section, you test your Azure AD single sign-on configuration with following options.
active-directory Nulab Pass Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/nulab-pass-tutorial.md
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Nulab Pass (Backlog,Cacoo,Typetalk) | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and Nulab Pass (Backlog,Cacoo,Typetalk).
++++++++ Last updated : 08/24/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with Nulab Pass (Backlog,Cacoo,Typetalk)
+
+In this tutorial, you'll learn how to integrate Nulab Pass (Backlog,Cacoo,Typetalk) with Azure Active Directory (Azure AD). When you integrate Nulab Pass (Backlog,Cacoo,Typetalk) with Azure AD, you can:
+
+* Control in Azure AD who has access to Nulab Pass (Backlog,Cacoo,Typetalk).
+* Enable your users to be automatically signed-in to Nulab Pass (Backlog,Cacoo,Typetalk) with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Nulab Pass (Backlog,Cacoo,Typetalk) single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Nulab Pass (Backlog,Cacoo,Typetalk) supports **SP and IDP** initiated SSO.
+
+## Add Nulab Pass (Backlog,Cacoo,Typetalk) from the gallery
+
+To configure the integration of Nulab Pass (Backlog,Cacoo,Typetalk) into Azure AD, you need to add Nulab Pass (Backlog,Cacoo,Typetalk) from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Nulab Pass (Backlog,Cacoo,Typetalk)** in the search box.
+1. Select **Nulab Pass (Backlog,Cacoo,Typetalk)** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Nulab Pass (Backlog,Cacoo,Typetalk)
+
+Configure and test Azure AD SSO with Nulab Pass (Backlog,Cacoo,Typetalk) using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Nulab Pass (Backlog,Cacoo,Typetalk).
+
+To configure and test Azure AD SSO with Nulab Pass (Backlog,Cacoo,Typetalk), perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Nulab Pass SSO](#configure-nulab-pass-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Nulab Pass test user](#create-nulab-pass-test-user)** - to have a counterpart of B.Simon in Nulab Pass (Backlog,Cacoo,Typetalk) that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Nulab Pass (Backlog,Cacoo,Typetalk)** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following steps:
+
+ a. In the **Identifier** text box, type a URL using the following pattern:
+ `https://apps.nulab.com/signin/spaces/<Space Key>/saml`
+
+ b. In the **Reply URL** text box, type a URL using the following pattern:
+ `https://apps.nulab.com/signin/spaces/<Space Key>/saml/callback`
+
+1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
+
+ In the **Sign-on URL** text box, type a URL using the following pattern:
+ `https://apps.nulab.com/signin/spaces/<INSTANCE_NAME>`
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Identifier, Reply URL and Sign-on URL. Contact [Nulab Pass (Backlog,Cacoo,Typetalk) Client support team](mailto:support@apps.nulab.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. Your Nulab Pass (Backlog,Cacoo,Typetalk) application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows an example for this. The default value of **Unique User Identifier** is **user.userprincipalname** but Nulab Pass (Backlog,Cacoo,Typetalk) expects this to be mapped with the user's email address. For that you can use **user.mail** attribute from the list or use the appropriate attribute value based on your organization configuration.
+
+ ![image](common/default-attributes.png)
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Base64)** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/certificatebase64.png)
+
+1. On the **Set up Nulab Pass (Backlog,Cacoo,Typetalk)** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Nulab Pass (Backlog,Cacoo,Typetalk).
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Nulab Pass (Backlog,Cacoo,Typetalk)**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Nulab Pass SSO
+
+To configure single sign-on on **Nulab Pass (Backlog,Cacoo,Typetalk)** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [Nulab Pass (Backlog,Cacoo,Typetalk) support team](mailto:support@apps.nulab.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create Nulab Pass test user
+
+In this section, you create a user called Britta Simon in Nulab Pass (Backlog,Cacoo,Typetalk). Work with [Nulab Pass (Backlog,Cacoo,Typetalk) support team](mailto:support@apps.nulab.com) to add the users in the Nulab Pass (Backlog,Cacoo,Typetalk) platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Nulab Pass (Backlog,Cacoo,Typetalk) Sign on URL where you can initiate the login flow.
+
+* Go to Nulab Pass (Backlog,Cacoo,Typetalk) Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
+
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Nulab Pass (Backlog,Cacoo,Typetalk) for which you set up the SSO.
+
+You can also use Microsoft My Apps to test the application in any mode. When you click the Nulab Pass (Backlog,Cacoo,Typetalk) tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Nulab Pass (Backlog,Cacoo,Typetalk) for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure Nulab Pass (Backlog,Cacoo,Typetalk) you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Qiita Team Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/qiita-team-tutorial.md
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Qiita Team | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and Qiita Team.
++++++++ Last updated : 08/24/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with Qiita Team
+
+In this tutorial, you'll learn how to integrate Qiita Team with Azure Active Directory (Azure AD). When you integrate Qiita Team with Azure AD, you can:
+
+* Control in Azure AD who has access to Qiita Team.
+* Enable your users to be automatically signed-in to Qiita Team with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Qiita Team single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Qiita Team supports **IDP** initiated SSO.
+
+* Qiita Team supports **Just In Time** user provisioning.
+
+## Add Qiita Team from the gallery
+
+To configure the integration of Qiita Team into Azure AD, you need to add Qiita Team from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Qiita Team** in the search box.
+1. Select **Qiita Team** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Qiita Team
+
+Configure and test Azure AD SSO with Qiita Team using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Qiita Team.
+
+To configure and test Azure AD SSO with Qiita Team, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Qiita Team SSO](#configure-qiita-team-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Qiita Team test user](#create-qiita-team-test-user)** - to have a counterpart of B.Simon in Qiita Team that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Qiita Team** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, perform the following steps:
+
+ a. In the **Identifier** text box, type a URL using the following pattern:
+ `https://<SUBDOMAIN>.qiita.com/saml/metadata`
+
+ b. In the **Reply URL** text box, type a URL using the following pattern:
+ `https://<SUBDOMAIN>.qiita.com/saml/consume`
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Identifier and Reply URL. Contact [Qiita Team Client support team](mailto:engineers+team@qiita.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. Qiita Team application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
+
+ ![image](common/default-attributes.png)
+
+1. In addition to above, Qiita Team application expects few more attributes to be passed back in SAML response, which are shown below. These attributes are also pre-populated but you can review them as per your requirements.
+
+ | Name | Source Attribute|
+ | | |
+ | User.FirstName | user.surname |
+ | User.LastName | user.givenname |
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/metadataxml.png)
+
+1. On the **Set up Qiita Team** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Qiita Team.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Qiita Team**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Qiita Team SSO
+
+To configure single sign-on on **Qiita Team** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Qiita Team support team](mailto:engineers+team@qiita.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create Qiita Team test user
+
+In this section, a user called Britta Simon is created in Qiita Team. Qiita Team supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Qiita Team, a new one is created after authentication.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on Test this application in Azure portal and you should be automatically signed in to the Qiita Team for which you set up the SSO.
+
+* You can use Microsoft My Apps. When you click the Qiita Team tile in the My Apps, you should be automatically signed in to the Qiita Team for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure Qiita Team you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Uniflow Online Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/uniflow-online-tutorial.md
Previously updated : 02/04/2021 Last updated : 08/26/2021
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* uniFLOW Online supports **SP** initiated SSO
+* uniFLOW Online supports **SP** initiated SSO.
## Add uniFLOW Online from the gallery
Follow these steps to enable Azure AD SSO in the Azure portal.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, perform the following steps:
- a. In the **Sign on URL** text box, type a URL using one of the following patterns:
+ a. In the **Identifier (Entity ID)** text box, type a URL using one of the following patterns:
- - `https://<tenant_domain_name>.eu.uniflowonline.com`
- - `https://<tenant_domain_name>.us.uniflowonline.com`
- - `https://<tenant_domain_name>.sg.uniflowonline.com`
- - `https://<tenant_domain_name>.jp.uniflowonline.com`
- - `https://<tenant_domain_name>.au.uniflowonline.com`
+ | **Identifier** |
+ ||
+ | `https://<tenant_domain_name>.eu.uniflowonline.com` |
+ | `https://<tenant_domain_name>.us.uniflowonline.com` |
+ | `https://<tenant_domain_name>.sg.uniflowonline.com` |
+ | `https://<tenant_domain_name>.jp.uniflowonline.com` |
+ | `https://<tenant_domain_name>.au.uniflowonline.com` |
- b. In the **Identifier (Entity ID)** text box, type a URL using one of the following patterns:
+ b. In the **Sign on URL** text box, type a URL using one of the following patterns:
- - `https://<tenant_domain_name>.eu.uniflowonline.com`
- - `https://<tenant_domain_name>.us.uniflowonline.com`
- - `https://<tenant_domain_name>.sg.uniflowonline.com`
- - `https://<tenant_domain_name>.jp.uniflowonline.com`
- - `https://<tenant_domain_name>.au.uniflowonline.com`
+ | **Sign on URL** |
+ ||
+ | `https://<tenant_domain_name>.eu.uniflowonline.com` |
+ | `https://<tenant_domain_name>.us.uniflowonline.com` |
+ | `https://<tenant_domain_name>.sg.uniflowonline.com` |
+ | `https://<tenant_domain_name>.jp.uniflowonline.com` |
+ | `https://<tenant_domain_name>.au.uniflowonline.com` |
> [!NOTE]
- > These values are not real. Update these values with the actual Sign on URL and Identifier. Contact [uniFLOW Online Client support team](mailto:support@nt-ware.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal or refer to the reply URL displayed in your uniFLOW Online tenant.
+ > These values are not real. Update these values with the actual Identifier and Sign on URL. Contact [uniFLOW Online Client support team](mailto:support@nt-ware.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal or refer to the reply URL displayed in your uniFLOW Online tenant.
-1. uniFLOW Online application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
+1. uniFLOW Online application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes, whereas **nameidentifier** is mapped with **user.userprincipalname**. uniFLOW Online application expects **nameidentifier** to be mapped with **user.objectid**, so you need to edit the attribute mapping by clicking on **Edit** icon and change the attribute mapping.
- ![image](common/default-attributes.png)
+ ![Screenshot shows the User Attributes pane with the edit icon highlighted.](common/edit-attribute.png)
1. In addition to above, uniFLOW Online application expects few more attributes to be passed back in SAML response which are shown below. These attributes are also pre populated but you can review them as per your requirements.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. From the left navigation panel, select **User** tab.
- ![Screenshot shows User selected from the uniflow Online site.](./media/uniflow-online-tutorial/configure-1.png)
+ ![Screenshot shows User selected from the uniflow Online site.](./media/uniflow-online-tutorial/user.png)
1. Click **Identity providers**.
- ![Screenshot shows Identity Providers selected.](./media/uniflow-online-tutorial/configure-2.png)
+ ![Screenshot shows Identity Providers selected.](./media/uniflow-online-tutorial/profile.png)
1. Click on **Add identity provider**.
- ![Screenshot shows Add identity provider selected.](./media/uniflow-online-tutorial/configure-3.png)
+ ![Screenshot shows Add identity provider selected.](./media/uniflow-online-tutorial/add-profile.png)
1. On the **ADD IDENTITY PROVIDER** section, perform the following steps:
- ![Screenshot shows the ADD IDENTITY PROVIDER section where you can enter the values described.](./media/uniflow-online-tutorial/configure-4.png)
+ ![Screenshot shows the ADD IDENTITY PROVIDER section where you can enter the values described.](./media/uniflow-online-tutorial/configuration.png)
- a. Enter the Display name Ex: *AzureAD SSO*.
+ a. Enter the Display name Ex: **AzureAD SSO**.
b. For **Provider type**, select **WS-Fed** option from the dropdown.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. On the **General** tab, perform the following steps:
- ![Screenshot shows the General tab where you can enter the values described.](./media/uniflow-online-tutorial/configure-5.png)
+ ![Screenshot shows the General tab where you can enter the values described.](./media/uniflow-online-tutorial/general-tab.png)
- a. Enter the Display name Ex: *AzureAD SSO*.
+ a. Enter the Display name Ex: **AzureAD SSO**.
b. Select the **From URL** option for the **ADFS Federation Metadata**.
- c. In the **Federation Metadata URl** textbox, paste the **App Federation Metadata Url** value, which you have copied from the Azure portal.
+ c. In the **Federation Metadata URL** textbox, paste the **App Federation Metadata Url** value, which you have copied from the Azure portal.
d. Select **Identity provider** as **Enabled**.
active-directory Zero Networks Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/zero-networks-tutorial.md
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Zero Networks | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and Zero Networks.
++++++++ Last updated : 08/26/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with Zero Networks
+
+In this tutorial, you'll learn how to integrate Zero Networks with Azure Active Directory (Azure AD). When you integrate Zero Networks with Azure AD, you can:
+
+* Control in Azure AD who has access to Zero Networks.
+* Enable your users to be automatically signed-in to Zero Networks with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Zero Networks single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Zero Networks supports **SP** initiated SSO.
+
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
+
+## Add Zero Networks from the gallery
+
+To configure the integration of Zero Networks into Azure AD, you need to add Zero Networks from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Zero Networks** in the search box.
+1. Select **Zero Networks** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Zero Networks
+
+Configure and test Azure AD SSO with Zero Networks using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Zero Networks.
+
+To configure and test Azure AD SSO with Zero Networks, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Zero Networks SSO](#configure-zero-networks-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Zero Networks test user](#create-zero-networks-test-user)** - to have a counterpart of B.Simon in Zero Networks that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Zero Networks** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, perform the following step.
+
+ a. In the **Sign on URL** text box, type the URL:
+ `https://portal.zeronetworks.com/#/login`
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Base64)** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/certificatebase64.png)
+
+1. On the **Set up Zero Networks** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Zero Networks.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Zero Networks**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Zero Networks SSO
+
+1. Log in to your Zero Networks company site as an administrator.
+
+1. Navigate to **Settings** > **Identity Providers**.
+
+1. Click on **Microsoft Azure** and perform the following steps.
+
+ ![Screenshot shows settings of SSO configuration.](./media/zero-networks-tutorial/settings.png "Account")
+
+ 1. Copy **Identifier(Entity ID)** value, paste this value into the **Identifier** text box in the **Basic SAML Configuration** section in the Azure portal.
+
+ 1. Copy **Reply URL (Assertion Consumer Service URL)** value, paste this value into the **Reply URL** text box in the **Basic SAML Configuration** section in the Azure portal.
+
+ 1. In the **Login URL** textbox, paste the **Login URL** value which you have copied from the Azure portal.
+
+ 1. In the **Logout URL** textbox, paste the **Logout URL** value which you have copied from the Azure portal.
+
+ 1. Open the downloaded **Certificate (Base64)** from the Azure portal into Notepad and paste the content into the **Certificate(Base64)** textbox.
+
+ 1. Click **Save**.
+
+### Create Zero Networks test user
+
+In this section, you create a user called Britta Simon in Zero Networks. Work with [Zero Networks support team](mailto:support@zeronetworks.com) to add the users in the Zero Networks platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on **Test this application** in Azure portal. This will redirect to Zero Networks Sign-on URL where you can initiate the login flow.
+
+* Go to Zero Networks Sign-on URL directly and initiate the login flow from there.
+
+* You can use Microsoft My Apps. When you click the Zero Networks tile in the My Apps, this will redirect to Zero Networks Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure Zero Networks you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
automation Automation Hrw Run Runbooks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/automation-hrw-run-runbooks.md
Title: Run Azure Automation runbooks on a Hybrid Runbook Worker
description: This article describes how to run runbooks on machines in your local datacenter or other cloud provider with the Hybrid Runbook Worker. Previously updated : 07/27/2021 Last updated : 08/12/2021
Enabling the Azure Firewall on [Azure Storage](../storage/common/storage-network
Azure Automation handles jobs on Hybrid Runbook Workers differently from jobs run in Azure sandboxes. If you have a long-running runbook, make sure that it's resilient to possible restart. For details of the job behavior, see [Hybrid Runbook Worker jobs](automation-hybrid-runbook-worker.md#hybrid-runbook-worker-jobs).
-Jobs for Hybrid Runbook Workers run under the local **System** account on Windows, or the **nxautomation** account on Linux. For Linux, verify the **nxautomation** account has access to the location where the runbook modules are stored. When you use the [Install-Module](/powershell/module/powershellget/install-module) cmdlet, be sure to specify AllUsers for the `Scope` parameter to ensure that the **nxautomation** account has access. For more information on PowerShell on Linux, see [Known Issues for PowerShell on Non-Windows Platforms](/powershell/scripting/whats-new/what-s-new-in-powershell-70).
+Jobs for Hybrid Runbook Workers run under the local **System** account on Windows, or the **nxautomation** account on Linux. For Linux, verify the **nxautomation** account has access to the location where the runbook modules are stored. To ensure **nxautomation** account access:
+
+- When you use the [Install-Module](/powershell/module/powershellget/install-module) cmdlet, be sure to specify `AllUsers` for the `Scope` parameter.
+- When you use `pip install`, `apt install` or other method for installing packages on Linux, ensure the package is installed for all users. For example `sudo -H pip install <package_name>`.
+
+For more information on PowerShell on Linux, see [Known Issues for PowerShell on Non-Windows Platforms](/powershell/scripting/whats-new/what-s-new-in-powershell-70).
## Configure runbook permissions
Once you have configured signature validation, use the following GPG command to
gpg --clear-sign <runbook name> ```
-The signed runbook is called **<runbook name>.asc**.
+The signed runbook is called **\<runbook name>.asc**.
You can now upload the signed runbook to Azure Automation and execute it like a regular runbook.
automation Python 3 Packages https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/python-3-packages.md
Title: Manage Python 3 packages in Azure Automation
description: This article tells how to manage Python 3 packages (preview) in Azure Automation. Previously updated : 06/22/2021 Last updated : 08/13/2021
for group in groups:
print(group.name) ```
+> [!NOTE]
+> The Python `automationassets` package is not available on pypi.org, so it's not available for import onto a Windows machine.
+ ## Next steps To prepare a Python runbook, see [Create a Python runbook](learn/automation-tutorial-runbook-textual-python-3.md).
automation Python Packages https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/python-packages.md
Title: Manage Python 2 packages in Azure Automation
description: This article tells how to manage Python 2 packages in Azure Automation. Previously updated : 12/17/2020 Last updated : 08/13/2021 + # Manage Python 2 packages in Azure Automation Azure Automation allows you to run Python 2 runbooks on Azure and on Linux Hybrid Runbook Workers. To help in simplification of runbooks, you can use Python packages to import the modules that you need. This article describes how to manage and use Python packages in Azure Automation.
for group in groups:
print group.name ```
+> [!NOTE]
+> The Python `automationassets` package is not available on pypi.org, so it's not available for import onto a Windows machine.
+ ## Develop and test runbooks offline To develop and test your Python 2 runbooks offline, you can use the [Azure Automation Python emulated assets](https://github.com/azureautomation/python_emulated_assets) module on GitHub. This module allows you to reference your shared resources such as credentials, variables, connections, and certificates.
azure-app-configuration Howto Move Resource Between Regions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-app-configuration/howto-move-resource-between-regions.md
+
+ Title: Move an App Configuration store to another region
+description: Learn how to move an App Configuration store to a different region.
++++ Last updated : 8/23/2021+
+#Customer intent: I want to move my App Configuration resource from one Azure region to another.
+++
+# Move an App Configuration store to another region
+
+App Configuration stores are region-specific and can't be moved across regions automatically. You must create a new App Configuration store in the target region, then move your content from the source store to the new target store. You might move your configuration to another region for a number of reasons. For example, to take advantage of a new Azure region with Availability Zone support, to deploy features or services available in specific regions only, or to meet internal policy and governance requirements.
+
+The following steps walk you through the process of creating a new target store and exporting your current store to the new region.
+
+## Design considerations
+
+Before you begin, keep in mind the following concepts:
+
+* Configuration store names are globally unique.
+* You need to reconfigure your access policies and network configuration settings in the new configuration store.
+
+## Create the target configuration store
+
+### [Portal](#tab/portal)
+To create a new App Configuration store in the Portal, follow these steps:
+1. Sign in to the [Azure portal](https://portal.azure.com). In the upper-left corner of the home page, select **Create a resource**. In the **Search the Marketplace** box, enter *App Configuration* and select <kbd>Enter</kbd>.
+
+ ![Search for App Configuration](../../includes/media/azure-app-configuration-create/azure-portal-search.png)
+1. Select **App Configuration** from the search results, and then select **Create**.
+
+ ![Select Create](../../includes/media/azure-app-configuration-create/azure-portal-app-configuration-create.png)
+1. On the **Create App Configuration** pane, enter the following settings:
+
+ | Setting | Suggested value | Description |
+ ||||
+ | **Subscription** | Your subscription | Select the Azure subscription of your original store |
+ | **Resource group** | Your resource group | Select the Azure resource group of your original store |
+ | **Resource name** | Globally unique name | Enter a unique resource name to use for the target App Configuration store. This can not be the same name as the previous configuration store. |
+ | **Location** | Your target Location | Select the target region you want to move your configuration store to. |
+ | **Pricing tier** | *Standard* | Select the desired pricing tier. For more information, see the [App Configuration pricing page](https://azure.microsoft.com/pricing/details/app-configuration). |
+1. Select **Review + create** to validate your settings.
+1. Select **Create**. The deployment might take a few minutes.
+1. Once the resource has been deployed, recreate the access policies and network configuration settings of our source store. These will not be transferred with the configuration. This can include using manage identities, virtual networks, and public network access.
+
+#### [Azure CLI](#tab/azcli)
+To create a new App Configuration store in the CLI, follow these steps:
+1. Log in to the Azure CLI with your credentials.
+ ```azurecli
+ az login
+ ```
+1. Create a new configuration store with the `create` command,
+ ```azurecli
+ az appconfig create -g MyResourceGroup -n MyResourceName -l targetlocation --sku Standard
+ ```
+ and enter the following settings:
+
+ | Setting | Suggested value | Description |
+ ||||
+ | **Resource group** | Your resource group | Select the Azure resource group of your original store |
+ | **Resource name** | Globally unique name | Enter a unique resource name to use for the target App Configuration store. This can not be the same name as the previous configuration store. |
+ | **Location** | Your target Location | Select the target region you want to move your configuration store to. |
+ | **Sku** | *Standard* | Select the desired pricing tier. For more information, see the [App Configuration pricing page](https://azure.microsoft.com/pricing/details/app-configuration). |
+1. The deployment might take a few minutes. Once it is complete, recreate the access policies and network configuration settings of our source store. These will not be transferred with the configuration values. This can include using manage identities, virtual networks, and public network access. For more information, reference the [CLI documentation](./cli-samples.md).
++
+## Transfer your configuration key-values
+
+### [Portal](#tab/portal)
+Follow these steps to export your configuration to the target store using the Portal:
+1. Navigate to your source configuration store in the [Azure portal](https://portal.azure.com) and select **Import/Export** under **Operations** .
+1. Select **Export** and choose **App Configuration** in the **Target Service** dropdown.
+ ![Export to another configuration store](media/export-to-config-store.png)
+1. Click on **Select Resource** and enter your **Subscription** and **Resource group**. The **Resource** is the name of the target configuration store you created previously.
+1. Select **Apply** to verify your target configuration store.
+1. Leave the from label, time, and Label fields as their default values and select **Apply**.
+1. To verify that your configurations have been successfully transferred from your source to your target store, navigate to your target configuration store in the portal. Select **Configuration Explorer** under **Operations** and verify that this contains the same key value pairs as those in your original store.
+ > [!NOTE]
+ > This process only allows for configuration key-values to be exported by one label at a time. To export multiple, repeat steps 2-5 for each label.
+
+### [Azure CLI](#tab/azcli)
+Follow these steps to export your configuration to the target store using the Azure:
+1. In the Azure CLI, enter the following command that will export all of the values from your source configuration store to your target configuration store.
+ ```azurecli
+ az appconfig kv export -n SourceConfigurationStore -d appconfig --dest-name TargetConfigurationStore --key * --label * --preserve-labels
+ ```
+1. To verify that your configurations have been successfully transferred from your source to your target store, list all of the key values in your target store.
+ ```azurecli
+ az appconfig kv list -n TargetAppConfiguration --all
+ ```
+
+## Delete your source configuration store
+
+If the configuration has been transferred to the target store, you can choose to delete your source configuration store.
+
+### [Portal](#tab/portal)
+Follow these steps to delete your source configuration store in the Portal:
+1. Sign in to the [Azure portal](https://portal.azure.com), and select **Resource groups**.
+1. In the **Filter by name** box, enter the name of your resource group.
+1. In the result list, select the resource group name to see an overview.
+1. Select your source configuration store, and on the **Overview** blade, select **Delete**.
+1. You're asked to confirm the deletion of the configuration store, select **Yes**.
+
+After a few moments, the source configuration store will have been deleted.
+
+### [Azure CLI](#tab/azcli)
+Follow these steps to delete your source configuration store in the Azure CLI:
+1. In the Azure CLI, run the following command:
+ ```azurecli
+ az appconfig delete -g ResourceGroupName -n SourceConfiguration
+ ```
+ Note that the **Resource Group** is the one associated with your source Configuration store.
+1. Deleting the source configuration store might take a few moments. You can verify that the operation was successful by listing all of the current configuration stores in your resource group.
+ ```azurecli
+ az appconfig list -g MyResourceGroup
+ ```
+ After a few moments, the source configuration store will have been deleted.
++
+## Next steps
+
+> [!div class="nextstepaction"]
+> [Automatically back up key-values from Azure App Configuration stores](./howto-move-resource-between-regions.md)
+>[Azure App Configuration resiliency and disaster recovery](./concept-disaster-recovery.md)
azure-app-configuration Use Feature Flags Dotnet Core https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-app-configuration/use-feature-flags-dotnet-core.md
When a feature flag has multiple filters, the filter list is traversed in order
The feature manager supports *appsettings.json* as a configuration source for feature flags. The following example shows how to set up feature flags in a JSON file: ```JSON
-"FeatureManagement": {
- "FeatureA": true, // Feature flag set to on
- "FeatureB": false, // Feature flag set to off
- "FeatureC": {
- "EnabledFor": [
- {
- "Name": "Percentage",
- "Parameters": {
- "Value": 50
+{"FeatureManagement": {
+ "FeatureA": true, // Feature flag set to on
+ "FeatureB": false, // Feature flag set to off
+ "FeatureC": {
+ "EnabledFor": [
+ {
+ "Name": "Percentage",
+ "Parameters": {
+ "Value": 50
+ }
}
- }
- ]
+ ]
+ }
} } ```
azure-arc Quickstart Connect Cluster https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/kubernetes/quickstart-connect-cluster.md
In this quickstart, you'll learn the benefits of Azure Arc enabled Kubernetes an
| `https://login.microsoftonline.com`, `login.windows.net` (for Azure Cloud), `https://login.microsoftonline.us` (for Azure US Government) | Required to fetch and update Azure Resource Manager tokens. | | `https://mcr.microsoft.com` | Required to pull container images for Azure Arc agents. | | `https://gbl.his.arc.azure.com` | Required to get the regional endpoint for pulling system-assigned Managed Service Identity (MSI) certificates. |
-| `https://<region-code>.his.arc.azure.com` (for Azure Cloud), `https://usgv.his.arc.azure.us` (for Azure US Government) | Required to pull system-assigned Managed Service Identity (MSI) certificates. `<region-code>` mapping for Azure cloud regions: `eus` (East US), `weu` (West Europe), `wcus` (West Central US), `scus` (South Central US), `sea` (South East Asia), `uks` (UK South), `wus2` (West US 2), `ae` (Australia East), `eus2` (East US 2), `ne` (North Europe), `fc` (France Central). |
+| `https://*.his.arc.azure.com` (for Azure Cloud), `https://usgv.his.arc.azure.us` (for Azure US Government) | Required to pull system-assigned Managed Service Identity (MSI) certificates. |
|`*.servicebus.windows.net`, `guestnotificationservice.azure.com`, `*.guestnotificationservice.azure.com`, `sts.windows.net` | For [Cluster Connect](cluster-connect.md) and for [Custom Location](custom-locations.md) based scenarios. | ## 1. Register providers for Azure Arc enabled Kubernetes
eastus AzureArcTest1 microsoft.kubernetes/connectedclusters
-## 4. Verify cluster connection
-
-Run the following command:
-
-### [Azure CLI](#tab/azure-cli)
-
-```azurecli
-az connectedk8s list --resource-group AzureArcTest --output table
-```
-
-Output:
-<pre>
-Name Location ResourceGroup
-- -
-AzureArcTest1 eastus AzureArcTest
-</pre>
-
-### [Azure PowerShell](#tab/azure-powershell)
-
-```azurepowershell
-Get-AzConnectedKubernetes -ResourceGroupName AzureArcTest
-```
-
-Output:
-<pre>
-Location Name Type
- -
-eastus AzureArcTest1 microsoft.kubernetes/connectedclusters
-</pre>
---
-> [!NOTE]
-> After onboarding the cluster, it takes around 5 to 10 minutes for the cluster metadata (cluster version, agent version, number of nodes, etc.) to surface on the overview page of the Azure Arc enabled Kubernetes resource in Azure portal.
-
-## 5. Connect using an outbound proxy server
+## 4a. Connect using an outbound proxy server
### [Azure CLI](#tab/azure-cli)
If your cluster is behind an outbound proxy server, Azure CLI and the Azure Arc
az connectedk8s connect --name <cluster-name> --resource-group <resource-group> --proxy-https https://<proxy-server-ip-address>:<port> --proxy-http http://<proxy-server-ip-address>:<port> --proxy-skip-range <excludedIP>,<excludedCIDR> --proxy-cert <path-to-cert-file> ```
-> [!NOTE]
-> * Specify `excludedCIDR` under `--proxy-skip-range` to ensure in-cluster communication is not broken for the agents.
-> * `--proxy-http`, `--proxy-https`, and `--proxy-skip-range` are expected for most outbound proxy environments. `--proxy-cert` is *only* required if you need to inject trusted certificates expected by proxy into the trusted certificate store of agent pods.
+ > [!NOTE]
+ > * Some network requests such as the ones involving in-cluster service-to-service communication need to be separated from the traffic that is routed via the proxy server for outbound communication. The `--proxy-skip-range` parameter can be used to specify the CIDR range and endpoints in a comma-separated way so that any communication from the agents to these endpoints do not go via the outbound proxy. At a minimum, the CIDR range of the services in the cluster should be specified as value for this parameter. For example, let's say `kubectl get svc -A` returns a list of services where all the services have ClusterIP values in the range `10.0.0.0/16`. Then the value to specify for `--proxy-skip-range` is '10.0.0.0/16,kubernetes.default.svc'.
+ > * `--proxy-http`, `--proxy-https`, and `--proxy-skip-range` are expected for most outbound proxy environments. `--proxy-cert` is *only* required if you need to inject trusted certificates expected by proxy into the trusted certificate store of agent pods.
### [Azure PowerShell](#tab/azure-powershell)
If your cluster is behind an outbound proxy server, Azure PowerShell and the Azu
+## 5. Verify cluster connection
+
+Run the following command:
+
+### [Azure CLI](#tab/azure-cli)
+
+```azurecli
+az connectedk8s list --resource-group AzureArcTest --output table
+```
+
+Output:
+<pre>
+Name Location ResourceGroup
+- -
+AzureArcTest1 eastus AzureArcTest
+</pre>
+
+### [Azure PowerShell](#tab/azure-powershell)
+
+```azurepowershell
+Get-AzConnectedKubernetes -ResourceGroupName AzureArcTest
+```
+
+Output:
+<pre>
+Location Name Type
+-- - -
+eastus AzureArcTest1 microsoft.kubernetes/connectedclusters
+</pre>
+++
+> [!NOTE]
+> After onboarding the cluster, it takes around 5 to 10 minutes for the cluster metadata (cluster version, agent version, number of nodes, etc.) to surface on the overview page of the Azure Arc enabled Kubernetes resource in Azure portal.
+ ## 6. View Azure Arc agents for Kubernetes Azure Arc enabled Kubernetes deploys a few operators into the `azure-arc` namespace.
azure-arc Tutorial Use Gitops Connected Cluster https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/kubernetes/tutorial-use-gitops-connected-cluster.md
Just like private keys, you can provide your known_hosts content directly or in
>[!NOTE] >* Helm operator chart version 1.2.0+ supports the HTTPS Helm release private auth. >* HTTPS Helm release is not supported for AKS managed clusters.
->* If you need Flux to access the Git repository through your proxy, you will need to update the Azure Arc agents with the proxy settings. For more information, see [Connect using an outbound proxy server](./quickstart-connect-cluster.md#5-connect-using-an-outbound-proxy-server).
+>* If you need Flux to access the Git repository through your proxy, you will need to update the Azure Arc agents with the proxy settings. For more information, see [Connect using an outbound proxy server](./quickstart-connect-cluster.md#4a-connect-using-an-outbound-proxy-server).
## Additional Parameters
azure-arc Validation Program https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/kubernetes/validation-program.md
The following providers and their corresponding Kubernetes distributions have su
| Canonical | [Charmed Kubernetes](https://ubuntu.com/kubernetes) | [1.19](https://ubuntu.com/kubernetes/docs/1.19/components) | | SUSE Rancher | [Rancher Kubernetes Engine](https://rancher.com/products/rke/) | RKE CLI version: [v1.2.4](https://github.com/rancher/rke/releases/tag/v1.2.4); Kubernetes versions: [1.19.6](https://github.com/kubernetes/kubernetes/releases/tag/v1.19.6)), [1.18.14](https://github.com/kubernetes/kubernetes/releases/tag/v1.18.14)), [1.17.16](https://github.com/kubernetes/kubernetes/releases/tag/v1.17.16)) | | Nutanix | [Karbon](https://www.nutanix.com/products/karbon) | Version 2.2.1 |
+| Platform9 | [Platform9 Managed Kubernetes (PMK)](https://platform9.com/managed-kubernetes/) | PMK Version [5.3.0](https://platform9.com/docs/kubernetes/release-notes#platform9-managed-kubernetes-version-53-release-notes); Kubernetes versions: v1.20.5, v1.19.6, v1.18.10 |
The Azure Arc team also ran the conformance tests and validated Azure Arc enabled Kubernetes scenarios on the following public cloud providers:
azure-arc Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/servers/overview.md
When you connect your machine to Azure Arc-enabled servers, it enables the abili
|**Govern** || | Azure Policy |Assign [Azure Policy guest configurations](../../governance/policy/concepts/guest-configuration.md) to audit settings inside the machine. To understand the cost of using Azure Policy Guest Configuration policies with Arc-enabled servers, see Azure Policy [pricing guide](https://azure.microsoft.com/pricing/details/azure-policy/)| |**Protect** ||
-| Azure Security Center | Protect non-Azure servers with [Microsoft Defender for Endpoint](/microsoft-365/security/endpoint-defender), included through [Azure Defender](../../security-center/defender-for-servers-introduction.md), for threat detection, for vulnerability management, and to proactively monitor for potential security threats. Azure Security Center presents the alerts and remediation suggestions from the threats detected. |
+| Azure Security Center | Protect non-Azure servers with [Microsoft Defender for Endpoint](/microsoft-365/security/defender-endpoint), included through [Azure Defender](../../security-center/defender-for-servers-introduction.md), for threat detection, for vulnerability management, and to proactively monitor for potential security threats. Azure Security Center presents the alerts and remediation suggestions from the threats detected. |
| Azure Sentinel | Machines connected to Arc-enabled servers can be [configured with Azure Sentinel](scenario-onboard-azure-sentinel.md) to collect security-related events and correlate them with other data sources. | |**Configure** || | Azure Automation |Assess configuration changes about installed software, Microsoft services, Windows registry and files, and Linux daemons using [Change Tracking and Inventory](../../automation/change-tracking/overview.md).<br> Use [Update Management](../../automation/update-management/overview.md) to manage operating system updates for your Windows and Linux servers. |
azure-functions Durable Functions Singletons https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/durable/durable-functions-singletons.md
async def main(req: func.HttpRequest, starter: str) -> func.HttpResponse:
existing_instance = await client.get_status(instance_id)
- if existing_instance is None or existing_instance.runtime_status in [df.OrchestrationRuntimeStatus.Completed, df.OrchestrationRuntimeStatus.Failed, df.OrchestrationRuntimeStatus.Terminated]:
+ if existing_instance.runtime_status in [df.OrchestrationRuntimeStatus.Completed, df.OrchestrationRuntimeStatus.Failed, df.OrchestrationRuntimeStatus.Terminated, None]:
event_data = req.get_body() instance_id = await client.start_new(function_name, instance_id, event_data) logging.info(f"Started orchestration with ID = '{instance_id}'.")
azure-functions Functions App Settings https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-app-settings.md
Only used when deploying to a Premium plan or to a Consumption plan running on W
## WEBSITE\_CONTENTOVERVNET
-A value of `1` enables your function app to scale when you have your storage account restricted to a virtual network. You should enable this setting when restricting your storage account to a virtual network. To learn more, see [Restrict your storage account to a virtual network](functions-networking-options.md#restrict-your-storage-account-to-a-virtual-network).
+A value of `1` enables your function app to scale when you have your storage account restricted to a virtual network. You should enable this setting when restricting your storage account to a virtual network. To learn more, see [Restrict your storage account to a virtual network](configure-networking-how-to.md#restrict-your-storage-account-to-a-virtual-network).
|Key|Sample value| |||
azure-functions Functions Premium Plan https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-premium-plan.md
And for plans with more than 4GB memory, ensure the Bitness Platform Setting is
## Region Max Scale Out
-Below are the currently supported maximum scale-out values for a single plan in each region and OS configuration. To request an increase, you can open a support ticket.
+Below are the currently supported maximum scale-out values for a single plan in each region and OS configuration.
See the complete regional availability of Functions on the [Azure web site](https://azure.microsoft.com/global-infrastructure/services/?products=functions).
azure-government Azure Services In Fedramp Auditscope https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/compliance/azure-services-in-fedramp-auditscope.md
This article provides a detailed list of Azure, Dynamics 365, Microsoft 365, and
| [Azure Red Hat OpenShift](https://azure.microsoft.com/services/openshift/) | &#x2705; | &#x2705; | | | [Azure Resource Graph](../../governance/resource-graph/overview.md) | &#x2705; | &#x2705; | | | [Azure Resource Manager](https://azure.microsoft.com/features/resource-manager/) | &#x2705; | &#x2705; | |
-| [Azure Scheduler](../../scheduler/scheduler-intro.md) | &#x2705; | &#x2705; | |
+| [Azure Scheduler](../../scheduler/scheduler-intro.md) (replaced by [Azure Logic Apps](https://azure.microsoft.com/services/logic-apps/)) | &#x2705; | &#x2705; | |
| [Azure Security Center](https://azure.microsoft.com/services/security-center/) | &#x2705; | &#x2705; | | | [Azure Service Fabric](https://azure.microsoft.com/services/service-fabric/) | &#x2705; | &#x2705; | | | [Azure Service Health](https://azure.microsoft.com/features/service-health/) | &#x2705; | &#x2705; | |
This article provides a detailed list of Azure, Dynamics 365, Microsoft 365, and
| [Azure Database Migration Service](https://azure.microsoft.com/services/database-migration/) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | | [Azure Databricks](https://azure.microsoft.com/services/databricks/) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | | [Azure DDoS Protection](https://azure.microsoft.com/services/ddos-protection/) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | |
-| [Azure Dedicated HSM](https://azure.microsoft.com/services/azure-dedicated-hsm/) | &#x2705; | &#x2705; | &#x2705; | | |
+| [Azure Dedicated HSM](https://azure.microsoft.com/services/azure-dedicated-hsm/) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | |
| [Azure Defender for IoT](https://azure.microsoft.com/services/azure-defender-for-iot/) (formerly Azure Security for IoT) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | | **Service** | **FedRAMP High** | **DoD IL2** | **DoD IL4** | **DoD IL5** | **DoD IL6** | | [Azure DevTest Labs](https://azure.microsoft.com/services/devtest-lab/) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | |
This article provides a detailed list of Azure, Dynamics 365, Microsoft 365, and
| [Azure Resource Graph](../../governance/resource-graph/overview.md) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | | [Azure Resource Manager](https://azure.microsoft.com/features/resource-manager/) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | **Service** | **FedRAMP High** | **DoD IL2** | **DoD IL4** | **DoD IL5** | **DoD IL6** |
-| [Azure Scheduler](../../scheduler/scheduler-intro.md) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | |
+| [Azure Scheduler](../../scheduler/scheduler-intro.md) (replaced by [Azure Logic Apps](https://azure.microsoft.com/services/logic-apps/)) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | |
| [Azure Security Center](https://azure.microsoft.com/services/security-center/) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | | [Azure Sentinel](https://azure.microsoft.com/services/azure-sentinel/) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | | | [Azure Service Fabric](https://azure.microsoft.com/services/service-fabric/) | &#x2705; | &#x2705; | &#x2705; | &#x2705; | &#x2705; |
azure-government Documentation Government Csp List https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/documentation-government-csp-list.md
Below you can find a list of all the authorized Cloud Solution Providers, AOS-G
|[Vidizmo LLC](https://www.vidizmo.com)| |[Virsage Solutions](https://www.virsage.com/)| |[ViON Corp.](https://www.vion.com/)|
-|[Viscon Networking Innovations Inc.](https://www.visconni.com/)|
|[VisioLogix Corporation](https://www.visiologix.com)| |[VVL Systems & Consulting, LLC](https://www.vvlsystems.com/)| |Vistronix, LLC|
azure-monitor Sampling https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/sampling.md
Title: Telemetry sampling in Azure Application Insights | Microsoft Docs description: How to keep the volume of telemetry under control. Previously updated : 01/17/2020 Last updated : 08/26/2021
When metric counts are presented in the portal, they are renormalized to take in
* There are three different types of sampling: adaptive sampling, fixed-rate sampling, and ingestion sampling. * Adaptive sampling is enabled by default in all the latest versions of the Application Insights ASP.NET and ASP.NET Core Software Development Kits (SDKs). It is also used by [Azure Functions](../../azure-functions/functions-overview.md). * Fixed-rate sampling is available in recent versions of the Application Insights SDKs for ASP.NET, ASP.NET Core, Java (both the agent and the SDK), and Python.
+* In Java, sampling overrides are available, and are useful when you need to apply different sampling rates to selected dependencies, requests, healthchecks. Use [sampling overrides](https://docs.microsoft.com/azure/azure-monitor/app/java-standalone-sampling-overrides) to tune out some noisy dependencies while for example all important errors are kept at 100%. This is a form of fixed sampling that gives you a fine-grained level of control over your telemetry.
* Ingestion sampling works on the Application Insights service endpoint. It only applies when no other sampling is in effect. If the SDK samples your telemetry, ingestion sampling is disabled. * For web applications, if you log custom events and need to ensure that a set of events is retained or discarded together, the events must have the same `OperationId` value. * If you write Analytics queries, you should [take account of sampling](/azure/data-explorer/kusto/query/samples?&pivots=azuremonitor#aggregations). In particular, instead of simply counting records, you should use `summarize sum(itemCount)`.
The following table summarizes the sampling types available for each SDK and typ
| ASP.NET | [Yes (on by default)](#configuring-adaptive-sampling-for-aspnet-applications) | [Yes](#configuring-fixed-rate-sampling-for-aspnet-applications) | Only if no other sampling is in effect | | ASP.NET Core | [Yes (on by default)](#configuring-adaptive-sampling-for-aspnet-core-applications) | [Yes](#configuring-fixed-rate-sampling-for-aspnet-core-applications) | Only if no other sampling is in effect | | Azure Functions | [Yes (on by default)](#configuring-adaptive-sampling-for-azure-functions) | No | Only if no other sampling is in effect |
-| Java | No | [Yes](#configuring-fixed-rate-sampling-for-java-applications) | Only if no other sampling is in effect |
+| Java | No | [Yes](#configuring-sampling-overrides-and-fixed-rate-sampling-for-java-applications) | Only if no other sampling is in effect |
| Node.JS | No | [Yes](./nodejs.md#sampling) | Only if no other sampling is in effect | Python | No | [Yes](#configuring-fixed-rate-sampling-for-opencensus-python-applications) | Only if no other sampling is in effect | | All others | No | No | [Yes](#ingestion-sampling) |
In Metrics Explorer, rates such as request and exception counts are multiplied b
} ```
-### Configuring fixed-rate sampling for Java applications
+### Configuring sampling overrides and fixed-rate sampling for Java applications
-By default no sampling is enabled in the Java agent and SDK. Currently it only supports fixed rate sampling. Adaptive sampling is not supported in Java.
+By default no sampling is enabled in the Java auto-instrumentation and SDK. Currently the Java auto-instrumentation, [sampling overrides](https://docs.microsoft.com/azure/azure-monitor/app/java-standalone-sampling-overrides) and fixed rate sampling are supported. Adaptive sampling is not supported in Java.
-#### Configuring Java Agent
+#### Configuring Java auto-instrumentation
-1. Download [applicationinsights-agent-3.0.0-PREVIEW.5.jar](https://github.com/microsoft/ApplicationInsights-Java/releases/download/3.0.0-PREVIEW.5/applicationinsights-agent-3.0.0-PREVIEW.5.jar)
-
-1. To enable sampling add the following to your `applicationinsights.json` file:
-
-```json
-{
- "sampling": {
- "percentage": 10 //this is just an example that shows you how to enable only 10% of transaction
- }
-}
-```
+* To configure sampling overrides that override the defaulr sampling rate and apply different sampling rates to selected requests and dependencies, use the [sampling override guide](https://docs.microsoft.com/azure/azure-monitor/app/java-standalone-sampling-overrides#getting-started).
+* To configure fixed-rate samping that applies to all of your telemetry, use the [fixed rate sampling guide](https://docs.microsoft.com/azure/azure-monitor/app/java-standalone-config#sampling).
#### Configuring Java 2.x SDK
azure-monitor Ad Replication Status https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/insights/ad-replication-status.md
The AD Replication Status solution regularly monitors your Active Directory envi
[!INCLUDE [azure-monitor-log-analytics-rebrand](../../../includes/azure-monitor-log-analytics-rebrand-solution.md)] ## Installing and configuring the solution+ Use the following information to install and configure the solution. ### Prerequisites
Use the following information to install and configure the solution.
### Install agents on domain controllers+ You must install agents on domain controllers that are members of the domain to be evaluated. Or, you must install agents on member servers and configure the agents to send AD replication data to Azure Monitor. To understand how to connect Windows computers to Azure Monitor, see [Connect Windows computers to Azure Monitor](../agents/agent-windows.md). If your domain controller is already part of an existing System Center Operations Manager environment that you want to connect to Azure Monitor, see [Connect Operations Manager to Azure Monitor](../agents/om-agents.md). ### Enable non-domain controller+ If you don't want to connect any of your domain controllers directly to Azure Monitor, you can use any other computer in your domain connected to Azure Monitor to collect data for the AD Replication Status solution pack and have it send the data. 1. Verify that the computer is a member of the domain that you wish to monitor using the AD Replication Status solution.
azure-netapp-files Azure Netapp Files Solution Architectures https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/azure-netapp-files-solution-architectures.md
na ms.devlang: na Previously updated : 08/18/2021 Last updated : 08/25/2021 # Solution architectures using Azure NetApp Files
This section provides references for Windows applications and SQL Server solutio
### File sharing and Global File Caching * [Build Your Own Azure NFS? Wrestling Linux File Shares into Cloud](https://cloud.netapp.com/blog/ma-anf-blg-build-your-own-linux-nfs-file-shares)
-* [Global File Cache / Azure NetApp Files Deployment](https://youtu.be/91LKb1qsLIM)
+* [Globally Distributed Enterprise File Sharing with Azure NetApp Files and NetApp Global File Cache](https://f.hubspotusercontent20.net/hubfs/525875/NA-580-0521-Architecture-Doc-R3.pdf)
* [Cloud Compliance for Azure NetApp Files](https://cloud.netapp.com/hubfs/Cloud%20Compliance%20for%20Azure%20NetApp%20Files%20-%20November%202020.pdf) ### SQL Server
azure-percept Azure Percept Devkit Software Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/azure-percept-devkit-software-release-notes.md
+
+ Title: Azure Percept DK software release notes
+description: Information about changes made to the Azure Percept DK software.
++++ Last updated : 08/23/2021+++
+# Azure Percept DK software release notes
+
+This page provides information of changes and fixes for each Azure Percept DK OS and firmware release.
+
+To download the update images, refer to [Azure Percept DK software releases for USB cable update](./software-releases-usb-cable-updates.md) or [Azure Percept DK software releases for OTA update](./software-releases-over-the-air-updates.md).
+
+## July (2107) Release
+
+> [!IMPORTANT]
+> Due to a code signing change OTA (Over-The-Air) package for this release is only compatible with Azure Percept DK running the 2106 release. For Azure Percept DK users who are currently running older SW release version, Microsoft recommends to perform an update over USB cable or perform an OTA update first to release 2106 before updating to 2107.
+
+- Wi-Fi:
+ - Security hardening to ensure the Wi-Fi access point is shut down after setup completes.
+ - Fixed an issue where pushing the **Setup** button on the dev kit could cause the dev kit's Wi-Fi access point to be out of sync with the setup experience web service.
+ - Enhanced the Wi-Fi access point iptables rules to be more resilient and removed unnecessary rules.
+ - Fixed an issue where multiple connected Wi-Fi networks wouldn't be properly prioritized.
+- Setup experience:
+ - Added localization for supported regions and updated the text for better readability.
+ - Fixed an issue where the setup experience would sometimes get stuck on a loading page.
+- General networking:
+ - Resolved issues with IPv6 not obtaining a valid DHCP lease.
+- Operating system:
+ - Security fixes.
+
+## June (2106) Release
+
+- Updated image verification mechanism for OTA agent.
+- UI improvements and bug fixes to the setup experience.
+
+## May (2105) Release
+
+- Security updates to CBL-Mariner OS.
+
+## April (2104) Release
+
+- Fixed log rotation issue that may cause Azure Percept DK storage to get full.
+- Enabled TPM based provisioning to Azure in the setup experience.
+- Added an automatic timeout to the setup experience and Wi-Fi access point. After 30 minutes or after the setup experience completion.
+- Wi-Fi access point SSID changed from "**scz-[xxx]**" to "**apd-[xxx]**".
+
+## Next steps
+
+- [How to determine your update strategy](./how-to-determine-your-update-strategy.md)
azure-percept How To Determine Your Update Strategy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/how-to-determine-your-update-strategy.md
+
+ Title: Determine your update strategy for Azure Percept DK
+description: Pros and cons of Azure Percept DK OTA or USB cable updates. Recommendation for choosing the best update approach for different users.
++++ Last updated : 08/23/2021+++
+# How to determine your update strategy
+
+To keep your Azure Percept DK software update-to-date, Microsoft offers two update methods for the dev kit. **Update over USB cable** or **Over-the-air (OTA) update**.
+
+Update over USB cable does a clean install to the dev kit. Existing configurations and all the user data in each partition will be wiped out after the new image is deployed. To do that, connect the dev kit to a host system with a type-c USB cable. The host system can be a Windows/Linux machine. You can also use this update method as the factory reset. To do that, redeployed the exact same version to the dev kit. Refer to [Update the Azure Percept DK over a USB-C cable connection](./how-to-update-via-usb.md) for detail about the USB cable update.
+
+The OTA update is built on top of the [Device Update for IoT Hub](https://docs.microsoft.com/azure/iot-hub-device-update/device-update-resources) Azure service. Connect the dev kit to Azure IoT Hub to do this type of update. Configurations and user data will be preserved after the OTA update. Refer to [Update your Azure Percept DK over-the-air (OTA)](./how-to-update-over-the-air.md) for detail about doing the OTA update.
+
+Check the pros and cons for both USB cable update and OTA update, then follow the Microsoft recommendations for different scenarios.
+
+## USB cable update
+
+- Pros
+ - You don't need to connect the dev kit to internet/Azure.
+ - Latest image is always applicable no matter what version of software and firmware are currently loaded on the dev kit.
+- Cons
+ - Reimages the device and will remove configurations, and user data.
+ - Need to rerun OOBE and download any non-preloaded container.
+ - Cannot be performed remotely.
+
+## OTA update
+
+- Pros
+ - Preserves user data, configurations, and downloaded containers. Dev kit will keep working as it was after the OTA.
+ - Update can be performed remotely.
+ - Several similar devices can be updated at the same time. Updates can also be schedule to happen, for example during night-time.
+- Cons
+ - There may be hard-stop version(s) that cannot be skipped. Refer to [Hard-Stop Version of OTA](./software-releases-over-the-air-updates.md#hard-stop-version-of-ota).
+ - The device needs to connect to a IoT Hub, which has been properly configured the ΓÇ£Device Update for IoT HubΓÇ¥ feature.
+ - It won't work well for downgrade.
+
+> [!IMPORTANT]
+> Device Update for IoT Hub does not block deployment of image with version that is older than the currently running OS. However doing so to dev kit will result in loss of data and functionality.
+
+## Microsoft recommendations
+
+|Type|Scenario|Update Method|
+|::||::|
+|Production|Keep dev kit up to date for latest fix and security patch while it's already running your solution or deployed to the field.|OTA|
+|Production/Develop|Unboxing a new dev kit and update it to the latest software.|USB|
+|Production/Develop|Want to update to the latest software version while have already skipped several monthly releases.|USB|
+|Production/Develop|Factory rest a dev kit.|USB|
+|Develop|During solution development, want to keep the dev kit OS and F/W up to date.|USB/OTA|
+|Develop|Jump to any specific (older) version for issue investigation/debugging.|USB|
+
+## Next steps
+
+After deciding the update method of choice, visit the following pages for getting ready to do update:
+
+USB cable update
+
+- [Update the Azure Percept DK over a USB-C cable connection](./how-to-update-via-usb.md)
+- [Azure Percept DK software releases for USB cable update](./software-releases-usb-cable-updates.md)
+
+OTA
+
+- [Update your Azure Percept DK over-the-air (OTA)](./how-to-update-over-the-air.md)
+- [Azure Percept DK software releases for OTA update](./software-releases-over-the-air-updates.md)
azure-percept How To Select Update Package https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/how-to-select-update-package.md
Title: Select your Azure Percept DK update package description: How to identify your Azure Percept DK version and select the best update package for it --++ Last updated 07/23/2021
-# How to determine and download the best update package for OTA and USB updates
+# Select your Azure Percept DK update package
This page provides guidance on how to select the update package that is best for your dev kit and the download locations for the update packages. For more information on how to update your device, see these articles:+ - [Update your Azure Percept DK over-the-air](./how-to-update-over-the-air.md) - [Update your Azure Percept DK via USB](./how-to-update-via-usb.md) - ## Prerequisites - An [Azure Percept DK](https://go.microsoft.com/fwlink/?linkid=2155270) that has been [set up and connected to Azure Percept Studio and IoT Hub](./quickstart-percept-dk-set-up.md). ## Identify the model name and software version of your dev kit+ To ensure you apply the correct update package to your dev kit, you must first determine which software version it's currently running. > [!WARNING] > Applying the incorrect update package could result in your dev kit becoming inoperable. It is important that you follow these steps to ensure you apply the correct update package. Option 1:+ 1. Log in to the [Azure Percept Studio](./overview-azure-percept-studio.md).
-2. In **Devices**, choose your devkit device.
-3. In the **General** tab, look for the **Model** and **SW Version** information.
+1. In **Devices**, choose your devkit device.
+1. In the **General** tab, look for the **Model** and **SW Version** information.
Option 2:
-1. View the **IoT Edge Device** of **IoT Hub** service from Microsoft Azure Portal.
-2. Choose your devkit device from the device list.
-3. Select **Device twin**.
-4. Scroll through the device twin properties and locate **"model"** and **"swVersion"** under **"deviceInformation"** and make a note of their values.
+
+1. View the **IoT Edge Device** of **IoT Hub** service from Microsoft Azure portal.
+1. Choose your devkit device from the device list.
+1. Select **Device twin**.
+1. Scroll through the device twin properties and locate **"model"** and **"swVersion"** under **"deviceInformation"** and make a note of their values.
## Determine the correct update package
-Using the **model** and **swVersion** identified in the previous section, check the table below to determine which update package to download.
+Using the **model** and **swVersion** identified in the previous section, check the table below to determine which update package to download.
|model |swVersion |Update method |Download links |Note | ||||||
Using the **model** and **swVersion** identified in the previous section, check
|APDK-101 |Any swVersion earlier than 2021.106.111.115 |**USB only** |[2021.107.129.116 USB update package](https://go.microsoft.com/fwlink/?linkid=2169086) |July release (2107) | |APDK-101 |2021.106.111.115 |OTA or USB |[2021.107.129.116 OTA update package](https://go.microsoft.com/fwlink/?linkid=2169245)<br>[2021.107.129.116 USB update package](https://go.microsoft.com/fwlink/?linkid=2169086) |July release (2107) | - ## Next steps+ Update your dev kits via the methods and update packages determined in the previous section.+ - [Update your Azure Percept DK over-the-air](./how-to-update-over-the-air.md) - [Update your Azure Percept DK via USB](./how-to-update-via-usb.md)
azure-percept How To Set Up Over The Air Updates https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/how-to-set-up-over-the-air-updates.md
Title: Set up Azure IoT Hub to deploy over-the-air updates description: Learn how to configure Azure IoT Hub to deploy updates over-the-air to Azure Percept DK--++ Last updated 03/30/2021
azure-percept How To Update Over The Air https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/how-to-update-over-the-air.md
Title: Update your Azure Percept DK using over-the-air (OTA) updates description: Learn how to receive over-the air (OTA) updates to your Azure Percept DK--++ Last updated 03/30/2021
Group Tag Requirements:
- A device can only belong to one group. 1. Add a Tag to your device(s):
+ 1. From **IoT Edge** on the left navigation pane, find your Azure Percept DK and navigate to its **Device Twin**.
+ 1. Add a new **Device Update for IoT Hub** tag value as shown below (```<CustomTagValue>``` refers to your tag value/name, for example, AzurePerceptGroup1). Learn more about device twin [JSON document tags](../iot-hub/iot-hub-devguide-device-twins.md#device-twins).
- 1. From **IoT Edge** on the left navigation pane, find your Azure Percept DK and navigate to its **Device Twin**.
-
- 1. Add a new **Device Update for IoT Hub** tag value as shown below (```<CustomTagValue>``` refers to your tag value/name, for example, AzurePerceptGroup1). Learn more about device twin [JSON document tags](../iot-hub/iot-hub-devguide-device-twins.md#device-twins).
-
- ```
+ ```json
"tags": { "ADUGroup": "<CustomTagValue>" },
Group Tag Requirements:
1. Create a group by selecting an existing Azure IoT Hub tag:
- 1. Navigate back to your Azure IoT Hub page.
-
- 1. Select **Device Updates** under **Automatic Device Management** on the left-hand menu panel.
-
- 1. Select the **Groups** tab. This page will display the number of ungrouped devices connected to Device Update.
-
- 1. Select **+ Add** to create a new group.
-
- 1. Select an IoT Hub tag from the list and click **Submit**.
-
- 1. Once the group is created, the update compliance chart and groups list will update. The chart shows the number of devices in various states of compliance: **On latest update**, **New updates available**, **Updates in progress**, and **Not yet grouped**.
+ 1. Navigate back to your Azure IoT Hub page.
+ 1. Select **Device Updates** under **Automatic Device Management** on the left-hand menu panel.
+ 1. Select the **Groups** tab. This page will display the number of ungrouped devices connected to Device Update.
+ 1. Select **+ Add** to create a new group.
+ 1. Select an IoT Hub tag from the list and click **Submit**.
+ 1. Once the group is created, the update compliance chart and groups list will update. The chart shows the number of devices in various states of compliance: **On latest update**, **New updates available**, **Updates in progress**, and **Not yet grouped**.
## Deploy an update
azure-percept How To Update Via Usb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/how-to-update-via-usb.md
Title: Update your Azure Percept DK over a USB-C cable connection description: Learn how to update the Azure Percept DK over a USB-C cable connection--++ Last updated 03/18/2021
# Update the Azure Percept DK over a USB-C cable connection This guide will show you how to successfully update your dev kit's operating system and firmware over a USB connection. Here's an overview of what you will be doing during this procedure.+ 1. Download the update package to a host computer 1. Run the command that transfers the update package to the dev kit 1. Set the dev kit into USB mode using SSH or DIP switches
This guide will show you how to successfully update your dev kit's operating sys
> > Follow all instructions in order. Skipping steps could put your dev kit in an unusable state. - ## Prerequisites - An Azure Percept DK
azure-percept Overview Update Experience https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/overview-update-experience.md
Title: Azure Percept DK update experience description: Learn more about how to keep the Azure Percept DK up-to-date--++ Last updated 03/24/2021
azure-percept Software Releases Over The Air Updates https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/software-releases-over-the-air-updates.md
+
+ Title: Software releases for Azure Percept DK OTA updates
+description: Information and download links for the Azure Percept DK over-the-air update packages
++++ Last updated : 08/23/2021++++
+# Software releases for OTA updates
+
+The OTA update is built for users who tend to always keep the dev kit up to date. That's why only the hard-stop versions and the latest version are provided here. To change your dev kit to a specific (older) version, use the USB cable update. Refer to [Update the Azure Percept DK over a USB-C cable connection](./how-to-update-via-usb.md). Also use the USB update if you want to jump to a much advanced version.
+
+>[!CAUTION]
+>Dev kit doesn't support SW version downgrade with OTA. The Device Update for IoT Hub framework will NOT block deploying an image with version older than the current one. However doing so to dev kit will result in loss of data and functionality.
+
+>[!IMPORTANT]
+>Be sure to check the following document before you decide to go with either OTA or USB cable update.
+>
+>[How to determine your update strategy](./how-to-determine-your-update-strategy.md)
+
+## Hard-stop version of OTA
+
+Microsoft would service each dev kit release with OTA packages. However, as there are breaking changes for dev kit OS/firmware, or the OTA platform, OTA directly from an old version to a much-advanced version may be problematic. Generally, when a breaking change happens, Microsoft will make sure that the OTA update process transitions the old system seamlessly to **the very first version that introduces/deliver this breaking change**. This specific version becomes a hard-stop version for OTA. Take a known hard-stop version: **June release** as an example. OTA will work if a user updates the dev kit from 2104 to 2106, then from 2106 to 2107. However, it will NOT work if a user tries to skip the hard-stop (2106) and update the dev kit from 2104 directly to 2107.
++
+## Recommendations for applying the OTA update
+
+**Scenario 1:** Frequently (monthly) update the dev kit to make sure itΓÇÖs always up to date
+
+- There should be no problem if you always do OTA to update the dev kit from last release to the newly released version.
+
+**Scenario 2:** Do update while few versions might be skipped.
+
+1. Identify the current software version of dev kit.
+1. Review the OTA package release list to look for any hard-stop version between the current version and target version.
+ - If there is, you need to sequentially deploy the hard-stop version(s) until you can deploy the latest update package.
+ - If there isn't, then you can directly deploy the latest OTA package to the dev kit.
+
+## Identify the current software version of dev kit
+
+**Option 1:**
+
+1. Sign in to the [Azure Percept Studio](./overview-azure-percept-studio.md).
+1. In **Devices**, choose your dev kit device.
+1. In the **General** tab, look for the **Model** and **SW Version** information.
+
+**Option 2:**
+
+1. View the **IoT Edge Device** of **IoT Hub** service from Microsoft Azure portal.
+1. Choose your dev kit device from the device list.
+1. Select **Device twin**.
+1. Scroll through the device twin properties and locate **"model"** and **"swVersion"** under **"deviceInformation"** and make a note of their values.
+
+## Identify the OTA package(s) to be deployed
+
+>[!IMPORTANT]
+>If the current version of you dev kit isn't included in any of the releases below, it's NOT supported for OTA update. Please do a USB cable update to get to the latest version.
+
+**Latest release:**
+
+|Release|Applicable Version(s)|Download Links|Note|
+|||||
+|July Service Release (2107)|2021.106.111.115|[2021.107.129.116 OTA update package](https://go.microsoft.com/fwlink/?linkid=2169245)||
+
+**Hard-stop releases:**
+
+|Release|Applicable Version(s)|Download Links|Note|
+|||||
+|June Service Release (2106)|2021.102.108.112, 2021.104.110.103, 2021.105.111.122 |[2021.106.111.115 OTA manifest (for PE-101)](https://go.microsoft.com/fwlink/?linkid=2167127)<br>[2021.106.111.115 OTA manifest (for APDK-101)](https://go.microsoft.com/fwlink/?linkid=2167235) <br>[2021.106.111.115 OTA update package](https://go.microsoft.com/fwlink/?linkid=2167128) |Be sure to use the correct manifest based on "model name" (PE-101/APDK-101)|
+
+## Next steps
+
+[Update your Azure Percept DK over-the-air (OTA)](./how-to-update-over-the-air.md)
azure-percept Software Releases Usb Cable Updates https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/software-releases-usb-cable-updates.md
+
+ Title: Azure Percept DK software releases for update over USB cable
+description: Information and download links for the USB cable update package of Azure Percept DK
++++ Last updated : 08/23/2021+++
+# Azure Percept DK software releases for updating over USB
+
+This page provides information and download links for all the dev kit OS/firmware image releases. For detail of changes/fixes in each version, refer to the release notes:
+
+- [Azure Percept DK software release notes](./azure-percept-devkit-software-release-notes.md).
+
+>[!IMPORTANT]
+>Be sure to check the following document before you decide to go with either OTA or USB cable update.
+>
+>[How to determine your update strategy](./how-to-determine-your-update-strategy.md)
+
+## Latest releases
+
+- **Latest service release**
+July Service Release (2107): [Azure-Percept-DK-1.0.20210729.0957-public_preview_1.0.zip](https://go.microsoft.com/fwlink/?linkid=2169086)
+- **Latest major update or known stable version**
+Feature Update (2104): [Azure-Percept-DK-1.0.20210409.2055.zip](https://download.microsoft.com/download/6/4/d/64d53e60-f702-432d-a446-007920a4612c/Azure-Percept-DK-1.0.20210409.2055.zip)
+
+## Full list of releases
+
+|Release|Download Links|Note|
+|||::|
+|July Service Release (2107)|[Azure-Percept-DK-1.0.20210729.0957-public_preview_1.0.zip](https://go.microsoft.com/fwlink/?linkid=2169086)||
+|June Service Release (2106)|[Azure-Percept-DK-1.0.20210511.1825.zip](https://download.microsoft.com/download/e/0/1/e01b6f7e-04f7-45ee-8933-8514c2fdbe6a/Azure-Percept-DK-1.0.20210511.1825.zip)||
+|May Service Release (2105)|[Azure-Percept-DK-1.0.20210611.0952-public_preview_1.0.zip](https://download.microsoft.com/download/1/5/8/1588f7e3-f8ae-4c06-baa2-c559364daae5/Azure-Percept-DK-1.0.20210611.0952-public_preview_1.0.zip)||
+|Feature Update (2104) |[Azure-Percept-DK-1.0.20210409.2055.zip](https://download.microsoft.com/download/6/4/d/64d53e60-f702-432d-a446-007920a4612c/Azure-Percept-DK-1.0.20210409.2055.zip)||
+
+## Next steps
+
+- [Update the Azure Percept DK over a USB-C cable connection](./how-to-update-via-usb.md)
azure-percept Troubleshoot Dev Kit https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/troubleshoot-dev-kit.md
In this section, you'll get guidance on which logs to collect and how to collect
|*OOBE logs* - records details about the setup experience.|Collect when you find issues during the setup experience.|```sudo journalctl -u oobe -b```| |*edgeAgent logs* - records the version numbers of all modules running on your device.|Collect when one or more modules aren't working.|```sudo iotedge logs edgeAgent```| |*Module container logs* - records details about specific IoT Edge module containers|Collect when you find issues with a module|```sudo iotedge logs [container name]```|
-|*Wi-Fi access point logs* - records details about the connection to the dev kit's Wi-Fi access point.|Collect when you find issues when connecting to the dev kit's Wi-Fi access point.|```sudo journalctl -u hostapd.service```|
|*Network logs* - a set of logs covering Wi-Fi services and the network stack.|Collect when you find Wi-Fi or network issues.|```sudo journalctl -u hostapd.service -u wpa_supplicant.service -u ztpd.service -u systemd-networkd > network_log.txt```<br><br>```cat /etc/os-release && cat /etc/os-subrelease && cat /etc/adu-version && rpm -q ztpd > system_ver.txt```<br><br>Run both commands. Each command collects multiple logs and puts them into a single output.| ## Troubleshooting commands
For more information on the Azure IoT Edge commands, see the [Azure IoT Edge dev
|Function |When to use |Command | ||-||
-|Checks the software version on the dev kit.|Use anytime you need confirm which software version is on your dev kit.|```cat /etc/adu-version```|
+|Checks the software version on the dev kit.|Use anytime you need confirm which software version is on your dev kit.|```cat /etc/os-release && cat /etc/os-subrelease && cat /etc/adu-version```|
|Checks the temperature of the dev kit|Use in cases where you think the dev kit might be overheating.|```cat /sys/class/thermal/thermal_zone0/temp```| |Checks the dev kit's telemetry ID|Use in cases where you need to know the dev kits unique telemetry identifier.|```sudo azure-device-health-id```| |Checks the status of IoT Edge|Use whenever there are issues with IoT Edge modules connecting to the cloud.|```sudo iotedge check```|
azure-resource-manager Install https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/install.md
Title: Set up Bicep development and deployment environments description: How to configure Bicep development and deployment environments Previously updated : 07/19/2021 Last updated : 08/26/2021
To verify you've installed the extension, open any file with the `.bicep` file e
The easiest way to get the commands you need to deploy a Bicep file is to install the latest version of Azure CLI. You can also use PowerShell, but it requires an extra installation.
+- [Azure CLI](#azure-cli)
+- [Azure PowerShell](#azure-powershell)
+- [Install manually](#install-manually)
+ ### Azure CLI You must have Azure CLI version 2.20.0 or later installed. To install or update Azure CLI, see:
For more commands, see [Bicep CLI](bicep-cli.md).
> [!IMPORTANT] > Azure CLI installs a self-contained instance of the Bicep CLI. This instance doesn't conflict with any versions you may have manually installed. Azure CLI doesn't add Bicep CLI to your PATH.
-### PowerShell
+#### Install on an air-gapped cloud
+
+To install Bicep CLI in an air-gapped environment, you need to download the Bicep CLI executable manually and save it to a certain location.
+
+- **Linux**
+
+ 1. Download **bicep-linux-x64** from the [Bicep release page](/Azure/bicep/releases/) in a non-air-gapped environment.
+ 1. Copy the executable to the **$HOME/.azure/bin** directory on an air-gapped machine.
+
+- **macOS**
+
+ 1. Download **bicep-osx-x64** from the [Bicep release page](/Azure/bicep/releases/) in a non-air-gapped environment.
+ 1. Copy the executable to the **$HOME/.azure/bin** directory on an air-gapped machine.
+
+- **Windows**
+
+ 1. Download **bicep-win-x64.exe** from the [Bicep release page](/Azure/bicep/releases/) in a non-air-gapped environment.
+ 1. Copy the executable to the **%UserProfile%/.azure/bin** directory on an air-gapped machine.
+
+Note `bicep install` and `bicep upgrade` commands don't not work in an air-gapped environment.
+
+### Azure PowerShell
You must have Azure PowerShell version 5.6.0 or later installed. To update or install, see [Install Azure PowerShell](/powershell/azure/install-az-ps).
bicep --version
The following methods install the Bicep CLI and add it to your PATH. You must manually install for any use other than Azure CLI.
+- [Linux](#linux)
+- [macOS](#macos)
+- [Windows](#windows)
+ #### Linux ```sh
azure-resource-manager App Service Move Limitations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/management/move-limitations/app-service-move-limitations.md
Title: Move Azure App Service resources description: Use Azure Resource Manager to move App Service resources to a new resource group or subscription. Previously updated : 08/10/2020 Last updated : 08/26/2021 # Move guidance for App Service resources
When moving a Web App across subscriptions, the following guidance applies:
- All App Service resources in the resource group must be moved together. - App Service Environments can't be moved to a new resource group or subscription. However, you can move a web app and app service plan to a new subscription without moving the App Service Environment. After the move, the web app is no longer hosted in the App Service Environment. - You can move a certificate bound to a web without deleting the TLS bindings, as long as the certificate is moved with all other resources in the resource group.-- App Service resources can only be moved from the resource group in which they were originally created. If an App Service resource is no longer in its original resource group, move it back to its original resource group. Then, move the resource across subscriptions.
+- App Service resources can only be moved from the resource group in which they were originally created. If an App Service resource is no longer in its original resource group, move it back to its original resource group. Then, move the resource across subscriptions. For help with finding the original resource group, see the next section.
+
+## Find original resource group
If you don't remember the original resource group, you can find it through diagnostics. For your web app, select **Diagnose and solve problems**. Then, select **Configuration and Management**.
You see the recommended actions to take before moving the resources. The informa
![Screen capture shows recommended steps for moving Microsoft dot Web resources.](./media/app-service-move-limitations/recommendations.png)
+## Move hidden resource types in portal
+
+When using the portal to move your App Service resources, you may see an error indicating that you haven't moved all of the resources. If you see this error, check if there are resource types that the portal didn't display. Select **Show hidden types**. Then, select all of the resources to move.
++ ## Move support To determine which App Service resources can be moved, see move support status for:
azure-resource-manager Resource Providers And Types https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/management/resource-providers-and-types.md
Title: Resource providers and resource types description: Describes the resource providers that support Azure Resource Manager. It describes their schemas, available API versions, and the regions that can host the resources. Previously updated : 03/15/2021 Last updated : 08/26/2021
For a list that maps resource providers to Azure services, see [Resource provide
## Register resource provider
-Before using a resource provider, your Azure subscription must be registered for the resource provider. Registration configures your subscription to work with the resource provider. Some resource providers are registered by default. For a list of resource providers registered by default, see [Resource providers for Azure services](azure-services-resource-providers.md).
+Before using a resource provider, your Azure subscription must be registered for the resource provider. Registration configures your subscription to work with the resource provider.
-Other resource providers are registered automatically when you take certain actions. When you deploy an Azure Resource Manager template, all required resource providers are automatically registered. When you create a resource through the portal, the resource provider is typically registered for you. For other scenarios, you may need to manually register a resource provider.
+> [!IMPORTANT]
+> Only register a resource provider when you're ready to use it. The registration step enables you to maintain least privileges within your subscription. A malicious user can't use resource providers that aren't registered.
-This article shows you how to check the registration status of a resource provider, and register it as needed. You must have permission to do the `/register/action` operation for the resource provider. The permission is included in the Contributor and Owner roles.
+Some resource providers are registered by default. For a list of resource providers registered by default, see [Resource providers for Azure services](azure-services-resource-providers.md).
+
+Other resource providers are registered automatically when you take certain actions. When you deploy an Azure Resource Manager template, all required resource providers are automatically registered. When you create a resource through the portal, the resource provider is typically registered for you. For other scenarios, you may need to manually register a resource provider.
> [!IMPORTANT]
-> Only register a resource provider when you're ready to use it. The registration step enables you to maintain least privileges within your subscription. A malicious user can't use resource providers that aren't registered.
+> Your application code **shouldn't block the creation of resources** for a resource provider that is in the **registering** state. When you register the resource provider, the operation is done individually for each supported region. To create resources in a region, the registration only needs to be completed in that region. By not blocking a resource provider in the registering state, your application can continue much sooner than waiting for all regions to complete.
-Your application code shouldn't block the creation of resources for a resource provider that is in the **registering** state. When you register the resource provider, the operation is done individually for each supported region. To create resources in a region, the registration only needs to be completed in that region. By not blocking resource provider in the registering state, your application can continue much sooner than waiting for all regions to complete.
+You must have permission to do the `/register/action` operation for the resource provider. The permission is included in the Contributor and Owner roles.
You can't unregister a resource provider when you still have resource types from that resource provider in your subscription.
To see all resource providers, and the registration status for your subscription
:::image type="content" source="./media/resource-providers-and-types/register-resource-provider.png" alt-text="register resource providers":::
+> [!IMPORTANT]
+> As [noted earlier](#register-resource-provider), **don't block the creation of resources** for a resource provider that is in the **registering** state. By not blocking a resource provider in the registering state, your application can continue much sooner than waiting for all regions to complete.
++ ### View resource provider To see information for a particular resource provider:
To see all resource providers in Azure, and the registration status for your sub
Get-AzResourceProvider -ListAvailable | Select-Object ProviderNamespace, RegistrationState ```
-Which returns results similar to:
+The command returns:
```output ProviderNamespace RegistrationState
To maintain least privileges in your subscription, only register those resource
Register-AzResourceProvider -ProviderNamespace Microsoft.Batch ```
-Which returns results similar to:
+The command returns:
```output ProviderNamespace : Microsoft.Batch
ResourceTypes : {batchAccounts, operations, locations, locations/quotas}
Locations : {West Europe, East US, East US 2, West US...} ```
+> [!IMPORTANT]
+> As [noted earlier](#register-resource-provider), **don't block the creation of resources** for a resource provider that is in the **registering** state. By not blocking a resource provider in the registering state, your application can continue much sooner than waiting for all regions to complete.
+ To see information for a particular resource provider, use: ```azurepowershell-interactive Get-AzResourceProvider -ProviderNamespace Microsoft.Batch ```
-Which returns results similar to:
+The command returns:
```output {ProviderNamespace : Microsoft.Batch
To see the resource types for a resource provider, use:
(Get-AzResourceProvider -ProviderNamespace Microsoft.Batch).ResourceTypes.ResourceTypeName ```
-Which returns:
+The command returns:
```output batchAccounts
To get the available API versions for a resource type, use:
((Get-AzResourceProvider -ProviderNamespace Microsoft.Batch).ResourceTypes | Where-Object ResourceTypeName -eq batchAccounts).ApiVersions ```
-Which returns:
+The command returns:
```output 2017-05-01
To get the supported locations for a resource type, use.
((Get-AzResourceProvider -ProviderNamespace Microsoft.Batch).ResourceTypes | Where-Object ResourceTypeName -eq batchAccounts).Locations ```
-Which returns:
+The command returns:
```output West Europe
To see all resource providers in Azure, and the registration status for your sub
az provider list --query "[].{Provider:namespace, Status:registrationState}" --out table ```
-Which returns results similar to:
+The command returns:
```output Provider Status
To maintain least privileges in your subscription, only register those resource
az provider register --namespace Microsoft.Batch ```
-Which returns a message that registration is on-going.
+The command returns a message that registration is on-going.
To see information for a particular resource provider, use:
To see information for a particular resource provider, use:
az provider show --namespace Microsoft.Batch ```
-Which returns results similar to:
+The command returns:
```output {
Which returns results similar to:
} ```
+> [!IMPORTANT]
+> As [noted earlier](#register-resource-provider), **don't block the creation of resources** for a resource provider that is in the **registering** state. By not blocking a resource provider in the registering state, your application can continue much sooner than waiting for all regions to complete.
+ To see the resource types for a resource provider, use: ```azurecli-interactive az provider show --namespace Microsoft.Batch --query "resourceTypes[*].resourceType" --out table ```
-Which returns:
+The command returns:
```output Result
To get the available API versions for a resource type, use:
az provider show --namespace Microsoft.Batch --query "resourceTypes[?resourceType=='batchAccounts'].apiVersions | [0]" --out table ```
-Which returns:
+The command returns:
```output Result
To get the supported locations for a resource type, use.
az provider show --namespace Microsoft.Batch --query "resourceTypes[?resourceType=='batchAccounts'].locations | [0]" --out table ```
-Which returns:
+The command returns:
```output Result
azure-sql Private Endpoint Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/private-endpoint-overview.md
select client_net_address from sys.dm_exec_connections
where session_id=@@SPID ````
+## Limitations
+Connections to private endpoint only support **Proxy** as the [connection policy](connectivity-architecture.md#connection-policy)
++ ## On-premises connectivity over private peering When customers connect to the public endpoint from on-premises machines, their IP address needs to be added to the IP-based firewall using a [Server-level firewall rule](firewall-create-server-level-portal-quickstart.md). While this model works well for allowing access to individual machines for dev or test workloads, it's difficult to manage in a production environment.
azure-video-analyzer Deploy Iot Edge Linux On Windows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-video-analyzer/video-analyzer-docs/deploy-iot-edge-linux-on-windows.md
Title: Deploy to an IoT Edge for Linux on Windows - Azure description: This article provides guidance on how to deploy to an IoT Edge for Linux on Windows device. Previously updated : 06/01/2021 Last updated : 08/25/2021
The following depicts the overall flow of the document and in 5 simple steps you
1. [Install EFLOW](../../iot-edge/how-to-install-iot-edge-on-windows.md) on your Windows device using PowerShell.
+ > [!NOTE]
+ > There are two ways to deploy EFLOW (PowerShell and Windows Admin Center) and two ways to provision the virtual machine (manual provisioning using the connection string and manual provisioning using X.509 certificates). Please follow the [PowerShell deployment](../../iot-edge/how-to-install-iot-edge-on-windows.md#create-a-new-deployment) and [provision the machine using the connection string from the IoT Hub](../../iot-edge/how-to-install-iot-edge-on-windows.md#manual-provisioning-using-the-connection-string).
1. Once EFLOW is set up, type the command `Connect-EflowVm` into PowerShell (with administrative privilege) to connect. This will bring up a bash terminal within PowerShell to control the EFLOW VM, where you can run Linux commands including utilities like Top and Nano.
The following depicts the overall flow of the document and in 5 simple steps you
`bash -c "$(curl -sL https://aka.ms/ava-edge/prep_device)"`
- Azure Video Analyzer module runs on the edge device with non-privileged local user accounts. Additionally, it needs certain local folders for storing application configuration data. Finally, for this how-to guide we are leveraging a [RTSP simulator](https://github.com/Azure/video-analyzer/tree/main/edge-modules/sources/rtspsim-live555) that relays a video feed in real time to AVA module for analysis. This simulator takes as input pre-recorded video files from an input directory.
+ The Azure Video Analyzer needs certain local folders for storing application configuration data. Finally, for this how-to guide we are leveraging a [RTSP simulator](https://github.com/Azure/video-analyzer/tree/main/edge-modules/sources/rtspsim-live555) that relays a video feed in real time to AVA module for analysis. This simulator takes as input pre-recorded video files from an input directory.
The prep-device script used above automates these tasks away, so you can run one command and have all relevant input and configuration folders, video input files, and user accounts with privileges created seamlessly. Once the command finishes successfully, you should see the following folders created on your edge device.
The following depicts the overall flow of the document and in 5 simple steps you
* `/var/media` Note the video files (*.mkv) in the /home/localedgeuser/samples/input folder, which serve as input files to be analyzed.
+
1. Now that you have the edge device set up, registered to the hub, and running successfully with the correct folder structures created, the next step is to set up the following additional Azure resources and deploy the AVA module. The following deployment template will take care of the resource creation: [![Deploy to Azure](https://aka.ms/deploytoazurebutton)](https://aka.ms/ava-click-to-deploy)
The following depicts the overall flow of the document and in 5 simple steps you
You should see the following four modules deployed and running on your edge device. Please note that the resource creation script deploys the AVA module along with IoT Edge modules (edgeAgent and edgeHub) and an RTSP simulator module to provide the simulated RTSP video feed. ![Deployed Modules](./media/vscode-common-screenshots/avaedge-module.png)
+
1. With the modules deployed and set up, you are ready to run your first AVA pipeline on EFLOW. You can run a simple motion detection pipeline as below and visualize the results by executing the following steps: ![Video Analyzer based on motion detection](./media/get-started-detect-motion-emit-events/motion-detection.svg)
The following depicts the overall flow of the document and in 5 simple steps you
> [!IMPORTANT] > Undeleted resources can still be active and incur Azure costs. Please ensure that you delete the resources you do not intend to use.
-
+
## Next steps
-* Try motion detection along with recording relevant videos in the cloud. Follow the steps from the [detect motion and record video clips](detect-motion-record-video-edge-devices.md) quickstart.
-* Run [AI on Live Video](analyze-live-video-use-your-model-http.md#overview) (you can skip the prerequisite setup as it has already been done above)
+* Try motion detection along with recording relevant videos in the Cloud. Follow the steps from the [detect motion and record video clips](detect-motion-record-video-edge-devices.md) quickstart.
* Use our [VS Code extension](https://marketplace.visualstudio.com/items?itemName=ms-azuretools.live-video-analytics-edge) to view additional pipelines. * Use an [IP camera](https://en.wikipedia.org/wiki/IP_camera) that supports RTSP instead of using the RTSP simulator. You can find IP cameras that support RTSP on the [ONVIF conformant products](https://www.onvif.org/conformant-products/) page. Look for devices that conform with profiles G, S, or T.
+* Run [AI on Live Video](analyze-live-video-use-your-model-http.md#overview) (you can skip the prerequisite setup as it has already been done above).
+
+ > [!WARNING]
+ > For advanced users who wish to run memory-intensive AI models like YOLO, you may have to increase the resources allotted to the EFLOW VM. First, exit the EFLOW VM and return to the Windows PowerShell terminal by typing `exit`. Then, run the command `Set-EflowVM` on PowerShell with elevated privilege. After running the command, input your desired [parameters](../../iot-edge/reference-iot-edge-for-linux-on-windows-functions.md#set-eflowvm) by following the prompts in PowerShell, for example `cpuCount: 2`, `memoryInMB: 2048`. After a few minutes, redeploy the Edge module(s) and reactivate the live pipeline to view inferences. If you are encountering connection issues (e.g., error 137 or 255 listed on IoT Hub), you may have to rerun this step.
azure-video-analyzer Player Widget https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-video-analyzer/video-analyzer-docs/player-widget.md
The application builds and runs. After it builds, it creates a self-signed certi
- `Issuer`, `Audience`, `Key Type`, `Algorithm`, `Key Id`, `RSA Key Modulus`, `RSA Key Exponent`, `Token`
-Be sure to copy these values for later use.
+**Be sure to copy and save these values for later use.**
## Create an access policy
azure-vmware Concepts Private Clouds Clusters https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/concepts-private-clouds-clusters.md
Title: Concepts - Private clouds and clusters description: Learn about the key capabilities of Azure VMware Solution software-defined data centers and vSphere clusters. Previously updated : 05/13/2021 Last updated : 08/25/2021 # Azure VMware Solution private cloud and cluster concepts
azure-vmware Create Placement Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/create-placement-policy.md
Additionally, you can monitor various DRS rule operations, such as recommendatio
## FAQs ### Are these the same as DRS affinity rules?
-Yes, and no. While vSphere DRS implements the current set of policies, we have simplified the experience. Tweaking VM groups and Host groups are a cumbersome operation, especially as hosts are ephemeral in nature and could be replaced in a cloud environment. As hosts are replaced in the vSphere inventory in an on-premises environment, the vSphere admin must modify the host group to ensure that the desired VM-Host placement constraints continue to stay in effect. Placement policies in Azure VMware Solution update the Host groups when a host is rotated or changed. Similarly, if you scale in a cluster, the Host Group is automatically updated, as applicable. This eliminates the overhead of managing the Host Groups for the customer.
+Yes, and no. While vSphere DRS implements the current set of policies, we have simplified the experience. Modifying VM groups and Host groups are a cumbersome operation, especially as hosts are ephemeral in nature and could be replaced in a cloud environment. As hosts are replaced in the vSphere inventory in an on-premises environment, the vSphere admin must modify the host group to ensure that the desired VM-Host placement constraints remain in effect. Placement policies in Azure VMware Solution update the Host groups when a host is rotated or changed. Similarly, if you scale in a cluster, the Host Group is automatically updated, as applicable. This eliminates the overhead of managing the Host Groups for the customer.
### As this is an existing functionality available in vCenter, why can't I use it directly? Azure VMware Solution provides a VMware private cloud in Azure. In this managed VMware infrastructure, Microsoft manages the clusters, hosts, datastores, and distributed virtual switches in the private cloud. At the same time, the tenant is responsible for managing the workloads deployed on the private cloud. As a result, the tenant administering the private cloud [does not have the same set of privileges](concepts-identity.md) as available to the VMware administrator in an on-premises deployment.
-Further, the lack of the desired granularity in the vSphere privileges presents some challenges when managing the placement of the workloads on the private cloud. For example, vSphere DRS rules commonly used on-premises to define affinity and anti-affinity rules cannot be used as-is in a VMware Cloud environment, as some of those rules can block day-to-day operation the private cloud. Placement Policies offers you a way to define those rules via the AVS portal, thereby circumventing the need to use DRS rules. Coupled with a simplified experience, they also ensure that the rules do not impact the day-to-day infrastructure maintenance and operation activities.
+Further, the lack of the desired granularity in the vSphere privileges presents some challenges when managing the placement of the workloads on the private cloud. For example, vSphere DRS rules commonly used on-premises to define affinity and anti-affinity rules can't be used as-is in a VMware Cloud environment, as some of those rules can block day-to-day operation the private cloud. Placement Policies provides a way to define those rules using the Azure VMware Solution portal, thereby circumventing the need to use DRS rules. Coupled with a simplified experience, they also ensure that the rules don't impact the day-to-day infrastructure maintenance and operation activities.
-### What caveats should I be aware of?
+### What caveats should I know about?
-The VM-Host MUST rules block maintenance operations and are not supported by Placement Policies.
+The VM-Host **MUST** rules aren't supported because they block maintenance operations.
-VM-Host SHOULD rules are preferential rules, where vSphere DRS tries to accommodate the rules to the extent possible. vSphere DRS may vMotion virtual machines subjected to the VM-Host SHOULD rules occasionally to ensure that the workloads get the resources they need. This is a standard vSphere DRS behavior, and the Placement Policies feature does not change the underlying vSphere DRS behavior.
+VM-Host **SHOULD** rules are preferential rules, where vSphere DRS tries to accommodate the rules to the extent possible. Occasionally, vSphere DRS may vMotion VMs subjected to the VM-Host **SHOULD** rules to ensure that the workloads get the resources they need. It's a standard vSphere DRS behavior, and the Placement policies feature does not change the underlying vSphere DRS behavior.
-If you create conflicting rules, those conflicts may show up on the vCenter, and the newly defined rules may not take effect. This is a standard vSphere DRS behavior, the logs for which can be observed in the vCenter.
+If you create conflicting rules, those conflicts may show up on the vCenter, and the newly defined rules may not take effect. It's a standard vSphere DRS behavior, the logs for which can be observed in the vCenter.
azure-web-pubsub Tutorial Serverless Notification https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-web-pubsub/tutorial-serverless-notification.md
+
+ Title: Tutorial - Create a serverless notification app using Azure Web PubSub service and Azure Functions
+description: A tutorial to walk through how to use Azure Web PubSub service and Azure Functions to build a serverless notification application.
++++ Last updated : 08/24/2021++
+# Tutorial: Create a serverless notification app with Azure Functions and Azure Web PubSub service
+
+The Azure Web PubSub service helps you build real-time messaging web applications using WebSockets. Azure Functions is a serverless platform that lets you run your code without managing any infrastructure. In this tutorial, you learn how to use Azure Web PubSub service and Azure Functions to build a serverless application with real-time messaging under notification scenarios.
+
+In this tutorial, you learn how to:
+
+> [!div class="checklist"]
+> * Build a serverless notification app
+> * Work with Web PubSub function input and output bindings
+> * Run the sample functions locally
+> * Deploy the function to Azure Function App
+
+## Prerequisites
+
+# [JavaScript](#tab/javascript)
+
+* A code editor, such as [Visual Studio Code](https://code.visualstudio.com/)
+
+* [Node.js](https://nodejs.org/en/download/), version 10.x.
+ > [!NOTE]
+ > For more information about the supported versions of Node.js, see [Azure Functions runtime versions documentation](../azure-functions/functions-versions.md#languages).
+
+* [Azure Functions Core Tools](https://github.com/Azure/azure-functions-core-tools#installing) (v3 or higher preferred) to run Azure Function apps locally and deploy to Azure.
+
+* [Azure command-line interface (Azure CLI)](/cli/azure) to manage Azure resources.
+
+# [C#](#tab/csharp)
+
+* A code editor, such as [Visual Studio Code](https://code.visualstudio.com/).
+
+* [Azure Functions Core Tools](https://github.com/Azure/azure-functions-core-tools#installing) (v3 or higher preferred) to run Azure Function apps locally and deploy to Azure.
+
+* [Azure command-line interface (Azure CLI)](/cli/azure) to manage Azure resources.
+++++
+## Create and run the functions locally
+
+1. Make sure you have [Azure Functions Core Tools](https://github.com/Azure/azure-functions-core-tools#installing) installed. And then create an empty directory for the project. Run command under this working directory.
+
+ # [JavaScript](#tab/javascript)
+ ```bash
+ func init --worker-runtime javascript
+ ```
+
+ # [C#](#tab/csharp)
+ ```bash
+ func init --worker-runtime dotnet
+ ```
+
+1. Install `Microsoft.Azure.WebJobs.Extensions.WebPubSub` function extension package explicitly.
+
+ a. Remove `extensionBundle` section in `host.json` to enable install specific extension package in next step. Or simply make host json as simple a below.
+ ```json
+ {
+ "version": "2.0"
+ }
+ ```
+ b. Run command to install specific function extension package.
+ ```bash
+ func extensions install --package Microsoft.Azure.WebJobs.Extensions.WebPubSub --version 1.0.0-beta.3
+ ```
+
+1. Create an `index` function to read and host a static web page for clients.
+ ```bash
+ func new -n index -t HttpTrigger
+ ```
+ # [JavaScript](#tab/javascript)
+ - Update `index/function.json` and copy following json codes.
+ ```json
+ {
+ "bindings": [
+ {
+ "authLevel": "anonymous",
+ "type": "httpTrigger",
+ "direction": "in",
+ "name": "req",
+ "methods": [
+ "get",
+ "post"
+ ]
+ },
+ {
+ "type": "http",
+ "direction": "out",
+ "name": "res"
+ }
+ ]
+ }
+ ```
+ - Update `index/index.js` and copy following codes.
+ ```js
+ var fs = require('fs');
+ module.exports = function (context, req) {
+ fs.readFile('https://docsupdatetracker.net/index.html', 'utf8', function (err, data) {
+ if (err) {
+ console.log(err);
+ context.done(err);
+ }
+ context.res = {
+ status: 200,
+ headers: {
+ 'Content-Type': 'text/html'
+ },
+ body: data
+ };
+ context.done();
+ });
+ }
+ ```
+
+ # [C#](#tab/csharp)
+ - Update `index.cs` and replace `Run` function with following codes.
+ ```c#
+ [FunctionName("index")]
+ public static IActionResult Run([HttpTrigger(AuthorizationLevel.Anonymous)] HttpRequest req)
+ {
+ return new ContentResult
+ {
+ Content = File.ReadAllText("https://docsupdatetracker.net/index.html"),
+ ContentType = "text/html",
+ };
+ }
+ ```
+
+2. Create a `negotiate` function to help clients get service connection url with access token.
+ ```bash
+ func new -n negotiate -t HttpTrigger
+ ```
+ # [JavaScript](#tab/javascript)
+ - Update `negotiate/function.json` and copy following json codes.
+ ```json
+ {
+ "bindings": [
+ {
+ "authLevel": "anonymous",
+ "type": "httpTrigger",
+ "direction": "in",
+ "name": "req"
+ },
+ {
+ "type": "http",
+ "direction": "out",
+ "name": "res"
+ },
+ {
+ "type": "webPubSubConnection",
+ "name": "connection",
+ "hub": "notification",
+ "direction": "in"
+ }
+ ]
+ }
+ ```
+ - Update `negotiate/index.js` and copy following codes.
+ ```js
+ module.exports = function (context, req, connection) {
+ context.res = { body: connection };
+ context.done();
+ };
+ ```
+ # [C#](#tab/csharp)
+ - Update `negotiate.cs` and replace `Run` function with following codes.
+ ```c#
+ [FunctionName("negotiate")]
+ public static WebPubSubConnection Run(
+ [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
+ [WebPubSubConnection(Hub = "notification")] WebPubSubConnection connection,
+ ILogger log)
+ {
+ log.LogInformation("Connecting...");
+
+ return connection;
+ }
+ ```
+
+3. Create a `notification` function to generate notifications with `TimerTrigger`.
+ ```bash
+ func new -n notification -t TimerTrigger
+ ```
+ # [JavaScript](#tab/javascript)
+ - Update `notification/function.json` and copy following json codes.
+ ```json
+ {
+ "bindings": [
+ {
+ "name": "myTimer",
+ "type": "timerTrigger",
+ "direction": "in",
+ "schedule": "*/10 * * * * *"
+ },
+ {
+ "type": "webPubSub",
+ "name": "webPubSubOperation",
+ "hub": "notification",
+ "direction": "out"
+ }
+ ]
+ }
+ ```
+ - Update `notification/index.js` and copy following codes.
+ ```js
+ module.exports = function (context, myTimer) {
+ context.bindings.webPubSubOperation = {
+ "operationKind": "sendToAll",
+ "message": `[DateTime: ${new Date()}] Temperature: ${getValue(22, 1)}\xB0C, Humidity: ${getValue(40, 2)}%`,
+ "dataType": "text"
+ }
+ context.done();
+ };
+
+ function getValue(baseNum, floatNum) {
+ return (baseNum + 2 * floatNum * (Math.random() - 0.5)).toFixed(3);
+ }
+ ```
+ # [C#](#tab/csharp)
+ - Update `notification.cs` and replace `Run` function with following codes.
+ ```c#
+ [FunctionName("notification")]
+ public static async Task Run([TimerTrigger("*/10 * * * * *")]TimerInfo myTimer, ILogger log,
+ [WebPubSub(Hub = "notification")] IAsyncCollector<WebPubSubOperation> operations)
+ {
+ await operations.AddAsync(new SendToAll
+ {
+ Message = BinaryData.FromString($"[DateTime: {DateTime.Now}] Temperature: {GetValue(23, 1)}{'\xB0'}C, Humidity: {GetValue(40, 2)}%"),
+ DataType = MessageDataType.Text
+ });
+ }
+
+ private static string GetValue(double baseNum, double floatNum)
+ {
+ var rng = new Random();
+ var value = baseNum + floatNum * 2 * (rng.NextDouble() - 0.5);
+ return value.ToString("0.000");
+ }
+ ```
+
+4. Add the client single page `https://docsupdatetracker.net/index.html` in the project root folder and copy content as below.
+ ```html
+ <html>
+ <body>
+ <h1>Azure Web PubSub Notification</h1>
+ <div id="messages"></div>
+ <script>
+ (async function () {
+ let messages = document.querySelector('#messages');
+ let res = await fetch(`${window.location.origin}/api/negotiate`);
+ let url = await res.json();
+ let ws = new WebSocket(url.url);
+ ws.onopen = () => console.log('connected');
+
+ ws.onmessage = event => {
+ let m = document.createElement('p');
+ m.innerText = event.data;
+ messages.appendChild(m);
+ };
+ })();
+ </script>
+ </body>
+ </html>
+ ```
+
+ # [JavaScript](#tab/javascript)
+
+ # [C#](#tab/csharp)
+ Since C# project will compile files to a different output folder, you need to update your `*.csproj` to make the content page go with it.
+ ```xml
+ <ItemGroup>
+ <None Update="https://docsupdatetracker.net/index.html">
+ <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
+ </None>
+ </ItemGroup>
+ ```
+
+5. Configure and run the Azure Function app
+
+ - In the browser, open the **Azure portal** and confirm the Web PubSub Service instance you deployed earlier was successfully created. Navigate to the instance.
+ - Select **Keys** and copy out the connection string.
+
+ :::image type="content" source="media/quickstart-serverless/copy-connection-string.png" alt-text="Screenshot of copying the Web PubSub connection string.":::
+
+ Run command below in the function folder to set the service connection string. Replace `<connection-string`> with your value as needed.
+
+ ```bash
+ func settings add WebPubSubConnectionString "<connection-string>"
+ ```
+
+ > [!NOTE]
+ > `TimerTrigger` used in the sample has dependency on Azure Storage, but you can use local storage emulator when the Function is running locally. If you got some error like `There was an error performing a read operation on the Blob Storage Secret Repository. Please ensure the 'AzureWebJobsStorage' connection string is valid.` You need to download and enable [Storage Emulator](../storage/common/storage-use-emulator.md).
+
+ Now you're able to run your local function by command below.
+
+ ```bash
+ func start
+ ```
+
+ And checking the running logs, you can visit your local host static page by visiting: `https://localhost:7071/api/index`.
+
+## Deploy Function App to Azure
+
+Before you can deploy your function code to Azure, you need to create 3 resources:
+* A resource group, which is a logical container for related resources.
+* A storage account, which is used to maintain state and other information about your functions.
+* A function app, which provides the environment for executing your function code. A function app maps to your local function project and lets you group functions as a logical unit for easier management, deployment and sharing of resources.
+
+Use the following commands to create these item.
+
+1. If you haven't done so already, sign in to Azure:
+
+ ```bash
+ az login
+ ```
+
+1. Create a resource group or you can skip by re-using the one of Azure Web PubSub service:
+
+ ```bash
+ az group create -n WebPubSubFunction -l <REGION>
+ ```
+
+1. Create a general-purpose storage account in your resource group and region:
+
+ ```bash
+ az storage account create -n <STORAGE_NAME> -l <REGION> -g WebPubSubFunction
+ ```
+
+1. Create the function app in Azure:
+
+ # [JavaScript](#tab/javascript)
+
+ ```bash
+ az functionapp create --resource-group WebPubSubFunction --consumption-plan-location <REGION> --runtime node --runtime-version 12 --functions-version 3 --name <FUNCIONAPP_NAME> --storage-account <STORAGE_NAME>
+ ```
+
+ # [C#](#tab/csharp)
+
+ ```bash
+ az functionapp create --resource-group WebPubSubFunction --consumption-plan-location <REGION> --runtime dotnet --functions-version 3 --name <FUNCIONAPP_NAME> --storage-account <STORAGE_NAME>
+ ```
+
+1. Deploy the function project to Azure:
+
+ After you've successfully created your function app in Azure, you're now ready to deploy your local functions project by using the [func azure functionapp publish](/azure-functions/functions-run-local) command.
+
+ ```bash
+ func azure functionapp publish <FUNCIONAPP_NAME> --publish-local-settings
+ ```
+
+ > [!NOTE]
+ > Here we are deploying local settings `local.settings.json` together with command parameter `--publish-local-settings`. If you're using Microsoft Azure Storage Emulator, you can type `no` to skip overwriting this value on Azure following the prompt message: `App setting AzureWebJobsStorage is different between azure and local.settings.json, Would you like to overwrite value in azure? [yes/no/show]`. Besides, you can update Function App settings in **Azure Portal** -> **Settings** -> **Configuration**.
+
+1. Now you can check your site from Azure Function App by navigating to URL: `https://<FUNCIONAPP_NAME>.azurewebsites.net/api/index`.
+
+## Clean up resources
+
+If you're not going to continue to use this app, delete all resources created by this doc with the following steps so you don't incur any charges:
+
+1. In the Azure portal, select **Resource groups** on the far left, and then select the resource group you created. You may use the search box to find the resource group by its name instead.
+
+1. In the window that opens, select the resource group, and then select **Delete resource group**.
+
+1. In the new window, type the name of the resource group to delete, and then select **Delete**.
+
+## Next steps
+
+In this quickstart, you learned how to run a serverless chat application. Now, you could start to build your own application.
+
+> [!div class="nextstepaction"]
+> [Tutorial: Create a simple chatroom with Azure Web PubSub](https://azure.github.io/azure-webpubsub/getting-started/create-a-chat-app/js-handle-events)
+
+> [!div class="nextstepaction"]
+> [Azure Web PubSub bindings for Azure Functions](https://azure.github.io/azure-webpubsub/references/functions-bindings)
+
+> [!div class="nextstepaction"]
+> [Explore more Azure Web PubSub samples](https://github.com/Azure/azure-webpubsub/tree/main/samples)
backup Backup Azure Arm Userestapi Restoreazurevms https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-azure-arm-userestapi-restoreazurevms.md
Title: Restore Azure VMs using REST API description: In this article, learn how to manage restore operations of Azure Virtual Machine Backup using REST API. Previously updated : 09/12/2018 Last updated : 08/26/2021 ms.assetid: b8487516-7ac5-4435-9680-674d9ecf5642
Once the recovery point is obtained, we need to construct the request body for t
The triggering of any restore operation is an [asynchronous operation](../azure-resource-manager/management/async-operations.md). It means this operation creates another operation that needs to be tracked separately.
-It returns two responses: 202 (Accepted) when another operation is created and then 200 (OK) when that operation completes.
+It returns two responses: 202 (Accepted) when another operation is created, and then 200 (OK) when that operation completes.
|Name |Type |Description | ||||
Once you track the response as explained [above](#responses), and the long runni
### Replace disks in a backed-up virtual machine
-While restore disks creates disks from the recovery point, replace disks replaces the current disks of the backed up VM with the disks from the recovery point. As explained [above](#restore-operations), the relevant request body for replacing disks is provided below.
+While restore disks creates disks from the recovery point, replace disks replaces the current disks of the backed-up VM with the disks from the recovery point. As explained [above](#restore-operations), the relevant request body for replacing disks is provided below.
#### Create request body
As explained [above](#restore-operations), the following request body defines pr
The response should be handled in the same way as [explained above for restoring disks](#responses).
+## Cross Region Restore
+
+If Cross Region Restore (CRR) is enabled on the vault with which you've protected your VMs, the backup data is replicated to the secondary region. You can use the backup data to perform a restore operation. To trigger a restore operation in the secondary region using REST API, follow these steps:
+
+### Select recovery point in secondary region
+
+You can list the available recovery points of a backup item in the secondary region using the [list recovery points REST API for CRR](/rest/api/backup/recovery-points-crr/list). It's a simple GET operation with all the relevant values.
+
+```http
+GET https://management.azure.com/Subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.RecoveryServices/vaults/{vaultName}/backupFabrics/{fabricName}/protectionContainers/{containerName}/protectedItems/{protectedItemName}/recoveryPoints?api-version=2018-12-20
+
+```
+
+The `{containerName}` and `{protectedItemName}` are as constructed [here](backup-azure-arm-userestapi-backupazurevms.md#example-responses-to-get-operation). `{fabricName}` is "Azure".
+
+The *GET* URI has all the required parameters. An additional request body isn't required.
+
+>[!NOTE]
+>For getting recovery points in the secondary region, use API version 2018-12-20 as in the above example.
+
+#### Responses
+
+|Name |Type |Description |
+||||
+|200 OK | [RecoveryPointResourceList](/rest/api/backup/recovery-points-crr/list#recoverypointresourcelist) | OK |
+
+#### Example response
+
+Once the *GET* URI is submitted, a 200 (OK) response is returned.
+
+```http
+Headers:
+Pragma : no-cache
+X-Content-Type-Options : nosniff
+x-ms-request-id : bfc4a4e6-c585-46e0-8e38-f11a86093701
+x-ms-client-request-id : 4344a9c2-70d8-482d-b200-0ca9cc498812,4344a9c2-70d8-482d-b200-0ca9cc498812
+Strict-Transport-Security : max-age=31536000; includeSubDomains
+x-ms-ratelimit-remaining-subscription-resource-requests: 149
+x-ms-correlation-request-id : bfc4a4e6-c585-46e0-8e38-f11a86093701
+x-ms-routing-request-id : SOUTHINDIA:20210731T112441Z:bfc4a4e6-c585-46e0-8e38-f11a86093701
+Cache-Control : no-cache
+Date : Sat, 31 Jul 2021 11:24:40 GMT
+Server : Microsoft-IIS/10.0
+X-Powered-By : ASP.NET
+
+Body:
+{
+ "value": [
+ {
+ "id":
+"/Subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/testVaultRG/providers/Microsoft.RecoveryServices/vaults/testVault/backupFabrics/Azure/protectionContainers/IaasVMContainer;iaasvmcontainerv2;testRG1;testVM/protectedItems/VM;iaasvmcontainerv2;testRG1;testVM/recoveryPoints/932895704780058094",
+ "name": "932895704780058094",
+ "type": "Microsoft.RecoveryServices/vaults/backupFabrics/protectionContainers/protectedItems/recoveryPoints",
+ "properties": {
+ "objectType": "IaasVMRecoveryPoint",
+ "recoveryPointType": "CrashConsistent",
+ "recoveryPointTime": "2021-07-31T09:24:34.687561Z",
+ "recoveryPointAdditionalInfo": "",
+ "sourceVMStorageType": "PremiumVMOnPartialPremiumStorage",
+ "isSourceVMEncrypted": false,
+ "isInstantIlrSessionActive": false,
+ "recoveryPointTierDetails": [
+ {
+ "type": 1,
+ "status": 1
+ }
+ ],
+ "isManagedVirtualMachine": true,
+ "virtualMachineSize": "Standard_D2s_v3",
+ "originalStorageAccountOption": false,
+ "osType": "Windows"
+ }
+ },
+ {
+ "id":
+"/Subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/testVaultRG/providers/Microsoft.RecoveryServices/vaults/testVault/backupFabrics/Azure/protectionContainers/IaasVMContainer;iaasvmcontainerv2;testRG1;testVM/protectedItems/VM;iaasvmcontainerv2;testRG1;testVM/recoveryPoints/932891484644960954",
+ "name": "932891484644960954",
+ "type": "Microsoft.RecoveryServices/vaults/backupFabrics/protectionContainers/protectedItems/recoveryPoints",
+ "properties": {
+ "objectType": "IaasVMRecoveryPoint",
+ "recoveryPointType": "CrashConsistent",
+ "recoveryPointTime": "2021-07-30T09:20:01.8355052Z",
+ "recoveryPointAdditionalInfo": "",
+ "sourceVMStorageType": "PremiumVMOnPartialPremiumStorage",
+ "isSourceVMEncrypted": false,
+ "isInstantIlrSessionActive": false,
+ "recoveryPointTierDetails": [
+ {
+ "type": 1,
+ "status": 1
+ },
+ {
+ "type": 2,
+ "status": 1
+ }
+ ],
+ "isManagedVirtualMachine": true,
+ "virtualMachineSize": "Standard_D2s_v3",
+ "originalStorageAccountOption": false,
+ "osType": "Windows"
+ }
+ },
+.....
+```
+
+The recovery point is identified with the `{name}` field in the above response.
+
+### Get access token
+
+To perform cross-region restore, you'll require an access token to authorize your request to access replicated restore points in the secondary region. To get an access token, follow these steps:
+
+#### Step 1:
+
+Use the [AAD Properties API](/rest/api/backup/aad-properties/get) to get AAD properties for the secondary region (*westus* in the below example):
+
+```http
+GET https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.RecoveryServices/locations/westus/backupAadProperties?api-version=2018-12-20
+```
+
+##### Response example
+
+The response is returned in the following format:
+
+```json
+{
+ "properties": {
+ "tenantId": "00000000-0000-0000-0000-000000000000",
+ "audience": "https://RecoveryServices/IaasCoord/aadmgmt/wus",
+ "servicePrincipalObjectId": "00000000-0000-0000-0000-000000000000"
+ }
+}
+```
+
+#### Step 2:
+
+Use the [Get Access Token API](/rest/api/backup/recovery-points-get-access-token-for-crr/get-access-token) to authorize your request to access replicated restore points in the secondary region:
+
+```http
+POST https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.RecoveryServices/vaults/{vaultName}/backupFabrics/{fabricName}/protectionContainers/{containerName}/protectedItems/{protectedItemName}/recoveryPoints/{recoveryPointId}/accessToken?api-version=2018-12-20
+```
+
+For the request body, paste the contents of the response returned by the AAD Properties API in the previous step.
+
+```json
+{
+ "properties": {
+ "tenantId": "00000000-0000-0000-0000-000000000000",
+ "audience": "https://RecoveryServices/IaasCoord/aadmgmt/wus",
+ "servicePrincipalObjectId": "00000000-0000-0000-0000-000000000000"
+ }
+}
+```
+
+##### Response example
+
+The response is returned in the following format:
+
+```json
+{
+ "id": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/testVaultRG/providers/Microsoft.RecoveryServices/vaults/testVault/backupFabrics/Azure/protectionContainers/IaasVMContainer;iaasvmcontainerv2;testRG1;testVM/protectedItems/VM;iaasvmcontainerv2;testRG1;testVM/recoveryPoints/26083826328862",
+ "name": "932879774590051503",
+ "type": "Microsoft.RecoveryServices/vaults/backupFabrics/protectionContainers/protectedItems/recoveryPoints",
+ "properties": {
+ "objectType": "CrrAccessToken",
+ "accessTokenString": "<access-token-string>",
+ "subscriptionId": "00000000-0000-0000-0000-000000000000",
+ "resourceGroupName": "testVaultRG",
+ "resourceName": "testVault",
+ "resourceId": "000000000000000000",
+ "protectionContainerId": 000000,
+ "recoveryPointId": "932879774590051503",
+ "recoveryPointTime": "7/26/2021 3:35:36 PM",
+ "containerName": "iaasvmcontainerv2;testRG1;testVM",
+ "containerType": "IaasVMContainer",
+ "backupManagementType": "AzureIaasVM",
+ "datasourceType": "VM",
+ "datasourceName": "testvm1234",
+ "datasourceId": "000000000000000000",
+ "datasourceContainerName": "iaasvmcontainerv2;testRG1;testVM",
+ "coordinatorServiceStampUri": "https://pod01-coord1.eus.backup.windowsazure.com",
+ "protectionServiceStampId": "00000000-0000-0000-0000-000000000000",
+ "protectionServiceStampUri": "https://pod01-prot1h-int.eus.backup.windowsazure.com",
+ "tokenExtendedInformation": "<IaaSVMRecoveryPointMetadataBase xmlns:i=\"http://www.w3.org/2001/XMLSchema-instance\" i:type=\"IaaSVMRecoveryPointMetadata_V2015_09\" xmlns=\"http://windowscloudbackup.com/CloudCommon/V2011_09\"><MetadataVersion>V2015_09</MetadataVersion><ContainerType i:nil=\"true\" /><InstantRpGCId>ef4ab5a7-c2c0-4304-af80-af49f48af3d1;AzureBackup_testvm1234_932843259176972511;AzureBackup_20210726_033536;AzureBackupRG_eastus_1</InstantRpGCId><IsBlockBlobEnabled>true</IsBlockBlobEnabled><IsManagedVirtualMachine>true</IsManagedVirtualMachine><OriginalSAOption>false</OriginalSAOption><OsType>Windows</OsType><ReadMetadaFromConfigBlob i:nil=\"true\" /><RecoveryPointConsistencyType>CrashConsistent</RecoveryPointConsistencyType><RpDiskDetails i:nil=\"true\" /><SourceIaaSVMRPKeyAndSecret i:nil=\"true\" /><SourceIaaSVMStorageType>PremiumVMOnPartialPremiumStorage</SourceIaaSVMStorageType><VMSizeDescription>Standard_D2s_v3</VMSizeDescription><Zones xmlns:d2p1=\"http://schemas.microsoft.com/2003/10/Serialization/Arrays\" i:nil=\"true\" /></IaaSVMRecoveryPointMetadataBase>",
+ "rpTierInformation": {
+ "InstantRP": "Valid",
+ "HardenedRP": "Valid"
+ },
+ "rpOriginalSAOption": false,
+ "rpIsManagedVirtualMachine": true,
+ "rpVMSizeDescription": "Standard_D2s_v3",
+ "bMSActiveRegion": "EastUS"
+ }
+}
+```
+
+### Restore disks to the secondary region
+
+Use the [Cross-Region Restore Trigger API](/rest/api/backup/cross-region-restore/trigger) to restore an item to the secondary region.
+
+```http
+POST https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.RecoveryServices/locations/{azureRegion}/backupCrossRegionRestore?api-version=2018-12-20
+```
+
+The request body should have two parts:
+
+1. ***crossRegionRestoreAccessDetails***: Paste the *properties* block of the response from the Get Access Token API request performed in the previous step to fill this segment of the request body.
+
+1. ***restoreRequest***: To fill the *restoreRequest* segment of the request body, you'll need to pass the recovery point ID obtained earlier, along with the Azure Resource Manager (ARM) ID of the source VM, as well as the details of the storage account in the secondary region to be used as a staging location. To perform disk restore, specify *RestoreDisks* as the recovery type.
+
+The following is a sample request body to restore the disks of a VM to the secondary region:
+
+```json
+ {
+ "crossRegionRestoreAccessDetails": {
+ "objectType": "CrrAccessToken",
+ "accessTokenString": "<access-token-string>",
+ "subscriptionId": "00000000-0000-0000-0000-000000000000",
+ "resourceGroupName": "azurefiles",
+ "resourceName": "azurefilesvault",
+ "resourceId": "000000000000000000",
+ "protectionContainerId": 000000,
+ "recoveryPointId": "932879774590051503",
+ "recoveryPointTime": "7/26/2021 3:35:36 PM",
+ "containerName": "iaasvmcontainerv2;testRG1;testVM",
+ "containerType": "IaasVMContainer",
+ "backupManagementType": "AzureIaasVM",
+ "datasourceType": "VM",
+ "datasourceName": "testvm1234",
+ "datasourceId": "000000000000000000",
+ "datasourceContainerName": "iaasvmcontainerv2;testRG1;testVM",
+ "coordinatorServiceStampUri": "https://pod01-coord1.eus.backup.windowsazure.com",
+ "protectionServiceStampId": "00000000-0000-0000-0000-000000000000",
+ "protectionServiceStampUri": "https://pod01-prot1h-int.eus.backup.windowsazure.com",
+ "tokenExtendedInformation": "<IaaSVMRecoveryPointMetadataBase xmlns:i=\"http://www.w3.org/2001/XMLSchema-instance\" i:type=\"IaaSVMRecoveryPointMetadata_V2015_09\" xmlns=\"http://windowscloudbackup.com/CloudCommon/V2011_09\"><MetadataVersion>V2015_09</MetadataVersion><ContainerType i:nil=\"true\" /><InstantRpGCId>ef4ab5a7-c2c0-4304-af80-af49f48af3d1;AzureBackup_testvm1234_932843259176972511;AzureBackup_20210726_033536;AzureBackupRG_eastus_1</InstantRpGCId><IsBlockBlobEnabled>true</IsBlockBlobEnabled><IsManagedVirtualMachine>true</IsManagedVirtualMachine><OriginalSAOption>false</OriginalSAOption><OsType>Windows</OsType><ReadMetadaFromConfigBlob i:nil=\"true\" /><RecoveryPointConsistencyType>CrashConsistent</RecoveryPointConsistencyType><RpDiskDetails i:nil=\"true\" /><SourceIaaSVMRPKeyAndSecret i:nil=\"true\" /><SourceIaaSVMStorageType>PremiumVMOnPartialPremiumStorage</SourceIaaSVMStorageType><VMSizeDescription>Standard_D2s_v3</VMSizeDescription><Zones xmlns:d2p1=\"http://schemas.microsoft.com/2003/10/Serialization/Arrays\" i:nil=\"true\" /></IaaSVMRecoveryPointMetadataBase>",
+ "rpTierInformation": {
+ "InstantRP": "Valid",
+ "HardenedRP": "Valid"
+ },
+ "rpOriginalSAOption": false,
+ "rpIsManagedVirtualMachine": true,
+ "rpVMSizeDescription": "Standard_D2s_v3",
+ "bMSActiveRegion": "EastUS"
+ },
+ "restoreRequest": {
+ "affinityGroup": "",
+ "createNewCloudService": false,
+ "encryptionDetails": {
+ "encryptionEnabled": false
+ },
+ "objectType": "IaasVMRestoreRequest",
+ "recoveryPointId": "932879774590051503",
+ "recoveryType": "RestoreDisks",
+ "sourceResourceId":"/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/testRG1/providers/Microsoft.Compute/virtualMachines/testVM",
+ "targetResourceGroupId": "/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/testRG1",
+ "storageAccountId":"/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/testRG1/providers/Microsoft.Storage/storageAccounts/testStorageAccount",
+ "region": "westus",
+ "affinityGroup": "",
+ "createNewCloudService": false,
+ "originalStorageAccountOption": false,
+ "restoreDiskLunList": []
+ }
+}
+```
+
+Similar to the primary region restore operation, this is an asynchronous operation and needs to be [tracked separately](/azure/backup/backup-azure-arm-userestapi-restoreazurevms#restore-response).
+++ ## Next steps For more information on the Azure Backup REST APIs, see the following documents:
backup Backup Azure Restore Files From Vm https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-azure-restore-files-from-vm.md
If you run the script on a computer with restricted access, ensure there's acces
- `https://pod01-rec2.GEO-NAME.backup.windowsazure.de` (For Azure Germany) or `AzureBackup` service tag in NSG - Public DNS resolution on port 53 (outbound)
+> [!NOTE]
+> Proxies may not support iSCSI protocol or give access to port 3260. Hence it is strongly recommended to run this script on machines which have direct access as required above and not on the machines which will redirect to proxy.
+ > [!NOTE] > > In case, the backed up VM is Windows, then the geo-name will be mentioned in the password generated.<br><br>
If the file recovery process hangs after you run the file-restore script (for ex
![Registry key changes](media/backup-azure-restore-files-from-vm/iscsi-reg-key-changes.png) ```registry-- HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Disk\TimeOutValue ΓÇô change this from 60 to 1200-- HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Class\{4d36e97b-e325-11ce-bfc1-08002be10318}\0003\Parameters\SrbTimeoutDelta ΓÇô change this from 15 to 1200
+- HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Disk\TimeOutValue ΓÇô change this from 60 to 1200 secs.
+- HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Class\{4d36e97b-e325-11ce-bfc1-08002be10318}\0003\Parameters\SrbTimeoutDelta ΓÇô change this from 15 to 1200 secs.
- HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Class\{4d36e97b-e325-11ce-bfc1-08002be10318}\0003\Parameters\EnableNOPOut ΓÇô change this from 0 to 1-- HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Class\{4d36e97b-e325-11ce-bfc1-08002be10318}\0003\Parameters\MaxRequestHoldTime - change this from 60 to 1200
+- HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Class\{4d36e97b-e325-11ce-bfc1-08002be10318}\0003\Parameters\MaxRequestHoldTime - change this from 60 to 1200 secs.
``` ### For Linux
backup Backup Sql Server Database Azure Vms https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-sql-server-database-azure-vms.md
Before you back up a SQL Server database, check the following criteria:
1. Identify or create a [Recovery Services vault](backup-sql-server-database-azure-vms.md#create-a-recovery-services-vault) in the same region and subscription as the VM hosting the SQL Server instance. 1. Verify that the VM has [network connectivity](backup-sql-server-database-azure-vms.md#establish-network-connectivity).
+1. Make sure that the [Azure Virtual Machine Agent](../virtual-machines/extensions/agent-windows.md) is installed on the VM.
+1. Make sure that .NET 4.5.2 version or above is installed on the VM.
1. Make sure that the SQL Server databases follow the [database naming guidelines for Azure Backup](#database-naming-guidelines-for-azure-backup). 1. Ensure that the combined length of the SQL Server VM name and the resource group name doesn't exceed 84 characters for Azure Resource Manager VMs (or 77 characters for classic VMs). This limitation is because some characters are reserved by the service. 1. Check that you don't have any other backup solutions enabled for the database. Disable all other SQL Server backups before you back up the database.
batch Large Number Tasks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/batch/large-number-tasks.md
Title: Submit a large number of tasks to a Batch job description: Learn how to efficiently submit a very large number of tasks in a single Azure Batch job. Previously updated : 12/30/2020 Last updated : 08/25/2021 # Submit a large number of tasks to a Batch job
cloud-services-extended-support Swap Cloud Service https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cloud-services-extended-support/swap-cloud-service.md
To save compute costs, you can delete one of the cloud services (designated as a
## REST API
-To use the REST API to swap to a new cloud services deployment in Azure Cloud Services (extended support), use the following command and JSON configuration:
+To use the [REST API](https://review.docs.microsoft.com/rest/api/compute/load-balancers/swap-public-ip-addresses?branch=net202102) to swap to a new cloud services deployment in Azure Cloud Services (extended support), use the following command and JSON configuration:
```http
-POST https://management.azure.com/subscriptions/subId/providers/Microsoft.Network/locations/region/setLoadBalancerFrontendPublicIpAddresses?api-version=2020-11-01
+POST https://management.azure.com/subscriptions/subid/providers/Microsoft.Network/locations/westus/setLoadBalancerFrontendPublicIpAddresses?api-version=2021-02-01
``` ```json { "frontendIPConfigurations": [
- {
- "id": "#LBFE1#",
- "properties": {
- "publicIPAddress": {
- "id": "#PIP2#"
- }
+ {
+ "id": "/subscriptions/subid/resourceGroups/rg1/providers/Microsoft.Network/loadBalancers/lb1/frontendIPConfigurations/lbfe1",
+ "properties": {
+ "publicIPAddress": {
+ "id": "/subscriptions/subid/resourceGroups/rg2/providers/Microsoft.Network/publicIPAddresses/pip2"
+ }
} },
- {
- "id": "#LBFE2#",
- "properties": {
- "publicIPAddress": {
- "id": "#PIP1#"
- }
- }
+ {
+ "id": "/subscriptions/subid/resourceGroups/rg2/providers/Microsoft.Network/loadBalancers/lb2/frontendIPConfigurations/lbfe2",
+ "properties": {
+ "publicIPAddress": {
+ "id": "/subscriptions/subid/resourceGroups/rg1/providers/Microsoft.Network/publicIPAddresses/pip1"
+ }
+ }
} ]
- }
} ```
cloud-shell Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cloud-shell/troubleshooting.md
PowerShell:
```powershell $token= (Get-AzAccessToken -Resource https://management.azure.com/).Token
- Invoke-WebRequest -Method Delete -Uri https://management.azure.com?api-version=2017-12-01-preview -Headers @{Authorization = "Bearer $token"}
+ Invoke-WebRequest -Method Delete -Uri https://management.azure.com/providers/Microsoft.Portal/usersettings/cloudconsole?api-version=2017-12-01-preview -Headers @{Authorization = "Bearer $token"}
``` ## Azure Government limitations Azure Cloud Shell in Azure Government is only accessible through the Azure portal.
cognitive-services Overview Multivariate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Anomaly-Detector/overview-multivariate.md
keywords: anomaly detection, machine learning, algorithms
The new **multivariate anomaly detection** APIs further enable developers by easily integrating advanced AI for detecting anomalies from groups of metrics, without the need for machine learning knowledge or labeled data. Dependencies and inter-correlations between up to 300 different signals are now automatically counted as key factors. This new capability helps you to proactively protect your complex systems such as software applications, servers, factory machines, spacecraft, or even your business, from failures.
-![Multiple time series line graphs for variables of: vibration, temperature, pressure, velocity, rotation speed with anomalies highlighted in orange](./media/multivariate-graph.png)
+![Multiple time series line graphs for variables of: rotation, optical filter, pressure, bearing with anomalies highlighted in orange](./media/multivariate-graph.png)
-Imagine 20 sensors from an auto engine generating 20 different signals like vibration, temperature, fuel pressure, etc. The readings of those signals individually may not tell you much about system level issues, but together they can represent the health of the engine. When the interaction of those signals deviates outside the usual range, the multivariate anomaly detection feature can sense the anomaly like a seasoned expert. The underlying AI models are trained and customized using your data such that it understands the unique needs of your business. With the new APIs in Anomaly Detector, developers can now easily integrate the multivariate time series anomaly detection capabilities into predictive maintenance solutions, AIOps monitoring solutions for complex enterprise software, or business intelligence tools.
+Imagine 20 sensors from an auto engine generating 20 different signals like rotation, fuel pressure, bearing, etc. The readings of those signals individually may not tell you much about system level issues, but together they can represent the health of the engine. When the interaction of those signals deviates outside the usual range, the multivariate anomaly detection feature can sense the anomaly like a seasoned expert. The underlying AI models are trained and customized using your data such that it understands the unique needs of your business. With the new APIs in Anomaly Detector, developers can now easily integrate the multivariate time series anomaly detection capabilities into predictive maintenance solutions, AIOps monitoring solutions for complex enterprise software, or business intelligence tools.
## When to use **multivariate** versus **univariate**
cognitive-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Anomaly-Detector/whats-new.md
We've also added links to some user-generated content. Those items will be marke
## Release notes
+### August 2021
+
+* Multivariate anomaly detection APIs deployed in five more regions: West US 3, Japan East, Brazil South, Central US, Norway East. Now in total 15 regions are supported.
+ ### July 2021 * Multivariate anomaly detection APIs deployed in four more regions: Australia East, Canada Central, North Europe, and Southeast Asia. Now in total 10 regions are supported.
cognitive-services How To Custom Speech Test And Train https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/how-to-custom-speech-test-and-train.md
This table lists accepted data types, when each data type should be used, and th
| Data type | Used for testing | Recommended quantity | Used for training | Recommended quantity | |--|--|-|-|-| | [Audio](#audio-data-for-testing) | Yes<br>Used for visual inspection | 5+ audio files | No | N/A |
+| [Audio + Human-labeled transcripts](#audio--human-labeled-transcript-data-for-trainingtesting) | Yes<br>Used to evaluate accuracy | 0.5-5 hours of audio | Yes | 1-20 hours of audio |
| [Plain text](#plain-text-data-for-training) | No | N/a | Yes | 1-200 MB of related text | | [Pronunciation](#pronunciation-data-for-training) | No | N/a | Yes | 1 KB - 1 MB of pronunciation text |
-| [Audio + Human-labeled transcripts](#audio-and-human-labeled-transcript-data) | Yes<br>Used to evaluate accuracy | 0.5-5 hours of audio | Yes | 1-20 hours of audio |
Files should be grouped by type into a dataset and uploaded as a .zip file. Each dataset can only contain a single data type. > [!TIP]
-> When you train a new model, start with [text](#plain-text-data-for-training). This data will already improve the recognition of special terms and phrases. Training with text is much faster than training with audio (minutes vs. days).
+> When you train a new model, start with plain text. This data will already improve the recognition of special terms and phrases. Training with text is much faster than training with audio (minutes vs. days).
> [!NOTE] > Not all base models support training with audio. If a base model does not support it, the Speech service will only use the text from the transcripts and ignore the audio. See [Language support](language-support.md#speech-to-text) for a list of base models that support training with audio data. Even if a base model supports training with audio data, the service might use only part of the audio. Still it will use all the transcripts.
Files should be grouped by type into a dataset and uploaded as a .zip file. Each
## Upload data
-To upload your data, navigate to <a href="https://speech.microsoft.com/customspeech" target="_blank">Custom Speech portal</a>. After creating a project, navigate to **Speech datasets** tab, and click **Upload data** to launch the wizard and create your first dataset. You'll be asked to select a speech data type for your dataset, before allowing you to upload your data.
+To upload your data, navigate to <a href="https://speech.microsoft.com/customspeech" target="_blank">Custom Speech portal</a>. After creating a project, navigate to **Speech datasets** tab, and click **Upload data** to launch the wizard and create your first dataset. Select a speech data type for your dataset, and upload your data.
-Firstly you need to specify whether the dataset is to be used for **Training** or **Testing**. And there are multiple types of data that can be uploaded and used for **Training** or **Testing**. Each dataset you upload must meet the requirements for the data type that you choose. Your data must be correctly formatted before it's uploaded. Correctly formatted data ensures it will be accurately processed by the Custom Speech service. Requirements are listed in the following sections.
+First, you need to specify whether the dataset is to be used for **Training** or **Testing**. There are many types of data that can be uploaded and used for **Training** or **Testing**. Each dataset you upload must be correctly formatted before uploading, and must meet the requirements for the data type that you choose. Requirements are listed in the following sections.
After your dataset is uploaded, you have a few options:
After your dataset is uploaded, you have a few options:
* You can navigate to the **Test models** tab to visually inspect quality with audio only data or evaluate accuracy with audio + human-labeled transcription data.
+## Audio + human-labeled transcript data for training/testing
+
+Audio + human-labeled transcript data can be used for both training and testing purposes. To improve the acoustic aspects like slight accents, speaking styles, background noises, or to measure the accuracy of Microsoft's speech-to-text accuracy when processing your audio files, you must provide human-labeled transcriptions (word-by-word) for comparison. While human-labeled transcription is often time consuming, it's necessary to evaluate accuracy and to train the model for your use cases. Keep in mind, the improvements in recognition will only be as good as the data provided. For that reason, it's important that only high-quality transcripts are uploaded.
+
+Audio files can have silence at the beginning and end of the recording. If possible, include at least a half-second of silence before and after speech in each sample file. While audio with low recording volume or disruptive background noise is not helpful, it should not hurt your custom model. Always consider upgrading your microphones and signal processing hardware before gathering audio samples.
+
+| Property | Value |
+|--|-|
+| File format | RIFF (WAV) |
+| Sample rate | 8,000 Hz or 16,000 Hz |
+| Channels | 1 (mono) |
+| Maximum length per audio | 2 hours (testing) / 60 s (training) |
+| Sample format | PCM, 16-bit |
+| Archive format | .zip |
+| Maximum zip size | 2 GB |
++
+> [!NOTE]
+> When uploading training and testing data, the .zip file size cannot exceed 2 GB. You can only test from a *single* dataset, be sure to keep it within the appropriate file size. Additionally, each training file cannot exceed 60 seconds otherwise it will error out.
+
+To address issues like word deletion or substitution, a significant amount of data is required to improve recognition. Generally, it's recommended to provide word-by-word transcriptions for 1 to 20 hours of audio. However, even as little as 30 minutes can help to improve recognition results. The transcriptions for all WAV files should be contained in a single plain-text file. Each line of the transcription file should contain the name of one of the audio files, followed by the corresponding transcription. The file name and transcription should be separated by a tab (\t).
+
+For example:
+
+<!-- The following example contains tabs. Don't accidentally convert these into spaces. -->
+
+```input
+speech01.wav speech recognition is awesome
+speech02.wav the quick brown fox jumped all over the place
+speech03.wav the lazy dog was not amused
+```
+
+> [!IMPORTANT]
+> Transcription should be encoded as UTF-8 byte order mark (BOM).
+
+The transcriptions are text-normalized so they can be processed by the system. However, there are some important normalizations that must be done before uploading the data to the Speech Studio. For the appropriate language to use when you prepare your transcriptions, see [How to create a human-labeled transcription](how-to-custom-speech-human-labeled-transcriptions.md)
+
+After you've gathered your audio files and corresponding transcriptions, package them as a single .zip file before uploading to the <a href="https://speech.microsoft.com/customspeech" target="_blank">Speech Studio </a>. Below is an example dataset with three audio files and a human-labeled transcription file:
+
+> [!div class="mx-imgBorder"]
+> ![Select audio from the Speech Portal](./media/custom-speech/custom-speech-audio-transcript-pairs.png)
+
+See [Set up your Azure account](custom-speech-overview.md#set-up-your-azure-account) for a list of recommended regions for your Speech service subscriptions. Setting up the Speech subscriptions in one of these regions will reduce the time it takes to train the model. In these regions, training can process about 10 hours of audio per day compared to just 1 hour per day in other regions. If model training cannot be completed within a week, the model will be marked as failed.
+
+Not all base models support training with audio data. If the base model does not support it, the service will ignore the audio and just train with the text of the transcriptions. In this case, training will be the same as training with related text. See [Language support](language-support.md#speech-to-text) for a list of base models that support training with audio data.
+ ## Plain text data for training
-Domain related sentences can be used to improve accuracy when recognizing product names, or industry-specific jargon. Sentences can be provided as a single text file. To improve accuracy, use text data that is closer to the expected spoken utterances.
+You can use domain related sentences to improve accuracy when recognizing product names, or industry-specific jargon. Provide sentences in a single text file. To improve accuracy, use text data that is closer to the expected spoken utterances.
Training with plain text usually completes within a few minutes.
If there are uncommon terms without standard pronunciations that your users will
> [!IMPORTANT] > It is not recommended to use custom pronunciation files to alter the pronunciation of common words.
-Pronunciations should be provided as a single text file. This includes examples of a spoken utterance, and a custom pronunciation for each:
+Provide pronunciations in a single text file. This includes examples of a spoken utterance, and a custom pronunciation for each:
| Recognized/displayed form | Spoken form | |--|--|
Use the following table to ensure that your related data file for pronunciations
| # of pronunciations per line | 1 | | Maximum file size | 1 MB (1 KB for free tier) |
-## Audio and human-labeled transcript data
-
-Audio + human-labeled transcript data can be used for both training and testing purposes. To improve the acoustic aspects like slight accents, speaking styles, background noises, or to measure the accuracy of Microsoft's speech-to-text accuracy when processing your audio files, you must provide human-labeled transcriptions (word-by-word) for comparison. While human-labeled transcription is often time consuming, it's necessary to evaluate accuracy and to train the model for your use cases. Keep in mind, the improvements in recognition will only be as good as the data provided. For that reason, it's important that only high-quality transcripts are uploaded.
-
-Audio files can have silence at the beginning and end of the recording. If possible, include at least a half-second of silence before and after speech in each sample file. While audio with low recording volume or disruptive background noise is not helpful, it should not hurt your custom model. Always consider upgrading your microphones and signal processing hardware before gathering audio samples.
-
-| Property | Value |
-|--|-|
-| File format | RIFF (WAV) |
-| Sample rate | 8,000 Hz or 16,000 Hz |
-| Channels | 1 (mono) |
-| Maximum length per audio | 2 hours (testing) / 60 s (training) |
-| Sample format | PCM, 16-bit |
-| Archive format | .zip |
-| Maximum zip size | 2 GB |
--
-> [!TIP]
-> DonΓÇÖt even have any real audio? You can also upload a text (.txt) file by selecting type **Transcript (automatic audio synthesis)** as **Testing** data to get a basic sense of current accuracy levels, and audio pair for each spoken utterance will be automatically synthesized using [Text-to-speech](text-to-speech.md).
->
-> Note that the synthesized audios are typically **NOT** recommended to use as **Training** data.
->
-> The maximum file size is 500KB. We will synthesize one audio for each line, and the maximum size of each line is 65535 bytes.
-
-> [!NOTE]
-> When uploading training and testing data, the .zip file size cannot exceed 2 GB. You can only test from a *single* dataset, be sure to keep it within the appropriate file size. Additionally, each training file cannot exceed 60 seconds otherwise it will error out.
-
-To address issues like word deletion or substitution, a significant amount of data is required to improve recognition. Generally, it's recommended to provide word-by-word transcriptions for 1 to 20 hours of audio. However, even as little as 30 minutes can help to improve recognition results. The transcriptions for all WAV files should be contained in a single plain-text file. Each line of the transcription file should contain the name of one of the audio files, followed by the corresponding transcription. The file name and transcription should be separated by a tab (\t).
-
-For example:
-
-<!-- The following example contains tabs. Don't accidentally convert these into spaces. -->
-
-```input
-speech01.wav speech recognition is awesome
-speech02.wav the quick brown fox jumped all over the place
-speech03.wav the lazy dog was not amused
-```
-
-> [!IMPORTANT]
-> Transcription should be encoded as UTF-8 byte order mark (BOM).
-
-The transcriptions are text-normalized so they can be processed by the system. However, there are some important normalizations that must be done before uploading the data to the Speech Studio. For the appropriate language to use when you prepare your transcriptions, see [How to create a human-labeled transcription](how-to-custom-speech-human-labeled-transcriptions.md)
-
-After you've gathered your audio files and corresponding transcriptions, package them as a single .zip file before uploading to the <a href="https://speech.microsoft.com/customspeech" target="_blank">Speech Studio </a>. Below is an example dataset with three audio files and a human-labeled transcription file:
-
-> [!div class="mx-imgBorder"]
-> ![Select audio from the Speech Portal](./media/custom-speech/custom-speech-audio-transcript-pairs.png)
-
-See [Set up your Azure account](custom-speech-overview.md#set-up-your-azure-account) for a list of recommended regions for your Speech service subscriptions. Setting up the Speech subscriptions in one of these regions will reduce the time it takes to train the model. In these regions, training can process about 10 hours of audio per day compared to just 1 hour per day in other regions. If model training cannot be completed within a week, the model will be marked as failed.
-
-Not all base models support training with audio data. If the base model does not support it, the service will ignore the audio and just train with the text of the transcriptions. In this case, training will be the same as training with related text. See [Language support](language-support.md#speech-to-text) for a list of base models that support training with audio data.
- ## Audio data for testing
-Audio data is optimal for testing the accuracy of Microsoft's baseline speech-to-text model or a custom model. Keep in mind, audio data is used to inspect the accuracy of speech with regard to a specific model's performance. If you're looking to quantify the accuracy of a model, use [audio + human-labeled transcription data](#audio-and-human-labeled-transcript-data).
+Audio data is optimal for testing the accuracy of Microsoft's baseline speech-to-text model or a custom model. Keep in mind, audio data is used to inspect the accuracy of speech with regard to a specific model's performance. If you want to quantify the accuracy of a model, use [audio + human-labeled transcripts](#audio--human-labeled-transcript-data-for-trainingtesting).
-Use this table to ensure that your audio files are formatted correctly for use with Custom Speech:
+Custom Speech requires audio files with these properties:
| Property | Value | |--|--|
Use this table to ensure that your audio files are formatted correctly for use w
[!INCLUDE [supported-audio-formats](includes/supported-audio-formats.md)]
-> [!TIP]
+> [!NOTE]
> When uploading training and testing data, the .zip file size cannot exceed 2 GB. If you require more data for training, divide it into several .zip files and upload them separately. Later, you can choose to train from *multiple* datasets. However, you can only test from a *single* dataset.
-Use <a href="http://sox.sourceforge.net" target="_blank" rel="noopener">SoX </a> to verify audio properties or convert existing audio to the appropriate formats. Below are some examples of how each of these activities can be done through the SoX command line:
+Use <a href="http://sox.sourceforge.net" target="_blank" rel="noopener">SoX</a> to verify audio properties or convert existing audio to the appropriate formats. Below are some example SoX commands:
-| Activity | Description | SoX command |
-|-|-|-|
-| Check audio format | Use this command to check<br>the audio file format. | `sox --i <filename>` |
-| Convert audio format | Use this command to convert<br>the audio file to single channel, 16-bit, 16 KHz. | `sox <input> -b 16 -e signed-integer -c 1 -r 16k -t wav <output>.wav` |
+| Activity | SoX command |
+||-|
+| Check the audio file format. | `sox --i <filename>` |
+| Convert the audio file to single channel, 16-bit, 16 KHz. | `sox <input> -b 16 -e signed-integer -c 1 -r 16k -t wav <output>.wav` |
## Next steps * [Inspect your data](how-to-custom-speech-inspect-data.md) * [Evaluate your data](how-to-custom-speech-evaluate-data.md) * [Train custom model](how-to-custom-speech-train-model.md)
-* [Deploy model](./how-to-custom-speech-train-model.md)
+* [Deploy model](./how-to-custom-speech-train-model.md)
communication-services Authentication https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/authentication.md
Title: Authenticate to Azure Communication Services description: Learn about the various ways an app or service can authenticate to Communication Services.-+ -+ -+ Last updated 06/30/2021
communication-services Call Flows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/call-flows.md
Title: Call flows in Azure Communication Services description: Learn about call flows in Azure Communication Services.--++ -+ Last updated 06/30/2021
communication-services Concepts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/chat/concepts.md
Title: Chat concepts in Azure Communication Services
description: Learn about Communication Services Chat concepts. -+ -+ Last updated 06/30/2021
communication-services Sdk Features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/chat/sdk-features.md
Title: Chat SDK overview for Azure Communication Services
description: Learn about the Azure Communication Services Chat SDK. -+ -+ Last updated 06/30/2021
communication-services Client And Server Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/client-and-server-architecture.md
Title: Client and server architecture description: Learn about Communication Services' architecture.-+ -+ Last updated 06/30/2021
communication-services Known Issues https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/known-issues.md
Title: Azure Communication Services - known issues description: Learn more about Azure Communication Services -+ -+ Last updated 06/30/2021
communication-services Logging And Diagnostics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/logging-and-diagnostics.md
Title: Communication Services Logs description: Learn about logging in Azure Communication Services--++ -+ Last updated 06/30/2021
communication-services Metrics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/metrics.md
Title: Metric definitions for Azure Communication Service description: This document covers definitions of metrics available in the Azure portal.--++ -+ Last updated 06/30/2021
communication-services Notifications https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/notifications.md
Title: Notifications in Azure Communication Services description: Send notifications to users of apps built on Azure Communication Services.--++ -+ Last updated 06/30/2021
communication-services Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/reference.md
Title: Reference documentation overview for Azure Communication Services description: Learn about Communication Services' reference documentation.--++ -+ Last updated 06/30/2021
communication-services Sdk Options https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/sdk-options.md
Title: SDKs and REST APIs for Azure Communication Services description: Learn more about Azure Communication Services SDKs and REST APIs.--++ -+ Last updated 06/30/2021
communication-services Concepts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/telephony-sms/concepts.md
Title: SMS concepts in Azure Communication Services description: Learn about Communication Services SMS concepts.--++ -+ Last updated 06/30/2021
communication-services Plan Solution https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/telephony-sms/plan-solution.md
The following table shows you where you can acquire different types of phone num
| Local (Geographic) | US | US, Canada, United Kingdom, Germany, France,. +more**| US, Canada, United Kingdom, Germany, France,. +more** |Not available| Not available | | Toll-Free | US | US | US |US | US |
-*Currently, you can receive calls only to a Microsoft number that is assigned to a Telephony Channel bot. Read more about Telephony Channel [here](/azure/bot-service/bot-service-channel-connect-telephony.md)
+*Currently, you can receive calls only to a Microsoft number that is assigned to a Telephony Channel bot. Read more about Telephony Channel [here](/azure/bot-service/bot-service-channel-connect-telephony)
**For more details about call destinations and pricing, refer to the [pricing page](../pricing.md).
communication-services Sdk Features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/telephony-sms/sdk-features.md
Title: SMS SDK overview for Azure Communication Services description: Provides an overview of the SMS SDK and its offerings.--++ Last updated 06/30/2021
communication-services Troubleshooting Info https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/troubleshooting-info.md
Title: Troubleshooting in Azure Communication Services description: Learn how to gather the information you need to troubleshoot your Communication Services solution. -+
chat_client = ChatClient(
```
-## Access your call ID
+## Access your server call ID
+When troubleshooting issues with the Call Automation SDK, like call recording and call management problems, you will need to collect the Server Call ID. This ID can be collected using the ```getServerCallId``` method.
+
+#### JavaScript
+```
+callAgent.on('callsUpdated', (e: { added: Call[]; removed: Call[] }): void => {
+ e.added.forEach((addedCall) => {
+ addedCall.on('stateChanged', (): void => {
+ if (addedCall.state === 'Connected') {
+ addedCall.info.getServerCallId().then(result => {
+ dispatch(setServerCallId(result));
+ }).catch(err => {
+ console.log(err);
+ });
+ }
+ });
+ });
+});
+```
++
+## Access your client call ID
When troubleshooting voice or video calls, you may be asked to provide a `call ID`. This can be accessed via the `id` property of the `call` object:
The Azure Communication Services Chat SDK uses the following error codes to help
## Related information - [Logs and diagnostics](logging-and-diagnostics.md)-- [Metrics](metrics.md)
+- [Metrics](metrics.md)
communication-services About Call Types https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/voice-video-calling/about-call-types.md
Title: Voice and video concepts in Azure Communication Services description: Learn about Communication Services call types.--++ -+ Last updated 06/30/2021
communication-services Calling Sdk Features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/voice-video-calling/calling-sdk-features.md
Title: Azure Communication Services Calling SDK overview description: Provides an overview of the Calling SDK.--++ -+ Last updated 06/30/2021
The following timeouts apply to the Communication Services Calling SDKs:
## JavaScript Calling SDK support by OS and browser
-The following table represents the set of supported browsers which are currently available. We support the most recent three versions of the browser unless otherwise indicated.
+The following table represents the set of supported browsers which are currently available. **We support the most recent three versions of the browser** unless otherwise indicated.
-| Platform | Chrome | Safari | Edge (Chromium) | Notes |
-| | | | | -- |
-| Android | ✔️ | ❌ | ❌ | Outgoing Screen Sharing is not supported. |
-| iOS | ❌ | ✔️ | ❌ | [An iOS app on Safari can't enumerate/select mic and speaker devices](../known-issues.md#enumerating-devices-isnt-possible-in-safari-when-the-application-runs-on-ios-or-ipados) (for example, Bluetooth); this is a limitation of the OS, and there's always only one device, OS controls default device selection. Outgoing screen sharing is not supported. |
-| macOS | ✔️ | ✔️ | ❌ | Safari 14+/macOS 11+ needed for outgoing video support. |
-| Windows | ✔️ | ❌ | ✔️ | |
-| Ubuntu/Linux | ✔️ | ❌ | ❌ | |
+| Platform | Chrome | Safari | Edge (Chromium) |
+| | | | -- |
+| Android | ✔️ | ❌ | ❌ |
+| iOS | ❌ | ✔️ | ❌ |
+| macOS | ✔️ | ✔️ | ❌ |
+| Windows | ✔️ | ❌ | ✔️ |
+| Ubuntu/Linux | ✔️ | ❌ | ❌ |
-* For Safari versions 13.1+ are supported, 1:1 calls are not supported on Safari.
-* Unless otherwise specified, the past 3 versions of each browser are supported.
+* 1:1 calls are not supported on Safari.
+* Outgoing Screen Sharing is not supported on iOS or Android.
+* [An iOS app on Safari can't enumerate/select mic and speaker devices](../known-issues.md#enumerating-devices-isnt-possible-in-safari-when-the-application-runs-on-ios-or-ipados) (for example, Bluetooth); this is a limitation of the OS, and there's always only one device, OS controls default device selection.
## Android Calling SDK support
For example, this iframe allows both camera and microphone access:
For more information, see the following articles: - Familiarize yourself with general [call flows](../call-flows.md) - Learn about [call types](../voice-video-calling/about-call-types.md)-- [Plan your PSTN solution](../telephony-sms/plan-solution.md)
+- [Plan your PSTN solution](../telephony-sms/plan-solution.md)
communication-services Network Requirements https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/voice-video-calling/network-requirements.md
Title: Prepare your organization's network for Azure Communication Services
description: Learn about the network requirements for Azure Communication Services voice and video calling. -+
communication-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/overview.md
Title: What is Azure Communication Services? description: Learn how Azure Communication Services helps you develop rich user experiences with real-time communications.--++ -+ Last updated 06/30/2021
communication-services Get Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/chat/get-started.md
-+ Last updated 06/30/2021
communication-services Create Communication Resource https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/create-communication-resource.md
Title: Quickstart - Create and manage resources in Azure Communication Services description: In this quickstart, you'll learn how to create and manage your first Azure Communication Services resource.--++ -+ Last updated 06/30/2021
communication-services Quick Create Identity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/identity/quick-create-identity.md
Title: Quickstart - Quickly create Azure Communication Services identities for t
description: Learn how to use the Identities & Access Tokens tool in the Azure portal to use with samples and for troubleshooting. -+ Last updated 07/19/2021
communication-services Handle Sms Events https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/telephony-sms/handle-sms-events.md
Title: Quickstart - Handle SMS events for Delivery Reports and Inbound Messages description: Learn how to handle SMS events using Azure Communication Services.--++ -+ Last updated 06/30/2021
communication-services Port Phone Number https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/telephony-sms/port-phone-number.md
Title: Quickstart - Port a phone number into Azure Communication Services description: Learn how to port a phone number into your Communication Services resource-+ -+ Last updated 06/30/2021
communication-services Send https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/telephony-sms/send.md
Title: Quickstart - Send an SMS message description: Learn how to send an SMS message using Azure Communication Services.--++ -+ Last updated 06/30/2021
communication-services Get Started With Video Calling https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/voice-video-calling/get-started-with-video-calling.md
Title: Quickstart - Add video calling to your app (JavaScript)
description: In this quickstart, you'll learn how to add video calling capabilities to your app using Azure Communication Services. -+ Last updated 06/30/2021
communication-services Getting Started With Calling https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/voice-video-calling/getting-started-with-calling.md
Title: Quickstart - Add voice calling to your app description: In this quickstart, you'll learn how to add calling capabilities to your app using Azure Communication Services.--++ Last updated 06/30/2021
confidential-computing Confidential Containers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/confidential-computing/confidential-containers.md
Occlum supports AKS deployments. Follow the deployment instructions with various
In a nutshell, Marblerun extends the confidentiality, integrity, and verifiability properties of a single enclave to a Kubernetes cluster.
-Marblerun supports confidential containers created with Graphene, Occlum, and EGo. Examples for each SDK are given [here](https://www.marblerun.sh/docs/examples/). Marblerun is built to run on Kubernetes and alongside your existing cloud-native tooling. It comes with an easy-to-use CLI and helm charts. It has first-class support for confidential computing nodes on AKS. Information on how to deploy Marblerun on AKS can be found [here](https://www.marblerun.sh/docs/deployment/cloud/).
+Marblerun supports confidential containers created with Graphene, Occlum, and EGo. Examples for each SDK are given [here](https://docs.edgeless.systems/marblerun/#/examples?id=examples). Marblerun is built to run on Kubernetes and alongside your existing cloud-native tooling. It comes with an easy-to-use CLI and helm charts. It has first-class support for confidential computing nodes on AKS. Information on how to deploy Marblerun on AKS can be found [here](https://docs.edgeless.systems/marblerun/#/deployment/cloud?id=cloud-deployment).
## Confidential Containers Demo View the confidential healthcare demo with confidential containers. Sample is available [here](/azure/architecture/example-scenario/confidential/healthcare-inference).
cosmos-db Automated Recommendations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/automated-recommendations.md
Previously updated : 07/28/2020 Last updated : 08/26/2021
In this category, the advisor detects the query execution and identifies that th
* [Tuning query performance in Azure Cosmos DB](sql-api-query-metrics.md) * [Troubleshoot query issues](troubleshoot-query-performance.md) when using Azure Cosmos DB
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Best Practice Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/best-practice-dotnet.md
Previously updated : 07/08/2021 Last updated : 08/26/2021
For workloads that have heavy create payloads, set the `EnableContentResponseOnW
## Next steps For a sample application that's used to evaluate Azure Cosmos DB for high-performance scenarios on a few client machines, see [Performance and scale testing with Azure Cosmos DB](performance-testing.md).
-To learn more about designing your application for scale and high performance, see [Partitioning and scaling in Azure Cosmos DB](partitioning-overview.md).
+To learn more about designing your application for scale and high performance, see [Partitioning and scaling in Azure Cosmos DB](partitioning-overview.md).
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Change Feed Design Patterns https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/change-feed-design-patterns.md
Previously updated : 04/08/2020 Last updated : 08/26/2021 # Change feed design patterns in Azure Cosmos DB [!INCLUDE[appliesto-sql-api](includes/appliesto-sql-api.md)]
Here are some real-world change feed code examples that extend beyond the scope
* [Change feed overview](change-feed.md) * [Options to read change feed](read-change-feed.md)
-* [Using change feed with Azure Functions](change-feed-functions.md)
+* [Using change feed with Azure Functions](change-feed-functions.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Choose Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/choose-api.md
This API stores data in document format. It offers the best end-to-end experienc
If you are migrating from other databases such as Oracle, DynamoDB, HBase etc. and if you want to use the modernized technologies to build your apps, SQL API is the recommended option. SQL API supports analytics and offers performance isolation between operational and analytical workloads.
+### Capacity planning for migration to API for MongoDB
+
+Trying to do capacity planning for a migration to Azure Cosmos DB SQL API from an existing database cluster? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing sharded and replicated database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ ## API for MongoDB This API stores data in a document structure, via BSON format. It is compatible with MongoDB wire protocol; however, it does not use any native MongoDB related code. This API is a great choice if you want to use the broader MongoDB ecosystem and skills, without compromising on using Azure Cosmos DBΓÇÖs features such as scaling, high availability, geo-replication, multiple write locations, automatic and transparent shard management, transparent replication between operational and analytical stores, and more.
You can use your existing MongoDB apps with API for MongoDB by just changing the
API for MongoDB is compatible with the 4.0, 3.6, and 3.2 MongoDB server versions. Server version 4.0 is recommended as it offers the best performance and full feature support. To learn more, see [API for MongoDB](mongodb/mongodb-introduction.md) article.
+### Capacity planning for migration to API for MongoDB
+
+Trying to do capacity planning for a migration to Azure Cosmos DB API for MongoDB from an existing database cluster? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](./mongodb/estimate-ru-capacity-planner.md)
+ ## Cassandra API This API stores data in column-oriented schema. Apache Cassandra offers a highly distributed, horizontally scaling approach to storing large volumes of data while offering a flexible approach to a column-oriented schema. Cassandra API in Azure Cosmos DB aligns with this philosophy to approaching distributed NoSQL databases. Cassandra API is wire protocol compatible with the Apache Cassandra. You should consider Cassandra API if you want to benefit the elasticity and fully managed nature of Azure Cosmos DB and still use most of the native Apache Cassandra features, tools, and ecosystem. This means on Cassandra API you donΓÇÖt need to manage the OS, Java VM, garbage collector, read/write performance, nodes, clusters, etc.
Applications written for Azure Table storage can migrate to the Table API with l
* [Get started with Azure Cosmos DB's API for MongoDB](mongodb/create-mongodb-nodejs.md) * [Get started with Azure Cosmos DB Cassandra API](cassandr) * [Get started with Azure Cosmos DB Gremlin API](create-graph-dotnet.md)
-* [Get started with Azure Cosmos DB Table API](create-table-dotnet.md)
+* [Get started with Azure Cosmos DB Table API](create-table-dotnet.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Cli Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/cli-samples.md
Previously updated : 10/13/2020 Last updated : 08/26/2021
These samples apply to all Azure Cosmos DB APIs
| [Throughput operations](scripts/cli/sql/throughput.md?toc=%2fcli%2fazure%2ftoc.json) | Read, update and migrate between autoscale and standard throughput on a database and container.| | [Lock resources from deletion](scripts/cli/sql/lock.md?toc=%2fcli%2fazure%2ftoc.json)| Prevent resources from being deleted with resource locks.| |||+
+## Next steps
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Concepts Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/concepts-limits.md
Previously updated : 04/07/2021 Last updated : 08/26/2021 # Azure Cosmos DB service quotas
Get started with Azure Cosmos DB with one of our quickstarts:
* [Get started with Azure Cosmos DB Cassandra API](cassandr) * [Get started with Azure Cosmos DB Gremlin API](create-graph-dotnet.md) * [Get started with Azure Cosmos DB Table API](table/create-table-dotnet.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
> [!div class="nextstepaction"] > [Try Azure Cosmos DB for free](https://azure.microsoft.com/try/cosmosdb/)
cosmos-db Convert Vcore To Request Unit https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/convert-vcore-to-request-unit.md
Previously updated : 08/20/2021 Last updated : 08/26/2021 # Convert the number of vCores or vCPUs in your nonrelational database to Azure Cosmos DB RU/s [!INCLUDE[appliesto-sql-api](includes/appliesto-sql-api.md)]
Azure Cosmos DB interop APIs run on top of the SQL API and implement their own u
## Worked example: estimate RU/s for single replica set migration
-![Migrate a replica set with 3 replicas of a four-core SKU to Azure Cosmos DB](media/tutorial-vcore-pricing/one-replica-set.png)
+![Migrate a replica set with 3 replicas of a four-core SKU to Azure Cosmos DB](media/convert-vcore-to-request-unit/one-replica-set.png)
Consider a single replica set with a replication factor of *R=3* based on a four-core server SKU. Then * *T* = 12 vCores
Provisioned RU/s, API for MongoDB = (1,000 RU/s/vCore) * (12 vCores) / (3) = 4,0
## Worked example: estimate RU/s when migrating a cluster of homogeneous replica sets
-![Migrate a homogeneous sharded replica set with 3 shards, each with three replicas of a four-core SKU, to Azure Cosmos DB](media/tutorial-vcore-pricing/homogeneous-sharded-replica-sets.png)
+![Migrate a homogeneous sharded replica set with 3 shards, each with three replicas of a four-core SKU, to Azure Cosmos DB](media/convert-vcore-to-request-unit/homogeneous-sharded-replica-sets.png)
Consider a sharded and replicated cluster comprising three replica sets each with a replication factor three, where each server is a four-core SKU. Then * *T* = 36 vCores
Provisioned RU/s, API for MongoDB = (1,000 RU/s/vCore) * (36 vCores) / (3) = 12,
## Worked example: estimate RU/s when migrating a cluster of heterogeneous replica sets
-![Migrate a heterogeneous sharded replica set with 3 shards, each with different numbers of replicas of a four-core SKU, to Azure Cosmos DB](media/tutorial-vcore-pricing/heterogeneous-sharded-replica-sets.png)
+![Migrate a heterogeneous sharded replica set with 3 shards, each with different numbers of replicas of a four-core SKU, to Azure Cosmos DB](media/convert-vcore-to-request-unit/heterogeneous-sharded-replica-sets.png)
Consider a sharded and replicated cluster comprising three replica sets, in which each server is based on a four-core SKU. The replica sets are "heterogeneous" in the sense that each has a different replication factor: 3x, 1x, and 5x, respectively. The recommended approach is to use the average replication factor when calculating request units. Then * *T* = 36 vCores
cosmos-db Cosmos Db Reserved Capacity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/cosmos-db-reserved-capacity.md
description: Learn how to buy Azure Cosmos DB reserved capacity to save on your
Previously updated : 02/18/2020 Last updated : 08/26/2021
The reservation discount is applied automatically to the Azure Cosmos DB resourc
* [Understand reservation usage for your Pay-As-You-Go subscription](../cost-management-billing/reservations/understand-reserved-instance-usage.md) * [Azure reservations in the Partner Center CSP program](/partner-center/azure-reservations)
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ ## Need help? Contact us. If you have questions or need help, [create a support request](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/newsupportrequest).
cosmos-db Cosmosdb Migrationchoices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/cosmosdb-migrationchoices.md
Previously updated : 09/01/2020 Last updated : 08/26/2021 # Options to migrate your on-premises or cloud data to Azure Cosmos DB
For APIs other than the SQL API, Mongo API and the Cassandra API, there are vari
## Next steps
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
* Learn more by trying out the sample applications consuming the bulk executor library in [.NET](bulk-executor-dot-net.md) and [Java](bulk-executor-java.md). * The bulk executor library is integrated into the Cosmos DB Spark connector, to learn more, see [Azure Cosmos DB Spark connector](./create-sql-api-spark.md) article. * Contact the Azure Cosmos DB product team by opening a support ticket under the "General Advisory" problem type and "Large (TB+) migrations" problem subtype for additional help with large scale migrations.
cosmos-db Cosmosdb Sql Api Migrate Data Striim https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/cosmosdb-sql-api-migrate-data-striim.md
Previously updated : 07/22/2019 Last updated : 08/26/2021
By using the Striim solution in Azure, you can continuously migrate data to Azur
## Next steps
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
* If you are migrating data to Azure Cosmos DB SQL API, see [how to migrate data to Cassandra API account using Striim](cassandr)- * [Monitor and debug your data with Azure Cosmos DB metrics](use-metrics.md)
cosmos-db Create Cosmosdb Resources Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/create-cosmosdb-resources-portal.md
ms.devlang: dotnet Previously updated : 05/19/2021 Last updated : 08/26/2021 # Quickstart: Create an Azure Cosmos account, database, container, and items from the Azure portal [!INCLUDE[appliesto-sql-api](includes/appliesto-sql-api.md)]
If you wish to delete just the database and use the Azure Cosmos account in futu
In this quickstart, you learned how to create an Azure Cosmos DB account, create a database and container using the Data Explorer. You can now import additional data to your Azure Cosmos DB account.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import data into Azure Cosmos DB](import-data.md)
cosmos-db Create Sql Api Dotnet V4 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/create-sql-api-dotnet-v4.md
ms.devlang: dotnet Previously updated : 04/07/2021 Last updated : 08/26/2021 # Quickstart: Build a console app by using the .NET V4 SDK (preview) to manage Azure Cosmos DB SQL API account resources
az group delete -g "myResourceGroup"
In this quickstart, you learned how to create an Azure Cosmos account, create a database, and create a container by using a .NET Core app. You can now import more data to your Azure Cosmos account by using the instructions in the following article:
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import data into Azure Cosmos DB](import-data.md)
cosmos-db Create Sql Api Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/create-sql-api-dotnet.md
ms.devlang: dotnet Previously updated : 03/07/2021 Last updated : 08/26/2021
az group delete -g "myResourceGroup"
In this quickstart, you learned how to create an Azure Cosmos account, create a database and a container using a .NET Core app. You can now import additional data to your Azure Cosmos account with the instructions in the following article.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import data into Azure Cosmos DB](import-data.md)
cosmos-db Create Sql Api Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/create-sql-api-java.md
ms.devlang: java Previously updated : 03/07/2021 Last updated : 08/26/2021
Now go back to the Azure portal to get your connection string information and la
In this quickstart, you've learned how to create an Azure Cosmos DB SQL API account, create a document database and container using the Data Explorer, and run a Java app to do the same thing programmatically. You can now import additional data into your Azure Cosmos DB account.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import data into Azure Cosmos DB](import-data.md)
cosmos-db Create Sql Api Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/create-sql-api-nodejs.md
ms.devlang: nodejs Previously updated : 03/07/2021 Last updated : 08/26/2021
You can continue to experiment with this sample application or go back to Data E
In this quickstart, you've learned how to create an Azure Cosmos DB account, create a container using the Data Explorer, and run a Node.js app. You can now import additional data to your Azure Cosmos DB account.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ > [!div class="nextstepaction"] > [import data into azure cosmos db](import-data.md)
cosmos-db Create Sql Api Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/create-sql-api-python.md
ms.devlang: python Previously updated : 04/06/2021 Last updated : 08/26/2021
The following snippets are all taken from the *cosmos_get_started.py* file.
In this quickstart, you've learned how to create an Azure Cosmos DB account, create a container using the Data Explorer, and run a Python app in Visual Studio Code. You can now import additional data to your Azure Cosmos DB account.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import data into Azure Cosmos DB for the SQL API](import-data.md)
cosmos-db Create Sql Api Spring Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/create-sql-api-spring-data.md
ms.devlang: java Previously updated : 03/07/2021 Last updated : 08/26/2021
Now go back to the Azure portal to get your connection string information and la
In this quickstart, you've learned how to create an Azure Cosmos DB SQL API account, create a document database and container using the Data Explorer, and run a Spring Data app to do the same thing programmatically. You can now import additional data into your Azure Cosmos DB account.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import data into Azure Cosmos DB](import-data.md)
cosmos-db Dedicated Gateway https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/dedicated-gateway.md
Previously updated : 08/20/2021 Last updated : 08/26/2021
Read more about dedicated gateway usage in the following articles:
- [Integrated cache](integrated-cache.md) - [Configure the integrated cache](how-to-configure-integrated-cache.md)-- [Integrated cache FAQ](integrated-cache-faq.md)
+- [Integrated cache FAQ](integrated-cache-faq.md)
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Estimate Ru With Capacity Planner https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/estimate-ru-with-capacity-planner.md
Previously updated : 04/20/2021 Last updated : 08/26/2021
# Estimate RU/s using the Azure Cosmos DB capacity planner - SQL API [!INCLUDE[appliesto-sql-api](includes/appliesto-sql-api.md)]
+> [!NOTE]
+> If you are planning a data migration to Azure Cosmos DB and all that you know is the number of vcores and servers in your existing sharded and replicated database cluster, please also read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+>
+ Configuring your Azure Cosmos databases and containers with the right amount of provisioned throughput, or [Request Units (RU/s)](request-units.md), for your workload is essential to optimizing cost and performance. This article describes how to use the Azure Cosmos DB [capacity planner](https://cosmos.azure.com/capacitycalculator/) to get an estimate of the required RU/s and cost of your workload when using the SQL API. If you are using API for MongoDB, see how to [use capacity calculator with MongoDB](mongodb/estimate-ru-capacity-planner.md) article. [!INCLUDE [capacity planner modes](includes/capacity-planner-modes.md)]
The prices shown in the Azure Cosmos DB capacity planner are estimates based on
## Next steps
+* If all you know is the number of vcores and servers in your existing sharded and replicated database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
* Learn more about [Azure Cosmos DB's pricing model](how-pricing-works.md). * Create a new [Cosmos account, database, and container](create-cosmosdb-resources-portal.md). * Learn how to [optimize provisioned throughput cost](optimize-cost-throughput.md). * Learn how to [optimize cost with reserved capacity](cosmos-db-reserved-capacity.md).+
cosmos-db Global Dist Under The Hood https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/global-dist-under-the-hood.md
Next learn how to configure global distribution by using the following articles:
* [Add/remove regions from your database account](how-to-manage-database-account.md#addremove-regions-from-your-database-account) * [How to create a custom conflict resolution policy](how-to-manage-conflicts.md#create-a-custom-conflict-resolution-policy)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db How Pricing Works https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-pricing-works.md
Previously updated : 08/19/2020 Last updated : 08/26/2021 # Pricing model in Azure Cosmos DB
Reserved capacity provides a billing discount and does not affect the runtime st
You can learn more about optimizing the costs for your Azure Cosmos DB resources in the following articles:
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
* Learn about [Optimizing for development and testing](optimize-dev-test.md) * Learn more about [Understanding your Azure Cosmos DB bill](understand-your-bill.md) * Learn more about [Optimizing throughput cost](optimize-cost-throughput.md)
cosmos-db How To Choose Offer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-choose-offer.md
When using autoscale, use Azure Monitor to see the provisioned autoscale max RU/
* Use [RU calculator](https://cosmos.azure.com/capacitycalculator/) to estimate throughput for new workloads. * Use [Azure Monitor](monitor-cosmos-db.md#view-operation-level-metrics-for-azure-cosmos-db) to monitor your existing workloads. * Learn how to [provision autoscale throughput on an Azure Cosmos database or container](how-to-provision-autoscale-throughput.md).
-* Review the [autoscale FAQ](autoscale-faq.yml).
+* Review the [autoscale FAQ](autoscale-faq.yml).
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db How To Migrate From Bulk Executor Library https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-migrate-from-bulk-executor-library.md
Previously updated : 04/24/2020 Last updated : 08/26/2021
Using stream APIs is only possible if the nature of the data you use matches tha
* To learn more about the .NET SDK releases, see the [Azure Cosmos DB SDK](sql-api-sdk-dotnet.md) article. * Get the complete [migration source code](https://github.com/Azure/azure-cosmos-dotnet-v3/tree/master/Microsoft.Azure.Cosmos.Samples/Usage/BulkExecutorMigration) from GitHub.
-* [Additional bulk samples on GitHub](https://github.com/Azure/azure-cosmos-dotnet-v3/tree/master/Microsoft.Azure.Cosmos.Samples/Usage/BulkSupport)
+* [Additional bulk samples on GitHub](https://github.com/Azure/azure-cosmos-dotnet-v3/tree/master/Microsoft.Azure.Cosmos.Samples/Usage/BulkSupport)
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db How To Migrate From Change Feed Library https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-migrate-from-change-feed-library.md
Previously updated : 09/17/2019 Last updated : 08/26/2021
You can now proceed to learn more about change feed processor in the following a
* [Overview of change feed processor](change-feed-processor.md) * [Using the change feed estimator](how-to-use-change-feed-estimator.md)
-* [Change feed processor start time](./change-feed-processor.md#starting-time)
+* [Change feed processor start time](./change-feed-processor.md#starting-time)
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db How To Model Partition Example https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-model-partition-example.md
Previously updated : 05/23/2019 Last updated : 08/26/2021
After this introduction to practical data modeling and partitioning, you may wan
- [Work with databases, containers, and items](account-databases-containers-items.md) - [Partitioning in Azure Cosmos DB](partitioning-overview.md) - [Change feed in Azure Cosmos DB](change-feed.md)+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Import Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/import-data.md
Previously updated : 10/23/2020 Last updated : 08/26/2021
In this tutorial, you've done the following tasks:
You can now proceed to the next tutorial and learn how to query data using Azure Cosmos DB.
+Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ > [!div class="nextstepaction"] >[How to query data?](../cosmos-db/tutorial-query-sql-api.md)
cosmos-db Index Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/index-overview.md
Previously updated : 05/04/2021 Last updated : 08/26/2021
Read more about indexing in the following articles:
- [Indexing policy](index-policy.md) - [How to manage indexing policy](how-to-manage-indexing-policy.md)
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Integrated Cache Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/integrated-cache-faq.md
Previously updated : 05/25/2021 Last updated : 08/26/2021
Expanding the integrated cache beyond SQL API is planned on the long-term roadma
- [Integrated cache](integrated-cache.md) - [Configure the integrated cache](how-to-configure-integrated-cache.md)-- [Dedicated gateway](dedicated-gateway.md)
+- [Dedicated gateway](dedicated-gateway.md)
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Integrated Cache https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/integrated-cache.md
Previously updated : 05/26/2021 Last updated : 08/26/2021
In some cases, if latency is unexpectedly high, you may need more dedicated gate
- [Integrated cache FAQ](integrated-cache-faq.md) - [Configure the integrated cache](how-to-configure-integrated-cache.md)-- [Dedicated gateway](dedicated-gateway.md)
+- [Dedicated gateway](dedicated-gateway.md)
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Introduction https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/introduction.md
Previously updated : 06/04/2021 Last updated : 08/26/2021
Get started with Azure Cosmos DB with one of our quickstarts:
- [Get started with Azure Cosmos DB Gremlin API](create-graph-dotnet.md) - [Get started with Azure Cosmos DB Table API](table/create-table-dotnet.md) - [A whitepaper on next-gen app development with Azure Cosmos DB](https://azure.microsoft.com/resources/microsoft-azure-cosmos-db-flexible-reliable-cloud-nosql-at-any-scale/)
+- Trying to do capacity planning for a migration to Azure Cosmos DB?
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
> [!div class="nextstepaction"] > [Try Azure Cosmos DB for free](https://azure.microsoft.com/try/cosmosdb/)
cosmos-db Migrate Containers Partitioned To Nonpartitioned https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/migrate-containers-partitioned-to-nonpartitioned.md
Previously updated : 09/25/2019 Last updated : 08/26/2021
If new items are inserted with different values for the partition key, querying
* [Request Units in Azure Cosmos DB](request-units.md) * [Provision throughput on containers and databases](set-throughput.md) * [Work with Azure Cosmos account](./account-databases-containers-items.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
[1]: https://github.com/Azure/azure-cosmos-dotnet-v3/tree/master/Microsoft.Azure.Cosmos.Samples/Usage/NonPartitionContainerMigration
cosmos-db Migrate Continuous Backup https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/migrate-continuous-backup.md
description: Azure Cosmos DB currently supports a one-way migration from periodi
Previously updated : 08/17/2021 Last updated : 08/26/2021
To learn more about continuous backup mode, see the following articles:
* [Continuous backup mode resource model.](continuous-backup-restore-resource-model.md) * Restore an account using [Azure portal](restore-account-continuous-backup.md#restore-account-portal), [PowerShell](restore-account-continuous-backup.md#restore-account-powershell), [CLI](restore-account-continuous-backup.md#restore-account-cli), or [Azure Resource Manager](restore-account-continuous-backup.md#restore-arm-template).+
+Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Migrate Cosmosdb Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/migrate-cosmosdb-data.md
Previously updated : 10/23/2019 Last updated : 08/26/2021
Once the migration is completed, you can validate that the document count in Azu
* Learn more by trying out the sample applications consuming the bulk executor library in [.NET](bulk-executor-dot-net.md) and [Java](bulk-executor-java.md). * The bulk executor library is integrated into the Cosmos DB Spark connector, to learn more, see [Azure Cosmos DB Spark connector](./create-sql-api-spark.md) article.
-* Contact the Azure Cosmos DB product team by opening a support ticket under the "General Advisory" problem type and "Large (TB+) migrations" problem subtype for additional help with large scale migrations.
+* Contact the Azure Cosmos DB product team by opening a support ticket under the "General Advisory" problem type and "Large (TB+) migrations" problem subtype for additional help with large scale migrations.
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Migrate Dotnet V2 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/migrate-dotnet-v2.md
Previously updated : 10/15/2020 Last updated : 08/26/2021 # Migrate your application to use the Azure Cosmos DB .NET SDK v2
For more information, see the [Azure Cosmos DB bulk executor library overview](b
## Next steps * Read about [additional performance tips](sql-api-get-started.md) using Azure Cosmos DB for SQL API v2 for optimization your application to achieve max performance
-* Learn more about [what you can do with the v2 SDK](sql-api-dotnet-samples.md)
+* Learn more about [what you can do with the v2 SDK](sql-api-dotnet-samples.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Migrate Dotnet V3 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/migrate-dotnet-v3.md
Previously updated : 08/19/2021 Last updated : 08/26/2021 # Migrate your application to use the Azure Cosmos DB .NET SDK v3
private static async Task DeleteItemAsync(DocumentClient client)
* [Build a Console app](sql-api-get-started.md) to manage Azure Cosmos DB SQL API data using the v3 SDK * Learn more about [what you can do with the v3 SDK](sql-api-dotnet-v3sdk-samples.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Migrate Java V4 Sdk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/migrate-java-v4-sdk.md
Previously updated : 06/15/2021 Last updated : 08/26/2021
Document responseDocument = documentResourceResponse.getResource();
* [Build a Java app](create-sql-api-java.md) to manage Azure Cosmos DB SQL API data using the V4 SDK * Learn about the [Reactor-based Java SDKs](https://github.com/Azure-Samples/azure-cosmos-java-sql-api-samples/blob/main/reactor-pattern-guide.md)
-* Learn about converting RxJava async code to Reactor async code with the [Reactor vs RxJava Guide](https://github.com/Azure-Samples/azure-cosmos-java-sql-api-samples/blob/main/reactor-rxjava-guide.md)
+* Learn about converting RxJava async code to Reactor async code with the [Reactor vs RxJava Guide](https://github.com/Azure-Samples/azure-cosmos-java-sql-api-samples/blob/main/reactor-rxjava-guide.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Modeling Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/modeling-data.md
Previously updated : 07/12/2021 Last updated : 08/26/2021 # Data modeling in Azure Cosmos DB
Just as there is no single way to represent a piece of data on a screen, there i
Data Modeling and Partitioning - a Real-World Example](how-to-model-partition-example.md). * See the learn module on how to [Model and partition your data in Azure Cosmos DB.](/learn/modules/model-partition-data-azure-cosmos-db/)+
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Cli Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/cli-samples.md
Previously updated : 10/13/2020 Last updated : 08/26/2021
These samples apply to all Azure Cosmos DB APIs
| [Throughput operations](../scripts/cli/mongodb/throughput.md?toc=%2fcli%2fazure%2ftoc.json) | Read, update and migrate between autoscale and standard throughput on a database and collection.| | [Lock resources from deletion](../scripts/cli/mongodb/lock.md?toc=%2fcli%2fazure%2ftoc.json)| Prevent resources from being deleted with resource locks.| |||+
+## Next steps
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Connect Mongodb Account https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/connect-mongodb-account.md
Previously updated : 03/02/2021 Last updated : 08/26/2021 adobe-target: true adobe-target-activity: DocsExp-A/B-384740-MongoDB-2.8.2021
Specifically, client drivers must support the Service Name Identification (SNI)
- Learn how to [use Studio 3T](connect-using-mongochef.md) with Azure Cosmos DB's API for MongoDB. - Learn how to [use Robo 3T](connect-using-robomongo.md) with Azure Cosmos DB's API for MongoDB. - Explore MongoDB [samples](nodejs-console-app.md) with Azure Cosmos DB's API for MongoDB.
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Connect Using Compass https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/connect-using-compass.md
description: Learn how to use MongoDB Compass to store and manage data in Azure
Previously updated : 06/05/2020 Last updated : 08/26/2021
To connect your Cosmos DB account to Compass, you can follow the below steps:
- Learn how to [use Studio 3T](connect-using-mongochef.md) with Azure Cosmos DB's API for MongoDB. - Explore MongoDB [samples](nodejs-console-app.md) with Azure Cosmos DB's API for MongoDB.
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Connect Using Mongochef https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/connect-using-mongochef.md
description: Learn how to connect to an Azure Cosmos DB's API for MongoDB using
Previously updated : 03/20/2020 Last updated : 08/26/2021
To create a database, collection, and documents using Studio 3T, perform the fol
- Learn how to [use Robo 3T](connect-using-robomongo.md) with Azure Cosmos DB's API for MongoDB. - Explore MongoDB [samples](nodejs-console-app.md) with Azure Cosmos DB's API for MongoDB.
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Connect Using Mongoose https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/connect-using-mongoose.md
ms.devlang: nodejs Previously updated : 03/20/2020 Last updated : 08/26/2021
As you can see, it is easy to work with Mongoose discriminators. So, if you have
- Learn how to [use Studio 3T](connect-using-mongochef.md) with Azure Cosmos DB's API for MongoDB. - Learn how to [use Robo 3T](connect-using-robomongo.md) with Azure Cosmos DB's API for MongoDB. - Explore MongoDB [samples](nodejs-console-app.md) with Azure Cosmos DB's API for MongoDB.
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
[dbleveltp]: ./media/connect-using-mongoose/db-level-throughput.png
cosmos-db Connect Using Robomongo https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/connect-using-robomongo.md
description: Learn how to connect to Azure Cosmos DB using Robo 3T and Azure Cos
Previously updated : 03/23/2020 Last updated : 08/26/2021
Both **User Name** and **Password** can be found in your connection information
- Learn how to [use Studio 3T](connect-using-mongochef.md) with Azure Cosmos DB's API for MongoDB. - Explore MongoDB [samples](nodejs-console-app.md) with Azure Cosmos DB's API for MongoDB.
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Create Mongodb Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/create-mongodb-dotnet.md
ms.devlang: dotnet Previously updated : 8/13/2021 Last updated : 8/26/2021
Enter any necessary parameters and select "Execute."
In this quickstart, you've learned how to create an API for MongoDB account, create a database and a collection with code, and run a web API app. You can now import additional data to your database.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import MongoDB data into Azure Cosmos DB](../../dms/tutorial-mongodb-cosmos-db.md?toc=%2fazure%2fcosmos-db%2ftoc.json%253ftoc%253d%2fazure%2fcosmos-db%2ftoc.json)
cosmos-db Create Mongodb Go https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/create-mongodb-go.md
ms.devlang: go Previously updated : 04/24/2020 Last updated : 08/26/2021 # Quickstart: Connect a Go application to Azure Cosmos DB's API for MongoDB
The `todo` you just deleted should not be present
In this quickstart, you learned how to create an Azure Cosmos DB MongoDB API account using the Azure Cloud Shell, and create and run a Go command-line app to manage `todo`s. You can now import additional data to your Azure Cosmos DB account.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import MongoDB data into Azure Cosmos DB](../../dms/tutorial-mongodb-cosmos-db.md?toc=%2fazure%2fcosmos-db%2ftoc.json%253ftoc%253d%2fazure%2fcosmos-db%2ftoc.json)
cosmos-db Create Mongodb Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/create-mongodb-java.md
ms.devlang: java Previously updated : 12/26/2018 Last updated : 08/26/2021
You can now use [Robomongo](connect-using-robomongo.md) / [Studio 3T](connect-us
In this quickstart, you learned how to create an Azure Cosmos DB API for Mongo DB account, add a database and container using Data Explorer, and add data using a Java console app. You can now import additional data to your Cosmos database.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import MongoDB data into Azure Cosmos DB](../../dms/tutorial-mongodb-cosmos-db.md?toc=%2fazure%2fcosmos-db%2ftoc.json%253ftoc%253d%2fazure%2fcosmos-db%2ftoc.json)
cosmos-db Create Mongodb Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/create-mongodb-nodejs.md
ms.devlang: nodejs Previously updated : 05/21/2019 Last updated : 08/26/2021
git commit -m "configured MongoDB connection string"
In this quickstart, you learned how to create an Azure Cosmos DB MongoDB API account using the Azure Cloud Shell, and create and run a MEAN.js app to add users to the account. You can now import additional data to your Azure Cosmos DB account.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import MongoDB data into Azure Cosmos DB](../../dms/tutorial-mongodb-cosmos-db.md?toc=%2fazure%2fcosmos-db%2ftoc.json%253ftoc%253d%2fazure%2fcosmos-db%2ftoc.json)
cosmos-db Create Mongodb Rust https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/create-mongodb-rust.md
ms.devlang: rust Previously updated : 01/12/2021 Last updated : 08/26/2021 # Quickstart: Connect a Rust application to Azure Cosmos DB's API for MongoDB
fn delete_todo(self, todo_id: &str) {
In this quickstart, you learned how to create an Azure Cosmos DB MongoDB API account using the Azure Cloud Shell, and create and run a Rust command-line app to manage `todo`s. You can now import additional data to your Azure Cosmos DB account.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import MongoDB data into Azure Cosmos DB](../../dms/tutorial-mongodb-cosmos-db.md?toc=%2fazure%2fcosmos-db%2ftoc.json%253ftoc%253d%2fazure%2fcosmos-db%2ftoc.json)
cosmos-db Create Mongodb Xamarin https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/create-mongodb-xamarin.md
ms.devlang: dotnet Previously updated : 10/09/2020 Last updated : 08/26/2021
You've now updated your app with all the info it needs to communicate with Azure
In this quickstart, you've learned how to create an Azure Cosmos DB account and run a Xamarin.Forms app using the API for MongoDB. You can now import additional data to your Cosmos DB account.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Import data into Azure Cosmos DB configured with Azure Cosmos DB's API for MongoDB](../../dms/tutorial-mongodb-cosmos-db.md?toc=%2fazure%2fcosmos-db%2ftoc.json%253ftoc%253d%2fazure%2fcosmos-db%2ftoc.json)
cosmos-db Estimate Ru Capacity Planner https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/estimate-ru-capacity-planner.md
Previously updated : 04/28/2021 Last updated : 08/26/2021
# Estimate RU/s using the Azure Cosmos DB capacity planner - Azure Cosmos DB API for MongoDB [!INCLUDE[appliesto-sql-api](../includes/appliesto-mongodb-api.md)]
+> [!NOTE]
+> If you are planning a data migration to Azure Cosmos DB and all that you know is the number of vcores and servers in your existing sharded and replicated database cluster, please also read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+>
+ Configuring your databases and collections with the right amount of provisioned throughput, or [Request Units (RU/s)](../request-units.md), for your workload is essential to optimizing cost and performance. This article describes how to use the Azure Cosmos DB [capacity planner](https://cosmos.azure.com/capacitycalculator/) to get an estimate of the required RU/s and cost of your workload when using the Azure Cosmos DB API for MongoDB. If you are using SQL API, see how to [use capacity calculator with SQL API](../estimate-ru-with-capacity-planner.md) article. [!INCLUDE [capacity planner modes](../includes/capacity-planner-modes.md)]
The prices shown in the capacity planner are estimates based on the public prici
## Next steps
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
* Learn more about [Azure Cosmos DB's pricing model](../how-pricing-works.md). * Create a new [Cosmos account, database, and container](../create-cosmosdb-resources-portal.md). * Learn how to [optimize provisioned throughput cost](../optimize-cost-throughput.md). * Learn how to [optimize cost with reserved capacity](../cosmos-db-reserved-capacity.md).
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
cosmos-db Feature Support 40 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/feature-support-40.md
description: Learn about Azure Cosmos DB's API for MongoDB 4.0 server version su
Previously updated : 03/02/2021 Last updated : 08/26/2021
Some applications rely on a [Write Concern](https://docs.mongodb.com/manual/refe
- Learn how to [use Studio 3T](connect-using-mongochef.md) with Azure Cosmos DB's API for MongoDB. - Learn how to [use Robo 3T](connect-using-robomongo.md) with Azure Cosmos DB's API for MongoDB. - Explore MongoDB [samples](nodejs-console-app.md) with Azure Cosmos DB's API for MongoDB.
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Find Request Unit Charge Mongodb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/find-request-unit-charge-mongodb.md
Previously updated : 03/19/2021 Last updated : 08/26/2021
To learn about optimizing your RU consumption, see these articles:
* [Request units and throughput in Azure Cosmos DB](../request-units.md) * [Optimize provisioned throughput cost in Azure Cosmos DB](../optimize-cost-throughput.md)
-* [Optimize query cost in Azure Cosmos DB](../optimize-cost-reads-writes.md)
+* [Optimize query cost in Azure Cosmos DB](../optimize-cost-reads-writes.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db How To Create Container Mongodb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/how-to-create-container-mongodb.md
description: Learn how to create a container in Azure Cosmos DB API for MongoDB
Previously updated : 10/16/2020 Last updated : 08/26/2021
If you encounter timeout exception when creating a collection, do a read operati
* [Partitioning in Azure Cosmos DB](../partitioning-overview.md) * [Request Units in Azure Cosmos DB](../request-units.md) * [Provision throughput on containers and databases](../set-throughput.md)
-* [Work with Azure Cosmos account](../account-databases-containers-items.md)
+* [Work with Azure Cosmos account](../account-databases-containers-items.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db How To Provision Throughput Mongodb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/how-to-provision-throughput-mongodb.md
description: Learn how to provision container, database, and autoscale throughpu
Previously updated : 10/15/2020 Last updated : 08/26/2021
Azure PowerShell can be used to provision autoscale throughput on a database or
See the following articles to learn about throughput provisioning in Azure Cosmos DB:
-* [Request units and throughput in Azure Cosmos DB](../request-units.md)
+* [Request units and throughput in Azure Cosmos DB](../request-units.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Migrate Databricks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/migrate-databricks.md
Previously updated : 06/29/2021 Last updated : 08/26/2021 # Migrate data from MongoDB to an Azure Cosmos DB API for MongoDB account by using Azure Databricks [!INCLUDE[appliesto-mongodb-api](../includes/appliesto-mongodb-api.md)]
You might see a 16500 error code for operations against the Cosmos DB API for Mo
After you migrate the data, you can connect to Azure Cosmos DB and manage the data. You can also follow other post-migration steps such as optimizing the indexing policy, update the default consistency level, or configure global distribution for your Azure Cosmos DB account. For more information, see the [Post-migration optimization](post-migration-optimization.md) article.
+## Additional resources
+
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
+ ## Next steps * [Manage indexing in Azure Cosmos DB's API for MongoDB](mongodb-indexing.md)- * [Find the request unit charge for operations](find-request-unit-charge-mongodb.md)
cosmos-db Mongodb Indexing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/mongodb-indexing.md
ms.devlang: nodejs Previously updated : 03/02/2021 Last updated : 08/26/2021
If you want to create a wildcard index, [upgrade to version 4.0 or 3.6](upgrade-
* [Indexing in Azure Cosmos DB](../index-policy.md) * [Expire data in Azure Cosmos DB automatically with time to live](../time-to-live.md) * To learn about the relationship between partitioning and indexing, see how to [Query an Azure Cosmos container](../how-to-query-container.md) article.
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Mongodb Introduction https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/mongodb-introduction.md
description: Learn how you can use Azure Cosmos DB to store and query massive am
Previously updated : 04/22/2021 Last updated : 08/26/2021
All the API for MongoDB versions run on the same codebase, making upgrades a sim
* Sharded cluster performance is dependent on the shard key you choose when creating a collection. Choose a shard key carefully to ensure that your data is evenly distributed across shards.
+### Capacity planning
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](../estimate-ru-with-capacity-planner.md)
+ ## Quickstart * [Migrate an existing MongoDB Node.js web app](create-mongodb-nodejs.md). * [Build a web app using Azure Cosmos DB's API for MongoDB and .NET SDK](create-mongodb-dotnet.md) * [Build a console app using Azure Cosmos DB's API for MongoDB and Java SDK](create-mongodb-java.md)
+* [Estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* [Estimating request units using Azure Cosmos DB capacity planner](../estimate-ru-with-capacity-planner.md)
## Next steps
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
* Follow the [Connect a MongoDB application to Azure Cosmos DB](connect-mongodb-account.md) tutorial to learn how to get your account connection string information. * Follow the [Use Studio 3T with Azure Cosmos DB](connect-using-mongochef.md) tutorial to learn how to create a connection between your Cosmos database and MongoDB app in Studio 3T. * Follow the [Import MongoDB data into Azure Cosmos DB](../../dms/tutorial-mongodb-cosmos-db.md?toc=%2fazure%2fcosmos-db%2ftoc.json%253ftoc%253d%2fazure%2fcosmos-db%2ftoc.json) tutorial to import your data to a Cosmos database.
cosmos-db Nodejs Console App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/nodejs-console-app.md
ms.devlang: nodejs Previously updated : 12/26/2018 Last updated : 08/26/2021
To use this example, you must:
- Learn how to [use Studio 3T](connect-using-mongochef.md) with Azure Cosmos DB's API for MongoDB. - Learn how to [use Robo 3T](connect-using-robomongo.md) with Azure Cosmos DB's API for MongoDB.-- Explore MongoDB [samples](nodejs-console-app.md) with Azure Cosmos DB's API for MongoDB.
+- Explore MongoDB [samples](nodejs-console-app.md) with Azure Cosmos DB's API for MongoDB.
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Optimize Write Performance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/optimize-write-performance.md
Previously updated : 06/25/2021 Last updated : 08/26/2021
If you are writing more than 1,000 documents at a time per process/thread, clien
* Learn more about [indexing in the API for MongoDB](../mongodb-indexing.md). * Learn more about [Azure Cosmos DB's sharding/partitioning](../partitioning-overview.md). * Learn more about [troubleshooting common issues](error-codes-solutions.md).
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Post Migration Optimization https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/post-migration-optimization.md
description: This doc provides the post-migration optimization techniques from M
Previously updated : 05/19/2021 Last updated : 08/26/2021
One convenient fact about [indexing](#optimize-the-indexing-policy), [global dis
## Next steps
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
* [Connect a MongoDB application to Azure Cosmos DB](connect-mongodb-account.md) * [Connect to Azure Cosmos DB account using Studio 3T](connect-using-mongochef.md) * [How to globally distribute reads using Azure Cosmos DB's API for MongoDB](readpreference-global-distribution.md)
cosmos-db Powershell Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/powershell-samples.md
Previously updated : 01/20/2021 Last updated : 08/26/2021
The following table includes links to commonly used Azure PowerShell scripts for
|[Throughput operations](../scripts/powershell/mongodb/throughput.md?toc=%2fpowershell%2fmodule%2ftoc.json)| Throughput operations for a database or collection including get, update and migrate between autoscale and standard throughput. | |[Lock resources from deletion](../scripts/powershell/mongodb/lock.md?toc=%2fpowershell%2fmodule%2ftoc.json)| Prevent resources from being deleted with resource locks. | |||+
+## Next steps
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Pre Migration Steps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/pre-migration-steps.md
description: This doc provides an overview of the prerequisites for a data migra
Previously updated : 05/17/2021 Last updated : 08/26/2021
With the discovery and assessment steps complete, you are done with the MongoDB
More detail is provided in the following sections.
+### Capacity planning
+
+Trying to do capacity planning for a migration to Azure Cosmos DB?
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
+ ### Considerations when using Azure Cosmos DB's API for MongoDB Before you plan your Azure Cosmos DB data estate, make sure you understand the following Azure Cosmos DB concepts:
In the pre-migration phase, spend some time to plan what steps you will take tow
## Next steps
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
* Migrate to Azure Cosmos DB API for MongoDB * [Offline migration using MongoDB native tools](tutorial-mongotools-cosmos-db.md) * [Offline migration using Azure database migration service (DMS)](../../dms/tutorial-mongodb-cosmos-db.md) * [Online migration using Azure database migration service (DMS)](../../dms/tutorial-mongodb-cosmos-db-online.md) * [Offline/online migration using Azure Databricks and Spark](migrate-databricks.md)
- * [Migrate your MongoDB data using Azure database migration service (DMS).](../../dms/tutorial-mongodb-cosmos-db.md)
* [Post-migration guide](post-migration-optimization.md) - optimize steps once you have migrated to Azure Cosmos DB API for MongoDB * [Provision throughput on Azure Cosmos containers and databases](../set-throughput.md) * [Partitioning in Azure Cosmos DB](../partitioning-overview.md)
cosmos-db Prevent Rate Limiting Errors https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/prevent-rate-limiting-errors.md
Previously updated : 01/13/2021 Last updated : 08/26/2021
No, server-side retry only affects rate limiting errors (429) by retrying them s
To learn more about troubleshooting common errors, see this article: * [Troubleshoot common issues in Azure Cosmos DB's API for MongoDB](error-codes-solutions.md)+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Resource Manager Template Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/resource-manager-template-samples.md
Previously updated : 10/14/2020 Last updated : 08/26/2021
Here are some additional resources:
* [Azure Resource Manager documentation](../../azure-resource-manager/index.yml) * [Azure Cosmos DB resource provider schema](/azure/templates/microsoft.documentdb/allversions) * [Azure Cosmos DB Quickstart templates](https://azure.microsoft.com/resources/templates/?resourceType=Microsoft.DocumentDB&pageNumber=1&sort=Popular)
-* [Troubleshoot common Azure Resource Manager deployment errors](../../azure-resource-manager/templates/common-deployment-errors.md)
+* [Troubleshoot common Azure Resource Manager deployment errors](../../azure-resource-manager/templates/common-deployment-errors.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Troubleshoot Query Performance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/troubleshoot-query-performance.md
description: Learn how to identify, diagnose, and troubleshoot Azure Cosmos DB's
Previously updated : 03/02/2021 Last updated : 08/26/2021
The value `estimatedDelayFromRateLimitingInMilliseconds` gives a sense of the po
* [Troubleshoot query performance (SQL API)](troubleshoot-query-performance.md) * [Manage indexing in Azure Cosmos DB's API for MongoDB](mongodb-indexing.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Tutorial Develop Mongodb React https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/tutorial-develop-mongodb-react.md
ms.devlang: nodejs Previously updated : 09/05/2018 Last updated : 08/26/2021
You can proceed to the next tutorial and learn how to import MongoDB data into A
> [!div class="nextstepaction"] > [Import MongoDB data into Azure Cosmos DB](../../dms/tutorial-mongodb-cosmos-db.md?toc=%2fazure%2fcosmos-db%2ftoc.json%253ftoc%253d%2fazure%2fcosmos-db%2ftoc.json)+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Tutorial Develop Nodejs Part 1 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/tutorial-develop-nodejs-part-1.md
ms.devlang: nodejs Previously updated : 12/26/2018 Last updated : 08/26/2021
You can proceed to the next part of the tutorial to create the Node.js Express a
> [!div class="nextstepaction"] > [Create a Node.js Express app with the Angular CLI](tutorial-develop-nodejs-part-2.md)+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Tutorial Develop Nodejs Part 2 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/tutorial-develop-nodejs-part-2.md
ms.devlang: nodejs Previously updated : 12/26/2018 Last updated : 08/26/2021
You can proceed to the next part of the tutorial to build the UI.
> [!div class="nextstepaction"] > [Build the UI with Angular](tutorial-develop-nodejs-part-3.md)+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Tutorial Develop Nodejs Part 3 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/tutorial-develop-nodejs-part-3.md
ms.devlang: nodejs Previously updated : 12/26/2018 Last updated : 08/26/2021
You can proceed to the next part of the tutorial to create an Azure Cosmos DB ac
> [!div class="nextstepaction"] > [Create an Azure Cosmos DB account using the Azure CLI](tutorial-develop-nodejs-part-4.md)+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Tutorial Develop Nodejs Part 4 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/tutorial-develop-nodejs-part-4.md
ms.devlang: nodejs Previously updated : 12/06/2018 Last updated : 08/26/2021
In this part of the tutorial, you've done the following:
You can proceed to the next part of the tutorial to connect Azure Cosmos DB to your app using Mongoose. > [!div class="nextstepaction"]
-> [Use Mongoose to connect to Azure Cosmos DB](tutorial-develop-nodejs-part-5.md)
+> [Use Mongoose to connect to Azure Cosmos DB](tutorial-develop-nodejs-part-5.md)
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Tutorial Develop Nodejs Part 5 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/tutorial-develop-nodejs-part-5.md
ms.devlang: nodejs Previously updated : 12/26/2018 Last updated : 08/26/2021
When you no longer need the resources, you can delete the resource group, Azure
Continue to Part 6 of the tutorial to add Post, Put, and Delete functions to the app: > [!div class="nextstepaction"]
-> [Part 6: Add Post, Put, and Delete functions to the app](tutorial-develop-nodejs-part-6.md)
+> [Part 6: Add Post, Put, and Delete functions to the app](tutorial-develop-nodejs-part-6.md)
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Tutorial Develop Nodejs Part 6 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/tutorial-develop-nodejs-part-6.md
ms.devlang: nodejs Previously updated : 12/26/2018 Last updated : 08/26/2021
In this part of the tutorial, you've done the following:
Check back soon for additional videos in this tutorial series.
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
+
cosmos-db Tutorial Global Distribution Mongodb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/tutorial-global-distribution-mongodb.md
Previously updated : 12/26/2018 Last updated : 08/26/2021
You can now proceed to the next tutorial to learn how to develop locally using t
> [!div class="nextstepaction"] > [Develop locally with the Azure Cosmos DB emulator](../local-emulator.md)+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Tutorial Mongotools Cosmos Db https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/tutorial-mongotools-cosmos-db.md
Previously updated : 05/19/2021 Last updated : 08/26/2021
After you migrate the data stored in MongoDB database to Azure Cosmos DBΓÇÖs API
* [Cosmos DB service information](https://azure.microsoft.com/services/cosmos-db/) * [MongoDB database tools documentation](https://docs.mongodb.com/database-tools/)
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
## Next steps
cosmos-db Upgrade Mongodb Version https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb/upgrade-mongodb-version.md
description: How to upgrade the MongoDB wire-protocol version for your existing
Previously updated : 03/19/2021 Last updated : 08/26/2021
If you upgraded from 3.2 to (4.0 or 3.6) and wish to downgrade back to 3.2, you
- Learn about the supported and unsupported [features of MongoDB version 4.0](feature-support-40.md). - Learn about the supported and unsupported [features of MongoDB version 3.6](feature-support-36.md). - For further information check [Mongo 3.6 version features](https://devblogs.microsoft.com/cosmosdb/azure-cosmos-dbs-api-for-mongodb-now-supports-server-version-3-6/)
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Optimize Cost Reads Writes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/optimize-cost-reads-writes.md
Previously updated : 10/14/2020 Last updated : 08/26/2021 # Optimize request cost in Azure Cosmos DB
Next you can proceed to learn more about cost optimization in Azure Cosmos DB wi
* Learn more about [Optimizing storage cost](optimize-cost-storage.md) * Learn more about [Optimizing the cost of multi-region Azure Cosmos accounts](optimize-cost-regions.md) * Learn more about [Azure Cosmos DB reserved capacity](cosmos-db-reserved-capacity.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Optimize Cost Regions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/optimize-cost-regions.md
Previously updated : 10/23/2020 Last updated : 08/26/2021 # Optimize multi-region cost in Azure Cosmos DB
Next you can proceed to learn more about cost optimization in Azure Cosmos DB wi
* Learn more about [Optimizing storage cost](optimize-cost-storage.md) * Learn more about [Optimizing the cost of reads and writes](optimize-cost-reads-writes.md) * Learn more about [Optimizing the cost of queries](./optimize-cost-reads-writes.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Optimize Cost Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/optimize-cost-storage.md
Previously updated : 05/21/2019 Last updated : 08/26/2021
Next you can proceed to learn more about cost optimization in Azure Cosmos DB wi
* Learn more about [Optimizing the cost of reads and writes](optimize-cost-reads-writes.md) * Learn more about [Optimizing the cost of queries](./optimize-cost-reads-writes.md) * Learn more about [Optimizing the cost of multi-region Azure Cosmos accounts](optimize-cost-regions.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Optimize Cost Throughput https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/optimize-cost-throughput.md
Previously updated : 02/07/2020 Last updated : 08/26/2021
The following steps help you to make your solutions highly scalable and cost-eff
Next you can proceed to learn more about cost optimization in Azure Cosmos DB with the following articles:
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
* Learn more about [Optimizing for development and testing](optimize-dev-test.md) * Learn more about [Understanding your Azure Cosmos DB bill](understand-your-bill.md) * Learn more about [Optimizing storage cost](optimize-cost-storage.md)
cosmos-db Optimize Dev Test https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/optimize-dev-test.md
Previously updated : 05/25/2021 Last updated : 08/26/2021 # Optimize development and testing cost in Azure Cosmos DB
You can get started with using the emulator or the free Azure Cosmos DB accounts
* Learn more about [Optimizing storage cost](optimize-cost-storage.md) * Learn more about [Optimizing the cost of reads and writes](optimize-cost-reads-writes.md) * Learn more about [Optimizing the cost of queries](./optimize-cost-reads-writes.md)
-* Learn more about [Optimizing the cost of multi-region Azure Cosmos accounts](optimize-cost-regions.md)
+* Learn more about [Optimizing the cost of multi-region Azure Cosmos accounts](optimize-cost-regions.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Partitioning Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/partitioning-overview.md
Previously updated : 07/12/2021 Last updated : 08/26/2021
Some things to consider when selecting the *item ID* as the partition key includ
* Learn how to [provision throughput on an Azure Cosmos container](how-to-provision-container-throughput.md). * Learn how to [provision throughput on an Azure Cosmos database](how-to-provision-database-throughput.md). * See the learn module on how to [Model and partition your data in Azure Cosmos DB.](/learn/modules/model-partition-data-azure-cosmos-db/)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Partners Migration Cosmosdb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/partners-migration-cosmosdb.md
Previously updated : 05/28/2019 Last updated : 08/26/2021 # Azure Cosmos DB NoSQL migration and application development partners
From NoSQL migration to application development, you can choose from a variety o
To learn more about some of Microsoft's other partners, see the [Microsoft Partner site](https://partner.microsoft.com/).
+Trying to do capacity planning for a migration to Azure Cosmos DB?
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ <!--Image references--> [2]: ./media/partners-migration-cosmosdb/striim_logo.png [3]: ./media/partners-migration-cosmosdb/altoros_logo.png
cosmos-db Performance Testing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/performance-testing.md
Previously updated : 05/23/2019 Last updated : 08/26/2021
In this article, we looked at how you can perform performance and scale testing
* [Azure Cosmos DB performance testing sample](https://github.com/Azure/azure-cosmos-dotnet-v2/tree/master/samples/documentdb-benchmark) * [Client configuration options to improve Azure Cosmos DB performance](performance-tips.md) * [Server-side partitioning in Azure Cosmos DB](partitioning-overview.md)-
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Performance Tips Java Sdk V4 Sql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/performance-tips-java-sdk-v4-sql.md
ms.devlang: java Previously updated : 10/13/2020 Last updated : 08/26/2021
The request charge (the request processing cost) of a given operation is directl
## Next steps To learn more about designing your application for scale and high performance, see [Partitioning and scaling in Azure Cosmos DB](partitioning-overview.md).+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Plan Manage Costs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/plan-manage-costs.md
Previously updated : 04/05/2021 Last updated : 08/26/2021 # Plan and manage costs for Azure Cosmos DB
Cost analysis in Cost Management supports most Azure account types, but not all
Azure Cosmos DB is available in two different capacity modes: provisioned throughput and serverless. You can perform the exact same database operations in both modes, but the way you get billed for these operations is different.
+### Capacity planning
+
+As an aid for estimating costs, it can be helpful to do capacity planning for a migration to Azure Cosmos DB. If you are planning a migration from an existing database cluster to Azure Cosmos DB, uou can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+
+![Migrate a replica set with 3 replicas of a four-core SKU to Azure Cosmos DB](media/convert-vcore-to-request-unit/one-replica-set.png)
+
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ ### Estimate provisioned throughput costs If you plan to use Azure Cosmos DB in provisioned throughput mode, use the [Azure Cosmos DB capacity calculator](https://cosmos.azure.com/capacitycalculator/) to estimate costs before you create the resources in an Azure Cosmos account. The capacity calculator is used to get an estimate of the required throughput and cost of your workload. Configuring your Azure Cosmos databases and containers with the right amount of provisioned throughput, or [Request Units (RU/s)](request-units.md), for your workload is essential to optimize the cost and performance. You have to input details such as API type, number of regions, item size, read/write requests per second, total data stored to get a cost estimate. To learn more about the capacity calculator, see the [estimate](estimate-ru-with-capacity-planner.md) article.
The following are some best practices you can use to reduce the costs:
See the following articles to learn more on how pricing works in Azure Cosmos DB:
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
* [Pricing model in Azure Cosmos DB](how-pricing-works.md) * Learn [how to optimize your cloud investment with Azure Cost Management](../cost-management-billing/costs/cost-mgt-best-practices.md?WT.mc_id=costmanagementcontent_docsacmhorizontal_-inproduct-learn). * Learn more about managing costs with [cost analysis](../cost-management-billing/costs/quick-acm-cost-analysis.md?WT.mc_id=costmanagementcontent_docsacmhorizontal_-inproduct-learn).
cosmos-db Powershell Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/powershell-samples.md
Previously updated : 01/20/2021 Last updated : 08/26/2021
For PowerShell cmdlets for other APIs see [PowerShell Samples for Cassandra](cas
|[Throughput operations](scripts/powershell/sql/throughput.md?toc=%2fpowershell%2fmodule%2ftoc.json)| Throughput operations for a database or container including get, update and migrate between autoscale and standard throughput. | |[Lock resources from deletion](scripts/powershell/sql/lock.md?toc=%2fpowershell%2fmodule%2ftoc.json)| Prevent resources from being deleted with resource locks. | |||+
+## Next steps
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Provision Throughput Autoscale https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/provision-throughput-autoscale.md
For more detail, see this [documentation](how-to-choose-offer.md) on how to choo
* Learn how to [choose between manual and autoscale throughput](how-to-choose-offer.md). * Learn how to [provision autoscale throughput on an Azure Cosmos database or container](how-to-provision-autoscale-throughput.md). * Learn more about [partitioning](partitioning-overview.md) in Azure Cosmos DB.
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Quick Create Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/quick-create-template.md
tags: azure-resource-manager
Previously updated : 06/01/2020 Last updated : 08/26/2021 #Customer intent: As a database admin who is new to Azure, I want to use Azure Cosmos DB to store and manage my data.
In this quickstart, you created an Azure Cosmos account, a database and a contai
- Read an [Overview of Azure Cosmos DB](introduction.md) - Learn more about [Azure Resource Manager](../azure-resource-manager/management/overview.md) - Get other [Azure Cosmos DB Resource Manager templates](./templates-samples-sql.md)
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Rate Limiting Requests https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/rate-limiting-requests.md
description: This article provides developers with a methodology to rate limit r
Previously updated : 05/07/2021 Last updated : 08/26/2021
For more information, see [Materialized View pattern](/azure/architecture/patter
* Learn more about [Partitioning and horizontal scaling](partitioning-overview.md) in Azure Cosmos DB. * Learn about [Indexing policies](index-policy.md) in Azure Cosmos DB. * Learn about [Autoscaling](provision-throughput-autoscale.md) in Azure Cosmos DB.
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Request Units https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/request-units.md
Previously updated : 10/23/2020 Last updated : 08/26/2021
Your choice of [consistency model](consistency-levels.md) also affects the throu
- Learn how to [optimize reads and writes cost in Azure Cosmos DB](optimize-cost-reads-writes.md). - Learn how to [optimize query cost in Azure Cosmos DB](./optimize-cost-reads-writes.md). - Learn how to [use metrics to monitor throughput](use-metrics.md).
+- Trying to do capacity planning for a migration to Azure Cosmos DB?
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Set Throughput https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/set-throughput.md
This table shows a comparison between provisioning standard (manual) throughput
* Learn how to [provision standard (manual) on an Azure Cosmos container](how-to-provision-container-throughput.md). * Learn how to [provision standard (manual) throughput on an Azure Cosmos database](how-to-provision-database-throughput.md). * Learn how to [provision autoscale throughput on an Azure Cosmos database or container](how-to-provision-autoscale-throughput.md).
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Sql Api Dotnet Application https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-dotnet-application.md
ms.devlang: dotnet Previously updated : 05/08/2020 Last updated : 08/26/2021
In this tutorial, you've learned how to build an ASP.NET Core MVC web applicatio
* [Partitioning in Azure Cosmos DB](./partitioning-overview.md) * [Getting started with SQL queries](./sql-query-getting-started.md) * [How to model and partition data on Azure Cosmos DB using a real-world example](./how-to-model-partition-example.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
[Visual Studio Express]: https://www.visualstudio.com/products/visual-studio-express-vs.aspx [Microsoft Web Platform Installer]: https://www.microsoft.com/web/downloads/platform.aspx
cosmos-db Sql Api Dotnet Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-dotnet-samples.md
Previously updated : 07/23/2019 Last updated : 08/26/2021
The [RunDemoAsync](https://github.com/Azure/azure-cosmos-dotnet-v2/tree/master/s
| [Create a user](https://github.com/Azure/azure-documentdb-net/blob/master/samples/code-samples/UserManagement/Program.cs#L93) |[DocumentClient.CreateUserAsync](/dotnet/api/microsoft.azure.documents.client.documentclient.createuserasync) | | [Set permissions on a collection or document](https://github.com/Azure/azure-documentdb-net/blob/master/samples/code-samples/UserManagement/Program.cs#L97) |[DocumentClient.CreatePermissionAsync](/dotnet/api/microsoft.azure.documents.client.documentclient.createpermissionasync) | | [Get a list of a user's permissions](https://github.com/Azure/azure-documentdb-net/blob/master/samples/code-samples/UserManagement/Program.cs#L241) |[DocumentClient.ReadUserAsync](/dotnet/api/microsoft.azure.documents.client.documentclient.readuserasync)<br>[DocumentClient.ReadPermissionFeedAsync](/dotnet/api/microsoft.azure.documents.client.documentclient.readpermissionfeedasync) |+
+## Next steps
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Sql Api Dotnet V3sdk Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-dotnet-v3sdk-samples.md
Previously updated : 10/07/2019 Last updated : 08/26/2021
The [RunDemoAsync](https://github.com/Azure/azure-cosmos-dotnet-v3/blob/master/M
| [Create a stored procedure](https://github.com/Azure/azure-cosmos-dotnet-v3/blob/master/Microsoft.Azure.Cosmos.Samples/Usage/ServerSideScripts/Program.cs#L116) |[Scripts.CreateStoredProcedureAsync](/dotnet/api/microsoft.azure.cosmos.scripts.scripts.createstoredprocedureasync) | | [Execute a stored procedure](https://github.com/Azure/azure-cosmos-dotnet-v3/blob/master/Microsoft.Azure.Cosmos.Samples/Usage/ServerSideScripts/Program.cs#L135) |[Scripts.ExecuteStoredProcedureAsync](/dotnet/api/microsoft.azure.cosmos.scripts.scripts.executestoredprocedureasync) | | [Delete a stored procedure](https://github.com/Azure/azure-cosmos-dotnet-v3/blob/master/Microsoft.Azure.Cosmos.Samples/Usage/ServerSideScripts/Program.cs#L351) |[Scripts.DeleteStoredProcedureAsync](/dotnet/api/microsoft.azure.cosmos.scripts.scripts.deletestoredprocedureasync) |+
+## Next steps
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Sql Api Get Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-get-started.md
ms.devlang: dotnet Previously updated : 08/12/2021 Last updated : 08/26/2021
That's it, build it, and you're on your way!
* Want to do scale and performance testing with Azure Cosmos DB? See [Performance and scale testing with Azure Cosmos DB](performance-testing.md). * To learn how to monitor Azure Cosmos DB requests, usage, and storage, see [Monitor performance and storage metrics in Azure Cosmos DB](./monitor-cosmos-db.md). * To learn more about Azure Cosmos DB, see [Welcome to Azure Cosmos DB](./introduction.md).
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
[cosmos-db-create-account]: create-sql-api-java.md#create-a-database-account
cosmos-db Sql Api Java Application https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-java-application.md
ms.devlang: java Previously updated : 02/10/2021 Last updated : 08/26/2021
All the samples in this tutorial are included in the [todo](https://github.com/A
## Next steps
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Build a node.js application with Azure Cosmos DB](sql-api-nodejs-application.md)
cosmos-db Sql Api Java Sdk Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-java-sdk-samples.md
Previously updated : 06/08/2021 Last updated : 08/26/2021
The User Management Sample file shows how to do the following tasks:
| Create a user | - | | Set permissions on a collection or document | - | | Get a list of a user's permissions |- |+
+## Next steps
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Sql Api Nodejs Application https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-nodejs-application.md
ms.devlang: nodejs Previously updated : 05/19/2021 Last updated : 08/26/2021 #Customer intent: As a developer, I want to build a Node.js web application to access and manage SQL API account resources in Azure Cosmos DB, so that customers can better use the service.
When these resources are no longer needed, you can delete the resource group, Az
## Next steps
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Build mobile applications with Xamarin and Azure Cosmos DB](mobile-apps-with-xamarin.md)
cosmos-db Sql Api Nodejs Get Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-nodejs-get-started.md
ms.devlang: nodejs Previously updated : 04/20/2020 Last updated : 08/26/2021 #Customer intent: As a developer, I want to build a Node.js console application to access and manage SQL API account resources in Azure Cosmos DB, so that customers can better use the service.
When these resources are no longer needed, you can delete the resource group, Az
## Next steps
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ > [!div class="nextstepaction"] > [Monitor an Azure Cosmos DB account](./monitor-cosmos-db.md)
cosmos-db Sql Api Nodejs Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-nodejs-samples.md
Previously updated : 08/23/2019 Last updated : 08/26/2021
The [index.ts](https://github.com/Azure/azure-cosmos-js/blob/master/samples/Serv
| [Execute a stored procedure](https://github.com/Azure/azure-cosmos-js/blob/master/samples/ServerSideScripts/index.ts) |[StoredProcedure.execute](/javascript/api/%40azure/cosmos/storedprocedure) | For more information about server-side programming, see [Azure Cosmos DB server-side programming: Stored procedures, database triggers, and UDFs](stored-procedures-triggers-udfs.md).+
+## Next steps
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Sql Api Python Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-python-samples.md
ms.devlang: python Previously updated : 08/11/2020 Last updated : 08/26/2021
The [index_management.py](https://github.com/Azure/azure-sdk-for-python/blob/mas
| [Use range indexes on strings](https://github.com/Azure/azure-sdk-for-python/blob/master/sdk/cosmos/azure-cosmos/samples/index_management.py#L401-L485) | Define indexing policy with range indexes on string data type. `'kind': documents.IndexKind.Range`, `'dataType': documents.DataType.String`| | [Perform an index transformation](https://github.com/Azure/azure-sdk-for-python/blob/master/sdk/cosmos/azure-cosmos/samples/index_management.py#L488-L544) |database.replace_container (use the updated indexing policy)| | [Use scans when only hash index exists on the path](https://github.com/Azure/azure-sdk-for-python/blob/master/sdk/cosmos/azure-cosmos/samples/index_management.py#L339-L398) | set the `enable_scan_in_query=True` and `enable_cross_partition_query=True` when querying the items |+
+## Next steps
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Sql Api Spring Data Sdk Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-spring-data-sdk-samples.md
Previously updated : 09/23/2020 Last updated : 08/26/2021
The [samples](https://github.com/Azure-Samples/azure-spring-data-cosmos-java-sql
| [Query with aggregate functions](https://github.com/Azure-Samples/azure-spring-data-cosmos-java-sql-api-samples/blob/main/src/main/java/com/azure/cosmos/springexamples/quickstart/sync/UserRepository.java#L56-L62) | @Query annotation | | [Work with subdocuments](https://github.com/Azure-Samples/azure-spring-data-cosmos-java-sql-api-samples/blob/main/src/main/java/com/azure/cosmos/springexamples/quickstart/sync/UserRepository.java#L64-L66) | @Query annotation | | [Query with intra-document Joins](https://github.com/Azure-Samples/azure-spring-data-cosmos-java-sql-api-samples/blob/main/src/main/java/com/azure/cosmos/springexamples/quickstart/sync/UserRepository.java#L68-L85) | @Query annotation |
-| [Query with string, math, and array operators](https://github.com/Azure-Samples/azure-spring-data-cosmos-java-sql-api-samples/blob/main/src/main/java/com/azure/cosmos/springexamples/quickstart/sync/UserRepository.java#L87-L97) | @Query annotation |
+| [Query with string, math, and array operators](https://github.com/Azure-Samples/azure-spring-data-cosmos-java-sql-api-samples/blob/main/src/main/java/com/azure/cosmos/springexamples/quickstart/sync/UserRepository.java#L87-L97) | @Query annotation |
+
+## Next steps
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Sql Sdk Connection Modes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-sdk-connection-modes.md
Previously updated : 10/14/2020 Last updated : 08/26/2021
For specific SDK platform performance optimizations:
* [.NET V3 SDK performance tips](performance-tips-dotnet-sdk-v3-sql.md)
-* [Java V4 SDK performance tips](performance-tips-java-sdk-v4-sql.md)
+* [Java V4 SDK performance tips](performance-tips-java-sdk-v4-sql.md)
+
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Sql Query Getting Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql/sql-query-getting-started.md
Previously updated : 04/09/2021 Last updated : 08/26/2021
The preceding examples show several aspects of the Cosmos DB query language:
- [Introduction to Azure Cosmos DB](../introduction.md) - [Azure Cosmos DB .NET samples](https://github.com/Azure/azure-cosmos-dotnet-v3) - [SELECT clause](sql-query-select.md)
+- Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ - If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](../estimate-ru-with-capacity-planner.md)
cosmos-db Stored Procedures Triggers Udfs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/stored-procedures-triggers-udfs.md
Previously updated : 04/09/2020 Last updated : 08/26/2021
Learn how to write and use stored procedures, triggers, and user-defined functio
* [How to use stored procedures, triggers, and user-defined functions](how-to-use-stored-procedures-triggers-udfs.md)
-* [Working with JavaScript language integrated query API](javascript-query-api.md)
+* [Working with JavaScript language integrated query API](javascript-query-api.md)
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Synthetic Partition Keys https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/synthetic-partition-keys.md
description: Learn how to use synthetic partition keys in your Azure Cosmos cont
Previously updated : 12/03/2019 Last updated : 08/26/2021
You can learn more about the partitioning concept in the following articles:
* Learn more about how to [provision throughput on Azure Cosmos containers and databases](set-throughput.md). * Learn how to [provision throughput on an Azure Cosmos container](how-to-provision-container-throughput.md). * Learn how to [provision throughput on an Azure Cosmos database](how-to-provision-database-throughput.md).
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Templates Samples Sql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/templates-samples-sql.md
Previously updated : 03/24/2021 Last updated : 08/26/2021
This article only shows Azure Resource Manager template examples for Core (SQL)
|[Create a free-tier Azure Cosmos account](manage-with-templates.md#free-tier) | This template creates an Azure Cosmos DB Core (SQL) API account on free-tier. | See [Azure Resource Manager reference for Azure Cosmos DB](/azure/templates/microsoft.documentdb/allversions) page for the reference documentation.+
+## Next steps
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Total Cost Ownership https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/total-cost-ownership.md
Previously updated : 08/01/2019 Last updated : 08/26/2021
The serverless provisioning model of Azure Cosmos DB eliminates the need to over
* **You can save up to 65% of costs with reserved capacity:** Azure Cosmos DB [reserved capacity](cosmos-db-reserved-capacity.md) helps you save money by pre-paying for Azure Cosmos DB resources for either one year or three years. You can significantly reduce your costs with one-year or three-year upfront commitments and save between 20-65% discounts when compared to the regular pricing. On your mission-critical workloads you can get better SLAs in terms of provisioning capacity.
+## Capacity planning
+
+As an aid for estimating TCO, it can be helpful to start with capacity planning. If you are planning a migration to Azure Cosmos DB from an existing database cluster, you can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ ## Next steps
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
* Learn more about [How Azure Cosmos DB pricing model is cost-effective for customers](total-cost-ownership.md) * Learn more about [Optimizing for development and testing](optimize-dev-test.md) * Learn more about [Optimizing throughput cost](optimize-cost-throughput.md)
cosmos-db Troubleshoot Service Unavailable Java Sdk V4 Sql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/troubleshoot-service-unavailable-java-sdk-v4-sql.md
The following list contains known causes and solutions for service unavailable e
### The required ports are being blocked Verify that all the [required ports](sql-sdk-connection-modes.md#service-port-ranges) are enabled.
+### Client initialization failure
+The following exception is hit if the SDK is not able to talk to the Cosmos DB instance. This normally points to some security protocol like a firewall rule is blocking the requests.
+
+```java
+ java.lang.RuntimeException: Client initialization failed. Check if the endpoint is reachable and if your auth token is valid
+```
+
+To validate the SDK can communicate to the Cosmos DB account execute the following command from where the application is hosted. If it fails this points to a firewall rule or other security feature blocking the request. If it succeeds the SDK should be able to communicate to the Cosmos DB account.
+```
+telnet myCosmosDbAccountName.documents.azure.com 443
+```
+ ### Client-side transient connectivity issues Service unavailable exceptions can surface when there are transient connectivity problems that are causing timeouts. Typically, the stack trace related to this scenario will contain a `ServiceUnavailableException` error with diagnostic details. For example:
cosmos-db Tutorial Global Distribution Sql Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/tutorial-global-distribution-sql-api.md
Previously updated : 11/05/2019 Last updated : 08/26/2021
You can now proceed to the next tutorial to learn how to develop locally using t
> [!div class="nextstepaction"] > [Develop locally with the emulator](local-emulator.md)
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
+ [regions]: https://azure.microsoft.com/regions/
cosmos-db Tutorial Query Sql Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/tutorial-query-sql-api.md
Previously updated : 08/12/2021 Last updated : 08/26/2021
In this tutorial, you've done the following tasks:
You can now proceed to the next tutorial to learn how to distribute your data globally. > [!div class="nextstepaction"]
-> [Distribute your data globally](tutorial-global-distribution-sql-api.md)
+> [Distribute your data globally](tutorial-global-distribution-sql-api.md)
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Tutorial Sql Api Dotnet Bulk Import https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/tutorial-sql-api-dotnet-bulk-import.md
Previously updated : 03/15/2021 Last updated : 08/26/2021
In this tutorial, you've done the following steps:
You can now proceed to the next tutorial: > [!div class="nextstepaction"]
->[Query Azure Cosmos DB by using the SQL API](tutorial-query-sql-api.md)
+>[Query Azure Cosmos DB by using the SQL API](tutorial-query-sql-api.md)
+
+Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+* If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+* If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Understand Your Bill https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/understand-your-bill.md
Previously updated : 05/25/2021 Last updated : 08/26/2021
Next you can proceed to learn about cost optimization in Azure Cosmos DB with th
* Learn more about [Optimizing the cost of reads and writes](optimize-cost-reads-writes.md) * Learn more about [Optimizing the cost of queries](./optimize-cost-reads-writes.md) * Learn more about [Optimizing the cost of multi-region Azure Cosmos accounts](optimize-cost-regions.md)
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cosmos-db Unique Keys https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/unique-keys.md
Previously updated : 07/23/2020 Last updated : 08/26/2021
You can define unique keys only when you create an Azure Cosmos container. A uni
* Learn more about [logical partitions](partitioning-overview.md) * Explore [how to define unique keys](how-to-define-unique-keys.md) when creating a container
+* Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-with-capacity-planner.md)
cost-management-billing Programmatically Create Subscription Microsoft Partner Agreement https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/programmatically-create-subscription-microsoft-partner-agreement.md
Pass the optional *resellerId* copied from the second step in the request body o
To install the latest version of the module that contains the `New-AzSubscriptionAlias` cmdlet, run `Install-Module Az.Subscription`. To install a recent version of PowerShellGet, see [Get PowerShellGet Module](/powershell/scripting/gallery/installing-psget).
-Run the following [New-AzSubscriptionAlias](/powershell/module/az.subscription/new-azsubscription) command, using the billing scope `"/providers/Microsoft.Billing/billingAccounts/99a13315-xxxx-xxxx-xxxx-xxxxxxxxxxxx:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx_xxxx-xx-xx/customers/2281f543-xxxx-xxxx-xxxx-xxxxxxxxxxxx"`.
+Run the following New-AzSubscriptionAlias command, using the billing scope `"/providers/Microsoft.Billing/billingAccounts/99a13315-xxxx-xxxx-xxxx-xxxxxxxxxxxx:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx_xxxx-xx-xx/customers/2281f543-xxxx-xxxx-xxxx-xxxxxxxxxxxx"`.
```azurepowershell New-AzSubscriptionAlias -AliasName "sampleAlias" -SubscriptionName "Dev Team Subscription" -BillingScope "/providers/Microsoft.Billing/billingAccounts/99a13315-xxxx-xxxx-xxxx-xxxxxxxxxxxx:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx_xxxx-xx-xx/customers/2281f543-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -Workload 'Production"
data-factory Concepts Data Flow Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-overview.md
Previously updated : 05/20/2021 Last updated : 08/26/2021 # Mapping data flows in Azure Data Factory
Mapping data flows are available in the following regions in ADF:
| South Central US | | | South India | | | Southeast Asia | Γ£ô |
-| Switzerland North | |
+| Switzerland North | Γ£ô |
| Switzerland West | | | UAE Central | | | UAE North | Γ£ô |
data-factory Connect Data Factory To Azure Purview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connect-data-factory-to-azure-purview.md
Previously updated : 08/10/2021 Last updated : 08/24/2021 # Connect Data Factory to Azure Purview (Preview)
You have two options to connect data factory to Azure Purview:
### Connect to Azure Purview account in Data Factory
-To establish the connection, you need to have **Owner** or **Contributor** role on your data factory.
+You need to have **Owner** or **Contributor** role on your data factory to connect to an Azure Purview account.
+
+To establish the connection on Data Factory authoring UI:
1. In the ADF authoring UI, go to **Manage** -> **Azure Purview**, and select **Connect to a Purview account**.
To establish the connection, you need to have **Owner** or **Contributor** role
3. Once connected, you can see the name of the Purview account in the tab **Purview account**.
-When connecting data factory to Purview, ADF UI also tries to grant the data factory's managed identity **Purview Data Curator** role on your Purview account. Managed identity is used to authenticate lineage push operations from data factory to Purview. If you have **Owner** or **User Access Administrator** role on the Purview account, this operation will be done automatically. If you see warning like the following, it means the needed role is not granted:
+The Purview connection information is stored in the data factory resource like the following. To establish the connection programmatically, you can update the data factory and add the `purviewConfiguration` settings.
+
+```json
+{
+ "name": "ContosoDataFactory",
+ "type": "Microsoft.DataFactory/factories",
+ "location": "<region>",
+ "properties": {
+ ...
+ "purviewConfiguration": {
+ "purviewResourceId": "/subscriptions/<subscriptionId>/resourceGroups/<resourceGroupname>/providers/Microsoft.Purview/accounts/<PurviewAccountName>"
+ }
+ },
+ "identity": {...},
+ ...
+}
+```
+### Register Data Factory in Azure Purview
-To fix the issue, go to Azure portal -> your Purview account -> Access control (IAM), check if **Purview Data Curator** role is granted to the data factory's managed identity. If not, manually add the role assignment.
+For how to register Data Factory in Azure Purview, see [How to connect Azure Data Factory and Azure Purview](../purview/how-to-link-azure-data-factory.md).
-### Register Data Factory in Azure Purview
+## Set up authentication
+
+Data factory's managed identity is used to authenticate lineage push operations from data factory to Purview.
+
+- For Purview account created **on or after August 18, 2021**, grant the data factory's managed identity **Data Curator** role on your Purview **root collection**. Learn more about [Access control in Azure Purview](../purview/catalog-permissions.md) and [Add roles and restrict access through collections](../purview/how-to-create-and-manage-collections.md#add-roles-and-restrict-access-through-collections).
-For how to register Data Factory in Azure Purview, see [How to connect Azure Data Factory and Azure Purview](../purview/how-to-link-azure-data-factory.md).
+ When connecting data factory to Purview on authoring UI, ADF tries to add such role assignment automatically. If you have **Collection admins** role on the Purview root collection, this operation is done successfully.
+
+- For Purview account created **before August 18, 2021**, grant the data factory's managed identity Azure built-in [**Purview Data Curator**](../role-based-access-control/built-in-roles.md#purview-data-curator) role on your Purview account. Learn more about [Access control in Azure Purview - legacy permissions](../purview/catalog-permissions.md#legacy-permission-guide).
+
+ When connecting data factory to Purview on authoring UI, ADF tries to add such role assignment automatically. If you have Azure built-in **Owner** or **User Access Administrator** role on the Purview account, this operation is done successfully.
+
+You may see below warning if you have the privilege to read Purview role assignment information and the needed role is not granted. To make sure the connection is properly set for the pipeline lineage push, go to your Purview account and check if **Purview Data Curator** role is granted to the data factory's managed identity. If not, manually add the role assignment.
+ ## Report lineage data to Azure Purview
-After you connect the data factory to a Purview account, when you run Copy, Data flow or Execute SSIS package activity, you can get the lineage between datasets created by data processes and have a high-level overview of whole workflow process among data sources and destination. For detailed supported capabilities, see [Supported Azure Data Factory activities](../purview/how-to-link-azure-data-factory.md#supported-azure-data-factory-activities). For an end to end walkthrough, refer to [Tutorial: Push Data Factory lineage data to Azure Purview](tutorial-push-lineage-to-purview.md).
+Once you connect the data factory to a Purview account, when you execute pipelines, Data Factory push lineage information to the Purview account. For detailed supported capabilities, see [Supported Azure Data Factory activities](../purview/how-to-link-azure-data-factory.md#supported-azure-data-factory-activities). For an end to end walkthrough, refer to [Tutorial: Push Data Factory lineage data to Azure Purview](tutorial-push-lineage-to-purview.md).
## Discover and explore data using Purview
-After you connect the data factory to a Purview account, you can use the search bar at the top center of Azure Data Factory autoring UI to search for data and perform actions. Learn more from [Discover and explore data in ADF using Purview](how-to-discover-explore-purview-data.md).
+Once you connect the data factory to a Purview account, you can use the search bar at the top center of Data Factory authoring UI to search for data and perform actions. Learn more from [Discover and explore data in ADF using Purview](how-to-discover-explore-purview-data.md).
## Next steps
defender-for-iot Agent Based Security Alerts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/device-builders/agent-based-security-alerts.md
Title: Agent based security alerts
-description: Learn about security alerts and recommended remediation using Defender for IoT device's features and service.
+ Title: Classic agent based security alerts
+description: Learn about the classic version of Defender for IoT's security alerts, and recommended remediation using Defender for IoT device's features, and service.
Previously updated : 2/16/2021 Last updated : 08/25/2021
-# Defender for IoT devices security alerts
+# Classic Defender for IoT devices security alerts
Defender for IoT continuously analyzes your IoT solution using advanced analytics and threat intelligence to alert you to malicious activity. In addition, you can create custom alerts based on your knowledge of expected device behavior.
defender-for-iot Concept Agent Based Security Alerts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/device-builders/concept-agent-based-security-alerts.md
+
+ Title: Agent based security alerts
+description: Learn about security alerts and recommended remediation using Defender for IoT device's features and service.
+ Last updated : 08/25/2021++
+# Defender for IoT devices security alerts
+
+Defender for IoT continuously analyzes your IoT solution using advanced analytics and threat intelligence to alert you to malicious activity.
+In addition, you can create custom alerts based on your knowledge of expected device behavior.
+An alert acts as an indicator of potential compromise, and should be investigated and remediated.
+
+In this article, you will find a list of built-in alerts, which can be triggered on your IoT devices.
+In addition to built-in alerts, Defender for IoT allows you to define custom alerts based on expected IoT Hub and/or device behavior.
+For more information, see [customizable alerts](concept-customizable-security-alerts.md).
+
+## Agent based security alerts
+
+| Name | Severity | Data Source | Description | Suggested remediation steps |
+|--|--|--|--|--|
+| **High** severity | | | |
+| Binary Command Line | High | Defender-IoT-micro-agent | LA Linux binary being called/executed from the command line was detected. This process may be legitimate activity, or an indication that your device is compromised. | Review the command with the user that ran it and check if this is something legitimately expected to run on the device. If not, escalate the alert to your information security team. |
+| Disable firewall | High | Defender-IoT-micro-agent | Possible manipulation of on-host firewall detected. Malicious actors often disable the on-host firewall in an attempt to exfiltrate data. | Review with the user that ran the command to confirm if this was legitimate expected activity on the device. If not, escalate the alert to your information security team. |
+| Port forwarding detection | High | Defender-IoT-micro-agent | Initiation of port forwarding to an external IP address detected. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Possible attempt to disable Auditd logging detected | High | Defender-IoT-micro-agent | Linux Auditd system provides a way to track security-relevant information on the system. The system records as much information about the events that are happening on your system as possible. This information is crucial for mission-critical environments to determine who violated the security policy and the actions they performed. Disabling Auditd logging may prevent your ability to discover violations of security policies used on the system. | Check with the device owner if this was legitimate activity with business reasons. If not, this event may be hiding activity by malicious actors. Immediately escalated the incident to your information security team. |
+| Reverse shells | High | Defender-IoT-micro-agent | Analysis of host data on a device detected a potential reverse shell. Reverse shells are often used to get a compromised machine to call back into a machine controlled by a malicious actor. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Successful local login | High | Defender-IoT-micro-agent | Successful local sign in to the device detected | Make sure the signed in user is an authorized party. |
+| Web shell | High | Defender-IoT-micro-agent | Possible web shell detected. Malicious actors commonly upload a web shell to a compromised machine to gain persistence or for further exploitation. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Behavior similar to ransomware detected | High | Defender-IoT-micro-agent | Execution of files similar to known ransomware that may prevent users from accessing their system, or personal files, and may demand ransom payment to regain access. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Crypto coin miner image | High | Defender-IoT-micro-agent | Execution of a process normally associated with digital currency mining detected. | Verify with the user that ran the command if this was legitimate activity on the device. If not, escalate the alert to the information security team. |
+| **Medium** severity | | | |
+| Behavior similar to common Linux bots detected | Medium | Defender-IoT-micro-agent | Execution of a process normally associated with common Linux botnets detected. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Behavior similar to Fairware ransomware detected | Medium | Defender-IoT-micro-agent | Execution of rm -rf commands applied to suspicious locations detected using analysis of host data. Because rm -rf recursively deletes files, it is normally only used on discrete folders. In this case, it is being used in a location that could remove a large amount of data. Fairware ransomware is known to execute rm -rf commands in this folder. | Review with the user that ran the command this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Crypto coin miner container image detected | Medium | Defender-IoT-micro-agent | Container detecting running known digital currency mining images. | 1. If this behavior is not intended, delete the relevant container image.<br> 2. Make sure that the Docker daemon is not accessible via an unsafe TCP socket. <br> 3. Escalate the alert to the information security team. |
+| Detected suspicious use of the nohup command | Medium | Defender-IoT-micro-agent | Suspicious use of the nohup command on host detected. Malicious actors commonly run the nohup command from a temporary directory, effectively allowing their executables to run in the background. Seeing this command run on files located in a temporary directory is not expected or usual behavior. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Detected suspicious use of the useradd command | Medium | Defender-IoT-micro-agent | Suspicious use of the useradd command detected on the device. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Exposed Docker daemon by TCP socket | Medium | Defender-IoT-micro-agent | Machine logs indicate that your Docker daemon (dockerd) exposes a TCP socket. By default, Docker configuration, does not use encryption or authentication when a TCP socket is enabled. Default Docker configuration enables full access to the Docker daemon, by anyone with access to the relevant port. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Failed local login | Medium | Defender-IoT-micro-agent | A failed local login attempt to the device was detected. | Make sure no unauthorized party has physical access to the device. |
+| Detected file download from a malicious source | Medium | Defender-IoT-micro-agent | Download of a file from a known malware source detected. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| htaccess file access detected | Medium | Defender-IoT-micro-agent | Analysis of host data detected possible manipulation of a htaccess file. Htaccess is a powerful configuration file that allows you to make multiple changes to a web server running Apache Web software, including basic redirect functionality, and more advanced functions, such as basic password protection. Malicious actors often modify htaccess files on compromised machines to gain persistence. | Confirm this is legitimate expected activity on the host. If not, escalate the alert to your information security team. |
+| Known attack tool | Medium | Defender-IoT-micro-agent | A tool often associated with malicious users attacking other machines in some way was detected. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Local host reconnaissance detected | Medium | Defender-IoT-micro-agent | Execution of a command normally associated with common Linux bot reconnaissance detected. | Review the suspicious command line to confirm that it was executed by a legitimate user. If not, escalate the alert to your information security team. |
+| Mismatch between script interpreter and file extension | Medium | Defender-IoT-micro-agent | Mismatch between the script interpreter and the extension of the script file provided as input detected. This type of mismatch is commonly associated with attacker script executions. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Possible backdoor detected | Medium | Defender-IoT-micro-agent | A suspicious file was downloaded and then run on a host in your subscription. This type of activity is commonly associated with the installation of a backdoor. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Possible loss of data detected | Medium | Defender-IoT-micro-agent | Possible data egress condition detected using analysis of host data. Malicious actors often egress data from compromised machines. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Privileged container detected | Medium | Defender-IoT-micro-agent | Machine logs indicate that a privileged Docker container is running. A privileged container has full access to host resources. If compromised, a malicious actor can use the privileged container to gain access to the host machine. | If the container doesn't need to run in privileged mode, remove the privileges from the container. |
+| Removal of system logs files detected | Medium | Defender-IoT-micro-agent | Suspicious removal of log files on the host detected. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Space after filename | Medium | Defender-IoT-micro-agent | Execution of a process with a suspicious extension detected using analysis of host data. Suspicious extensions may trick users into thinking files are safe to be opened and can indicate the presence of malware on the system. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Tools commonly used for malicious credentials access detected | Medium | Defender-IoT-micro-agent | Detection usage of a tool commonly associated with malicious attempts to access credentials. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Suspicious compilation detected | Medium | Defender-IoT-micro-agent | Suspicious compilation detected. Malicious actors often compile exploits on a compromised machine to escalate privileges. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Suspicious file download followed by file run activity | Medium | Defender-IoT-micro-agent | Analysis of host data detected a file that was downloaded and run in the same command. This technique is commonly used by malicious actors to get infected files onto victim machines. | Review with the user that ran the command if this was legitimate activity that you expect to see on the device. If not, escalate the alert to the information security team. |
+| Suspicious IP address communication | Medium | Defender-IoT-micro-agent | Communication with a suspicious IP address detected. | Verify if the connection is legitimate. Consider blocking communication with the suspicious IP. |
+| **LOW** severity | | | |
+| Bash history cleared | Low | Defender-IoT-micro-agent | Bash history log cleared. Malicious actors commonly erase bash history to hide their own commands from appearing in the logs. | Review with the user that ran the command that the activity in this alert to see if you recognize this as legitimate administrative activity. If not, escalate the alert to the information security team. |
+
+## Next steps
+
+- Defender for IoT service [Overview](overview.md)
+- Learn how to [Access your security data](how-to-security-data-access.md)
+- Learn more about [Investigating a device](how-to-investigate-device.md)
defender-for-iot Concept Micro Agent Linux Dependencies https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/device-builders/concept-micro-agent-linux-dependencies.md
Title: Micro agent Linux dependencies (Preview) description: This article describes the different Linux OS dependencies for the Defender for IoT micro agent. Previously updated : 07/19/2021 Last updated : 08/26/2021 # Micro agent Linux dependencies (Preview)
The table below shows the Linux dependencies for each component.
| | libpcap | Library | | | | | CONFIG_PACKET=y | Kernel config | | | | | CONFIG_NETFILTER =y | Kernel config | | Optional ΓÇô Performance improvement |
+| **Login collector** | | | | |
+| | Wtmp, btmp | Log files | | [utmp](https://en.wikipedia.org/wiki/Utmp) |
## Next steps
defender-for-iot Quickstart Standalone Agent Binary Installation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/device-builders/quickstart-standalone-agent-binary-installation.md
Title: 'Quickstart: Install Defender for IoT micro agent (Preview)' description: In this quickstart, learn how to install, and authenticate the Defender Micro Agent. Previously updated : 06/27/2021 Last updated : 08/26/2021
Before you install the Defender for IoT module, you must create a module identit
sudo cp ./microsoft-prod.list /etc/apt/sources.list.d/ ```
-1. Update the list of packages from the repository that you added with the following command:
+1. Install the Microsoft GPG public key:
```bash
- sudo apt-get update
+ curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.gpg
+ sudo cp ./microsoft.gpg /etc/apt/trusted.gpg.d/
``` To install the Defender micro agent package on Debian, and Ubuntu based Linux distributions, use the following command:
To install the Defender micro agent package on Debian, and Ubuntu based Linux di
sudo apt-get install defender-iot-micro-agent ```
-## Micro agent authentication methods
+## Micro agent authentication methods
-The two options used to authenticate the Defender for IoT micro agent are:
+The two options used to authenticate the Defender for IoT micro agent are:
-- Module identity connection string.
+- Module identity connection string.
- Certificate. ### Authenticate using a module identity connection string
-Ensure the [Prerequisites](#prerequisites) for this article are met, and that you create a module identity before starting these steps.
+Ensure the [Prerequisites](#prerequisites) for this article are met, and that you create a module identity before starting these steps.
#### Get the module identity connection string
-To get the module identity connection string from the IoT Hub:
+To get the module identity connection string from the IoT Hub:
1. Navigate to the IoT Hub, and select your hub.
To authenticate using a certificate:
1. Procure a certificate by following [these instructions](../../iot-hub/tutorial-x509-scripts.md).
-1. Place the PEM-encoded public part of the certificate, and the private key, in to the Defender Agent Directory in to the file called `certificate_public.pem`, and `certificate_private.pem`.
+1. Place the PEM-encoded public part of the certificate, and the private key, in to the Defender Agent Directory in to the file called `certificate_public.pem`, and `certificate_private.pem`.
-1. Place the appropriate connection string in to the `connection_string.txt` file. the connection string should look like this:
+1. Place the appropriate connection string in to the `connection_string.txt` file. the connection string should look like this:
- `HostName=<the host name of the iot hub>;DeviceId=<the id of the device>;ModuleId=<the id of the module>;x509=true`
+ `HostName=<the host name of the iot hub>;DeviceId=<the id of the device>;ModuleId=<the id of the module>;x509=true`
- This string alerts the defender agent, to expect a certificate be provided for authentication.
+ This string alerts the defender agent, to expect a certificate be provided for authentication.
1. Restart the service using the following command:
To validate your installation:
1. Ensure that the service is stable by making sure it is `active` and that the uptime of the process is appropriate :::image type="content" source="media/quickstart-standalone-agent-binary-installation/active-running.png" alt-text="Check to make sure your service is stable and active.":::
-
-## Testing the system end-to-end
-You can test the system from end to end by creating a trigger file on the device. The trigger file will cause the baseline scan in the agent to detect the file as a baseline violation.
+## Testing the system end-to-end
+
+You can test the system from end to end by creating a trigger file on the device. The trigger file will cause the baseline scan in the agent to detect the file as a baseline violation.
Create a file on the file system with the following command:
Create a file on the file system with the following command:
sudo touch /tmp/DefenderForIoTOSBaselineTrigger.txt ```
-A baseline validation failure recommendation will occur in the hub, with a `CceId` of CIS-debian-9-DEFENDER_FOR_IOT_TEST_CHECKS-0.0:
+A baseline validation failure recommendation will occur in the hub, with a `CceId` of CIS-debian-9-DEFENDER_FOR_IOT_TEST_CHECKS-0.0:
:::image type="content" source="media/quickstart-standalone-agent-binary-installation/validation-failure.png" alt-text="The baseline validation failure recommendation that occurs in the hub." lightbox="media/quickstart-standalone-agent-binary-installation/validation-failure-expanded.png":::
-Allow up to one hour for the recommendation to appear in the hub.
+Allow up to one hour for the recommendation to appear in the hub.
-## Micro agent versioning
+## Micro agent versioning
-To install a specific version of the Defender IoT micro agent, run the following command:
+To install a specific version of the Defender IoT micro agent, run the following command:
```bash sudo apt-get install defender-iot-micro-agent=<version>
defender-for-iot Integration Cisco Ise Pxgrid https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/organizations/integration-cisco-ise-pxgrid.md
- Title: About the Cisco ISE pxGrid integration
-description: Bridging the capabilities of Defender for IoT with Cisco ISE pxGrid, provides security teams unprecedented device visibility to the enterprise ecosystem.
Previously updated : 12/28/2020---
-# About the Cisco ISE pxGrid integration
-
-Defender for IoT delivers the only ICS and IoT cybersecurity platform built by blue-team experts with a track record defending critical national infrastructure, and the only platform with patented ICS-aware threat analytics and machine learning. Defender for IoT provides:
--- Immediate insights about ICS devices, vulnerabilities, and known and zero-day threats.--- ICS-aware deep embedded knowledge of OT protocols, devices, applications, and their behaviors.--- An automated ICS threat modeling technology to predict the most likely paths of targeted ICS attacks via proprietary analytics.-
-## Powerful device visibility and control
-
-The Defender for IoT integration with Cisco ISE pxGrid provides a new level of centralized visibility, monitoring, and control for the OT landscape.
-
-These bridged platforms enable automated device visibility and protection to previously unreachable ICS and IIoT devices.
-
-### ICS and IIoT device visibility: comprehensive and deep
-
-Patented Defender for IoT technologies ensures comprehensive and deep ICS and IIoT device discovery and inventory management for enterprise data.
-
-Device types, manufacturers, open ports, serial numbers, firmware revisions, IP addresses, and MAC address data and more are updated in real time. Defender for IoT can further enhance visibility, discovery, and analysis from this baseline by integrating with critical enterprise data sources. For example, CMDBs, DNS, Firewalls, and Web APIs.
-
-In addition, the Defender for IoT platform combines passive monitoring and optional selective probing techniques to provide the most accurate and detailed inventory of devices in industrial and critical infrastructure organizations.
-
-### Bridged capabilities
-
-Bridging these capabilities with Cisco ISE pxGrid, provides security teams unprecedented device visibility to the enterprise ecosystem.
-
-Seamless, robust integration with Cisco ISE pxGrid ensures no OT device goes undiscovered and no device information is missed.
----
-### Use case coverage: ISE policies based on Defender for IoT attributes
-
-Use powerful ISE policies based on Defender for IoT deep learning to handle ICS and IoT use case requirements.
-
-Working with policies lets you close the security cycle and bring your network to compliance in real time.
-
-For example, customers can use Defender for IoT ICS and IoT attributes to create policies that handle the following use cases, such as:
--- Create an authorization policy to allow known and authorized devices into a sensitive zone based on allowed attributes - for example, specific firmware version for Rockwell Automation PLCs.--- Notify security teams when an ICS device is detected in a non-OT zone.--- Remediate machines with outdated or noncompliant vendors.--- Quarantine and block devices as required.--- Generate reports on PLCs or RTUs running firmware with known vulnerabilities (CVEs).-
-### About this article
-
-This article describes how to set up pxGrid and the Defender for IoT platform to ensure that Defender for IoT injects OT attributes to Cisco ISE.
-
-### Getting more information
-
-For more information about Cisco ISE pxGrid integration requirements, see <https://www.cisco.com/c/en/us/products/security/pxgrid.html>
-
-## Integration system requirements
-
-This article describes integration system requirements:
-
-Defender for IoT requirements
--- Defender for IoT version 2.5-
-Cisco requirements
--- pxGrid version 2.0--- Cisco ISE version 2.4-
-Network requirements
--- Verify that the Defender for IoT appliance has access to the Cisco ISE management interface.--- Verify that you have CLI access to all Defender for IoT appliances in your enterprise.-
-> [!NOTE]
-> Only devices with MAC addresses are synced with Cisco ISE pxGrid.
-
-## Cisco ISE pxGrid setup
-
-This article describes how to:
--- Set up communication with pxGrid--- Subscribe to the endpoint device topic--- Generate certificates--- Define pxGrid settings-
-## Set up communication with pxGrid
-
-This article describes how to set up communication with pxGrid.
-
-To set up communication:
-
-1. Sign in to Cisco ISE.
-
-1. Select **Administration**>**System** and **Deployment**.
-
-1. Select the required node. In the General Settings tab, select the **pxGrid checkbox**.
-
- :::image type="content" source="media/integration-cisco-isepxgrid-integration/pxgrid.png" alt-text="Ensure the pxgrid checkbox is selected.":::
-
-1. Select the **Profiling Configuration** tab.
-
-1. Select the **pxGrid checkbox**. Add a description.
-
- :::image type="content" source="media/integration-cisco-isepxgrid-integration/profile-configuration.png" alt-text="Screenshot of the add description":::
-
-## Subscribe to the endpoint device topic
-
-Verify that the ISE pxGrid node has subscribed to the endpoint device topic. Navigate to **Administration**>**pxGrid Services**>**Web Clients**. There, you can verify that ISE has subscribed to the endpoint device topic.
-
-## Generate certificates
-
-This article describes how to generate certificates.
-
-To generate:
-
-1. Select **Administration** > **pxGrid Services**, and then select **Certificates**.
-
- :::image type="content" source="media/integration-cisco-isepxgrid-integration/certificates.png" alt-text="Select the certificates tab in order to generate a certificate.":::
-
-1. In the **I Want To** field, select **Generate a single certificate (without a certificate signing request)**.
-
-1. Fill in the remaining fields and select **Create**.
-
-1. Create the client certificate key and the server certificate, and then convert them to java keystore format. YouΓÇÖll need to copy these to each Defender for IoT sensor in your network.
-
-## Define pxGrid settings
-
-To define settings:
-
-1. Select **Administration** > **pxGrid Services** and then select **Settings**.
-
-1. Enable the **Automatically approve new certificate-based accounts** and **Allow password-based account creation.**
-
- :::image type="content" source="media/integration-cisco-isepxgrid-integration/allow-these.png" alt-text="Ensure both checkboxes are selected.":::
-
-## Set up the Defender for IoT appliances
-
-This article describes how to set up Defender for IoT appliances to communicate with pxGrid. The configuration should be carried out on all Defender for IoT appliances that will inject data to Cisco ISE.
-
-To set up appliances:
-
-1. Sign in to the sensor CLI.
-
-1. Copy `trust.jks`, and , which were previously created on the sensor. Copy them to: `/var/cyberx/properties/`.
-
-1. Edit `/var/cyberx/properties/pxgrid.properties`:
-
- 1. Add a key and trust, then store filenames and passwords.
-
- 2. Add the hostname of the pxGrid instance.
-
- 3. `Enabled=true`
-
-1. Restart the appliance.
-
-1. Repeat these steps for each appliance in your enterprise that will inject data.
-
-## View and manage detections in Cisco ISE
-
-1. Endpoint detections are displayed in the ISE Context **Visibility** > **Endpoints** tab.
-
-1. Select **Policy** > **Profiling** > **Add** > **Rules** > **+ Condition** > **Create New Condition**.
-
-1. Select **Attribute** and use the IOT device dictionaries to build a profiling rule based on the attributes injected. The following attributes are injected.
-
- - Device type
-
- - Hardware revision
-
- - IP address
-
- - MAC address
-
- - Name
-
- - Product ID
-
- - Protocol
-
- - Serial number
-
- - Software revision
-
- - Vendor
-
-Only devices with MAC addresses are synced.
-
-## Troubleshooting and logs
-
-Logs can be found in:
--- `/var/cyberx/logs/pxgrid.log`--- `/var/cyberx/logs/core.log`-
-## Next steps
-
-Learn how to [Forward alert information](how-to-forward-alert-information-to-partners.md).
devtest-labs Devtest Lab Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/devtest-labs/devtest-lab-overview.md
Title: About Azure DevTest Labs | Microsoft Docs description: Learn how DevTest Labs can make it easy to create, manage, and monitor Azure virtual machines Previously updated : 06/20/2020 Last updated : 08/20/2021 # About Azure DevTest Labs
Use pre-made plug-ins or the API to provision development/testing environments d
See the following articles: - To learn more about DevTest Labs, see [DevTest Labs concepts](devtest-lab-concepts.md).-- For a walkthrough with step-by-step instructions, see [Tutorial: Set up a lab by using Azure DevTest Labs](tutorial-create-custom-lab.md).
+- For a walkthrough with step-by-step instructions, see [Tutorial: Set up a lab by using Azure DevTest Labs](tutorial-create-custom-lab.md).
devtest-labs Devtest Lab Troubleshoot Apply Artifacts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/devtest-labs/devtest-lab-troubleshoot-apply-artifacts.md
You can troubleshoot VMs created using DevTest Labs and the Resource Manager dep
An artifact appears to stop responding until a pre-defined timeout period expires, and the artifact is marked as **Failed**.
-When an artifact appears to hang, first determine where it's stuck. An artifact can be blocked at any of the following steps during execution:
+When an artifact appears to stop responding, first determine where it's stuck. An artifact can be blocked at any of the following steps during execution:
- **During the initial request**. DevTest Labs creates an Azure Resource Manager template to request the use of the Custom Script Extension (CSE). Therefore, behind the scenes, a resource group deployment is triggered. When an error at this level happens, you get details in the **Activity Logs** of the resource group for the VM in question. - You can access the activity log from the lab VM page navigation bar. When you select it, you see an entry for either **applying artifacts to virtual machine** (if the apply artifacts operation was triggered directly) or **Add or modify virtual machines** (if the applying artifacts operation was part of the VM creation process).
dms Tutorial Mongodb Cosmos Db Online https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/dms/tutorial-mongodb-cosmos-db-online.md
Previously updated : 05/19/2021 Last updated : 08/26/2021 # Tutorial: Migrate MongoDB to Azure Cosmos DB's API for MongoDB online using DMS
After you migrate the data stored in MongoDB database to Azure Cosmos DBΓÇÖs API
## Additional resources * [Cosmos DB service information](https://azure.microsoft.com/services/cosmos-db/)
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../cosmos-db/convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](../cosmos-db/mongodb/estimate-ru-capacity-planner.md)
## Next steps
dms Tutorial Mongodb Cosmos Db https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/dms/tutorial-mongodb-cosmos-db.md
Previously updated : 05/19/2021 Last updated : 08/26/2021 # Tutorial: Migrate MongoDB to Azure Cosmos DB API for MongoDB offline
After the migration finishes, you can check your Azure Cosmos DB account to veri
After you migrate the data stored in MongoDB database to the Azure Cosmos DB API for MongoDB, you can connect to Azure Cosmos DB and manage the data. You can also perform other post-migration optimization steps. These might include optimizing the indexing policy, updating the default consistency level, or configuring global distribution for your Azure Cosmos DB account. For more information, see [Post-migration optimization](../cosmos-db/mongodb-post-migration.md).
+## Additional resources
+
+* Trying to do capacity planning for a migration to Azure Cosmos DB?
+ * If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../cosmos-db/convert-vcore-to-request-unit.md)
+ * If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](../cosmos-db/mongodb/estimate-ru-capacity-planner.md)
+ ## Next steps Review migration guidance for additional scenarios in the [Azure Database Migration Guide](https://datamigration.microsoft.com/).
dms Tutorial Sql Server Managed Instance Online https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/dms/tutorial-sql-server-managed-instance-online.md
Previously updated : 08/04/2020 Last updated : 08/20/2021 # Tutorial: Migrate SQL Server to an Azure SQL Managed Instance online using DMS You can use Azure Database Migration Service to migrate the databases from a SQL Server instance to an [Azure SQL Managed Instance](../azure-sql/managed-instance/sql-managed-instance-paas-overview.md) with minimal downtime. For additional methods that may require some manual effort, see the article [SQL Server instance migration to Azure SQL Managed Instance](../azure-sql/migration-guides/managed-instance/sql-server-to-managed-instance-guide.md).
-In this tutorial, you migrate the **Adventureworks2012** database from an on-premises instance of SQL Server to a SQL Managed Instance with minimal downtime by using Azure Database Migration Service.
+In this tutorial, you migrate the [AdventureWorks2016](/sql/samples/adventureworks-install-configure#download-backup-files) database from an on-premises instance of SQL Server to a SQL Managed Instance with minimal downtime by using Azure Database Migration Service.
-In this tutorial, you learn how to:
+You will learn how to:
> [!div class="checklist"] >
+> * Register the Azure DataMigration resource provider.
> * Create an instance of Azure Database Migration Service. > * Create a migration project and start online migration by using Azure Database Migration Service. > * Monitor the migration.
This article describes an online migration from SQL Server to a SQL Managed Inst
To complete this tutorial, you need to:
+* Download and install [SQL Server 2016 or later](https://www.microsoft.com/sql-server/sql-server-downloads).
+* Enable the TCP/IP protocol, which is disabled by default during SQL Server Express installation, by following the instructions in the article [Enable or Disable a Server Network Protocol](/sql/database-engine/configure-windows/enable-or-disable-a-server-network-protocol#SSMSProcedure).
+* [Restore the AdventureWorks2016 database to the SQL Server instance.](/sql/samples/adventureworks-install-configure#restore-to-sql-server)
* Create a Microsoft Azure Virtual Network for Azure Database Migration Service by using the Azure Resource Manager deployment model, which provides site-to-site connectivity to your on-premises source servers by using either [ExpressRoute](../expressroute/expressroute-introduction.md) or [VPN](../vpn-gateway/vpn-gateway-about-vpngateways.md). [Learn network topologies for SQL Managed Instance migrations using Azure Database Migration Service](./resource-network-topologies.md). For more information about creating a virtual network, see the [Virtual Network Documentation](../virtual-network/index.yml), and especially the quickstart articles with step-by-step details. > [!NOTE]
To complete this tutorial, you need to:
## Create an Azure Database Migration Service instance
-1. In the Azure portal, select + **Create a resource**, search for **Azure Database Migration Service**, and then select **Azure Database Migration Service** from the drop-down list.
+1. In the Azure portal menu or on the **Home** page, select **Create a resource**. Search for and select **Azure Database Migration Service**.
- ![Azure Marketplace](media/tutorial-sql-server-to-managed-instance-online/portal-marketplace.png)
+ ![Azure Marketplace](media/tutorial-sql-server-to-managed-instance-online/portal-marketplace.png)
2. On the **Azure Database Migration Service** screen, select **Create**.
- ![Create Azure Database Migration Service instance](media/tutorial-sql-server-to-managed-instance-online/dms-create1.png)
+ ![Create Azure Database Migration Service instance](media/tutorial-sql-server-to-managed-instance-online/dms-create-service1.png)
-3. On the **Create Migration Service** screen, specify a name for the service, the subscription, and a new or existing resource group.
+3. On the **Create Migration Service** basics screen:
-4. Select the location in which you want to create the instance of DMS.
-
-5. Select an existing virtual network or create one.
-
- The virtual network provides Azure Database Migration Service with access to the source SQL Server and target SQL Managed Instance.
+ - Select the subscription.
+ - Create a new resource group or choose an existing one.
+ - Specify a name for the instance of the Azure Database Migration Service.
+ - Select the location in which you want to create the instance of Azure Database Migration Service.
+ - Choose **Azure** as the service mode.
+ - Select an SKU from the Premium pricing tier.
+
+ > [!NOTE]
+ > Online migrations are supported only when using the Premium tier.
- For more information on how to create a virtual network in Azure portal, see the article [Create a virtual network using the Azure portal](../virtual-network/quick-create-portal.md).
+ - For more information on costs and pricing tiers, see the [pricing page](https://aka.ms/dms-pricing).
- For additional detail, see the article [Network topologies for SQL Managed Instance migrations using Azure Database Migration Service](./resource-network-topologies.md).
+ ![Configure Azure Database Migration Service instance basics settings](media/tutorial-sql-server-to-managed-instance-online/dms-create-service2.png)
-6. Select a SKU from the Premium pricing tier.
+ - Select **Next: Networking**.
- > [!NOTE]
- > Online migrations are supported only when using the Premium tier.
+4. On the **Create Migration Service** networking screen:
- For more information on costs and pricing tiers, see the [pricing page](https://aka.ms/dms-pricing).
+ - Select an existing virtual network or create a new one. The virtual network provides Azure Database Migration Service with access to the source SQL Server and the target Azure SQL Managed Instance.
+
+ - For more information about how to create a virtual network in the Azure portal, see the article [Create a virtual network using the Azure portal](../virtual-network/quick-create-portal.md).
+
+ - For additional detail, see the article [Network topologies for Azure SQL Managed Instance migrations using Azure Database Migration Service](./resource-network-topologies.md).
- ![Create DMS Service](media/tutorial-sql-server-to-managed-instance-online/dms-create-service3.png)
+ ![Configure Azure Database Migration Service instance networking settings](media/tutorial-sql-server-to-managed-instance-online/dms-create-service3.png)
-7. Select **Create** to create the service.
+ - Select **Review + Create** to review the details and then select **Create** to create the service.
## Create a migration project After an instance of the service is created, locate it within the Azure portal, open it, and then create a new migration project.
-1. In the Azure portal, select **All services**, search for Azure Database Migration Service, and then select **Azure Database Migration Services**.
+1. In the Azure portal menu, select **All services**. Search for and select **Azure Database Migration Services**.
![Locate all instances of Azure Database Migration Service](media/tutorial-sql-server-to-managed-instance-online/dms-search.png)
-2. On the **Azure Database Migration Service** screen, search for the name of the instance that you created, and then select the instance.
+2. On the **Azure Database Migration Services** screen, select the Azure Database Migration Service instance that you created.
-3. Select + **New Migration Project**.
+3. Select **New Migration Project**.
-4. On the **New migration project** screen, specify a name for the project, in the **Source server type** text box, select **SQL Server**, in the **Target server type** text box, select **Azure SQL Managed Instance**, and then for **Choose type of activity**, select **Online data migration**.
+ ![Locate your instance of Azure Database Migration Service](media/tutorial-sql-server-to-managed-instance-online/dms-create-project1.png)
- ![Create Azure Database Migration Service Project](media/tutorial-sql-server-to-managed-instance-online/dms-create-project3.png)
+4. On the **New migration project** screen, specify a name for the project, in the **Source server type** text box, select **SQL Server**, in the **Target server type** text box, select **Azure SQL Database Managed Instance**, and then for **Choose type of activity**, select **Online data migration**.
-5. Select **Create and run activity** to create the project.
+ ![Create Database Migration Service Project](media/tutorial-sql-server-to-managed-instance-online/dms-create-project2.png)
+
+5. Select **Create and run activity** to create the project and run the migration activity.
## Specify source details
-1. On the **Migration source detail** screen, specify the connection details for the source SQL Server.
+1. On the **Select source** screen, specify the connection details for the source SQL Server instance.
+
+ Make sure to use a Fully Qualified Domain Name (FQDN) for the source SQL Server instance name. You can also use the IP Address for situations in which DNS name resolution isn't possible.
2. If you haven't installed a trusted certificate on your server, select the **Trust server certificate** check box.
After an instance of the service is created, locate it within the Azure portal,
> [!CAUTION] > TLS connections that are encrypted using a self-signed certificate does not provide strong security. They are susceptible to man-in-the-middle attacks. You should not rely on TLS using self-signed certificates in a production environment or on servers that are connected to the internet.
- ![Source Details](media/tutorial-sql-server-to-managed-instance-online/dms-source-details2.png)
+ ![Source Details](media/tutorial-sql-server-to-managed-instance-online/dms-source-details.png)
-3. Select **Save**.
-
-4. On the **Select source databases** screen, select the **Adventureworks2012** database for migration.
-
- ![Select Source Databases](media/tutorial-sql-server-to-managed-instance-online/dms-source-database1.png)
-
- > [!IMPORTANT]
- > If you use SQL Server Integration Services (SSIS), DMS does not currently support migrating the catalog database for your SSIS projects/packages (SSISDB) from SQL Server to SQL Managed Instance. However, you can provision SSIS in Azure Data Factory (ADF) and redeploy your SSIS projects/packages to the destination SSISDB hosted by SQL Managed Instance. For more information about migrating SSIS packages, see the article [Migrate SQL Server Integration Services packages to Azure](./how-to-migrate-ssis-packages.md).
-
-5. Select **Save**.
+3. Select **Next: Select target**
## Specify target details
-1. On the **Migration target details** screen, specify the **Application ID** and **Key** that the DMS instance can use to connect to the target instance of SQL Managed Instance and the Azure Storage Account.
+1. On the **Select target** screen, specify the **Application ID** and **Key** that the DMS instance can use to connect to the target instance of SQL Managed Instance and the Azure Storage Account.
For more information, see the article [Use portal to create an Azure Active Directory application and service principal that can access resources](../active-directory/develop/howto-create-service-principal-portal.md).
-2. Select the **Subscription** containing the target instance of SQL Managed Instance, and then select the target instance.
+2. Select the **Subscription** containing the target instance of SQL Managed Instance, and then choose the target SQL Managed instance.
If you haven't already provisioned the SQL Managed Instance, select the [link](../azure-sql/managed-instance/instance-create-quickstart.md) to help you provision the instance. When the SQL Managed Instance is ready, return to this specific project to execute the migration. 3. Provide **SQL User** and **Password** to connect to the SQL Managed Instance.
- ![Select Target](media/tutorial-sql-server-to-managed-instance-online/dms-target-details3.png)
+ ![Select Target](media/tutorial-sql-server-to-managed-instance-online/dms-target-details.png)
+
+4. Select **Next: Select databases**.
-4. Select **Save**.
+## Specify source databases
-## Select source databases
+1. On the **Select databases** screen, select the source databases that you want to migrate.
-1. On the **Select source databases** screen, select the source database that you want to migrate.
+ ![Select Source Databases](media/tutorial-sql-server-to-managed-instance-online/dms-source-database.png)
- ![Select source databases](media/tutorial-sql-server-to-managed-instance-online/dms-select-source-databases2.png)
+> [!IMPORTANT]
+> If you use SQL Server Integration Services (SSIS), DMS does not currently support migrating the catalog database for your SSIS projects/packages (SSISDB) from SQL Server to SQL Managed Instance. However, you can provision SSIS in Azure Data Factory (ADF) and redeploy your SSIS projects/packages to the destination SSISDB hosted by SQL Managed Instance. For more information about migrating SSIS packages, see the article [Migrate SQL Server Integration Services packages to Azure](./how-to-migrate-ssis-packages.md).
-2. Select **Save**.
+2. Select **Next: Configure migration settings**.
## Configure migration settings
-1. On the **Configure migration settings** screen, provide the following detail:
+1. On the **Configure migration settings** screen, provide the following details:
| Parameter | Description | |--||
- |**SMB Network location share** | The local SMB network share or Azure file share that contains the Full database backup files and transaction log backup files that Azure Database Migration Service can use for migration. The service account running the source SQL Server instance must have read\write privileges on this network share. Provide an FQDN or IP addresses of the server in the network share, for example, '\\\servername.domainname.com\backupfolder' or '\\\IP address\backupfolder'. For improved performance, it's recommended to use separate folder for each database to be migrated. You can provide the database level file share path by using the **Advanced Settings** option. If you are running into issues connecting to the SMB share, see [SMB share](known-issues-azure-sql-db-managed-instance-online.md#smb-file-share-connectivity). |
+ |**SMB Network location share** | The local SMB network share or Azure file share that contains the full database backup files and transaction log backup files that Azure Database Migration Service can use for migration. The service account running the source SQL Server instance must have read\write privileges on this network share. Provide an FQDN or IP addresses of the server in the network share, for example, '\\\servername.domainname.com\backupfolder' or '\\\IP address\backupfolder'. For improved performance, it's recommended to use separate folder for each database to be migrated. You can provide the database level file share path by using the **Advanced Settings** option. If you are running into issues connecting to the SMB share, see [SMB share](known-issues-azure-sql-db-managed-instance-online.md#smb-file-share-connectivity). |
|**User name** | Make sure that the Windows user has full control privilege on the network share that you provided above. Azure Database Migration Service will impersonate the user credential to upload the backup files to Azure Storage container for restore operation. If using Azure File share, use the storage account name pre-pended with AZURE\ as the username. | |**Password** | Password for the user. If using Azure file share, use a storage account key as the password. | |**Subscription of the Azure Storage Account** | Select the subscription that contains the Azure Storage Account. | |**Azure Storage Account** | Select the Azure Storage Account that DMS can upload the backup files from the SMB network share to and use for database migration. We recommend selecting the Storage Account in the same region as the DMS service for optimal file upload performance. |
- ![Configure Migration Settings](media/tutorial-sql-server-to-managed-instance-online/dms-configure-migration-settings4.png)
+ ![Configure Migration Settings](media/tutorial-sql-server-to-managed-instance-online/dms-configure-migration-settings.png)
> [!NOTE] > If Azure Database Migration Service shows error ΓÇÿSystem Error 53ΓÇÖ or ΓÇÿSystem Error 57ΓÇÖ, the cause might result from an inability of Azure Database Migration Service to access Azure file share. If you encounter one of these errors, please grant access to the storage account from the virtual network using the instructions [here](../storage/common/storage-network-security.md?toc=%2fazure%2fvirtual-network%2ftoc.json#grant-access-from-a-virtual-network).
After an instance of the service is created, locate it within the Azure portal,
> [!IMPORTANT] > If loopback check functionality is enabled and the source SQL Server and file share are on the same computer, then source won't be able to access the files hare using FQDN. To fix this issue, disable loopback check functionality using the instructions [here](https://support.microsoft.com/help/926642/error-message-when-you-try-to-access-a-server-locally-by-using-its-fqd).
-2. Select **Save**.
+2. Select **Next: Summary**.
## Review the migration summary
-1. On the **Migration summary** screen, in the **Activity name** text box, specify a name for the migration activity.
+1. On the **Summary** screen, in the **Activity name** text box, specify a name for the migration activity.
2. Review and verify the details associated with the migration project.
- ![Migration project summary](media/tutorial-sql-server-to-managed-instance-online/dms-project-summary3.png)
+ ![Migration project summary](media/tutorial-sql-server-to-managed-instance-online/dms-project-summary.png)
## Run and monitor the migration
-1. Select **Run migration**.
+1. Select **Start migration**.
-2. On the migration activity screen, select **Refresh** to update the display.
+2. The migration activity window appears displaying the current databases migration status. Select **Refresh** to update the display.
- ![Migration activity in progress](media/tutorial-sql-server-to-managed-instance-online/dms-monitor-migration2.png)
+ ![Migration activity in progress](media/tutorial-sql-server-to-managed-instance-online/dms-monitor-migration.png)
You can further expand the databases and logins categories to monitor the migration status of the respective server objects.
- ![Migration activity status](media/tutorial-sql-server-to-managed-instance-online/dms-monitor-migration-extend2.png)
+ ![Migration activity status](media/tutorial-sql-server-to-managed-instance-online/dms-monitor-migration-extend.png)
## Performing migration cutover
After the full database backup is restored on the target instance of SQL Managed
![Cutover complete](media/tutorial-sql-server-to-managed-instance-online/dms-cutover-complete.png)
-## Next steps
+## Additional resources
* For a tutorial showing you how to migrate a database to SQL Managed Instance using the T-SQL RESTORE command, see [Restore a backup to SQL Managed Instance using the restore command](../azure-sql/managed-instance/restore-sample-database-quickstart.md). * For information about SQL Managed Instance, see [What is SQL Managed Instance](../azure-sql/managed-instance/sql-managed-instance-paas-overview.md).
dns Private Dns Autoregistration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/dns/private-dns-autoregistration.md
To enable auto registration, select the checkbox for "Enable auto registration"
* DNS records are created automatically only for the primary virtual machine NIC. If your virtual machines have more than one NIC, you can manually create the DNS records for other network interfaces. * DNS records are created automatically only if the primary virtual machine NIC is using DHCP. If you're using static IPs, such a configuration with [multiple IP addresses in Azure](../virtual-network/virtual-network-multiple-ip-addresses-portal.md#os-config)), auto registration won't create records for that virtual machine. * Autoregistration for IPv6 (AAAA records) isn't supported.
+* Auto Registration for Private DNS Zone is limited to single VNET.
## Next steps
To enable auto registration, select the checkbox for "Enable auto registration"
* Read about some common [private zone scenarios](./private-dns-scenarios.md) that can be realized with private zones in Azure DNS.
-* For common questions and answers about private zones in Azure DNS, including specific behavior you can expect for certain kinds of operations, see [Private DNS FAQ](./dns-faq-private.yml).
+* For common questions and answers about private zones in Azure DNS, including specific behavior you can expect for certain kinds of operations, see [Private DNS FAQ](./dns-faq-private.yml).
event-grid Authenticate With Active Directory https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-grid/authenticate-with-active-directory.md
Regardless of the security principal used, a managed identity or an application
If you're using the Event Grid SDK, you don't need to worry about the details on how to implement the acquisition of access tokens and how to include it with every request to Event Grid because the [Event Grid data plane SDKs](#publish-events-using-event-grids-client-sdks) do that for you.
-### high-level steps
-Perform the following steps to ready your client to use Azure AD authentication when sending events to a topic, domain, or partner namespace.
+### Client configuration steps to use Azure AD authentication
+Perform the following steps to configure your client to use Azure AD authentication when sending events to a topic, domain, or partner namespace.
1. Create or use a security principal you want to use to authenticate. You can use a [managed identity](#authenticate-using-a-managed-identity) or an [application security principal](#authenticate-using-a-security-principal-of-a-client-application). 2. [Grant permission to a security principal to publish events](#assign-permission-to-a-security-principal-to-publish-events) by assigning the **EventGrid Data Sender** role to the security principal.
expressroute Expressroute Erdirect About https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/expressroute/expressroute-erdirect-about.md
Once enrolled, verify that **Microsoft.Network** resource provider is registered
1. In your subscription, for **Resource Providers**, verify **Microsoft.Network** provider shows a **Registered** status. If the Microsoft.Network resource provider isn't present in the list of registered providers, add it.
-If you begin to use ExpressRoute Direct and notice that there are no available ports in your chosen peering location, email ExpressRouteDirect@microsoft.com to request more inventory.
+If you begin to use ExpressRoute Direct and notice that there are no available ports in your chosen peering location, please log a support request to request more inventory.
## ExpressRoute using a service provider and ExpressRoute Direct
expressroute Expressroute Locations Providers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/expressroute/expressroute-locations-providers.md
Previously updated : 08/12/2021 Last updated : 08/26/2021 # ExpressRoute partners and peering locations
The following table shows connectivity locations and the service providers for e
| **Dallas** | [Equinix DA3](https://www.equinix.com/locations/americas-colocation/united-states-colocation/dallas-data-centers/da3/) | 1 | n/a | 10G, 100G | Aryaka Networks, AT&T NetBond, Cologix, Equinix, Internet2, Level 3 Communications, Megaport, Neutrona Networks, PacketFabric, Telmex Uninet, Telia Carrier, Transtelco, Verizon, Zayo| | **Denver** | [CoreSite DE1](https://www.coresite.com/data-centers/locations/denver/de1) | 1 | West Central US | 10G, 100G | CoreSite, Megaport, PacketFabric, Zayo | | **Dubai** | [PCCS](https://www.pacificcontrols.net/cloudservices/https://docsupdatetracker.net/index.html) | 3 | UAE North | n/a | Etisalat UAE |
-| **Dubai2** | [du datamena](http://datamena.com/solutions/data-centre) | 3 | UAE North | n/a | DE-CIX, du datamena, Equinix, Megaport, Orange, Orixcom |
+| **Dubai2** | [du datamena](http://datamena.com/solutions/data-centre) | 3 | UAE North | n/a | DE-CIX, du datamena, Equinix, GBI, Megaport, Orange, Orixcom |
| **Dublin** | [Equinix DB3](https://www.equinix.com/locations/europe-colocation/ireland-colocation/dublin-data-centers/db3/) | 1 | North Europe | 10G, 100G | CenturyLink Cloud Connect, Colt, eir, Equinix, GEANT, euNetworks, Interxion, Megaport | | **Dublin2** | [Interxion DUB2](https://www.interxion.com/locations/europe/dublin) | 1 | North Europe | 10G, 100G | |
-| **Frankfurt** | [Interxion FRA11](https://www.interxion.com/Locations/frankfurt/) | 1 | Germany West Central | 10G, 100G | AT&T NetBond, British Telecom, CenturyLink Cloud Connect, Colt, DE-CIX, Equinix, euNetworks, GEANT, InterCloud, Interxion, Megaport, NTT Global DataCenters EMEA, Orange, Telia Carrier, T-Systems |
+| **Frankfurt** | [Interxion FRA11](https://www.interxion.com/Locations/frankfurt/) | 1 | Germany West Central | 10G, 100G | AT&T NetBond, British Telecom, CenturyLink Cloud Connect, Colt, DE-CIX, Equinix, euNetworks, GBI, GEANT, InterCloud, Interxion, Megaport, NTT Global DataCenters EMEA, Orange, Telia Carrier, T-Systems |
| **Frankfurt2** | [Equinix FR7](https://www.equinix.com/locations/europe-colocation/germany-colocation/frankfurt-data-centers/fr7/) | 1 | Germany West Central | 10G, 100G | Deutsche Telekom AG, Equinix | | **Geneva** | [Equinix GV2](https://www.equinix.com/locations/europe-colocation/switzerland-colocation/geneva-data-centers/gv2/) | 1 | Switzerland West | 10G, 100G | Colt, Equinix, Megaport, Swisscom | | **Hong Kong** | [Equinix HK1](https://www.equinix.com/data-centers/asia-pacific-colocation/hong-kong-colocation/hong-kong-data-centers/hk1) | 2 | East Asia | 10G | Aryaka Networks, British Telecom, CenturyLink Cloud Connect, Chief Telecom, China Telecom Global, China Unicom, Colt, Equinix, InterCloud, Megaport, NTT Communications, Orange, PCCW Global Limited, Tata Communications, Telia Carrier, Verizon |
expressroute Expressroute Locations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/expressroute/expressroute-locations.md
Previously updated : 08/12/2021 Last updated : 08/26/2021
The following table shows locations by service provider. If you want to view ava
| **[FarEasTone](https://www.fetnet.net/corporate/en/Enterprise.html)** |Supported |Supported |Taipei| | **[Fastweb](https://www.fastweb.it/grandi-aziende/cloud/scheda-prodotto/fastcloud-interconnect/)** | Supported |Supported |Milan| | **[Fibrenoire](https://fibrenoire.ca/en/services/cloudextn-2/)** |Supported |Supported |Montreal|
+| **[GBI](https://www.gbiinc.com/microsoft-azure/)** |Supported |Supported |Dubai2, Frankfurt|
| **[GÉANT](https://www.geant.org/Networks)** |Supported |Supported |Amsterdam, Amsterdam2, Dublin, Frankfurt, Marseille | | **[GlobalConnect](https://www.globalconnect.no/tjenester/nettverk/cloud-access)** | Supported |Supported | Oslo, Stavanger | | **GTT** |Supported |Supported |London2 |
firewall Quick Create Ipgroup Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/firewall/quick-create-ipgroup-template.md
In the Azure portal, review the deployed resources, especially the firewall rule
:::image type="content" source="media/quick-create-ipgroup-template/network-rule.png" alt-text="Network rules.":::
-To learn about the JSON syntax and properties for a firewall in a template, see [Microsoft.Network azureFirewalls template reference](/azure/templates/Microsoft.Network/2019-11-01/azureFirewalls).
+To learn about the JSON syntax and properties for a firewall in a template, see [Microsoft.Network azureFirewalls template reference](/azure/templates/microsoft.network/azurefirewalls).
## Clean up resources
firewall Tutorial Firewall Deploy Portal Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/firewall/tutorial-firewall-deploy-portal-policy.md
Previously updated : 05/03/2021 Last updated : 08/26/2021 #Customer intent: As an administrator new to this service, I want to control outbound network access from resources located in an Azure subnet.
You can keep your firewall resources for the next tutorial, or if no longer need
## Next steps > [!div class="nextstepaction"]
-> [Tutorial: Monitor Azure Firewall logs](./firewall-diagnostics.md)
+> [Deploy and configure Azure Firewall Premium](premium-deploy.md)
firewall Tutorial Firewall Dnat Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/firewall/tutorial-firewall-dnat-policy.md
Previously updated : 04/29/2021 Last updated : 08/26/2021 #Customer intent: As an administrator, I want to deploy and configure Azure Firewall policy DNAT so that I can control inbound Internet access to resources located in a subnet.
You can keep your firewall resources for the next tutorial, or if no longer need
## Next steps
-Next, you can monitor the Azure Firewall logs.
- > [!div class="nextstepaction"]
-> [Tutorial: Monitor Azure Firewall logs](./firewall-diagnostics.md)
+> [Deploy and configure Azure Firewall Premium](premium-deploy.md)
firewall Tutorial Hybrid Portal Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/firewall/tutorial-hybrid-portal-policy.md
Previously updated : 04/27/2021 Last updated : 08/26/2021 customer intent: As an administrator, I want to control network access from an on-premises network to an Azure virtual network.
You can keep your firewall resources for the next tutorial, or if no longer need
## Next steps
-Next, you can monitor the Azure Firewall logs.
- > [!div class="nextstepaction"]
-> [Tutorial: Monitor Azure Firewall logs](./firewall-diagnostics.md)
+> [Deploy and configure Azure Firewall Premium](premium-deploy.md)
frontdoor Quickstart Create Front Door https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/quickstart-create-front-door.md
Finally, add a routing rule. A routing rule maps your frontend host to the backe
:::image type="content" source="media/quickstart-create-front-door/front-door-add-a-rule.png" alt-text="Add a rule to your Front Door"::: >[!WARNING]
- > You **must** ensure that each of the frontend hosts in your Front Door has a routing rule with a default path (`\*`) associated with it. That is, across all of your routing rules there must be at least one routing rule for each of your frontend hosts defined at the default path (`\*`). Failing to do so may result in your end-user traffic not getting routed correctly.
+ > You **must** ensure that each of the frontend hosts in your Front Door has a routing rule with a default path (`/*`) associated with it. That is, across all of your routing rules there must be at least one routing rule for each of your frontend hosts defined at the default path (`/*`). Failing to do so may result in your end-user traffic not getting routed correctly.
1. Select **Review + Create**, and then **Create**.
frontdoor How To Configure Https Custom Domain https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/standard-premium/how-to-configure-https-custom-domain.md
Previously updated : 06/10/2021 Last updated : 08/26/2021 #Customer intent: As a website owner, I want to add a custom domain to my Front Door configuration so that my users can use my custom domain to access my content.
Register the service principal for Azure Front Door as an app in your Azure Acti
1. In PowerShell, run the following command:
- `New-AzADServicePrincipal -ApplicationId "ad0e1c7e-6d38-4ba4-9efd-0bc77ba9f037"`
+ `New-AzADServicePrincipal -ApplicationId "205478c0-bd83-4e1b-a9d6-db63a3e1e1c8"`
#### Grant Azure Front Door access to your key vault
Grant Azure Front Door permission to access the certificates in your Azure Key
1. In your key vault account, under SETTINGS, select **Access policies**. Then select **Add new** to create a new policy.
-1. In **Select principal**, search for **ad0e1c7e-6d38-4ba4-9efd-0bc77ba9f037**, and choose ** Microsoft.AzureFrontDoor-Cdn**. Click **Select**.
+1. In **Select principal**, search for **205478c0-bd83-4e1b-a9d6-db63a3e1e1c8**, and choose **Microsoft.AzureFrontDoor-Cdn**. Click **Select**.
1. In **Secret permissions**, select **Get** to allow Front Door to retrieve the certificate.
frontdoor How To Logs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/standard-premium/how-to-logs.md
Previously updated : 03/15/2021 Last updated : 08/26/2021
Azure Front Door currently provides individual API requests with each entry havi
| UserAgent | The browser type that the client used. | | ClientIp | The IP address of the client that made the original request. If there was an X-Forwarded-For header in the request, then the Client IP is picked from the same. | | SocketIp | The IP address of the direct connection to AFD edge. If the client used an HTTP proxy or a load balancer to send the request, the value of SocketIp is the IP address of the proxy or load balancer. |
-| Latency | The length of time from the time AFD edge server receives a client's request to the time that AFD sends the last byte of response to client, in milliseconds. This field doesn't take into account network latency and TCP buffering. |
+| timeTaken | The length of time from the time AFD edge server receives a client's request to the time that AFD sends the last byte of response to client, in milliseconds. This field doesn't take into account network latency and TCP buffering. |
| RequestProtocol | The protocol that the client specified in the request: HTTP, HTTPS. | | SecurityProtocol | The TLS/SSL protocol version used by the request or null if no encryption. Possible values include: SSLv3, TLSv1, TLSv1.1, TLSv1.2 | | SecurityCipher | When the value for Request Protocol is HTTPS, this field indicates the TLS/SSL cipher negotiated by the client and AFD for encryption. |
governance Policy For Kubernetes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/policy/concepts/policy-for-kubernetes.md
The add-on checks in with Azure Policy service for changes in policy assignments
During this refresh cycle, the add-on checks for changes. These changes trigger creates, updates, or deletes of the constraint templates and constraints.
-In a Kubernetes cluster, if a namespace has either of the following labels, the admission requests
+In a Kubernetes cluster, if a namespace has the cluster-appropriate label, the admission requests
with violations aren't denied. Compliance assessment results are still available. -- `control-plane`-- `admission.policy.azure.com/ignore`
+- Azure Arc-enabled Kubernetes cluster: `admission.policy.azure.com/ignore`
+- Azure Kubernetes Service cluster: `control-plane`
> [!NOTE] > While a cluster admin may have permission to create and update constraint templates and
governance Export Resources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/policy/how-to/export-resources.md
the supported SDKs.
## Export with Azure portal
+> [!NOTE]
+> Exporting Azure Policy resources from the Azure portal isn't available for Azure sovereign clouds.
+ To export a policy definition from Azure portal, follow these steps: 1. Launch the Azure Policy service in the Azure portal by clicking **All services**, then searching
iot-edge How To Continuous Integration Continuous Deployment Classic https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-edge/how-to-continuous-integration-continuous-deployment-classic.md
description: Set up continuous integration and continuous deployment using the c
Previously updated : 08/26/2020 Last updated : 08/26/2021
[!INCLUDE [iot-edge-version-all-supported](../../includes/iot-edge-version-all-supported.md)]
-You can easily adopt DevOps with your Azure IoT Edge applications with the built-in Azure IoT Edge tasks in Azure Pipelines. This article demonstrates how you can use the continuous integration and continuous deployment features of Azure Pipelines to build, test, and deploy applications quickly and efficiently to your Azure IoT Edge using the classic editor. Alternatively, you can [use YAML](how-to-continuous-integration-continuous-deployment.md).
+Azure Pipelines includes a built-in Azure IoT Edge task that helps you adopt DevOps with your Azure IoT Edge applications. This article demonstrates how to use the continuous integration and continuous deployment features of Azure Pipelines to build, test, and deploy applications quickly and efficiently to your Azure IoT Edge using the classic editor. Alternatively, you can [use YAML](how-to-continuous-integration-continuous-deployment.md).
![Diagram - CI and CD branches for development and production](./media/how-to-continuous-integration-continuous-deployment-classic/model.png)
In this section, you create a new build pipeline. You configure the pipeline to
In your pipeline description, choose the correct agent specification based on your target platform:
- * If you would like to build your modules in platform amd64 for Linux containers, choose **ubuntu-16.04**
+ * If you would like to build your modules in platform amd64 for Linux containers, choose **ubuntu-18.04**
* If you would like to build your modules in platform amd64 for Windows 1809 containers, you need to [set up self-hosted agent on Windows](/azure/devops/pipelines/agents/v2-windows). * If you would like to build your modules in platform arm32v7 or arm64 for Linux containers, you need to [set up self-hosted agent on Linux](https://devblogs.microsoft.com/iotdev/setup-azure-iot-edge-ci-cd-pipeline-with-arm-agent).
- ![Configure build agent specification](./media/how-to-continuous-integration-continuous-deployment-classic/configure-env.png)
+ :::image type="content" source="./media/how-to-continuous-integration-continuous-deployment-classic/configure-env.png" alt-text="Configure build agent specification.":::
6. Your pipeline comes preconfigured with a job called **Agent job 1**. Select the plus sign (**+**) to add four tasks to the job: **Azure IoT Edge** twice, **Copy Files** once, and **Publish Build Artifacts** once. Search for each task and hover over the task's name to see the **Add** button.
- ![Add Azure IoT Edge task](./media/how-to-continuous-integration-continuous-deployment-classic/add-iot-edge-task.png)
+ :::image type="content" source="./media/how-to-continuous-integration-continuous-deployment-classic/add-iot-edge-task.png" alt-text="Add Azure IoT Edge task.":::
When all four tasks are added, your Agent job looks like the following example:
- ![Four tasks in the build pipeline](./media/how-to-continuous-integration-continuous-deployment-classic/add-tasks.png)
+ :::image type="content" source="./media/how-to-continuous-integration-continuous-deployment-classic/add-tasks.png" alt-text="Four tasks in the build pipeline.":::
7. Select the first **Azure IoT Edge** task to edit it. This task builds all modules in the solution with the target platform that you specify. Edit the task with the following values:
In this section, you create a new build pipeline. You configure the pipeline to
For more information about this task and its parameters, see [Azure IoT Edge task](/azure/devops/pipelines/tasks/build/azure-iot-edge).
- These configurations use the image repository and tag that are defined in the `module.json` file to name and tag the module image. **Build module images** also helps replace the variables with the exact value you define in the `module.json` file. In Visual Studio or Visual Studio Code, you are specifying the actual value in a `.env` file. In Azure Pipelines, you set the value on the **Pipeline Variables** tab. Select the **Variables** tab on the pipeline editor menu and configure the name and value as following:
+ These configurations use the image repository and tag that are defined in the `module.json` file to name and tag the module image. **Build module images** also helps replace the variables with the exact value you define in the `module.json` file. In Visual Studio or Visual Studio Code, you specify the actual value in a `.env` file. In Azure Pipelines, you set the value on the **Pipeline Variables** tab. Select the **Variables** tab on the pipeline editor menu and configure the name and value as following:
- * **ACR_ADDRESS**: Your Azure Container Registry **Login server** value. You can retrieve the login server value from the overview page of your container registry in the Azure portal.
+ * **ACR_ADDRESS**: Your Azure Container Registry **Login server** value. You can find the login server value on the container registry's overview page in the Azure portal.
If you have other variables in your project, you can specify the name and value on this tab. **Build module images** recognizes only variables that are in `${VARIABLE}` format. Make sure you use this format in your `**/module.json` files.
This pipeline is now configured to run automatically when you push new code to y
* IoT Edge DevOps sample in [Azure DevOps Starter for IoT Edge](how-to-devops-starter.md) * Understand the IoT Edge deployment in [Understand IoT Edge deployments for single devices or at scale](module-deployment-monitoring.md)
-* Walk through the steps to create, update, or delete a deployment in [Deploy and monitor IoT Edge modules at scale](how-to-deploy-at-scale.md).
+* Walk through the steps to create, update, or delete a deployment in [Deploy and monitor IoT Edge modules at scale](how-to-deploy-at-scale.md).
iot-edge How To Visual Studio Develop Module https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-edge/how-to-visual-studio-develop-module.md
Previously updated : 07/19/2021 Last updated : 08/24/2021
The IoT Edge project template in Visual Studio creates a solution that can be de
1. On the **Create a new project** page, search for **Azure IoT Edge**. Select the project that matches the platform and architecture for your IoT Edge device, and click **Next**.
- ![Create New Project](./media/how-to-visual-studio-develop-csharp-module/create-new.png)
+ :::image type="content" source="./media/how-to-visual-studio-develop-module/create-new-project.png" alt-text="Create New Project":::
1. On the **Configure your new project** page, enter a name for your project and specify the location, then select **Create**.
The project folder contains a list of all the modules included in that project.
The project folder also contains a file named `deployment.template.json`. This file is a template of an IoT Edge deployment manifest, which defines all the modules that will run on a device along with how they will communicate with each other. For more information about deployment manifests, see [Learn how to deploy modules and establish routes](module-composition.md). If you open this deployment template, you see that the two runtime modules, **edgeAgent** and **edgeHub** are included, along with the custom module that you created in this Visual Studio project. A fourth module named **SimulatedTemperatureSensor** is also included. This default module generates simulated data that you can use to test your modules, or delete if it's not necessary. To see how the simulated temperature sensor works, view the [SimulatedTemperatureSensor.csproj source code](https://github.com/Azure/iotedge/tree/master/edge-modules/SimulatedTemperatureSensor).
+### Set IoT Edge runtime version
+
+The IoT Edge extension defaults to the latest stable version of the IoT Edge runtime when it creates your deployment assets. Currently, the latest stable version is version 1.2. If you're developing modules for devices running the 1.1 long-term support version or the earlier 1.0 version, update the IoT Edge runtime version in Visual Studio to match.
+
+1. In the Solution Explorer, right-click the name of your project and select **Set IoT Edge runtime version**.
+
+ :::image type="content" source="./media/how-to-visual-studio-develop-module/set-iot-edge-runtime-version.png" alt-text="Right-click your project name and select set IoT Edge runtime version.":::
+
+1. Use the drop-down menu to choose the runtime version that your IoT Edge devices are running, then select **OK** to save your changes.
+
+1. Re-generate your deployment manifest with the new runtime version. Right-click the name of your project and select **Generate deployment for IoT Edge**.
+ ## Develop your module When you add a new module, it comes with default code that is ready to be built and deployed to a device so that you can start testing without touching any code. The module code is located within the module folder in a file named `Program.cs` (for C#) or `main.c` (for C).
iot-edge How To Vs Code Develop Module https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-edge/how-to-vs-code-develop-module.md
keywords:
Previously updated : 08/11/2021 Last updated : 08/24/2021
There are four items within the solution:
To see how the simulated temperature module works, view the [SimulatedTemperatureSensor.csproj source code](https://github.com/Azure/iotedge/tree/master/edge-modules/SimulatedTemperatureSensor).
+### Set IoT Edge runtime version
+
+The IoT Edge extension defaults to the latest stable version of the IoT Edge runtime when it creates your deployment assets. Currently, the latest stable version is version 1.2. If you're developing modules for devices running the 1.1 long-term support version or the earlier 1.0 version, update the IoT Edge runtime version in Visual Studio Code to match.
+
+1. Select **View** > **Command Palette**.
+
+1. In the command palette, enter and run the command **Azure IoT Edge: Set default IoT Edge runtime version**.
+
+1. Choose the runtime version that your IoT Edge devices are running from the list.
+
+After selecting a new runtime version, your deployment manifest is dynamically updated to reflect the change to the runtime module images.
+ ## Add additional modules To add additional modules to your solution, run the command **Azure IoT Edge: Add IoT Edge Module** from the command palette. You can also right-click the **modules** folder or the `deployment.template.json` file in the Visual Studio Code Explorer view and then select **Add IoT Edge Module**.
iot-edge Tutorial Deploy Machine Learning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-edge/tutorial-deploy-machine-learning.md
+ # Tutorial: Deploy Azure Machine Learning as an IoT Edge module (preview)
iot-edge Tutorial Develop For Linux https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-edge/tutorial-develop-for-linux.md
The following table lists the supported development scenarios for **Linux contai
>Support for Linux ARM64 devices is available in [public preview](https://azure.microsoft.com/support/legal/preview-supplemental-terms/). For more information, see [Develop and debug ARM64 IoT Edge modules in Visual Studio Code (preview)](https://devblogs.microsoft.com/iotdev/develop-and-debug-arm64-iot-edge-modules-in-visual-studio-code-preview). This tutorial teaches the development steps for Visual Studio Code. If you would rather use Visual Studio, refer to the instructions in [Use Visual Studio 2019 to develop and debug modules for Azure IoT Edge](how-to-visual-studio-develop-module.md).+ ## Install container engine IoT Edge modules are packaged as containers, so you need a container engine on your development machine to build and manage them. We recommend Docker Desktop for development because of its feature support and popularity. Docker Desktop on Windows lets you switch between Linux containers and Windows containers so that you can easily develop modules for different types of IoT Edge devices.
Once your new solution loads in the Visual Studio Code window, take a moment to
* In the registry credentials section, the address is autofilled from the information you provided when you created the solution. However, the username and password reference the variables stored in the .env file. This configuration is for security, as the .env file is git ignored, but the deployment template is not. * In the SampleModule section, the container image isn't filled in even though you provided the image repository when you created the solution. This placeholder points to the **module.json** file inside the SampleModule folder. If you go to that file, you'll see that the image field does contain the repository, but also a tag value that is made up of the version and the platform of the container. You can iterate the version manually as part of your development cycle, and you select the container platform using a switcher that we introduce later in this section.
+### Set IoT Edge runtime version
+
+The IoT Edge extension defaults to the latest stable version of the IoT Edge runtime when it creates your deployment assets. Currently, the latest stable version is version 1.2. If you're developing modules for devices running the 1.1 long-term support version or the earlier 1.0 version, update the IoT Edge runtime version in Visual Studio Code to match.
+
+1. Select **View** > **Command Palette**.
+
+1. In the command palette, enter and run the command **Azure IoT Edge: Set default IoT Edge runtime version**.
+
+1. Choose the runtime version that your IoT Edge devices are running from the list.
+
+After selecting a new runtime version, your deployment manifest is dynamically updated to reflect the change to the runtime module images.
+ ### Provide your registry credentials to the IoT Edge agent The environment file stores the credentials for your container registry and shares them with the IoT Edge runtime. The runtime needs these credentials to pull your container images onto the IoT Edge device.
iot-edge Tutorial Develop For Windows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-edge/tutorial-develop-for-windows.md
Once your new project loads in the Visual Studio window, take a moment to famili
* The **program.cs** file contains the default C# module code that comes with the project template. The default module takes input from a source and passes it along to IoT Hub. * The **module.json** file hold details about the module, including the full image repository, image version, and which Dockerfile to use for each supported platform.
+### Set IoT Edge runtime version
+
+The IoT Edge extension defaults to the latest stable version of the IoT Edge runtime when it creates your deployment assets. Currently, the latest stable version is version 1.2.
+
+Windows containers are only supported in the 1.1 long-term support version or the earlier 1.0 version. To develop modules for devices using Windows containers, update the IoT Edge runtime version in Visual Studio to match the IoT Edge version on those devices.
+
+1. In the Solution Explorer, right-click the name of your project and select **Set IoT Edge runtime version**.
+
+ :::image type="content" source="./media/how-to-visual-studio-develop-module/set-iot-edge-runtime-version.png" alt-text="Right-click your project name and select set IoT Edge runtime version.":::
+
+1. Use the drop-down menu to choose the runtime version that your IoT Edge devices are running, then select **OK** to save your changes.
+
+1. Re-generate your deployment manifest with the new runtime version. Right-click the name of your project and select **Generate deployment for IoT Edge**.
+ ### Provide your registry credentials to the IoT Edge agent The IoT Edge runtime needs your registry credentials to pull your container images onto the IoT Edge device. The IoT Edge extension tries to pull your container registry information from Azure and populate it in the deployment template.
iot-hub Tutorial Routing Config Message Routing RM Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/iot-hub/tutorial-routing-config-message-routing-RM-template.md
Previously updated : 03/25/2019 Last updated : 08/24/2021 #Customer intent: As a developer, I want to be able to route messages sent to my IoT hub to different destinations based on properties stored in the message. This step of the tutorial needs to show me how to set up my resources using an Azure Resource Manager template.
Most of these parameters have default values. The ones ending with **_in** are c
**subscriptionId**: This field is set for you to the subscription into which you are deploying the template. This field is not in the parameters file since it is set for you.
-**IoTHubName_in**: This field is the base IoT Hub name, which is concatenated with the randomValue to be globally unique.
+**IoTHubName_in**: This field is the base IoT Hub name, which is concatenated with the randomValue so it is globally unique.
**location**: This field is the Azure region into which you are deploying, such as "westus".
lab-services Class Type Rstudio Linux https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/lab-services/class-type-rstudio-linux.md
+
+ Title: Set up a lab with R and RStudio on Linux using Azure Lab Services
+description: Learn how to set up labs to teach R using RStudio on Linux
++ Last updated : 08/25/2021+++
+# Set up a lab to teach R
+
+[R](https://www.r-project.org/https://docsupdatetracker.net/about.html) is an open-source language used for statistical computing and graphics. It's used in the statistical analysis of genetics to natural language processing to financial data. R provides an [interactive command line](https://cran.r-project.org/doc/manuals/r-release/R-intro.html#Invoking-R-from-the-command-line) experience. [RStudio](https://www.rstudio.com/products/rstudio/) is an interactive development environment (IDE) available for the R language. The free version provides code editing tools, an integrated debugging experience, and package development tools.
+
+This article will focus on solely RStudio and R as a building block for a class that requires the use of statistical computing. The [deep learning](class-type-deep-learning-natural-language-processing.md) and [Python and Jupyter Notebooks](class-type-jupyter-notebook.md)
+class type setup RStudio differently. Each article describes how to use the [Data Science Virtual Machine for Linux (Ubuntu)](https://azuremarketplace.microsoft.com/en-US/marketplace/apps/microsoft-dsvm.ubuntu-1804) marketplace image, which has many [data science related tools](/azure/machine-learning/data-science-virtual-machine/tools-included), including RStudio, pre-installed.
+
+## Lab Account configuration
+
+To set up this lab, you need an Azure subscription and lab account to get started. If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/) before you begin. Once you get an Azure subscription, you can create a new lab account in Azure Lab Services. For more information about creating a new lab account, see the tutorial on [how to setup a lab account](./tutorial-setup-lab-account.md). You can also use an existing lab account.
+
+### Lab account settings
+
+Enable your lab account settings as described in the following table. For more information about how to enable Azure Marketplace images, see [Specify the Azure Marketplace images available to lab creators](./specify-marketplace-images.md).
+
+| Lab account setting | Instructions |
+| -- | -- |
+| Marketplace images | Ubuntu Server 18.04 LTS |
+| [Enable peer virtual network](how-to-connect-peer-virtual-network.md) | Enable if:<ul><li>Class requires a shared R Server.</li><li>Class requires large data files that you want to store externally and not on the student VM.</li></ul> |
+
+> [!IMPORTANT]
+> If you choose to enable peer virtual network, this must be done before the lab is created.
+
+## Lab configuration
+
+For instructions to create a new lab and apply needed settings, see [Tutorial: Set up a classroom lab](tutorial-setup-classroom-lab.md). When creating the lab, apply the following settings:
+
+| Lab setting | Value and description |
+| | |
+| Virtual Machine Size | Small GPU (Compute)|
+| VM image | Ubuntu Server 18.04 LTS |
+| Enable remote desktop connection | This setting should be enabled if you choose to use RDP. This setting isn't needed if you choose [X2Go to connect to lab machines](how-to-use-remote-desktop-linux-student.md). You'll need to connect to the Linux VM using SSH the first time and install the RDP/X2Go and GUI packages. For more information, see [enable graphical remote desktop for Linux VMs](how-to-enable-remote-desktop-linux.md). |
+
+## External Resource Configuration
+
+Some classes require files, such as large data files, to be stored externally. See [use external file storage in Azure Lab Services](how-to-attach-external-storage.md) for options and setup instructions.
+
+If you choose to have a shared R Server for the students, the server should be set up before the lab is created. For more information on how to set up a shared server, see [how to create a lab with a shared resource in Azure Lab Services](how-to-create-a-lab-with-shared-resource.md). For instructions to create an RStudio Server, see [Download RStudio Server for Debian & Ubuntu](https://www.rstudio.com/products/rstudio/download-server/debian-ubuntu/) and [Accessing RStudio Server Open-Source](https://support.rstudio.com/hc/en-us/articles/200552306-Getting-Started).
+
+## Template configuration
+
+After the template machine is created, start the machine, and connect to it to [install R](https://docs.rstudio.com/resources/install-r/), [RStudio Desktop](https://www.rstudio.com/products/rstudio/download/) and optionally [X2Go Server](https://wiki.x2go.org/doku.php/doc:installation:x2goserver).
+
+First, letΓÇÖs update apt and upgrade existing packages on the machine.
+
+```bash
+sudo apt update
+sudo apt upgrade
+```
+
+### Install X2Go Server
+
+If you choose to use X2Go, [install the server](https://github.com/Azure/azure-devtestlab/tree/master/samples/ClassroomLabs/Scripts/X2GoRemoteDesktop#install-x2go-server). You'll first need to [connect using ssh](how-to-use-remote-desktop-linux-student.md#connect-to-the-student-vm-using-ssh) to install the server component. Once that is completed, the rest of the setup can be completed after [connecting using the X2Go client](how-to-use-remote-desktop-linux-student.md).
+
+The default installation of X2Go isn't compatible with RStudio. To work around this issue, update the x2goagent options file.
+
+1. Edit `/etc/x2go/x2goagent.options` file. DonΓÇÖt forget to edit file as sudo.
+ 1. Uncomment the line that states: `X2GO_NXAGENT_DEFAULT_OPTIONS+=" -extension GLX"`
+ 1. Comment the line that states: `X2GO_NXAGENT_DEFAULT_OPTIONS+=" -extension GLX"`
+2. Restart the X2Go server so the new options are used.
+
+ ```bash
+ sudo systemctl restart x2goserver
+ ```
+
+Alternatively, you can build the required libraries by following instructions at [Glx Xlib workaround for X2Go](https://wiki.x2go.org/doku.php/wiki:development:glx-xlib-workaround).
+
+### Install R
+
+There are a few ways to install R on the VM. We'll install R from the Comprehensive R Archive Network (CRAN) repository. It provides the most up-to-date versions of R. Once this repository is added to our machine, we can install R and many other related packages.
+
+We need to add the CRAN repository. Commands are modified from instructions available at [Ubuntu Packages for R brief instructions](https://cran.rstudio.com/bin/linux/ubuntu/).
+
+```bash
+#download helper packages
+sudo apt install --no-install-recommends software-properties-common dirmngr
+# download and add the signing key (by Michael Rutter) for these repos
+sudo wget -q "https://cloud.r-project.org/bin/linux/ubuntu/marutter_pubkey.asc" -O /etc/apt/trusted.gpg.d/cran_ubuntu_key.asc
+#add repository
+sudo add-apt-repository "deb https://cloud.r-project.org/bin/linux/ubuntu bionic-cran40/"
+```
+
+Now we can install R, running the following command:
+
+```bash
+sudo apt install r-base
+```
+
+### Install RStudio
+
+Now that we have R installed locally, we can install the RStudio IDE. We'll install the free version of RStudio Desktop. For all available versions, see [RStudio downloads](https://www.rstudio.com/products/rstudio/download/).
+
+1. [Import the code signing key](https://www.rstudio.com/code-signing/) for RStudio.
+
+ ```bash
+ sudo gpg --keyserver keyserver.ubuntu.com --recv-keys 3F32EE77E331692F
+ ```
+
+2. Download the [Debian Linux Package file (.deb) for R Studio](https://www.rstudio.com/products/rstudio/download/#download) for Ubuntu. File will be in the format `rstudio-{version}-amd64.deb`. For example:
+
+ ```bash
+ export rstudiover="1.4.1717"
+ wget --quiet -O rstudio.deb https://download1.rstudio.org/desktop/bionic/amd64/rstudio-$rstudiover-amd64.deb
+ ```
+
+3. Use gdebi to install RStudio. Make sure to use the file path to indicate to apt that were installing a local file.
+
+ ```bash
+ sudo apt install gdebi-core
+ echo "y" | gdebi rstudio.deb ΓÇôquiet
+ ```
+
+### CRAN packages
+
+Now itΓÇÖs time to install any [CRAN packages](https://cloud.r-project.org/web/packages/available_packages_by_name.html) you want. First, add the [current R 4.0 or later ΓÇÿc2d4uΓÇÖ repository](https://cran.rstudio.com/bin/linux/ubuntu/#get-5000-cran-packages).
+
+```bash
+sudo add-apt-repository ppa:c2d4u.team/c2d4u4.0+
+```
+
+Use the `install.packages(ΓÇ£package nameΓÇ¥)` command in an R interactive session as shown in [quick list of useful R packages](https://support.rstudio.com/hc/articles/201057987-Quick-list-of-useful-R-packages) article. Alternately, use Tools -> Install Packages menu item in RStudio.
+
+If you need help with finding a package, see a [list of packages by task](https://cran.r-project.org/web/views/) or [alphabetic list of packages](https://cloud.r-project.org/web/packages/available_packages_by_name.html).
+
+## Cost
+
+LetΓÇÖs cover an example cost estimate for this class. Suppose you have a class of 25 students. Each student has 20 hours of scheduled class time. Another 10 quota hours for homework or assignments outside of scheduled class time is given to each student. The virtual machine size we chose was **Small GPU (Compute)**, which is 139 lab units.
+
+25 students &times; (20 scheduled hours + 10 quota hours) &times; 139 Lab Units &times; 0.01 USD per hour = 1042.5 USD
+
+> [!IMPORTANT]
+> The cost estimate is for example purposes only. For current pricing information, see [Azure Lab Services pricing](https://azure.microsoft.com/pricing/details/lab-services/).
+
+## Next steps
+
+The template image can now be published to the lab. See [publish the template VM](how-to-create-manage-template.md#publish-the-template-vm) for further instructions.
+
+As you set up your lab, see the following articles:
+
+- [Add users](tutorial-setup-classroom-lab.md#add-users-to-the-lab)
+- [Set quotas](how-to-configure-student-usage.md#set-quotas-for-users)
+- [Set a schedule](tutorial-setup-classroom-lab.md#set-a-schedule-for-the-lab)
+- [Email registration links to students](how-to-configure-student-usage.md#send-invitations-to-users)
lab-services Classroom Labs Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/lab-services/classroom-labs-overview.md
Title: About labs in Azure Lab Services | Microsoft Docs description: Learn how to quickly set up a classroom lab environment in the cloud - configure a lab with a template VM with the software required for the class and make a copy of the VM available to each student in the class. Previously updated : 06/26/2020 Last updated : 08/26/2021 # Introduction to labs
lab-services How To Attach External Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/lab-services/how-to-attach-external-storage.md
If you're using a private endpoint to the Azure Files share, it's important to r
Follow these steps to create a VM connected to an Azure file share. 1. Create an [Azure Storage account](../storage/files/storage-how-to-create-file-share.md). On the **Connectivity method** page, choose **public endpoint** or **private endpoint**.
-2. If you've chosen the private method, create a [private endpoint](../private-link/tutorial-private-endpoint-storage-portal.md) in order for the file shares to be accessible from the virtual network. Create a [private DNS zone](../dns/private-dns-privatednszone.md), or use an existing one. Private Azure DNS zones provide name resolution within a virtual network.
-3. Create an [Azure file share](../storage/files/storage-how-to-create-file-share.md). The file share is reachable by the public host name of the storage account.
+2. If you've chosen the private method, create a [private endpoint](../private-link/tutorial-private-endpoint-storage-portal.md) in order for the file shares to be accessible from the virtual network.
+3. Create an [Azure file share](../storage/files/storage-how-to-create-file-share.md). The file share is reachable by the public host name of the storage account if using a public endpoint. The file share is reachable by private IP address if using a private endpoint.
4. Mount the Azure file share in the template VM: - [Windows](../storage/files/storage-how-to-use-files-windows.md)
- - [Linux](../storage/files/storage-how-to-use-files-linux.md). To avoid mounting issues on student VMs, see the next section.
+ - [Linux](../storage/files/storage-how-to-use-files-linux.md). To avoid mounting issues on student VMs, see the [use Azure Files with Linux](#use-azure-files-with-linux) section.
5. [Publish](how-to-create-manage-template.md#publish-the-template-vm) the template VM. > [!IMPORTANT]
Follow these steps to create a VM connected to an Azure file share.
If you use the default instructions to mount an Azure Files share, the file share will seem to disappear on student VMs after the template is published. The following modified script addresses this issue.
+For file share with a public endpoint:
```bash #!/bin/bash
sudo bash -c "echo ""//$storage_account_name.file.core.windows.net/$fileshare_na
sudo mount -t cifs //$storage_account_name.file.core.windows.net/$fileshare_name /$mount_directory/$fileshare_name -o vers=3.0,credentials=/etc/smbcredentials/$storage_account_name.cred,dir_mode=0777,file_mode=0777,serverino ```
+For file share with a private endpoint:
+```bash
+#!/bin/bash
+
+# Assign variables values for your storage account and file share
+storage_account_name=""
+storage_account_ip=""
+storage_account_key=""
+fileshare_name=""
+
+# Do not use 'mnt' for mount directory.
+# Using ΓÇÿmntΓÇÖ will cause issues on student VMs.
+mount_directory="prm-mnt"
+
+sudo mkdir /$mount_directory/$fileshare_name
+if [ ! -d "/etc/smbcredentials" ]; then
+ sudo mkdir /etc/smbcredentials
+fi
+if [ ! -f "/etc/smbcredentials/$storage_account_name.cred" ]; then
+ sudo bash -c "echo ""username=$storage_account_name"" >> /etc/smbcredentials/$storage_account_name.cred"
+ sudo bash -c "echo ""password=$storage_account_key"" >> /etc/smbcredentials/$storage_account_name.cred"
+fi
+sudo chmod 600 /etc/smbcredentials/$storage_account_name.cred
+
+sudo bash -c "echo ""//$storage_account_ip/$fileshare_name /$mount_directory/$fileshare_name cifs nofail,vers=3.0,credentials=/etc/smbcredentials/$storage_account_name.cred,dir_mode=0777,file_mode=0777,serverino"" >> /etc/fstab"
+sudo mount -t cifs //$storage_account_name.file.core.windows.net/$fileshare_name /$mount_directory/$fileshare_name -o vers=3.0,credentials=/etc/smbcredentials/$storage_account_name.cred,dir_mode=0777,file_mode=0777,serverino
+```
+ If the template VM that mounts the Azure Files share to the `/mnt` directory is already published, the student can either: - Move the instruction to mount `/mnt` to the top of the `/etc/fstab` file.
lighthouse Create Eligible Authorizations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/lighthouse/how-to/create-eligible-authorizations.md
Title: Create eligible authorizations description: When onboarding customers to Azure Lighthouse, you can let users in your managing tenant elevate their role on a just-in-time basis. Previously updated : 07/15/2021 Last updated : 08/26/2021
The **subscription-managing-tenant-approvers.json** template, which can be used
"type": "string", "metadata": { "description": "Specify a unique name for your offer"
- },
- "defaultValue": "<to be filled out by MSP> Specify a title for your offer"
+ }
}, "mspOfferDescription": { "type": "string", "metadata": { "description": "Name of the Managed Service Provider offering"
- },
- "defaultValue": "<to be filled out by MSP> Provide a brief description of your offer"
+ }
}, "managedByTenantId": { "type": "string", "metadata": { "description": "Specify the tenant id of the Managed Service Provider"
- },
- "defaultValue": "<to be filled out by MSP> Provide your tenant id"
+ }
}, "authorizations": { "type": "array", "metadata": { "description": "Specify an array of objects, containing tuples of Azure Active Directory principalId, a Azure roleDefinitionId, and an optional principalIdDisplayName. The roleDefinition specified is granted to the principalId in the provider's Active Directory and the principalIdDisplayName is visible to customers."
- },
- "defaultValue": [
- {
- "principalId": "00000000-0000-0000-0000-000000000000",
- "roleDefinitionId": "acdd72a7-3385-48ef-bd42-f606fba81ae7",
- "principalIdDisplayName": "PIM_Group"
- },
- {
- "principalId": "00000000-0000-0000-0000-000000000000",
- "roleDefinitionId": "91c1777a-f3dc-4fae-b103-61d183457e46",
- "principalIdDisplayName": "PIM_Group"
- }
- ]
- },
- "eligibleAuthorizations": {
- "type": "array",
- "metadata": {
- "description": "Provide the authorizations that will have just-in-time role assignments on customer environments"
- },
- "defaultValue": [
- {
- "justInTimeAccessPolicy": {
- "multiFactorAuthProvider": "Azure",
- "maximumActivationDuration": "PT8H",
- "managedByTenantApprovers": [
- {
- "principalId": "00000000-0000-0000-0000-000000000000",
- "principalIdDisplayName": "PIM-Approvers"
- }
- ]
- },
- "principalId": "00000000-0000-0000-0000-000000000000",
- "principalIdDisplayName": "PIM_Group",
- "roleDefinitionId": "b24988ac-6180-42a0-ab88-20f7382dd24c"
-
- }
- ]
-
- }
- },
- "variables": {
- "mspRegistrationName": "[guid(parameters('mspOfferName'))]",
- "mspAssignmentName": "[guid(parameters('mspOfferName'))]"
- },
- "resources": [
- {
- "type": "Microsoft.ManagedServices/registrationDefinitions",
- "apiVersion": "2020-02-01-preview",
- "name": "[variables('mspRegistrationName')]",
- "properties": {
- "registrationDefinitionName": "[parameters('mspOfferName')]",
- "description": "[parameters('mspOfferDescription')]",
- "managedByTenantId": "[parameters('managedByTenantId')]",
- "authorizations": "[parameters('authorizations')]",
- "eligibleAuthorizations": "[parameters('eligibleAuthorizations')]"
} },
- {
- "type": "Microsoft.ManagedServices/registrationAssignments",
- "apiVersion": "2020-02-01-preview",
- "name": "[variables('mspAssignmentName')]",
- "dependsOn": [
- "[resourceId('Microsoft.ManagedServices/registrationDefinitions/', variables('mspRegistrationName'))]"
- ],
- "properties": {
- "registrationDefinitionId": "[resourceId('Microsoft.ManagedServices/registrationDefinitions/', variables('mspRegistrationName'))]"
+ "eligibleAuthorizations": {
+ "type": "array",
+ "metadata": {
+ "description": "Provide the authorizations that will have just-in-time role assignments on customer environments with support for approvals from the managing tenant"
} }
- ],
-
- "outputs": {
- "mspOfferName": {
- "type": "string",
- "value": "[concat('Managed by', ' ', parameters('mspOfferName'))]"
+ },
+ "variables": {
+ "mspRegistrationName": "[guid(parameters('mspOfferName'))]",
+ "mspAssignmentName": "[guid(parameters('mspOfferName'))]"
},
- "authorizations": {
- "type": "array",
- "value": "[parameters('authorizations')]"
- },
- "eligibleAuthorizations": {
- "type": "array",
- "value": "[parameters('eligibleAuthorizations')]"
-
- }
+ "resources": [
+ {
+ "type": "Microsoft.ManagedServices/registrationDefinitions",
+ "apiVersion": "2020-02-01-preview",
+ "name": "[variables('mspRegistrationName')]",
+ "properties": {
+ "registrationDefinitionName": "[parameters('mspOfferName')]",
+ "description": "[parameters('mspOfferDescription')]",
+ "managedByTenantId": "[parameters('managedByTenantId')]",
+ "authorizations": "[parameters('authorizations')]",
+ "eligibleAuthorizations": "[parameters('eligibleAuthorizations')]"
+ }
+ },
+ {
+ "type": "Microsoft.ManagedServices/registrationAssignments",
+ "apiVersion": "2020-02-01-preview",
+ "name": "[variables('mspAssignmentName')]",
+ "dependsOn": [
+ "[resourceId('Microsoft.ManagedServices/registrationDefinitions/', variables('mspRegistrationName'))]"
+ ],
+ "properties": {
+ "registrationDefinitionId": "[resourceId('Microsoft.ManagedServices/registrationDefinitions/', variables('mspRegistrationName'))]"
+ }
+ }
+ ],
+ "outputs": {
+ "mspOfferName": {
+ "type": "string",
+ "value": "[concat('Managed by', ' ', parameters('mspOfferName'))]"
+ },
+ "authorizations": {
+ "type": "array",
+ "value": "[parameters('authorizations')]"
+ },
+ "eligibleAuthorizations": {
+ "type": "array",
+ "value": "[parameters('eligibleAuthorizations')]"
+ }
+ }
}
-}
``` ### Define eligible authorizations in your parameters file
lighthouse Onboard Customer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/lighthouse/how-to/onboard-customer.md
Title: Onboard a customer to Azure Lighthouse description: Learn how to onboard a customer to Azure Lighthouse, allowing their resources to be accessed and managed by users in your tenant. Previously updated : 08/25/2021 Last updated : 08/26/2021
The template you choose will depend on whether you are onboarding an entire subs
|To onboard this |Use this Azure Resource Manager template |And modify this parameter file | ||||
-|Subscription |[delegatedResourceManagement.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/rg/rg.json) |[delegatedResourceManagement.parameters.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/rg/rg.parameters.json) |
-|Resource group |[rgDelegatedResourceManagement.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/rg/rg.json) |[rgDelegatedResourceManagement.parameters.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/rg/rg.parameters.json) |
-|Multiple resource groups within a subscription |[multipleRgDelegatedResourceManagement.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/rg/multi-rg.json) |[multipleRgDelegatedResourceManagement.parameters.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/rg/multiple-rg.parameters.json) |
+|Subscription |[subscription.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/subscription/subscription.json) |[subscription.parameters.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/subscription/subscription.parameters.json) |
+|Resource group |[rg.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/rg/rg.json) |[rg.parameters.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/rg/rg.parameters.json) |
+|Multiple resource groups within a subscription |[multi-rg.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/rg/multi-rg.json) |[multiple-rg.parameters.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/delegated-resource-management/rg/multiple-rg.parameters.json) |
|Subscription (when using an offer published to Azure Marketplace) |[marketplaceDelegatedResourceManagement.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/marketplace-delegated-resource-management/marketplaceDelegatedResourceManagement.json) |[marketplaceDelegatedResourceManagement.parameters.json](https://github.com/Azure/Azure-Lighthouse-samples/blob/master/templates/marketplace-delegated-resource-management/marketplaceDelegatedResourceManagement.parameters.json) | If you want to include [eligible authorizations](create-eligible-authorizations.md#create-eligible-authorizations-using-azure-resource-manager-templates) (currently in public preview), select the corresponding template from the [delegated-resource-management-eligible-authorizations section of our samples repo](https://github.com/Azure/Azure-Lighthouse-samples/tree/master/templates/delegated-resource-management-eligible-authorizations).
If you want to include [eligible authorizations](create-eligible-authorizations.
> [!TIP] > While you can't onboard an entire management group in one deployment, you can deploy a policy to [onboard each subscription in a management group](onboard-management-group.md). You'll then have access to all of the subscriptions in the management group, although you'll have to work on them as individual subscriptions (rather than taking actions on the management group resource directly).
-The following example shows a modified **delegatedResourceManagement.parameters.json** file that can be used to onboard a subscription. The resource group parameter files (located in the [rg-delegated-resource-management](https://github.com/Azure/Azure-Lighthouse-samples/tree/master/templates/delegated-resource-management/rg) folder) are similar, but also include an **rgName** parameter to identify the specific resource group(s) to be onboarded.
+The following example shows a modified **subscription.parameters.json** file that can be used to onboard a subscription. The resource group parameter files (located in the [rg-delegated-resource-management](https://github.com/Azure/Azure-Lighthouse-samples/tree/master/templates/delegated-resource-management/rg) folder) are similar, but also include an **rgName** parameter to identify the specific resource group(s) to be onboarded.
```json {
lighthouse Index https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/lighthouse/samples/index.md
Title: Azure Lighthouse samples and templates description: These samples and Azure Resource Manager templates help you onboard customers and support Azure Lighthouse scenarios. Previously updated : 08/13/2021 Last updated : 08/26/2021 # Azure Lighthouse samples
machine-learning How To Attach Arc Kubernetes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-attach-arc-kubernetes.md
else:
- [Train a TensorFlow model](how-to-train-tensorflow.md) - [Train a PyTorch model](how-to-train-pytorch.md) - [Train using Azure Machine Learning pipelines](how-to-create-machine-learning-pipelines.md)-- [Train model on-premise with outbound proxy server](../azure-arc/kubernetes/quickstart-connect-cluster.md#5-connect-using-an-outbound-proxy-server)
+- [Train model on-premise with outbound proxy server](../azure-arc/kubernetes/quickstart-connect-cluster.md#4a-connect-using-an-outbound-proxy-server)
machine-learning How To Configure Private Link https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-configure-private-link.md
Azure Private Link enables you to connect to your workspace using a private endp
## Limitations
-* Using an Azure Machine Learning workspace with private endpoint is not available in the Azure Government regions.
* If you enable public access for a workspace secured with private private endpoint and use Azure Machine Learning studio over the public internet, some features such as the designer may fail to access your data. This problem happens when the data is stored on a service that is secured behind the VNet. For example, an Azure Storage Account. * You may encounter problems trying to access the private endpoint for your workspace if you are using Mozilla Firefox. This problem may be related to DNS over HTTPS in Mozilla. We recommend using Microsoft Edge or Google Chrome as a workaround. * Using a private endpoint does not effect Azure control plane (management operations) such as deleting the workspace or managing compute resources. For example, creating, updating, or deleting a compute target. These operations are performed over the public Internet as normal. Data plane operations, such as using Azure Machine Learning studio, APIs (including published pipelines), or the SDK use the private endpoint.
machine-learning How To Create Workspace Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-create-workspace-template.md
By setting the `vnetOption` parameter value to either `new` or `existing`, you a
If your associated resources are not behind a virtual network, you can set the **privateEndpointType** parameter to `AutoAproval` or `ManualApproval` to deploy the workspace behind a private endpoint. This can be done for both new and existing workspaces. When updating an existing workspace, fill in the template parameters with the information from the existing workspace.
-> [!IMPORTANT]
-> Using an Azure Machine Learning workspace with private endpoint is not available in the Azure Government regions.
- # [Azure CLI](#tab/azcli) ```azurecli
machine-learning How To Deploy With Triton https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-deploy-with-triton.md
az ml service delete -n triton-densenet-onnx
[!notebook-python[] (~/Azureml-examples-main/python-sdk/experimental/deploy-triton/1.bidaf-ncd-local.ipynb?name=delete-service)] +
+## How to use Azure Machine Learning Triton Inference Server container image
+
+Learn how to use Azure Machine Learning Triton Inference Server container image with new [CLI(v2)](https://docs.microsoft.com/cli/azure/ml?view=azure-cli-latest). The examples below use [online endpoint and deployments](concept-endpoints.md#what-are-online-endpoints-preview) concept.
+
+1. [Deploy single Triton model](https://github.com/Azure/azureml-examples/blob/main/cli/deploy-triton-managed-online-endpoint.sh).
+1. [Deploy Triton multiple models](https://github.com/Azure/azureml-examples/blob/main/cli/deploy-triton-multiple-models-online-endpoint.sh).
+1. [Deploy Triton ensemble model](https://github.com/Azure/azureml-examples/blob/main/cli/deploy-triton-ensemble-managed-online-endpoint.sh).
+1. Check out [Triton examples](https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/triton).
+ ## Troubleshoot * [Troubleshoot a failed deployment](how-to-troubleshoot-deployment.md), learn how to troubleshoot and solve, or work around, common errors you may encounter when deploying a model.
machine-learning How To Manage Workspace Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-manage-workspace-cli.md
az ml workspace create -w <workspace-name> -g <resource-group-name> --file works
-> [!IMPORTANT]
-> Using an Azure Machine Learning workspace with private endpoint is not available in the Azure Government regions.
- ### Customer-managed key and high business impact workspace By default, metadata for the workspace is stored in an Azure Cosmos DB instance that Microsoft maintains. This data is encrypted using Microsoft-managed keys. Instead of using the Microsoft-managed key, you can also provide your own key. Doing so creates an additional set of resources in your Azure subscription to store your data.
machine-learning How To Manage Workspace https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-manage-workspace.md
In the [Azure portal](https://portal.azure.com/), select **Delete** at the top
* If you go directly to your workspace from a share link from the SDK or the Azure portal, you can't view the standard **Overview** page that has subscription information in the extension. In this scenario, you also can't switch to another workspace. To view another workspace, go directly to [Azure Machine Learning studio](https://ml.azure.com) and search for the workspace name. * All assets (Datasets, Experiments, Computes, and so on) are available only in [Azure Machine Learning studio](https://ml.azure.com). They're *not* available from the Azure portal.
+### Workspace diagnostics
++ ### Resource provider errors [!INCLUDE [machine-learning-resource-provider](../../includes/machine-learning-resource-provider.md)]
machine-learning How To Network Security Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-network-security-overview.md
For detailed instructions on how to complete these steps, see [Secure an Azure M
### Limitations Securing your workspace and associated resources within a virtual network have the following limitations:-- Using an Azure Machine Learning workspace with private endpoint is not available in the Azure Government or Azure China 21Vianet regions.
+- Using an Azure Machine Learning workspace with private endpoint is not available in Azure China 21Vianet regions.
- All resources must be behind the same VNet. However, subnets within the same VNet are allowed. ## Secure the training environment
machine-learning How To Secure Workspace Vnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/how-to-secure-workspace-vnet.md
The following methods can be used to connect to the secure workspace:
> [!IMPORTANT] > When using a __VPN gateway__ or __ExpressRoute__, you will need to plan how name resolution works between your on-premises resources and those in the VNet. For more information, see [Use a custom DNS server](how-to-custom-dns.md).
+## Workspace diagnostics
++ ## Next steps This article is part of a series on securing an Azure Machine Learning workflow. See the other articles in this series:
machine-learning Quickstart Create Resources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/quickstart-create-resources.md
You could install Azure Machine Learning on your own computer. But in this quic
Create a *compute instance* to use this development environment for the rest of the tutorials and quickstarts. 1. If you didn't select **Go to workspace** in the previous section, sign in to [Azure Machine Learning studio](https://ml.azure.com) now, and select your workspace.
-1. On the left side, under **Manage**, select **Compute**.
+1. On the left side, select **Compute**.
1. Select **+New** to create a new compute instance.
-1. Keep all the defaults on the first page, select **Next**.
-1. Supply a name and select **Create**.
+1. Supply a name, Keep all the defaults on the first page.
+1. Select **Create**.
In about two minutes, you'll see the **State** of the compute instance change from *Creating* to *Running*. It's now ready to go.
Review the parts of the studio on the left-hand navigation bar:
* You already used the **Manage** section of the studio to create your compute resources. This section also lets you create and manage data and external services you link to your workspace.
+### Workspace diagnostics
++ ## <a name="clean-up"></a>Clean up resources If you plan to continue now to the next tutorial, skip to [Next steps](#next-steps).
machine-learning Reference Machine Learning Cloud Parity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/machine-learning/reference-machine-learning-cloud-parity.md
Previously updated : 07/16/2021 Last updated : 08/24/2021
The information in the rest of this document provides information on what featur
| Virtual Network (VNet) support for training | GA | YES | YES | | Virtual Network (VNet) support for inference | GA | YES | YES | | Scoring endpoint authentication | Public Preview | YES | YES |
-| Workplace private endpoint | GA | Public Preview | Public Preview |
+| Workplace private endpoint | GA | GA | GA |
| ACI behind VNet | Public Preview | NO | NO | | ACR behind VNet | GA | YES | YES | | Private IP of AKS cluster | Public Preview | NO | NO |
marketplace Gtm Offer Listing Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/marketplace/gtm-offer-listing-best-practices.md
For an analysis of how your offers are performing, go to the [Marketplace Insigh
## Link to your offer page from your website
-To easily direct users to your offer in the commercial marketplace, leverage our **Get It Now** badges on your website or in your digital marketing collateral. Find these badges in our [Marketplace Marketing Toolkit](/asset/collection/azure-marketplace-and-appsource-publisher-toolkit#/).
+To easily direct users to your offer in the commercial marketplace, leverage our **Get It Now** badges on your website or in your digital marketing collateral. Find these badges in our [Marketplace Marketing Toolkit](https://partner.microsoft.com/asset/collection/azure-marketplace-and-appsource-publisher-toolkit#/).
When you link from the AppSource or Azure Marketplace badge on your site to your listing in the commercial marketplace, support strong analytics and reporting by including the following query parameters at the end of the URL: * **src**: Include the source from which the traffic is routed to AppSource (for example, website, LinkedIn, or Faceb