Updates from: 06/18/2021 03:12:49
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Application Types https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/application-types.md
Previously updated : 07/24/2019 Last updated : 06/17/2021
Many architectures include a web API that needs to call another downstream web A
This chained web API scenario can be supported by using the OAuth 2.0 JWT bearer credential grant, also known as the on-behalf-of flow. However, the on-behalf-of flow is not currently implemented in the Azure AD B2C.
-### Faulted apps
-
-Do not edit Azure AD B2C applications in these ways:
--- On other application management portals such as the [Application Registration Portal](https://portal.azure.com/#blade/Microsoft_AAD_RegisteredApps/ApplicationsListBlade).-- Using Graph API or PowerShell.-
-If you edit the Azure AD B2C application outside of the Azure portal, it becomes a faulted application and is no longer usable with Azure AD B2C. Delete the application and create it again.
-
-To delete the application, go to the [Application Registration Portal](https://portal.azure.com/#blade/Microsoft_AAD_RegisteredApps/ApplicationsListBlade) and delete the application there. In order for the application to be visible, you need to be the owner of the application (and not just an admin of the tenant).
- ## Next steps
-Find out more about the built-in policies provided by [User flows in Azure Active Directory B2C](user-flow-overview.md).
+Find out more about the built-in policies provided by [User flows in Azure Active Directory B2C](user-flow-overview.md).
active-directory-b2c Configure Authentication Sample Spa App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/configure-authentication-sample-spa-app.md
Previously updated : 06/11/2021 Last updated : 06/17/2021
This sample demonstrates how a single-page application can use Azure AD B2C for
### 3.1 Update the SPA sample
-Now that you've obtained the sample, update the code with your Azure AD B2C tenant name and the application ID of *myApp* you recorded in step 2.3.
+Now that you've obtained the SPA app sample, update the code with your Azure AD B2C and web API values. In the sample folder, under the `App` folder, open the following JavaScript files, and update with the corresponding value:
-Open the *authConfig.js* file inside the *App* folder.
-1. In the `msalConfig` object, find the assignment for `clientId` and replace it with the **Application (client) ID** you recorded in step 2.3.
-
-Open the `policies.js` file.
-1. Find the entries under `names` and replace their assignment with the name of the user-flows you created in an earlier step, for example `b2c_1_susi`.
-1. Find the entries under `authorities` and replace them as appropriate with the names of the user-flows you created in an earlier step, for example `https://<your-tenant-name>.b2clogin.com/<your-tenant-name>.onmicrosoft.com/<your-sign-in-sign-up-policy>`.
-1. Find the assignment for `authorityDomain` and replace it with `<your-tenant-name>.b2clogin.com`.
-
-Open the `apiConfig.js` file.
-1. Find the assignment for `b2cScopes` and replace the URL with the scope URL you created for the Web API, for example `b2cScopes: ["https://<your-tenant-name>.onmicrosoft.com/tasks-api/tasks.read"]`.
-1. Find the assignment for `webApi` and replace the current URL with `http://localhost:5000/tasks`.
+|File |Key |Value |
+||||
+|authConfig.js|clientId| The SPA application ID from [step 2.1](#21-register-the-web-api-application).|
+|policies.js| names| The user flows, or custom policy you created in [step 1](#step-1-configure-your-user-flow).|
+|policies.js|authorities|Your Azure AD B2C [tenant name](tenant-management.md#get-your-tenant-name). For example, `contoso.onmicrosoft.com`. Then, replace with the user flows, or custom policy you created in [step 1](#step-1-configure-your-user-flow). For example `https://<your-tenant-name>.b2clogin.com/<your-tenant-name>.onmicrosoft.com/<your-sign-in-sign-up-policy>`|
+|policies.js|authorityDomain|Your Azure AD B2C [tenant name](tenant-management.md#get-your-tenant-name). For example, `contoso.onmicrosoft.com`.|
+|apiConfig.js|b2cScopes|The scopes you [created for the web API](#22-configure-scopes). For example, `b2cScopes: ["https://<your-tenant-name>.onmicrosoft.com/tasks-api/tasks.read"]`.|
+|apiConfig.js|webApi|The URL of the web API, `http://localhost:5000/tasks`.|
Your resulting code should look similar to following sample:
active-directory-b2c Identity Provider Azure Ad Multi Tenant https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-azure-ad-multi-tenant.md
Previously updated : 06/15/2021 Last updated : 06/17/2021
This article shows you how to enable sign-in for users using the multi-tenant en
[!INCLUDE [active-directory-b2c-customization-prerequisites](../../includes/active-directory-b2c-customization-prerequisites.md)]
-## Register an application
+## Register an Azure AD app
To enable sign-in for users with an Azure AD account in Azure Active Directory B2C (Azure AD B2C), you need to create an application in [Azure portal](https://portal.azure.com). For more information, see [Register an application with the Microsoft identity platform](../active-directory/develop/quickstart-register-app.md).
To enable sign-in for users with an Azure AD account in Azure Active Directory B
1. Select **Certificates & secrets**, and then select **New client secret**. 1. Enter a **Description** for the secret, select an expiration, and then select **Add**. Record the **Value** of the secret for use in a later step.
-## Configuring optional claims
+### Configuring optional claims
If you want to get the `family_name`, and `given_name` claims from Azure AD, you can configure optional claims for your application in the Azure portal UI or application manifest. For more information, see [How to provide optional claims to your Azure AD app](../active-directory/develop/active-directory-optional-claims.md).
If you want to get the `family_name`, and `given_name` claims from Azure AD, you
1. Select the optional claims to add, `family_name`, and `given_name`. 1. Click **Add**.
+## [Optional] Verify your app authenticity
+
+[Publisher verification](../active-directory/develop/publisher-verification-overview.md) helps your users understand the authenticity of the app you [registered](#register-an-azure-ad-app). A verified app means that the publisher of the app has [verified](/partner-center/verification-responses) their identity using their Microsoft Partner Network (MPN). Learn how to [mark your app as publisher verified](../active-directory/develop/mark-app-as-publisher-verified.md).
+ ## Create a policy key You need to store the application key that you created in your Azure AD B2C tenant.
If the sign-in process is successful, your browser is redirected to `https://jwt
## Next steps
-[Publisher verification](../active-directory/develop/publisher-verification-overview.md) helps your users understand the authenticity of the app you [registered](#register-an-application). A verified app means that the publisher of the app has [verified](/partner-center/verification-responses) their identity using their Microsoft Partner Network (MPN). Learn how to [mark your app as publisher verified](../active-directory/develop/mark-app-as-publisher-verified.md).
+Learn how to [pass the Azure AD token to your application](idp-pass-through-user-flow.md).
::: zone-end
active-directory-b2c Identity Provider Azure Ad Single Tenant https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-azure-ad-single-tenant.md
Previously updated : 06/15/2021 Last updated : 06/17/2021
If you want to get the `family_name` and `given_name` claims from Azure AD, you
1. Select the optional claims to add, `family_name` and `given_name`. 1. Click **Add**.
+## [Optional] Verify your app authenticity
+
+[Publisher verification](../active-directory/develop/publisher-verification-overview.md) helps your users understand the authenticity of the app you [registered](#register-an-azure-ad-app). A verified app means that the publisher of the app has [verified](/partner-center/verification-responses) their identity using their Microsoft Partner Network (MPN). Learn how to [mark your app as publisher verified](../active-directory/develop/mark-app-as-publisher-verified.md).
+ ::: zone pivot="b2c-user-flow" ## Configure Azure AD as an identity provider
If the sign-in process is successful, your browser is redirected to `https://jwt
## Next steps
-[Publisher verification](../active-directory/develop/publisher-verification-overview.md) helps your users understand the authenticity of the app you [registered](#register-an-azure-ad-app). A verified app means that the publisher of the app has [verified](/partner-center/verification-responses) their identity using their Microsoft Partner Network (MPN). Learn how to [mark your app as publisher verified](../active-directory/develop/mark-app-as-publisher-verified.md).
+Learn how to [pass the Azure AD token to your application](idp-pass-through-user-flow.md).
active-directory-domain-services Join Centos Linux Vm https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-domain-services/join-centos-linux-vm.md
Previously updated : 07/13/2020 Last updated : 06/17/2021
When done, save and exit the *hosts* file using the `:wq` command of the editor.
The VM needs some additional packages to join the VM to the managed domain. To install and configure these packages, update and install the domain-join tools using `yum`: ```console
-sudo yum install realmd sssd krb5-workstation krb5-libs oddjob oddjob-mkhomedir samba-common-tools
+sudo yum install adcli realmd sssd krb5-workstation krb5-libs oddjob oddjob-mkhomedir samba-common-tools
``` ## Join VM to the managed domain
Now that the required packages are installed on the VM, join the VM to the manag
1. Finally, join the VM to the managed domain using the `realm join` command. Use the same user account that's a part of the managed domain that you specified in the previous `kinit` command, such as `contosoadmin@AADDSCONTOSO.COM`: ```console
- sudo realm join --verbose AADDSCONTOSO.COM -U 'contosoadmin@AADDSCONTOSO.COM'
+ sudo realm join --verbose AADDSCONTOSO.COM -U 'contosoadmin@AADDSCONTOSO.COM' --membership-software=adcli
``` It takes a few moments to join the VM to the managed domain. The following example output shows the VM has successfully joined to the managed domain:
active-directory Concept Authentication Methods https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/concept-authentication-methods.md
Previously updated : 06/16/2021 Last updated : 06/17/2021
# What authentication and verification methods are available in Azure Active Directory?
-Microsoft recommends passwordless authentication methods such as Windows Hello, FIDO2 security keys, and the Microsoft Authenticator app because they provide the most secure sign-in experience. As part of the sign-in experience for accounts in Azure Active Directory (Azure AD), there are different ways that Although a user can sign-in using other common methods such as a username and password, passwords should be replaced with more secure authentication methods.
+Microsoft recommends passwordless authentication methods such as Windows Hello, FIDO2 security keys, and the Microsoft Authenticator app because they provide the most secure sign-in experience. Although a user can sign-in using other common methods such as a username and password, passwords should be replaced with more secure authentication methods.
![Table of the strengths and preferred authentication methods in Azure AD](media/concept-authentication-methods/authentication-methods.png)
active-directory Concept Registration Mfa Sspr Combined https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/authentication/concept-registration-mfa-sspr-combined.md
Previously updated : 01/27/2021 Last updated : 06/16/2021
A user who has previously set up at least one method navigates to [https://aka.m
A user who has previously set up at least one method that can be used for Multi-Factor Authentication navigates to [https://aka.ms/mysecurityinfo](https://aka.ms/mysecurityinfo). The user changes the current default method to a different default method. When finished, the user sees the new default method on the Security info page.
+An external identity such as a B2B user may need to switch the directory to change the security registration information for a third-party tenant. In the Azure portal, click the user account name in the upper right corner and click **SWitch directory**.
+
+![External users can switch directory.](media/concept-registration-mfa-sspr-combined/switch-directory.png)
+ ## Next steps To get started, see the tutorials to [enable self-service password reset](tutorial-enable-sspr.md) and [enable Azure AD Multi-Factor Authentication](tutorial-enable-azure-mfa.md).
active-directory Active Directory Claims Mapping https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/active-directory-claims-mapping.md
Previously updated : 06/10/2021 Last updated : 06/16/2021 # Customize claims emitted in tokens for a specific app in a tenant
-Claims customization is used by tenant admins to customize the claims emitted in tokens for a specific application in their tenant. You can use claims-mapping policies to:
+A claim is information that an identity provider states about a user inside the token they issue for that user. Claims customization is used by tenant admins to customize the claims emitted in tokens for a specific application in their tenant. You can use claims-mapping policies to:
- select which claims are included in tokens. - create claim types that do not already exist.
Claims customization supports configuring claim-mapping policies for the WS-Fed,
> [!NOTE] > This feature replaces and supersedes the [claims customization](active-directory-saml-claims-customization.md) offered through the Azure portal. On the same application, if you customize claims using the portal in addition to the Microsoft Graph/PowerShell method detailed in this document, tokens issued for that application will ignore the configuration in the portal. Configurations made through the methods detailed in this document will not be reflected in the portal.
-In this article, we walk through a few common scenarios that can help you grasp how to use the [claims-mapping policy type](reference-claims-mapping-policy-type.md).
+In this article, we walk through a few common scenarios that can help you understand how to use the [claims-mapping policy type](reference-claims-mapping-policy-type.md).
-When creating a claims-mapping policy, you can also emit a claim from a directory schema extension attribute in tokens. Use *ExtensionID* for the extension attribute instead of *ID* in the `ClaimsSchema` element. For more info on extension attributes, see [Using directory schema extension attributes](active-directory-schema-extensions.md).
+## Get started
-## Prerequisites
+In the following examples, you create, update, link, and delete policies for service principals. Claims-mapping policies can only be assigned to service principal objects. If you are new to Azure AD, we recommend that you [learn about how to get an Azure AD tenant](quickstart-create-new-tenant.md) before you proceed with these examples.
-In the following examples, you create, update, link, and delete policies for service principals. claims-mapping policies can only be assigned to service principal objects. If you are new to Azure AD, we recommend that you [learn about how to get an Azure AD tenant](quickstart-create-new-tenant.md) before you proceed with these examples.
+When creating a claims-mapping policy, you can also emit a claim from a directory schema extension attribute in tokens. Use *ExtensionID* for the extension attribute instead of *ID* in the `ClaimsSchema` element. For more info on extension attributes, see [Using directory schema extension attributes](active-directory-schema-extensions.md).
> [!NOTE]
-> The [Azure AD PowerShell Module public preview release](https://www.powershellgallery.com/packages/AzureADPreview) is required to configure claims-mapping policies. The PowerShell module is in preview, be prepared to revert or remove any changes.
+> The [Azure AD PowerShell Module public preview release](https://www.powershellgallery.com/packages/AzureADPreview) is required to configure claims-mapping policies. The PowerShell module is in preview, while the claims mapping and token creation runtime in Azure is generally available. Updates to the preview PowerShell module could require you to update or change your configuration scripts.
To get started, do the following steps: 1. Download the latest [Azure AD PowerShell Module public preview release](https://www.powershellgallery.com/packages/AzureADPreview).
-1. Run the Connect command to sign in to your Azure AD admin account. Run this command each time you start a new session.
+1. Run the [Connect-AzureAD](/powershell/module/azuread/connect-azuread?view=azureadps-2.0-preview) command to sign in to your Azure AD admin account. Run this command each time you start a new session.
``` powershell Connect-AzureAD -Confirm
To get started, do the following steps:
Get-AzureADPolicy ```
+Next, create a claims mapping policy and assign it to a service principal. See these examples for common scenarios:
+- [Omit the basic claims from tokens](#omit-the-basic-claims-from-tokens)
+- [Include the EmployeeID and TenantCountry as claims in tokens](#include-the-employeeid-and-tenantcountry-as-claims-in-tokens)
+- [Use a claims transformation in tokens](#use-a-claims-transformation-in-tokens)
+
+After creating a claims mapping policy, configure your application to acknowledge that tokens will contain customized claims. For more information, read [security considerations](#security-considerations).
+ ## Omit the basic claims from tokens In this example, you create a policy that removes the [basic claim set](reference-claims-mapping-policy-type.md#claim-sets) from tokens issued to linked service principals.
In this example, you create a policy that emits a custom claim "JoinedData" to J
## Security considerations
-Applications that receive tokens rely on the fact that the claim values are authoritatively issued by Azure AD and cannot be tampered with. However, when you modify the token contents through claims-mapping policies, these assumptions may no longer be correct. Applications must explicitly acknowledge that tokens have been modified by the creator of the claims-mapping policy to protect themselves from claims-mapping policies created by malicious actors. This can be done in the following ways:
+Applications that receive tokens rely on the fact that the claim values are authoritatively issued by Azure AD and cannot be tampered with. However, when you modify the token contents through claims-mapping policies, these assumptions may no longer be correct. Applications must explicitly acknowledge that tokens have been modified by the creator of the claims-mapping policy to protect themselves from claims-mapping policies created by malicious actors. This can be done in one the following ways:
-- Configure a custom signing key-- Update the application manifest to accept mapped claims.
+- [Configure a custom signing key](#configure-a-custom-signing-key)
+- Or, [update the application manifest](#update-the-application-manifest) to accept mapped claims.
Without this, Azure AD will return an [`AADSTS50146` error code](reference-aadsts-error-codes.md#aadsts-error-codes).
-### Custom signing key
+### Configure a custom signing key
+
+For multi-tenant apps, a custom signing key should be used. Do not set `acceptMappedClaims` in the app manifest. If set up an app in the Azure portal, you get an app registration object and a service principal in your tenant. That app is using the Azure global sign-in key, which cannot be used for customizing claims in tokens. To get custom claims in tokens, create a custom sign-in key from a certificate and add it to service principal. For testing purposes, you can use a self-signed certificate. After configuring the custom signing key, your application code needs to [validate the token signing key](#validate-token-signing-key).
+
+Add the following information to the service principal:
+
+- Private key (as a [key credential](/graph/api/resources/keycredential))
+- Password (as a [password credential](/graph/api/resources/passwordcredential))
+- Public key (as a [key credential](/graph/api/resources/keycredential))
+
+Extract the private and public key base-64 encoded from the PFX file export of your certificate. Make sure that the `keyId` for the `keyCredential` used for "Sign" matches the `keyId` of the `passwordCredential`. You can generate the `customkeyIdentifier` by getting the hash of the cert's thumbprint.
-In order to add a custom signing key to the service principal object, you can use the Azure PowerShell cmdlet [`New-AzureADApplicationKeyCredential`](/powerShell/module/Azuread/New-AzureADApplicationKeyCredential) to create a certificate key credential for your Application object.
+#### Request
+
+The following shows the format of the HTTP PATCH request to add a custom signing key to a service principal. The "key" value in the `keyCredentials` property is shortened for readability. The value is base-64 encoded. For the private key, the property usage is "Sign". For the public key, the property usage is "Verify".
+
+```
+PATCH https://graph.microsoft.com/v1.0/servicePrincipals/f47a6776-bca7-4f2e-bc6c-eec59d058e3e
+
+Content-type: servicePrincipals/json
+Authorization: Bearer {token}
+
+{
+ "keyCredentials":[
+ {
+ "customKeyIdentifier": "lY85bR8r6yWTW6jnciNEONwlVhDyiQjdVLgPDnkI5mA=",
+ "endDateTime": "2021-04-22T22:10:13Z",
+ "keyId": "4c266507-3e74-4b91-aeba-18a25b450f6e",
+ "startDateTime": "2020-04-22T21:50:13Z",
+ "type": "X509CertAndPassword",
+ "usage": "Sign",
+ "key":"MIIKIAIBAz.....HBgUrDgMCERE20nuTptI9MEFCh2Ih2jaaLZBZGeZBRFVNXeZmAAgIH0A==",
+ "displayName": "CN=contoso"
+ },
+ {
+ "customKeyIdentifier": "lY85bR8r6yWTW6jnciNEONwlVhDyiQjdVLgPDnkI5mA=",
+ "endDateTime": "2021-04-22T22:10:13Z",
+ "keyId": "e35a7d11-fef0-49ad-9f3e-aacbe0a42c42",
+ "startDateTime": "2020-04-22T21:50:13Z",
+ "type": "AsymmetricX509Cert",
+ "usage": "Verify",
+ "key": "MIIDJzCCAg+gAw......CTxQvJ/zN3bafeesMSueR83hlCSyg==",
+ "displayName": "CN=contoso"
+ }
+
+ ],
+ "passwordCredentials": [
+ {
+ "customKeyIdentifier": "lY85bR8r6yWTW6jnciNEONwlVhDyiQjdVLgPDnkI5mA=",
+ "keyId": "4c266507-3e74-4b91-aeba-18a25b450f6e",
+ "endDateTime": "2022-01-27T19:40:33Z",
+ "startDateTime": "2020-04-20T19:40:33Z",
+ "secretText": "mypassword"
+ }
+ ]
+}
+```
+
+#### Configure a custom signing key using PowerShell
+
+Use PowerShell to [instantiate an MSAL Public Client Application](msal-net-initializing-client-applications.md#initializing-a-public-client-application-from-code) and use the [Authorization Code Grant](v2-oauth2-auth-code-flow.md) flow to obtain a delegated permission access token for Microsoft Graph. Use the access token to call Microsoft Graph and configure a custom signing key for the service principal. After configuring the custom signing key, your application code needs to [validate the token signing key](#validate-token-signing-key).
+
+To run this script you need:
+1. The object ID of your application's service principal, found in the **Overview** blade of your application's entry in [Enterprise Applications](https://portal.azure.com/#blade/Microsoft_AAD_IAM/StartboardApplicationsMenuBlade/AllApps/menuId/) in the Azure portal.
+2. An app registration to sign in a user and get an access token to call Microsoft Graph. Get the application (client) ID of this app in the **Overview** blade of the application's entry in [App registrations](https://portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/RegisteredApps) in the Azure portal. The app registration should have the following configuration:
+ - A redirect URI of "http://localhost" listed in the **Mobile and desktop applications** platform configuration
+ - In **API permissions**, Microsoft Graph delegated permissions **Application.ReadWrite.All** and **User.Read** (make sure you grant Admin consent to these permissions)
+3. A user who logs in to get the Microsoft Graph access token. The user should be one of the following Azure AD administrative roles (required to update the service principal):
+ - Cloud Application Administrator
+ - Application Administrator
+ - Global Administrator
+4. A certificate to configure as a custom signing key for our application. You can either create a self-signed certificate or obtain one from your trusted certificate authority. The following certificate components are used in the script:
+ - public key (typically a .cer file)
+ - private key in PKCS#12 format (in .pfx file)
+ - password for the private key (pfx file)
+
+> [!IMPORTANT]
+> The private key must be in PKCS#12 format since Azure AD does not support other format types. Using the wrong format can result in the the error ΓÇ£Invalid certificate: Key value is invalid certificateΓÇ¥ when using Microsoft Graph to PATCH the service principal with a `keyCredentials` containing the certificate info.
+
+```powershell
+
+$fqdn="fourthcoffeetest.onmicrosoft.com" # this is used for the 'issued to' and 'issued by' field of the certificate
+$pwd="mypassword" # password for exporting the certificate private key
+$location="C:\\temp" # path to folder where both the pfx and cer file will be written to
+
+# Create a self-signed cert
+$cert = New-SelfSignedCertificate -certstorelocation cert:\currentuser\my -DnsName $fqdn
+$pwdSecure = ConvertTo-SecureString -String $pwd -Force -AsPlainText
+$path = 'cert:\currentuser\my\' + $cert.Thumbprint
+$cerFile = $location + "\\" + $fqdn + ".cer"
+$pfxFile = $location + "\\" + $fqdn + ".pfx"
+
+# Export the public and private keys
+Export-PfxCertificate -cert $path -FilePath $pfxFile -Password $pwdSecure
+Export-Certificate -cert $path -FilePath $cerFile
+
+$ClientID = "<app-id>"
+$loginURL = "https://login.microsoftonline.com"
+$tenantdomain = "fourthcoffeetest.onmicrosoft.com"
+$redirectURL = "http://localhost" # this reply URL is needed for PowerShell Core
+[string[]] $Scopes = "https://graph.microsoft.com/.default"
+$pfxpath = $pfxFile # path to pfx file
+$cerpath = $cerFile # path to cer file
+$SPOID = "<service-principal-id>"
+$graphuri = "https://graph.microsoft.com/v1.0/serviceprincipals/$SPOID"
+$password = $pwd # password for the pfx file
+
+
+# choose the correct folder name for MSAL based on PowerShell version 5.1 (.Net) or PowerShell Core (.Net Core)
+
+if ($PSVersionTable.PSVersion.Major -gt 5)
+ {
+ $core = $true
+ $foldername = "netcoreapp2.1"
+ }
+else
+ {
+ $core = $false
+ $foldername = "net45"
+ }
+
+# Load the MSAL/microsoft.identity/client assembly -- needed once per PowerShell session
+[System.Reflection.Assembly]::LoadFrom((Get-ChildItem C:/Users/<username>/.nuget/packages/microsoft.identity.client/4.32.1/lib/$foldername/Microsoft.Identity.Client.dll).fullname) | out-null
+
+$global:app = $null
+
+$ClientApplicationBuilder = [Microsoft.Identity.Client.PublicClientApplicationBuilder]::Create($ClientID)
+[void]$ClientApplicationBuilder.WithAuthority($("$loginURL/$tenantdomain"))
+[void]$ClientApplicationBuilder.WithRedirectUri($redirectURL)
+
+$global:app = $ClientApplicationBuilder.Build()
+
+Function Get-GraphAccessTokenFromMSAL {
+ [Microsoft.Identity.Client.AuthenticationResult] $authResult = $null
+ $AquireTokenParameters = $global:app.AcquireTokenInteractive($Scopes)
+ [IntPtr] $ParentWindow = [System.Diagnostics.Process]::GetCurrentProcess().MainWindowHandle
+ if ($ParentWindow)
+ {
+ [void]$AquireTokenParameters.WithParentActivityOrWindow($ParentWindow)
+ }
+ try {
+ $authResult = $AquireTokenParameters.ExecuteAsync().GetAwaiter().GetResult()
+ }
+ catch {
+ $ErrorMessage = $_.Exception.Message
+ Write-Host $ErrorMessage
+ }
+
+ return $authResult
+}
+
+$myvar = Get-GraphAccessTokenFromMSAL
+if ($myvar)
+{
+ $GraphAccessToken = $myvar.AccessToken
+ Write-Host "Access Token: " $myvar.AccessToken
+ #$GraphAccessToken = "eyJ0eXAiOiJKV1QiL ... iPxstltKQ"
+
+
+ # this is for PowerShell Core
+ $Secure_String_Pwd = ConvertTo-SecureString $password -AsPlainText -Force
+
+ # reading certificate files and creating Certificate Object
+ if ($core)
+ {
+ $pfx_cert = get-content $pfxpath -AsByteStream -Raw
+ $cer_cert = get-content $cerpath -AsByteStream -Raw
+ $cert = Get-PfxCertificate -FilePath $pfxpath -Password $Secure_String_Pwd
+ }
+ else
+ {
+ $pfx_cert = get-content $pfxpath -Encoding Byte
+ $cer_cert = get-content $cerpath -Encoding Byte
+ # Write-Host "Enter password for the pfx file..."
+ # calling Get-PfxCertificate in PowerShell 5.1 prompts for password
+ # $cert = Get-PfxCertificate -FilePath $pfxpath
+ $cert = [System.Security.Cryptography.X509Certificates.X509Certificate2]::new($pfxpath, $password)
+ }
+
+ # base 64 encode the private key and public key
+ $base64pfx = [System.Convert]::ToBase64String($pfx_cert)
+ $base64cer = [System.Convert]::ToBase64String($cer_cert)
+
+ # getting id for the keyCredential object
+ $guid1 = New-Guid
+ $guid2 = New-Guid
+
+ # get the custom key identifier from the certificate thumbprint:
+ $hasher = [System.Security.Cryptography.HashAlgorithm]::Create('sha256')
+ $hash = $hasher.ComputeHash([System.Text.Encoding]::UTF8.GetBytes($cert.Thumbprint))
+ $customKeyIdentifier = [System.Convert]::ToBase64String($hash)
+
+ # get end date and start date for our keycredentials
+ $endDateTime = ($cert.NotAfter).ToUniversalTime().ToString( "yyyy-MM-ddTHH:mm:ssZ" )
+ $startDateTime = ($cert.NotBefore).ToUniversalTime().ToString( "yyyy-MM-ddTHH:mm:ssZ" )
+
+ # building our json payload
+ $object = [ordered]@{
+ keyCredentials = @(
+ [ordered]@{
+ customKeyIdentifier = $customKeyIdentifier
+ endDateTime = $endDateTime
+ keyId = $guid1
+ startDateTime = $startDateTime
+ type = "X509CertAndPassword"
+ usage = "Sign"
+ key = $base64pfx
+ displayName = "CN=fourthcoffeetest"
+ },
+ [ordered]@{
+ customKeyIdentifier = $customKeyIdentifier
+ endDateTime = $endDateTime
+ keyId = $guid2
+ startDateTime = $startDateTime
+ type = "AsymmetricX509Cert"
+ usage = "Verify"
+ key = $base64cer
+ displayName = "CN=fourthcoffeetest"
+ }
+ )
+ passwordCredentials = @(
+ [ordered]@{
+ customKeyIdentifier = $customKeyIdentifier
+ keyId = $guid1
+ endDateTime = $endDateTime
+ startDateTime = $startDateTime
+ secretText = $password
+ }
+ )
+ }
+
+ $json = $object | ConvertTo-Json -Depth 99
+ Write-Host "JSON Payload:"
+ Write-Output $json
+
+ # Request Header
+ $Header = @{}
+ $Header.Add("Authorization","Bearer $($GraphAccessToken)")
+ $Header.Add("Content-Type","application/json")
+
+ try
+ {
+ Invoke-RestMethod -Uri $graphuri -Method "PATCH" -Headers $Header -Body $json
+ }
+ catch
+ {
+ # Dig into the exception to get the Response details.
+ # Note that value__ is not a typo.
+ Write-Host "StatusCode:" $_.Exception.Response.StatusCode.value__
+ Write-Host "StatusDescription:" $_.Exception.Response.StatusDescription
+ }
+
+ Write-Host "Complete Request"
+}
+else
+{
+ Write-Host "Fail to get Access Token"
+}
+```
+#### Validate token signing key
Apps that have claims mapping enabled must validate their token signing keys by appending `appid={client_id}` to their [OpenID Connect metadata requests](v2-protocols-oidc.md#fetch-the-openid-connect-metadata-document). Below is the format of the OpenID Connect metadata document you should use: ```
https://login.microsoftonline.com/{tenant}/v2.0/.well-known/openid-configuration
### Update the application manifest
-Alternatively, you can set the `acceptMappedClaims` property to `true` in the [application manifest](reference-app-manifest.md). As documented on the [apiApplication resource type](/graph/api/resources/apiapplication#properties), this allows an application to use claims mapping without specifying a custom signing key.
+For single tenant apps, you can set the `acceptMappedClaims` property to `true` in the [application manifest](reference-app-manifest.md). As documented on the [apiApplication resource type](/graph/api/resources/apiapplication#properties), this allows an application to use claims mapping without specifying a custom signing key.
+
+> [!WARNING]
+> Do not set `acceptMappedClaims` property to `true` for multi-tenant apps, which can allow malicious actors to create claims-mapping policies for your app.
This does require the requested token audience to use a verified domain name of your Azure AD tenant, which means you should ensure to set the `Application ID URI` (represented by the `identifierUris` in the application manifest) for example to `https://contoso.com/my-api` or (simply using the default tenant name) `https://contoso.onmicrosoft.com/my-api`.
active-directory Howto Create Service Principal Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/howto-create-service-principal-portal.md
Previously updated : 06/26/2020 Last updated : 06/16/2021 -+ # How to: Use the portal to create an Azure AD application and service principal that can access resources
You can set the scope at the level of the subscription, resource group, or resou
If you don't see the subscription you're looking for, select **global subscriptions filter**. Make sure the subscription you want is selected for the portal. 1. Select **Access control (IAM)**.
-1. Select **Add role assignment**.
-1. Select the role you wish to assign to the application. For example, to allow the application to execute actions like **reboot**, **start** and **stop** instances, select the **Contributor** role. Read more about the [available roles](../../role-based-access-control/built-in-roles.md) By default, Azure AD applications aren't displayed in the available options. To find your application, search for the name and select it.
+1. Select Select **Add** > **Add role assignment** to open the **Add role assignment** page.
+1. Select the role you wish to assign to the application. For example, to allow the application to execute actions like **reboot**, **start** and **stop** instances, select the **Contributor** role. Read more about the [available roles](../../role-based-access-control/built-in-roles.md) By default, Azure AD applications aren't displayed in the available options. To find your application, search for the name and select it.
- ![Select the role to assign to the application](./media/howto-create-service-principal-portal/select-role.png)
-
-1. Select **Save** to finish assigning the role. You see your application in the list of users with a role for that scope.
+ Assign the Contributor role to the application at the subscription scope. For detailed steps, see [Assign Azure roles using the Azure portal](../../role-based-access-control/role-assignments-portal.md).
Your service principal is set up. You can start using it to run your scripts or apps. To manage your service principal (permissions, user consented permissions, see which users have consented, review permissions, see sign in information, and more), go to **Enterprise applications**.
Keep in mind, you might need to configure additional permissions on resources th
* Learn how to [use Azure PowerShell to create a service principal](howto-authenticate-service-principal-powershell.md). * To learn about specifying security policies, see [Azure role-based access control (Azure RBAC)](../../role-based-access-control/role-assignments-portal.md). * For a list of available actions that can be granted or denied to users, see [Azure Resource Manager Resource Provider operations](../../role-based-access-control/resource-provider-operations.md).
-* For information about working with app registrations by using **Microsoft Graph**, see the [Applications](/graph/api/resources/application) API reference.
+* For information about working with app registrations by using **Microsoft Graph**, see the [Applications](/graph/api/resources/application) API reference.
active-directory V2 Oauth2 Auth Code Flow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/v2-oauth2-auth-code-flow.md
A successful token response will look like:
| Parameter | Description | |||
-| `access_token` | The requested access token. The app can use this token to authenticate to the secured resource, such as a web API. |
+| `access_token` | The requested access token. The app can use this token to authenticate to the secured resource, such as a web API. |
| `token_type` | Indicates the token type value. The only type that Azure AD supports is Bearer | | `expires_in` | How long the access token is valid (in seconds). | | `scope` | The scopes that the access_token is valid for. Optional - this is non-standard, and if omitted the token will be for the scopes requested on the initial leg of the flow. |
POST /{tenant}/oauth2/v2.0/token HTTP/1.1
Host: https://login.microsoftonline.com Content-Type: application/x-www-form-urlencoded
-client_id=6731de76-14a6-49ae-97bc-6eba6914391e
+client_id=535fb089-9ff3-47b6-9bfb-4f1264799865
&scope=https%3A%2F%2Fgraph.microsoft.com%2Fmail.read &refresh_token=OAAABAAAAiL9Kn2Z27UubvWFPbm0gLWQJVzCTE9UkP3pSx1aXxUjq... &grant_type=refresh_token
-&client_secret=JqQX2PNo9bpM0uEihUPzyrh // NOTE: Only required for web apps. This secret needs to be URL-Encoded
+&client_secret=sampleCredentia1s // NOTE: Only required for web apps. This secret needs to be URL-Encoded
``` > [!TIP]
active-directory V2 Oauth2 Client Creds Grant Flow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/v2-oauth2-client-creds-grant-flow.md
Content-Type: application/x-www-form-urlencoded
client_id=535fb089-9ff3-47b6-9bfb-4f1264799865 &scope=https%3A%2F%2Fgraph.microsoft.com%2F.default
-&client_secret=5ampl3Cr3dentia1s
+&client_secret=sampleCredentia1s
&grant_type=client_credentials ```
active-directory V2 Oauth2 On Behalf Of Flow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/v2-oauth2-on-behalf-of-flow.md
Host: login.microsoftonline.com/<tenant>
Content-Type: application/x-www-form-urlencoded grant_type=urn:ietf:params:oauth:grant-type:jwt-bearer
-&client_id=2846f71b-a7a4-4987-bab3-760035b2f389
-&client_secret=BYyVnAt56JpLwUcyo47XODd
+client_id=535fb089-9ff3-47b6-9bfb-4f1264799865
+&client_secret=sampleCredentia1s
&assertion=eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsImtpZCI6InowMzl6ZHNGdWl6cEJmQlZLMVRuMjVRSFlPMCJ9.eyJhdWQiOiIyO{a lot of characters here} &scope=https://graph.microsoft.com/user.read+offline_access &requested_token_use=on_behalf_of
active-directory Concept Primary Refresh Token https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/devices/concept-primary-refresh-token.md
The PRT is issued during user authentication on a Windows 10 device in two scena
* **Azure AD joined** or **Hybrid Azure AD joined**: A PRT is issued during Windows logon when a user signs in with their organization credentials. A PRT is issued with all Windows 10 supported credentials, for example, password and Windows Hello for Business. In this scenario, Azure AD CloudAP plugin is the primary authority for the PRT. * **Azure AD registered device**: A PRT is issued when a user adds a secondary work account to their Windows 10 device. Users can add an account to Windows 10 in two different ways -
- * Adding an account via the **Use this account everywhere on this device** prompt after signing in to an app (for example, Outlook)
+ * Adding an account via the **Allow my organization to manage my device** prompt after signing in to an app (for example, Outlook)
* Adding an account from **Settings** > **Accounts** > **Access Work or School** > **Connect** In Azure AD registered device scenarios, the Azure AD WAM plugin is the primary authority for the PRT since Windows logon is not happening with this Azure AD account.
active-directory Customize Branding https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/customize-branding.md
Use your organization's logo and custom color schemes to provide a consistent look-and-feel on your Azure Active Directory (Azure AD) sign-in pages. Your sign-in pages appear when users sign in to your organization's web-based apps, such as Microsoft 365, which uses Azure AD as your identity provider. >[!NOTE]
->Adding custom branding requires you to have Azure Active Directory Premium 1 or Premium 2 licenses. For more information about licensing and editions, see [Sign up for Azure AD Premium](active-directory-get-started-premium.md).<br><br>Azure AD Premium editions are available for customers in China using the worldwide instance of Azure Active Directory. Azure AD Premium editions aren't currently supported in the Azure service operated by 21Vianet in China. For more information, talk to us using the [Azure Active Directory Forum](https://feedback.azure.com/forums/169401-azure-active-directory/).
+>Adding custom branding requires you to have either Azure Active Directory Premium 1, Premium 2, or Office 365 (for Office 365 apps) licenses. For more information about licensing and editions, see [Sign up for Azure AD Premium](active-directory-get-started-premium.md).<br><br>Azure AD Premium editions are available for customers in China using the worldwide instance of Azure Active Directory. Azure AD Premium editions aren't currently supported in the Azure service operated by 21Vianet in China. For more information, talk to us using the [Azure Active Directory Forum](https://feedback.azure.com/forums/169401-azure-active-directory/).
## Customize your Azure AD sign-in page You can customize your Azure AD sign-in pages, which appear when users sign in to your organization's tenant-specific apps, such as `https://outlook.com/contoso.com`, or when passing a domain variable, such as `https://passwordreset.microsoftonline.com/?whr=contoso.com`.
active-directory Users Default Permissions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/users-default-permissions.md
Users can register application | Setting this option to No prevents users from c
Allow users to connect work or school account with LinkedIn | Setting this option to No prevents users from connecting their work or school account with their LinkedIn account. For more information, see [LinkedIn account connections data sharing and consent](../enterprise-users/linkedin-user-consent.md). Ability to create security groups | Setting this option to No prevents users from creating security groups. Global administrators and User administrators can still create security groups. See [Azure Active Directory cmdlets for configuring group settings](../enterprise-users/groups-settings-cmdlets.md) to learn how. Ability to create Microsoft 365 groups | Setting this option to No prevents users from creating Microsoft 365 groups. Setting this option to Some allows a select set of users to create Microsoft 365 groups. Global administrators and User administrators will still be able to create Microsoft 365 groups. See [Azure Active Directory cmdlets for configuring group settings](../enterprise-users/groups-settings-cmdlets.md) to learn how.
-Restrict access to Azure AD administration portal | Setting this option to No lets non-administrators use the Azure AD administration portal to read and manage Azure AD resources. Yes restricts all non-administrators from accessing any Azure AD data in the administration portal.<p>**Note**: this setting does not restrict access to Azure AD data using PowerShell or other clients such as Visual Studio.When set to Yes, to grant a specific non-admin user the ability to use the Azure AD administration portal assign any administrative role such as the Directory Readers role.<p>This role allows reading basic directory information, which member users have by default (guests and service principals do not).
+Restrict access to Azure AD administration portal | <p>Setting this option to No lets non-administrators use the Azure AD administration portal to read and manage Azure AD resources. Yes restricts all non-administrators from accessing any Azure AD data in the administration portal.</p><p>**Note**: this setting does not restrict access to Azure AD data using PowerShell or other clients such as Visual Studio.When set to Yes, to grant a specific non-admin user the ability to use the Azure AD administration portal assign any administrative role such as the Directory Readers role.</p><p>**Note**: this settings will block non-admin users who are owners of groups or applications from using the Azure portal to manage their owned resources.</p><p>This role allows reading basic directory information, which member users have by default (guests and service principals do not).</p>
Ability to read other users | This setting is available in PowerShell only. Setting this flag to $false prevents all non-admins from reading user information from the directory. This flag does not prevent reading user information in other Microsoft services like Exchange Online. This setting is meant for special circumstances, and setting this flag to $false is not recommended. ## Restrict guest users default permissions
active-directory Application Sign In Unexpected User Consent Prompt https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/application-sign-in-unexpected-user-consent-prompt.md
This results in a consent prompt being shown the first time an application is us
Additional prompts can be expected in various scenarios:
+* The application has been configured to require assignment. User consent is not currently supported for apps which require assignment. If you configure an application to require assignment, be sure to also grant tenant-wide admin consent so that assigned user can sign in.
+ * The set of permissions required by the application has changed.
-* The user who originally consented to the application was not an administrator, and now a different (non-admin) User is using the application for the first time.
+* The user who originally consented to the application was not an administrator, and now a different (non-admin) user is using the application for the first time.
* The user who originally consented to the application was an administrator, but they did not consent on-behalf of the entire organization.
Additional prompts can be expected in various scenarios:
* The developer has configured the application to require a consent prompt every time it is used (note: this is not best practice).
+ > [!NOTE]
+ > Following Microsoft's recommendations and best practices, many organizations have disabled or limited users' permission to grant consent to apps. If an application forces users to grant consent every time they sign in, most users will be blocked from using these applications even if an administrator grants tenant-wide admin consent. If you encounter an application which is requiring user consent even after admin consent has been granted, check with the app publisher to see if they have a setting or option to stop forcing user consent on every sign in.
++ ## Next steps - [Apps, permissions, and consent in Azure Active Directory (v1.0 endpoint)](../develop/quickstart-register-app.md)
active-directory How Manage User Assigned Managed Identities https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/how-manage-user-assigned-managed-identities.md
Title: Manage user assigned managed identities - Azure AD
-description: Create user assigned managed identities
+ Title: Manage user-assigned managed identities - Azure AD
+description: Create user-assigned managed identities.
zone_pivot_groups: identity-mi-methods
# Manage user-assigned managed identities
-Managed identities for Azure resources eliminate the need to manage credentials in code. They allow you to get an Azure active directory token your applications can use when accessing resources that support Azure Active Directory authentication. Azure manages the identity so you don't have to. There are two types of managed identities ΓÇô system-assigned and user-assigned. The main difference between the two types is that system assigned managed identities have their lifecycle linked to the resource where they are used. User assigned managed identities may be used on multiple resources. You can learn more about managed identities in the managed identities [overview](overview.md).
+Managed identities for Azure resources eliminate the need to manage credentials in code. You can use them to get an Azure Active Directory (Azure AD) token your applications can use when you access resources that support Azure AD authentication. Azure manages the identity so you don't have to.
+
+There are two types of managed identities: system-assigned and user-assigned. The main difference between them is that system-assigned managed identities have their lifecycle linked to the resource where they're used. User-assigned managed identities can be used on multiple resources. To learn more about managed identities, see [What are managed identities for Azure resources?](overview.md).
::: zone pivot="identity-mi-methods-azp"
-In this article, you learn how to create, list, delete, or assign a role to a user-assigned managed identity using the Azure portal
+In this article, you learn how to create, list, delete, or assign a role to a user-assigned managed identity by using the Azure portal.
## Prerequisites -- If you're unfamiliar with managed identities for Azure resources, check out the [overview section](overview.md). **Be sure to review the [difference between a system-assigned and user-assigned managed identity](overview.md#managed-identity-types)**.-- If you don't already have an Azure account, [sign up for a free account](https://azure.microsoft.com/free/) before continuing.
+- If you're unfamiliar with managed identities for Azure resources, check out the [overview section](overview.md). *Be sure to review the [difference between a system-assigned and user-assigned managed identity](overview.md#managed-identity-types)*.
+- If you don't already have an Azure account, [sign up for a free account](https://azure.microsoft.com/free/) before you continue.
## Create a user-assigned managed identity To create a user-assigned managed identity, your account needs the [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
-1. Sign in to the [Azure portal](https://portal.azure.com) using an account associated with the Azure subscription to create the user-assigned managed identity.
-2. In the search box, type *Managed Identities*, and under **Services**, click **Managed Identities**.
-3. Click **Add** and enter values in the following fields under **Create user assigned managed** identity pane:
+1. Sign in to the [Azure portal](https://portal.azure.com) by using an account associated with the Azure subscription to create the user-assigned managed identity.
+1. In the search box, enter **Managed Identities**. Under **Services**, select **Managed Identities**.
+1. Select **Add**, and enter values in the following boxes in the **Create User Assigned Managed Identity** pane:
- **Subscription**: Choose the subscription to create the user-assigned managed identity under.
- - **Resource group**: Choose a resource group to create the user-assigned managed identity in or click **Create new** to create a new resource group.
- - **Region**: Choose a region to deploy the user-assigned managed identity, for example **West US**.
- - **Name**: This is the name for your user-assigned managed identity, for example UAI1.
- ![Create a user-assigned managed identity](media/how-to-manage-ua-identity-portal/create-user-assigned-managed-identity-portal.png)
-4. Click **Review + create** to review the changes.
-5. Click **Create**.
+ - **Resource group**: Choose a resource group to create the user-assigned managed identity in, or select **Create new** to create a new resource group.
+ - **Region**: Choose a region to deploy the user-assigned managed identity, for example, **West US**.
+ - **Name**: Enter the name for your user-assigned managed identity, for example, UAI1.
+
+ ![Screenshot that shows the Create User Assigned Managed Identity pane.](media/how-to-manage-ua-identity-portal/create-user-assigned-managed-identity-portal.png)
+1. Select **Review + create** to review the changes.
+1. Select **Create**.
## List user-assigned managed identities
-To list/read a user-assigned managed identity, your account needs the [Managed Identity Operator](../../role-based-access-control/built-in-roles.md#managed-identity-operator) or [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
+To list or read a user-assigned managed identity, your account needs the [Managed Identity Operator](../../role-based-access-control/built-in-roles.md#managed-identity-operator) or [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
-1. Sign in to the [Azure portal](https://portal.azure.com) using an account associated with the Azure subscription to list the user-assigned managed identities.
-2. In the search box, type *Managed Identities*, and under Services, click **Managed Identities**.
-3. A list of the user-assigned managed identities for your subscription is returned. To see the details of a user-assigned managed identity click its name.
+1. Sign in to the [Azure portal](https://portal.azure.com) by using an account associated with the Azure subscription to list the user-assigned managed identities.
+1. In the search box, enter **Managed Identities**. Under **Services**, select **Managed Identities**.
+1. A list of the user-assigned managed identities for your subscription is returned. To see the details of a user-assigned managed identity, select its name.
-![List user-assigned managed identity](media/how-to-manage-ua-identity-portal/list-user-assigned-managed-identity-portal.png)
+ ![Screenshot that shows the list of user-assigned managed identity.](media/how-to-manage-ua-identity-portal/list-user-assigned-managed-identity-portal.png)
## Delete a user-assigned managed identity To delete a user-assigned managed identity, your account needs the [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
-Deleting a user assigned identity does not remove it from the VM or resource it was assigned to. To remove the user assigned identity from a VM see, [Remove a user-assigned managed identity from a VM](qs-configure-portal-windows-vm.md#remove-a-user-assigned-managed-identity-from-a-vm).
+Deleting a user-assigned identity doesn't remove it from the VM or resource it was assigned to. To remove the user-assigned identity from a VM, see [Remove a user-assigned managed identity from a VM](qs-configure-portal-windows-vm.md#remove-a-user-assigned-managed-identity-from-a-vm).
-1. Sign in to the [Azure portal](https://portal.azure.com) using an account associated with the Azure subscription to delete a user-assigned managed identity.
-2. Select the user-assigned managed identity and click **Delete**.
-3. Under the confirmation box choose, **Yes**.
+1. Sign in to the [Azure portal](https://portal.azure.com) by using an account associated with the Azure subscription to delete a user-assigned managed identity.
+1. Select the user-assigned managed identity, and select **Delete**.
+1. Under the confirmation box, select **Yes**.
-![Delete user-assigned managed identity](media/how-to-manage-ua-identity-portal/delete-user-assigned-managed-identity-portal.png)
+ ![Screenshot that shows the Delete user-assigned managed identities.](media/how-to-manage-ua-identity-portal/delete-user-assigned-managed-identity-portal.png)
## Assign a role to a user-assigned managed identity To assign a role to a user-assigned managed identity, your account needs the [User Access Administrator](../../role-based-access-control/built-in-roles.md#user-access-administrator) role assignment.
-1. Sign in to the [Azure portal](https://portal.azure.com) using an account associated with the Azure subscription to list the user-assigned managed identities.
-2. In the search box, type *Managed Identities*, and under Services, click **Managed Identities**.
-3. A list of the user-assigned managed identities for your subscription is returned. Select the user-assigned managed identity that you want to assign a role.
-4. Select **Access control (IAM)**, and then select **Add role assignment**.
+1. Sign in to the [Azure portal](https://portal.azure.com) by using an account associated with the Azure subscription to list the user-assigned managed identities.
+1. In the search box, enter **Managed Identities**. Under **Services**, select **Managed Identities**.
+1. A list of the user-assigned managed identities for your subscription is returned. Select the user-assigned managed identity that you want to assign a role.
+1. Select **Access control (IAM)**, and then select **Add role assignment**.
- ![User-assigned managed identity start](media/how-to-manage-ua-identity-portal/assign-role-screenshot1.png)
+ ![Screenshot that shows the user-assigned managed identity start.](media/how-to-manage-ua-identity-portal/assign-role-screenshot1.png)
-5. In the Add role assignment blade, configure the following values, and then click **Save**:
- - **Role** - the role to assign
- - **Assign access to** - the resource to assign the user-assigned managed identity
- - **Select** - the member to assign access
+1. In the **Add role assignment** pane, configure the following values, and then select **Save**:
+ - **Role**: The role to assign.
+ - **Assign access to**: The resource to assign the user-assigned managed identity.
+ - **Select**: The member to assign access.
- ![User-assigned managed identity IAM](media/how-to-manage-ua-identity-portal/assign-role-screenshot2.png)
+ ![Screenshot that shows the user-assigned managed identity IAM.](media/how-to-manage-ua-identity-portal/assign-role-screenshot2.png)
To assign a role to a user-assigned managed identity, your account needs the [Us
::: zone pivot="identity-mi-methods-azcli"
-In this article, you learn how to create, list, delete, or assign a role to a user-assigned managed identity using the Azure CLI.
+In this article, you learn how to create, list, delete, or assign a role to a user-assigned managed identity by using the Azure CLI.
## Prerequisites [!INCLUDE [azure-cli-prepare-your-environment-no-header.md](../../../includes/azure-cli-prepare-your-environment-no-header.md)]
-> [!IMPORTANT]
-> To modify user permissions when using an app service principal using CLI you must provide the service principal additional permissions in Azure AD Graph API as portions of CLI perform GET requests against the Graph API. Otherwise, you may end up receiving a 'Insufficient privileges to complete the operation' message. To do this you will need to go into the App registration in Azure Active Directory, select your app, click on API permissions, scroll down and select Azure Active Directory Graph. From there select Application permissions, and then add the appropriate permissions.
+> [!IMPORTANT]
+> To modify user permissions when you use an app service principal by using the CLI, you must provide the service principal more permissions in the Azure Active Directory Graph API because portions of the CLI perform GET requests against the Graph API. Otherwise, you might end up receiving an "Insufficient privileges to complete the operation" message. To do this step, go into the **App registration** in Azure AD, select your app, select **API permissions**, and scroll down and select **Azure Active Directory Graph**. From there, select **Application permissions**, and then add the appropriate permissions.
## Create a user-assigned managed identity To create a user-assigned managed identity, your account needs the [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
-Use the [az identity create](/cli/azure/identity#az_identity_create) command to create a user-assigned managed identity. The `-g` parameter specifies the resource group where to create the user-assigned managed identity, and the `-n` parameter specifies its name. Replace the `<RESOURCE GROUP>` and `<USER ASSIGNED IDENTITY NAME>` parameter values with your own values:
+Use the [az identity create](/cli/azure/identity#az_identity_create) command to create a user-assigned managed identity. The `-g` parameter specifies the resource group where to create the user-assigned managed identity. The `-n` parameter specifies its name. Replace the `<RESOURCE GROUP>` and `<USER ASSIGNED IDENTITY NAME>` parameter values with your own values.
[!INCLUDE [ua-character-limit](~/includes/managed-identity-ua-character-limits.md)]
az identity create -g <RESOURCE GROUP> -n <USER ASSIGNED IDENTITY NAME>
``` ## List user-assigned managed identities
-To list/read a user-assigned managed identity, your account needs the [Managed Identity Operator](../../role-based-access-control/built-in-roles.md#managed-identity-operator) or [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
+To list or read a user-assigned managed identity, your account needs the [Managed Identity Operator](../../role-based-access-control/built-in-roles.md#managed-identity-operator) or [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
-To list user-assigned managed identities, use the [az identity list](/cli/azure/identity#az_identity_list) command. Replace the `<RESOURCE GROUP>` with your own value:
+To list user-assigned managed identities, use the [az identity list](/cli/azure/identity#az_identity_list) command. Replace the `<RESOURCE GROUP>` value with your own value.
```azurecli-interactive az identity list -g <RESOURCE GROUP> ```
-In the json response, user-assigned managed identities have `"Microsoft.ManagedIdentity/userAssignedIdentities"` value returned for key, `type`.
+In the JSON response, user-assigned managed identities have the `"Microsoft.ManagedIdentity/userAssignedIdentities"` value returned for the key `type`.
`"type": "Microsoft.ManagedIdentity/userAssignedIdentities"`
In the json response, user-assigned managed identities have `"Microsoft.ManagedI
To delete a user-assigned managed identity, your account needs the [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
-To delete a user-assigned managed identity, use the [az identity delete](/cli/azure/identity#az_identity_delete) command. The -n parameter specifies its name and the -g parameter specifies the resource group where the user-assigned managed identity was created. Replace the `<USER ASSIGNED IDENTITY NAME>` and `<RESOURCE GROUP>` parameters values with your own values:
+To delete a user-assigned managed identity, use the [az identity delete](/cli/azure/identity#az_identity_delete) command. The -n parameter specifies its name. The -g parameter specifies the resource group where the user-assigned managed identity was created. Replace the `<USER ASSIGNED IDENTITY NAME>` and `<RESOURCE GROUP>` parameter values with your own values.
```azurecli-interactive az identity delete -n <USER ASSIGNED IDENTITY NAME> -g <RESOURCE GROUP> ``` > [!NOTE]
-> Deleting a user-assigned managed identity will not remove the reference, from any resource it was assigned to. Please remove those from VM/VMSS using the `az vm/vmss identity remove` command
+> Deleting a user-assigned managed identity won't remove the reference from any resource it was assigned to. Remove those from a VM or virtual machine scale set by using the `az vm/vmss identity remove` command.
## Next steps For a full list of Azure CLI identity commands, see [az identity](/cli/azure/identity).
-For information on how to assign a user-assigned managed identity to an Azure VM see, [Configure managed identities for Azure resources on an Azure VM using Azure CLI](qs-configure-cli-windows-vm.md#user-assigned-managed-identity)
+For information on how to assign a user-assigned managed identity to an Azure VM, see [Configure managed identities for Azure resources on an Azure VM using Azure CLI](qs-configure-cli-windows-vm.md#user-assigned-managed-identity).
::: zone-end ::: zone pivot="identity-mi-methods-powershell"
-In this article, you learn how to create, list, and delete a user-assigned managed identity using PowerShell.
+In this article, you learn how to create, list, and delete a user-assigned managed identity by using PowerShell.
## Prerequisites -- If you're unfamiliar with managed identities for Azure resources, check out the [overview section](overview.md). **Be sure to review the [difference between a system-assigned and user-assigned managed identity](overview.md#managed-identity-types)**.-- If you don't already have an Azure account, [sign up for a free account](https://azure.microsoft.com/free/) before continuing.
+- If you're unfamiliar with managed identities for Azure resources, check out the [overview section](overview.md). *Be sure to review the [difference between a system-assigned and user-assigned managed identity](overview.md#managed-identity-types)*.
+- If you don't already have an Azure account, [sign up for a free account](https://azure.microsoft.com/free/) before you continue.
- To run the example scripts, you have two options:
- - Use the [Azure Cloud Shell](../../cloud-shell/overview.md), which you can open using the **Try It** button on the top right corner of code blocks.
+ - Use [Azure Cloud Shell](../../cloud-shell/overview.md), which you can open by using the **Try It** button in the upper-right corner of code blocks.
- Run scripts locally with Azure PowerShell, as described in the next section. ### Configure Azure PowerShell locally
-To use Azure PowerShell locally for this article (rather than using Cloud Shell), complete the following steps:
+To use Azure PowerShell locally for this article instead of using Cloud Shell:
1. Install [the latest version of Azure PowerShell](/powershell/azure/install-az-ps) if you haven't already.
-1. Sign in to Azure:
+1. Sign in to Azure.
```azurepowershell Connect-AzAccount
To use Azure PowerShell locally for this article (rather than using Cloud Shell)
Install-Module -Name PowerShellGet -AllowPrerelease ```
- You may need to `Exit` out of the current PowerShell session after you run this command for the next step.
+ You might need to `Exit` out of the current PowerShell session after you run this command for the next step.
-1. Install the prerelease version of the `Az.ManagedServiceIdentity` module to perform the user-assigned managed identity operations in this article:
+1. Install the prerelease version of the `Az.ManagedServiceIdentity` module to perform the user-assigned managed identity operations in this article.
```azurepowershell Install-Module -Name Az.ManagedServiceIdentity -AllowPrerelease
To use Azure PowerShell locally for this article (rather than using Cloud Shell)
To create a user-assigned managed identity, your account needs the [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
-To create a user-assigned managed identity, use the `New-AzUserAssignedIdentity` command. The `ResourceGroupName` parameter specifies the resource group where to create the user-assigned managed identity, and the `-Name` parameter specifies its name. Replace the `<RESOURCE GROUP>` and `<USER ASSIGNED IDENTITY NAME>` parameter values with your own values:
+To create a user-assigned managed identity, use the `New-AzUserAssignedIdentity` command. The `ResourceGroupName` parameter specifies the resource group where to create the user-assigned managed identity. The `-Name` parameter specifies its name. Replace the `<RESOURCE GROUP>` and `<USER ASSIGNED IDENTITY NAME>` parameter values with your own values.
[!INCLUDE [ua-character-limit](~/includes/managed-identity-ua-character-limits.md)]
New-AzUserAssignedIdentity -ResourceGroupName <RESOURCEGROUP> -Name <USER ASSIGN
## List user-assigned managed identities
-To list/read a user-assigned managed identity, your account needs the [Managed Identity Operator](../../role-based-access-control/built-in-roles.md#managed-identity-operator) or [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
+To list or read a user-assigned managed identity, your account needs the [Managed Identity Operator](../../role-based-access-control/built-in-roles.md#managed-identity-operator) or [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
-To list user-assigned managed identities, use the [Get-AzUserAssigned] command. The `-ResourceGroupName` parameter specifies the resource group where the user-assigned managed identity was created. Replace the `<RESOURCE GROUP>` with your own value:
+To list user-assigned managed identities, use the [Get-AzUserAssigned] command. The `-ResourceGroupName` parameter specifies the resource group where the user-assigned managed identity was created. Replace the `<RESOURCE GROUP>` value with your own value.
```azurepowershell-interactive Get-AzUserAssignedIdentity -ResourceGroupName <RESOURCE GROUP> ```
-In the response, user-assigned managed identities have `"Microsoft.ManagedIdentity/userAssignedIdentities"` value returned for key, `Type`.
+In the response, user-assigned managed identities have the `"Microsoft.ManagedIdentity/userAssignedIdentities"` value returned for the key `Type`.
`Type :Microsoft.ManagedIdentity/userAssignedIdentities`
In the response, user-assigned managed identities have `"Microsoft.ManagedIdenti
To delete a user-assigned managed identity, your account needs the [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
-To delete a user-assigned managed identity, use the `Remove-AzUserAssignedIdentity` command. The `-ResourceGroupName` parameter specifies the resource group where the user-assigned identity was created and the `-Name` parameter specifies its name. Replace the `<RESOURCE GROUP>` and the `<USER ASSIGNED IDENTITY NAME>` parameters values with your own values:
+To delete a user-assigned managed identity, use the `Remove-AzUserAssignedIdentity` command. The `-ResourceGroupName` parameter specifies the resource group where the user-assigned identity was created. The `-Name` parameter specifies its name. Replace the `<RESOURCE GROUP>` and the `<USER ASSIGNED IDENTITY NAME>` parameter values with your own values.
```azurepowershell-interactive Remove-AzUserAssignedIdentity -ResourceGroupName <RESOURCE GROUP> -Name <USER ASSIGNED IDENTITY NAME> ``` > [!NOTE]
-> Deleting a user-assigned managed identity will not remove the reference, from any resource it was assigned to. Identity assignments need to be removed separately.
+> Deleting a user-assigned managed identity won't remove the reference from any resource it was assigned to. Identity assignments must be removed separately.
## Next steps
For a full list and more details of the Azure PowerShell managed identities for
::: zone pivot="identity-mi-methods-arm"
-In this article, you create a user-assigned managed identity using an Azure Resource Manager.
+In this article, you create a user-assigned managed identity by using Azure Resource Manager.
-It is not possible to list and delete a user-assigned managed identity using an Azure Resource Manager template. See the following articles to create and list a user-assigned managed identity:
+You can't list and delete a user-assigned managed identity by using a Resource Manager template. See the following articles to create and list a user-assigned managed identity:
- [List user-assigned managed identity](how-to-manage-ua-identity-cli.md#list-user-assigned-managed-identities) - [Delete user-assigned managed identity](how-to-manage-ua-identity-cli.md#delete-a-user-assigned-managed-identity) ## Prerequisites -- If you are unfamiliar with managed identities for Azure resources, check out the [overview section](overview.md). **Be sure to review the [difference between a system-assigned and user-assigned managed identity](overview.md#managed-identity-types)**.-- If you don't already have an Azure account, [sign up for a free account](https://azure.microsoft.com/free/) before continuing.
+- If you're unfamiliar with managed identities for Azure resources, check out the [overview section](overview.md). *Be sure to review the [difference between a system-assigned and user-assigned managed identity](overview.md#managed-identity-types)*.
+- If you don't already have an Azure account, [sign up for a free account](https://azure.microsoft.com/free/) before you continue.
## Template creation and editing
-As with the Azure portal and scripting, Azure Resource Manager templates provide the ability to deploy new or modified resources defined by an Azure resource group. Several options are available for template editing and deployment, both local and portal-based, including:
+As with the Azure portal and scripting, Resource Manager templates provide the ability to deploy new or modified resources defined by an Azure resource group. Several options are available for template editing and deployment, both local and portal-based. You can:
-- Using a [custom template from the Azure Marketplace](../../azure-resource-manager/templates/deploy-portal.md#deploy-resources-from-custom-template), which allows you to create a template from scratch, or base it on an existing common or [quickstart template](https://azure.microsoft.com/documentation/templates/).-- Deriving from an existing resource group, by exporting a template from either [the original deployment](../../azure-resource-manager/management/manage-resource-groups-portal.md#export-resource-groups-to-templates), or from the [current state of the deployment](../../azure-resource-manager/management/manage-resource-groups-portal.md#export-resource-groups-to-templates).-- Using a local [JSON editor (such as VS Code)](../../azure-resource-manager/templates/quickstart-create-templates-use-the-portal.md), and then uploading and deploying by using PowerShell or CLI.-- Using the Visual Studio [Azure Resource Group project](../../azure-resource-manager/templates/create-visual-studio-deployment-project.md) to both create and deploy a template.
+- Use a [custom template from Azure Marketplace](../../azure-resource-manager/templates/deploy-portal.md#deploy-resources-from-custom-template) to create a template from scratch or base it on an existing common or [quickstart template](https://azure.microsoft.com/resources/templates/).
+- Derive from an existing resource group by exporting a template from either [the original deployment](../../azure-resource-manager/management/manage-resource-groups-portal.md#export-resource-groups-to-templates) or from the [current state of the deployment](../../azure-resource-manager/management/manage-resource-groups-portal.md#export-resource-groups-to-templates).
+- Use a local [JSON editor (such as VS Code)](../../azure-resource-manager/templates/quickstart-create-templates-use-the-portal.md), and then upload and deploy by using PowerShell or the Azure CLI.
+- Use the Visual Studio [Azure Resource Group project](../../azure-resource-manager/templates/create-visual-studio-deployment-project.md) to create and deploy a template.
## Create a user-assigned managed identity To create a user-assigned managed identity, your account needs the [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
-To create a user-assigned managed identity, use the following template. Replace the `<USER ASSIGNED IDENTITY NAME>` value with your own values:
+To create a user-assigned managed identity, use the following template. Replace the `<USER ASSIGNED IDENTITY NAME>` value with your own values.
[!INCLUDE [ua-character-limit](~/includes/managed-identity-ua-character-limits.md)]
To create a user-assigned managed identity, use the following template. Replace
``` ## Next steps
-For information on how to assign a user-assigned managed identity to an Azure VM using an Azure Resource Manager template see, [Configure managed identities for Azure resources on an Azure VM using a templates](qs-configure-template-windows-vm.md).
+For information on how to assign a user-assigned managed identity to an Azure VM by using a Resource Manager template, see [Configure managed identities for Azure resources on an Azure VM using a template](qs-configure-template-windows-vm.md).
For information on how to assign a user-assigned managed identity to an Azure VM
::: zone pivot="identity-mi-methods-rest"
-In this article, you learn how to create, list, and delete a user-assigned managed identity using CURL to make REST API calls.
+In this article, you learn how to create, list, and delete a user-assigned managed identity by using CURL to make REST API calls.
## Prerequisites -- If you're unfamiliar with managed identities for Azure resources, check out the [overview section](overview.md). **Be sure to review the [difference between a system-assigned and user-assigned managed identity](overview.md#managed-identity-types)**.-- If you don't already have an Azure account, [sign up for a free account](https://azure.microsoft.com/free/) before continuing.
+- If you're unfamiliar with managed identities for Azure resources, check out the [overview section](overview.md). *Be sure to review the [difference between a system-assigned and user-assigned managed identity](overview.md#managed-identity-types)*.
+- If you don't already have an Azure account, [sign up for a free account](https://azure.microsoft.com/free/) before you continue.
- You can run all the commands in this article either in the cloud or locally:
- - To run in the cloud, use the [Azure Cloud Shell](../../cloud-shell/overview.md).
+ - To run in the cloud, use [Azure Cloud Shell](../../cloud-shell/overview.md).
- To run locally, install [curl](https://curl.haxx.se/download.html) and the [Azure CLI](/cli/azure/install-azure-cli). ## Obtain a bearer access token
-1. If running locally, sign into Azure through the Azure CLI:
+1. If you're running locally, sign in to Azure through the Azure CLI.
``` az login ```
-1. Obtain an access token using [az account get-access-token](/cli/azure/account#az_account_get_access_token)
+1. Obtain an access token by using [az account get-access-token](/cli/azure/account#az_account_get_access_token).
```azurecli-interactive az account get-access-token
s/<RESOURCE GROUP>/providers/Microsoft.ManagedIdentity/userAssignedIdentities/<U
|Name |Description | |||
-|location | Required. Resource location. |
+|Location | Required. Resource location. |
## List user-assigned managed identities
-To list/read a user-assigned managed identity, your account needs the [Managed Identity Operator](../../role-based-access-control/built-in-roles.md#managed-identity-operator) or [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
+To list or read a user-assigned managed identity, your account needs the [Managed Identity Operator](../../role-based-access-control/built-in-roles.md#managed-identity-operator) or [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment.
```bash curl 'https://management.azure.com/subscriptions/<SUBSCRIPTION ID>/resourceGroups/<RESOURCE GROUP>/providers/Microsoft.ManagedIdentity/userAssignedIdentities?api-version=2015-08-31-preview' -H "Authorization: Bearer <ACCESS TOKEN>"
GET https://management.azure.com/subscriptions/<SUBSCRIPTION ID>/resourceGroups/
To delete a user-assigned managed identity, your account needs the [Managed Identity Contributor](../../role-based-access-control/built-in-roles.md#managed-identity-contributor) role assignment. > [!NOTE]
-> Deleting a user-assigned managed identity will not remove the reference from any resource it was assigned to. To remove a user-assigned managed identity from a VM using CURL see [Remove a user-assigned identity from an Azure VM](qs-configure-rest-vm.md#remove-a-user-assigned-managed-identity-from-an-azure-vm).
+> Deleting a user-assigned managed identity won't remove the reference from any resource it was assigned to. To remove a user-assigned managed identity from a VM by using CURL, see [Remove a user-assigned identity from an Azure VM](qs-configure-rest-vm.md#remove-a-user-assigned-managed-identity-from-an-azure-vm).
```bash curl 'https://management.azure.com/subscriptions/<SUBSCRIPTION ID>/resourceGroup
DELETE https://management.azure.com/subscriptions/80c696ff-5efa-4909-a64d-f1b616
## Next steps
-For information on how to assign a user-assigned managed identity to an Azure VM/VMSS using CURL see, [Configure managed identities for Azure resources on an Azure VM using REST API calls](qs-configure-rest-vm.md#user-assigned-managed-identity) and [Configure managed identities for Azure resources on a virtual machine scale set using REST API calls](qs-configure-rest-vmss.md#user-assigned-managed-identity).
+For information on how to assign a user-assigned managed identity to an Azure VM or virtual machine scale set by using CURL, see:
+- [Configure managed identities for Azure resources on an Azure VM using REST API calls](qs-configure-rest-vm.md#user-assigned-managed-identity)
+- [Configure managed identities for Azure resources on a virtual machine scale set using REST API calls](qs-configure-rest-vmss.md#user-assigned-managed-identity)
::: zone-end
active-directory How To Manage Ua Identity Arm https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/how-to-manage-ua-identity-arm.md
It is not possible to list and delete a user-assigned managed identity using an
As with the Azure portal and scripting, Azure Resource Manager templates provide the ability to deploy new or modified resources defined by an Azure resource group. Several options are available for template editing and deployment, both local and portal-based, including: -- Using a [custom template from the Azure Marketplace](../../azure-resource-manager/templates/deploy-portal.md#deploy-resources-from-custom-template), which allows you to create a template from scratch, or base it on an existing common or [quickstart template](https://azure.microsoft.com/documentation/templates/).
+- Using a [custom template from the Azure Marketplace](../../azure-resource-manager/templates/deploy-portal.md#deploy-resources-from-custom-template), which allows you to create a template from scratch, or base it on an existing common or [quickstart template](https://azure.microsoft.com/resources/templates/).
- Deriving from an existing resource group, by exporting a template from either [the original deployment](../../azure-resource-manager/management/manage-resource-groups-portal.md#export-resource-groups-to-templates), or from the [current state of the deployment](../../azure-resource-manager/management/manage-resource-groups-portal.md#export-resource-groups-to-templates). - Using a local [JSON editor (such as VS Code)](../../azure-resource-manager/templates/quickstart-create-templates-use-the-portal.md), and then uploading and deploying by using PowerShell or CLI. - Using the Visual Studio [Azure Resource Group project](../../azure-resource-manager/templates/create-visual-studio-deployment-project.md) to both create and deploy a template.
active-directory Qs Configure Template Windows Vm https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/qs-configure-template-windows-vm.md
In this article, using the Azure Resource Manager deployment template, you learn
As with the Azure portal and scripting, [Azure Resource Manager](../../azure-resource-manager/management/overview.md) templates provide the ability to deploy new or modified resources defined by an Azure resource group. Several options are available for template editing and deployment, both local and portal-based, including:
- - Using a [custom template from the Azure Marketplace](../../azure-resource-manager/templates/deploy-portal.md#deploy-resources-from-custom-template), which allows you to create a template from scratch, or base it on an existing common or [quickstart template](https://azure.microsoft.com/documentation/templates/).
+ - Using a [custom template from the Azure Marketplace](../../azure-resource-manager/templates/deploy-portal.md#deploy-resources-from-custom-template), which allows you to create a template from scratch, or base it on an existing common or [quickstart template](https://azure.microsoft.com/resources/templates/).
- Deriving from an existing resource group, by exporting a template from either [the original deployment](../../azure-resource-manager/templates/export-template-portal.md), or from the [current state of the deployment](../../azure-resource-manager/templates/export-template-portal.md). - Using a local [JSON editor (such as VS Code)](../../azure-resource-manager/templates/quickstart-create-templates-use-the-portal.md), and then uploading and deploying by using PowerShell or CLI. - Using the Visual Studio [Azure Resource Group project](../../azure-resource-manager/templates/create-visual-studio-deployment-project.md) to both create and deploy a template.
active-directory Qs Configure Template Windows Vmss https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/managed-identities-azure-resources/qs-configure-template-windows-vmss.md
In this article, you learn how to perform the following managed identities for A
As with the Azure portal and scripting, [Azure Resource Manager](../../azure-resource-manager/management/overview.md) templates provide the ability to deploy new or modified resources defined by an Azure resource group. Several options are available for template editing and deployment, both local and portal-based, including:
- - Using a [custom template from the Azure Marketplace](../../azure-resource-manager/templates/deploy-portal.md#deploy-resources-from-custom-template), which allows you to create a template from scratch, or base it on an existing common or [quickstart template](https://azure.microsoft.com/documentation/templates/).
+ - Using a [custom template from the Azure Marketplace](../../azure-resource-manager/templates/deploy-portal.md#deploy-resources-from-custom-template), which allows you to create a template from scratch, or base it on an existing common or [quickstart template](https://azure.microsoft.com/resources/templates/).
- Deriving from an existing resource group, by exporting a template from either [the original deployment](../../azure-resource-manager/templates/export-template-portal.md), or from the [current state of the deployment](../../azure-resource-manager/templates/export-template-portal.md). - Using a local [JSON editor (such as VS Code)](../../azure-resource-manager/templates/quickstart-create-templates-use-the-portal.md), and then uploading and deploying by using PowerShell or CLI. - Using the Visual Studio [Azure Resource Group project](../../azure-resource-manager/templates/create-visual-studio-deployment-project.md) to both create and deploy a template.
active-directory Delegate App Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/delegate-app-roles.md
In some cases, enterprise applications created from the application gallery incl
1. Select **Owners** to see the list of the owners for the app. 1. Select **Add** to select one or more owners to add to the app.
+> [!NOTE]
+> If the user setting "[Restrict access to Azure AD administration portal](../fundamentals/users-default-permissions.md)" is set to Yes, non-admin users wil not be able to use the Azure portal to manage the applications they own.
+ > [!IMPORTANT] > Users and service principals can be owners of application registrations. Only users can be owners of enterprise applications. Groups cannot be assigned as owners of either. >
This separation allows you to create a single role definition and then assign it
Tips when creating and using custom roles for delegating application management: - Custom roles only grant access in the most current app registration blades of the Azure portal. They do not grant access in the legacy app registrations blades.-- Custom roles do not grant access to the Azure portal when the ΓÇ£Restrict access to Azure AD administration portalΓÇ¥ user setting is set to Yes.
+- Custom roles do not grant access to the Azure portal when the ΓÇ£[Restrict access to Azure AD administration portal](../fundamentals/users-default-permissions.md)ΓÇ¥ user setting is set to Yes.
- App registrations the user has access to using role assignments only show up in the ΓÇÿAll applicationsΓÇÖ tab on the App registration page. They do not show up in the ΓÇÿOwned applicationsΓÇÖ tab. For more information on the basics of custom roles, see the [custom roles overview](custom-overview.md), as well as how to [create a custom role](custom-create.md) and how to [assign a role](custom-assign-powershell.md).
active-directory Amazon Business Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/amazon-business-tutorial.md
Previously updated : 07/16/2019 Last updated : 06/16/2021 # Tutorial: Integrate Amazon Business with Azure Active Directory
-In this tutorial, you'll learn how to integrate Amazon Business with Azure Active Directory (Azure AD). When you integrate [Amazon Business](https://www.amazon.com/b2b/info/amazon-business?layout=landing) with Azure AD, you can:
+In this tutorial, you'll learn how to integrate Amazon Business with Azure Active Directory (Azure AD). When you integrate Amazon Business with Azure AD, you can:
* Control in Azure AD who has access to Amazon Business. * Enable your users to be automatically signed-in to Amazon Business with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
-* An Azure AD subscription. If you don't have a subscription, you can get one-month free trial [here](https://azure.microsoft.com/pricing/free-trial/).
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
* An Amazon Business single sign-on (SSO) enabled subscription. Go to the [Amazon Business](https://www.amazon.com/business/register/org/landing?ref_=ab_reg_mlp) page to create an Amazon Business account. ## Scenario description In this tutorial, you configure and test Azure AD SSO in an existing Amazon Business account.
-* Amazon Business supports **SP and IDP** initiated SSO
-* Amazon Business supports **Just In Time** user provisioning
+* Amazon Business supports **SP and IDP** initiated SSO.
+* Amazon Business supports **Just In Time** user provisioning.
+
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-## Adding Amazon Business from the gallery
+## Add Amazon Business from the gallery
To configure the integration of Amazon Business into Azure AD, you need to add Amazon Business from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Amazon Business** in the search box. 1. Select **Amazon Business** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on
+## Configure and test Azure AD SSO for Amazon Business
-Configure and test Azure AD SSO with Amazon Business using a test user called **B.Simon**.
+Configure and test Azure AD SSO with Amazon Business using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Amazon Business.
-To configure and test Azure AD SSO with Amazon Business, complete the following building steps:
+To configure and test Azure AD SSO with Amazon Business, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
-2. **[Configure Amazon Business SSO](#configure-amazon-business-sso)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
-5. **[Create Amazon Business test user](#create-amazon-business-test-user)** - to have a counterpart of B.Simon in Amazon Business that is linked to the Azure AD representation of user.
-6. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Amazon Business SSO](#configure-amazon-business-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Amazon Business test user](#create-amazon-business-test-user)** - to have a counterpart of B.Simon in Amazon Business that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-### Configure Azure AD SSO
+## Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Amazon Business** application integration page, find the **Manage** section and select **Single sign-on**.
+1. In the Azure portal, on the **Amazon Business** application integration page, find the **Manage** section and select **Single sign-on**.
1. On the **Select a Single sign-on method** page, select **SAML**.
-1. On the **Set up Single Sign-On with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png) 1. On the **Basic SAML Configuration** section, if you wish to configure in **IDP** initiated mode, perform the following steps:
- 1. In the **Identifier (Entity ID)** text box, type a URL using one of the following patterns:
+ 1. In the **Identifier (Entity ID)** text box, type one of the following URLs:
| URL | Region | |-|-|
Follow these steps to enable Azure AD SSO in the Azure portal.
1. If you want to configure the application in **SP** initiated mode, you will need to add the full URL provided in the Amazon Business configuration to the **Sign-on URL** in the **Set additional URLs** section.
-1. The following screenshot shows the list of default attributes. Edit the attributes by clicking on the **Edit** icon in the **User Attributes & Claims** section.
+1. The following screenshot shows the list of default attributes. Edit the attributes by clicking on the **pencil** icon in the **User Attributes & Claims** section.
- ![Screenshot shows User Attributes & Claims with default values such as Givenname user.givenname and Emailaddress user.mail.](media/amazon-business-tutorial/map-attribute3.png)
+ ![Screenshot shows User Attributes & Claims with default values such as Givenname user.givenname and Emailaddress user.mail.](media/amazon-business-tutorial/map-attribute.png)
1. Edit Attributes and copy **Namespace** value of these attributes into the Notepad.
- ![Screenshot shows User Attributes & Claims with columns for Claim name and value.](media/amazon-business-tutorial/map-attribute4.png)
+ ![Screenshot shows User Attributes & Claims with columns for Claim name and value.](media/amazon-business-tutorial/attribute.png)
1. In addition to above, Amazon Business application expects few more attributes to be passed back in SAML response. In the **User Attributes & Claims** section on the **Group Claims** dialog, perform the following steps: 1. Click the **pen** next to **Groups returned in claim**.
- ![Screenshot shows User Attributes & Claims with the icon for Groups returned in claim selected.](./media/amazon-business-tutorial/config04.png)
+ ![Screenshot shows User Attributes & Claims with the icon for Groups returned in claim selected.](./media/amazon-business-tutorial/claim.png)
- ![Screenshot shows Group Claims with values as described in this procedure.](./media/amazon-business-tutorial/config05.png)
+ ![Screenshot shows Group Claims with values as described in this procedure.](./media/amazon-business-tutorial/group-claim.png)
1. Select **All Groups** from the radio list.
Follow these steps to enable Azure AD SSO in the Azure portal.
![Copy configuration URLs](common/copy-configuration-urls.png)
-### Configure Amazon Business SSO
-
-1. In a different web browser window, sign in to your Amazon Business company site as an administrator.
-
-1. Click on the **User Profile** and select **Business Settings**.
-
- ![User Profile](media/amazon-business-tutorial/user-profile.png)
-
-1. On the **System integrations** wizard, select **Single Sign-On (SSO)**.
-
- ![Single Sign-On (SSO)](media/amazon-business-tutorial/sso-settings.png)
-
-1. On the **Set up SSO** wizard, select the provider according to your Organizational requirements and click **Next**.
-
- ![Screenshot shows Set up S S O, with Microsoft Azure A D and Next selected.](media/amazon-business-tutorial/default-group1.png)
-
- > [!NOTE]
- > Although Microsoft ADFS is a listed option, it won't work with Azure AD SSO.
-
-1. On the **New user account defaults** wizard, select the **Default Group** and then select **Default Buying Role** according to user role in your Organization and click **Next**.
-
- ![Screenshot shows New user account defaults with Microsoft S S O, Requisitioner, and Next selected.](media/amazon-business-tutorial/dafault-group2.png)
-
-1. On the **Upload your metadata file** wizard, click **Browse** to upload the **Metadata XML** file, which you have downloaded from the Azure portal and click **Upload**.
-
- ![Screenshot shows Upload your metadata file, which allows you to browse to an x m l file and upload it.](media/amazon-business-tutorial/connection-data1.png)
-
-1. After uploading the downloaded metadata file, the fields in the **Connection data** section will populate automatically. After that click **Next**.
-
- ![Screenshot shows Connection data, where you can specify an Azure A D Identifier, Login U R L, and SAML Signing Certificate.](media/amazon-business-tutorial/connection-data2.png)
-
-1. On the **Upload your Attribute statement** wizard, click **Skip**.
-
- ![Screenshot shows Upload your Attribute statement, which allows you to browse to an attribute statement, but in this case, select Skip.](media/amazon-business-tutorial/map-attribute1.png)
-
-1. On the **Attribute mapping** wizard, add the requirement fields by clicking the **+ Add a field** option. Add the attribute values including the namespace, which you have copied from the **User Attributes & Claims** section of Azure portal into the **SAML AttributeName** field, and click **Next**.
-
- ![Screenshot shows Attribute mapping, where you can edit your Amazon data SAML attribute names.](media/amazon-business-tutorial/map-attribute2.png)
-
-1. On the **Amazon connection data** wizard, click **Next**.
-
- ![Screenshot shows Amazon connection data, where you can click next to continue.](media/amazon-business-tutorial/amazon-connect.png)
-
-1. Please check the **Status** of the steps which have been configured and click **Start testing**.
-
- ![Screenshot shows S S O Connection Details with the option to Start testing.](media/amazon-business-tutorial/sso-connection1.png)
-
-1. On the **Test SSO Connection** wizard, click **Test**.
-
- ![Screenshot shows Test S S O Connection with the Test button.](media/amazon-business-tutorial/sso-connection2.png)
-
-1. On the **IDP initiated URL** wizard, before you click **Activate**, copy the value which is assigned to **idpid** and paste into the **idpid** parameter in the **Reply URL** in the **Basic SAML Configuration** section in the Azure portal.
-
- ![Screenshot shows I D P initiated U R L where you can get a U R L necessary for testing and then select Activate.](media/amazon-business-tutorial/sso-connection3.png)
-
-1. On the **Are you ready to switch to active SSO?** wizard, check **I have fully tested SSO and am ready to go live** checkbox and click on **Switch to active**.
-
- ![Screenshot shows the Are you ready to switch to active S S O confirmation where you can select Switch to active.](media/amazon-business-tutorial/sso-connection4.png)
-
-1. Finally in the **SSO Connection details** section the **Status** is shown as **Active**.
-
- ![Screenshot shows S S O Connection Details with a status of Active.](media/amazon-business-tutorial/sso-connection5.png)
-
- > [!NOTE]
- > If you want to configure the application in **SP** initiated mode, complete the following step, paste the sign-on URL from the screenshot above in the **Sign-on URL** text box of the **Set additional URLs** section in the Azure portal. Use the following format:
- >
- > `https://www.amazon.<TLD>/bb/feature/sso/action/start?domain_hint=<uniqueid>`
- ### Create an Azure AD test user In this section, you'll create a test user in the Azure portal called B.Simon.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Amazon Business**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![Screenshot shows Add user button.](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
### Assign the Azure AD Security Group in the Azure portal 1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Amazon Business**.
+2. In the applications list, type and select **Amazon Business**.
+3. In the menu on the left, select **Users and groups**.
+4. Click the **Add user**.
+5. Search for the Security Group you want to use, then click on the group to add it to the Select members section. Click **Select**, then click **Assign**.
- ![Enterprise applications blade](common/enterprise-applications.png)
+ ![Search Security Group](./media/amazon-business-tutorial/assign-group.png)
-2. In the applications list, type and select **Amazon Business**.
+ > [!NOTE]
+ > Check the notifications in the menu bar to be notified that the Group was successfully assigned to the Enterprise application in the Azure portal.
- ![The Amazon Business link in the Applications list](common/all-applications.png)
+## Configure Amazon Business SSO
-3. In the menu on the left, select **Users and groups**.
+1. In a different web browser window, sign in to your Amazon Business company site as an administrator.
- ![The "Users and groups" link](common/users-groups-blade.png)
+1. Click on the **User Profile** and select **Business Settings**.
-4. Click the **Add user**.
+ ![User Profile](media/amazon-business-tutorial/user-profile.png)
- ![The Add Assignment pane](common/add-assign-user.png)
+1. On the **System integrations** wizard, select **Single Sign-On (SSO)**.
-5. Search for the Security Group you want to use, then click on the group to add it to the Select members section. Click **Select**, then click **Assign**.
+ ![Single Sign-On (SSO)](media/amazon-business-tutorial/sso-settings.png)
- ![Search Security Group](./media/amazon-business-tutorial/assign-group.png)
+1. On the **Set up SSO** wizard, select the provider according to your Organizational requirements and click **Next**.
+
+ ![Screenshot shows Set up S S O, with Microsoft Azure A D and Next selected.](media/amazon-business-tutorial/default-group.png)
> [!NOTE]
- > Check the notifications in the menu bar to be notified that the Group was successfully assigned to the Enterprise application in the Azure portal.
+ > Although Microsoft ADFS is a listed option, it won't work with Azure AD SSO.
+
+1. On the **New user account defaults** wizard, select the **Default Group** and then select **Default Buying Role** according to user role in your Organization and click **Next**.
+
+ ![Screenshot shows New user account defaults with Microsoft S S O, Requisitioner, and Next selected.](media/amazon-business-tutorial/group.png)
+
+1. On the **Upload your metadata file** wizard, click **Browse** to upload the **Metadata XML** file, which you have downloaded from the Azure portal and click **Upload**.
+
+ ![Screenshot shows Upload your metadata file, which allows you to browse to an x m l file and upload it.](media/amazon-business-tutorial/connection-data.png)
+
+1. After uploading the downloaded metadata file, the fields in the **Connection data** section will populate automatically. After that click **Next**.
+
+ ![Screenshot shows Connection data, where you can specify an Azure A D Identifier, Login U R L, and SAML Signing Certificate.](media/amazon-business-tutorial/connection.png)
+
+1. On the **Upload your Attribute statement** wizard, click **Skip**.
+
+ ![Screenshot shows Upload your Attribute statement, which allows you to browse to an attribute statement, but in this case, select Skip.](media/amazon-business-tutorial/upload-attribute.png)
+
+1. On the **Attribute mapping** wizard, add the requirement fields by clicking the **+ Add a field** option. Add the attribute values including the namespace, which you have copied from the **User Attributes & Claims** section of Azure portal into the **SAML AttributeName** field, and click **Next**.
+
+ ![Screenshot shows Attribute mapping, where you can edit your Amazon data SAML attribute names.](media/amazon-business-tutorial/attribute-mapping.png)
+
+1. On the **Amazon connection data** wizard, click **Next**.
+
+ ![Screenshot shows Amazon connection data, where you can click next to continue.](media/amazon-business-tutorial/amazon-connect.png)
+
+1. Please check the **Status** of the steps which have been configured and click **Start testing**.
+
+ ![Screenshot shows S S O Connection Details with the option to Start testing.](media/amazon-business-tutorial/status.png)
+
+1. On the **Test SSO Connection** wizard, click **Test**.
+
+ ![Screenshot shows Test S S O Connection with the Test button.](media/amazon-business-tutorial/test.png)
+
+1. On the **IDP initiated URL** wizard, before you click **Activate**, copy the value which is assigned to **idpid** and paste into the **idpid** parameter in the **Reply URL** in the **Basic SAML Configuration** section in the Azure portal.
+
+ ![Screenshot shows I D P initiated U R L where you can get a U R L necessary for testing and then select Activate.](media/amazon-business-tutorial/activate.png)
+
+1. On the **Are you ready to switch to active SSO?** wizard, check **I have fully tested SSO and am ready to go live** checkbox and click on **Switch to active**.
+
+ ![Screenshot shows the Are you ready to switch to active S S O confirmation where you can select Switch to active.](media/amazon-business-tutorial/switch-active.png)
+
+1. Finally in the **SSO Connection details** section the **Status** is shown as **Active**.
+
+ ![Screenshot shows S S O Connection Details with a status of Active.](media/amazon-business-tutorial/details.png)
+
+ > [!NOTE]
+ > If you want to configure the application in **SP** initiated mode, complete the following step, paste the sign-on URL from the screenshot above in the **Sign-on URL** text box of the **Set additional URLs** section in the Azure portal. Use the following format:
+ >
+ > `https://www.amazon.<TLD>/bb/feature/sso/action/start?domain_hint=<UNIQUE_ID>`
### Create Amazon Business test user In this section, a user called B.Simon is created in Amazon Business. Amazon Business supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Amazon Business, a new one is created after authentication.
-### Test SSO
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Amazon Business Sign on URL where you can initiate the login flow.
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+* Go to Amazon Business Sign-on URL directly and initiate the login flow from there.
-When you click the Amazon Business tile in the Access Panel, you should be automatically signed in to the Amazon Business for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Amazon Business for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the Amazon Business tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Amazon Business for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Amazon Business you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Amplitude Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/amplitude-tutorial.md
Previously updated : 10/14/2019 Last updated : 06/16/2021
In this tutorial, you'll learn how to integrate Amplitude with Azure Active Dire
* Enable your users to be automatically signed-in to Amplitude with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Amplitude supports **SP and IDP** initiated SSO
-* Amplitude supports **Just In Time** user provisioning
+* Amplitude supports **SP and IDP** initiated SSO.
+* Amplitude supports **Just In Time** user provisioning.
+
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-## Adding Amplitude from the gallery
+## Add Amplitude from the gallery
To configure the integration of Amplitude into Azure AD, you need to add Amplitude from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Amplitude** in the search box. 1. Select **Amplitude** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on for Amplitude
+## Configure and test Azure AD SSO for Amplitude
Configure and test Azure AD SSO with Amplitude using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Amplitude.
-To configure and test Azure AD SSO with Amplitude, complete the following building blocks:
+To configure and test Azure AD SSO with Amplitude, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
1. **[Configure Amplitude SSO](#configure-amplitude-sso)** - to configure the single sign-on settings on application side.
- * **[Create Amplitude test user](#create-amplitude-test-user)** - to have a counterpart of B.Simon in Amplitude that is linked to the Azure AD representation of user.
+ 1. **[Create Amplitude test user](#create-amplitude-test-user)** - to have a counterpart of B.Simon in Amplitude that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Amplitude** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Amplitude** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following steps:
- a. In the **Identifier** text box, type a URL:
+ a. In the **Identifier** text box, type the URL:
`https://amplitude.com/saml/sso/metadata` b. In the **Reply URL** text box, type a URL using the following pattern:
- `https://analytics.amplitude.com/saml/sso/<uniqueid>`
+ `https://analytics.amplitude.com/saml/sso/<UNIQUE_ID>`
> [!NOTE] > The Reply URL value is not real. You will get the Reply URL value later in this tutorial. 1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- In the **Sign-on URL** text box, type a URL:
+ In the **Sign-on URL** text box, type the URL:
`https://analytics.amplitude.com/sso` 1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Amplitude**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. Click on the **Plan Admin** from the left navigation bar.
- ![Screenshot shows the Amplitude menu with Plan Admin selected.](./media/amplitude-tutorial/configure1.png)
+ ![Screenshot shows the Amplitude menu with Plan Admin selected.](./media/amplitude-tutorial/plan-tab.png)
1. Select **Microsoft Azure Active Directory Metadata** from the **SSO Integration**.
- ![Screenshot shows the Plan Admin pane with the Microsoft Azure Active Directory Metadata link called out.](./media/amplitude-tutorial/configure2.png)
+ ![Screenshot shows the Plan Admin pane with the Microsoft Azure Active Directory Metadata link called out.](./media/amplitude-tutorial/metadata.png)
1. On the **Set Up Single Sign-On** section, perform the following steps:
- ![Screenshot shows the Set Up Single Sign-on section with values described in this step.](./media/amplitude-tutorial/configure3.png)
+ ![Screenshot shows the Set Up Single Sign-on section with values described in this step.](./media/amplitude-tutorial/configuration.png)
a. Open the downloaded **Metadata Xml** from Azure portal in notepad, paste the content into the **Microsoft Azure Active Directory Metadata** textbox.
In this section, a user called B.Simon is created in Amplitude. Amplitude suppor
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Amplitude Sign on URL where you can initiate the login flow.
-When you click the Amplitude tile in the Access Panel, you should be automatically signed in to the Amplitude for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Go to Amplitude Sign-on URL directly and initiate the login flow from there.
-## Additional resources
+#### IDP initiated:
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Amplitude for which you set up the SSO.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the Amplitude tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Amplitude for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+## Next steps
-- [Try Amplitude with Azure AD](https://aad.portal.azure.com/)
+Once you configure Amplitude you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Beeline Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/beeline-tutorial.md
Previously updated : 02/06/2019 Last updated : 06/15/2021 # Tutorial: Azure Active Directory integration with Beeline
-In this tutorial, you learn how to integrate Beeline with Azure Active Directory (Azure AD).
-Integrating Beeline with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Beeline with Azure Active Directory (Azure AD). When you integrate Beeline with Azure AD, you can:
-* You can control in Azure AD who has access to Beeline.
-* You can enable your users to be automatically signed-in to Beeline (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Beeline.
+* Enable your users to be automatically signed-in to Beeline with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Beeline, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Beeline single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Beeline single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Beeline only supports **IDP** initiated SSO
+* Beeline only supports **IDP** initiated SSO.
-## Adding Beeline from the gallery
+## Add Beeline from the gallery
To configure the integration of Beeline into Azure AD, you need to add Beeline from the gallery to your list of managed SaaS apps.
-**To add Beeline from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Beeline**, select **Beeline** from result panel then click **Add** button to add the application.
-
- ![Beeline in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Beeline** in the search box.
+1. Select **Beeline** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you configure and test Azure AD single sign-on with Beeline based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Beeline needs to be established.
+## Configure and test Azure AD SSO for Beeline
-To configure and test Azure AD single sign-on with Beeline, you need to complete the following building blocks:
+Configure and test Azure AD SSO with Beeline using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Beeline.
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Beeline Single Sign-On](#configure-beeline-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Beeline test user](#create-beeline-test-user)** - to have a counterpart of Britta Simon in Beeline that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure and test Azure AD SSO with Beeline, perform the following steps:
-### Configure Azure AD single sign-on
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Beeline SSO](#configure-beeline-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Beeline test user](#create-beeline-test-user)** - to have a counterpart of B.Simon in Beeline that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure Azure AD SSO
-To configure Azure AD single sign-on with Beeline, perform the following steps:
+Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Beeline** application integration page, select **Single sign-on**.
+1. In the Azure portal, on the **Beeline** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Configure single sign-on link](common/select-sso.png)
-
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
-
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Set up Single Sign-On with SAML** page, perform the following steps:
- ![BeeLine Domain and URLs single sign-on information](common/idp-intiated.png)
- a. In the **Identifier** text box, type a URL using the following pattern:
- `https://projects.beeline.com/<ProjInstanceName>`
+ `https://projects.beeline.com/<ProjInstance_Name>`
- b. In the **Reply URL** text box, type a URL using the following pattern:
-
- ```https
- https://projects.beeline.com/<ProjInstanceName>/SSO_External.ashx
- ```
+ b. In the **Reply URL** text box, type a URL using the following pattern:
+ `https://projects.beeline.com/<ProjInstance_Name>/SSO_External.ashx`
> [!NOTE] > These values are not real. Update these values with the actual Identifier and Reply URL. Contact [Beeline Client support team](https://www.beeline.com/contact-support/) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
To configure Azure AD single sign-on with Beeline, perform the following steps:
![Copy User Access URL](media/beeline-tutorial/client-access-url.png) -
-### Configure Beeline Single Sign-On
-
-To configure single sign-on on **Beeline** side, you need to send the downloaded **Federation Metadata XML** and the User Access URL from the Azure portal properties to [Beeline support team](https://www.beeline.com/contact-support/). They require the metadata and User Access URL so that the SAML SSO connection is configured properly on both sides.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
+In this section, you'll create a test user in the Azure portal called B.Simon.
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Beeline.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Beeline**.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Beeline.
- ![Enterprise applications blade](common/enterprise-applications.png)
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Beeline**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
-2. In the applications list, select **Beeline**.
+## Configure Beeline SSO
- ![The Beeline link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
+To configure single sign-on on **Beeline** side, you need to send the downloaded **Federation Metadata XML** and the User Access URL from the Azure portal properties to [Beeline support team](https://www.beeline.com/contact-support/). They require the metadata and User Access URL so that the SAML SSO connection is configured properly on both sides.
### Create Beeline test user In this section, you will create a user, Britta Simon, in Beeline. The Beeline application needs all users to be provisioned in the application before doing Single Sign On. So work with the [Beeline support team](https://www.beeline.com/contact-support/) to provision all these users into the application.
-### Test single sign-on
-
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+## Test SSO
-When you click the Beeline tile in the Access Panel, you should be automatically signed in to the Beeline instance in which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+In this section, you test your Azure AD single sign-on configuration with following options.
-## Additional Resources
+* Click on Test this application in Azure portal and you should be automatically signed in to the Beeline for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Beeline tile in the My Apps, you should be automatically signed in to the Beeline for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Beeline you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Bonus Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/bonus-tutorial.md
Previously updated : 04/14/2019 Last updated : 06/15/2021 # Tutorial: Azure Active Directory integration with Bonusly
-In this tutorial, you learn how to integrate Bonusly with Azure Active Directory (Azure AD).
-Integrating Bonusly with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Bonusly with Azure Active Directory (Azure AD). When you integrate Bonusly with Azure AD, you can:
-* You can control in Azure AD who has access to Bonusly.
-* You can enable your users to be automatically signed-in to Bonusly (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Bonusly.
+* Enable your users to be automatically signed-in to Bonusly with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites To configure Azure AD integration with Bonusly, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/free/)
-* Bonusly single sign-on enabled subscription
+* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/free/).
+* Bonusly single sign-on enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Bonusly supports **IDP** initiated SSO
-
-## Adding Bonusly from the gallery
-
-To configure the integration of Bonusly into Azure AD, you need to add Bonusly from the gallery to your list of managed SaaS apps.
-
-**To add Bonusly from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Bonusly**, select **Bonusly** from result panel then click **Add** button to add the application.
-
- ![Bonusly in the results list](common/search-new-app.png)
+* Bonusly supports **IDP** initiated SSO.
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with Bonusly based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Bonusly needs to be established.
-
-To configure and test Azure AD single sign-on with Bonusly, you need to complete the following building blocks:
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Bonusly Single Sign-On](#configure-bonusly-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Bonusly test user](#create-bonusly-test-user)** - to have a counterpart of Britta Simon in Bonusly that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+## Add Bonusly from the gallery
-### Configure Azure AD single sign-on
+To configure the integration of Bonusly into Azure AD, you need to add Bonusly from the gallery to your list of managed SaaS apps.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Bonusly** in the search box.
+1. Select **Bonusly** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-To configure Azure AD single sign-on with Bonusly, perform the following steps:
+## Configure and test Azure AD SSO for Bonusly
-1. In the [Azure portal](https://portal.azure.com/), on the **Bonusly** application integration page, select **Single sign-on**.
+Configure and test Azure AD SSO with Bonusly using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Bonusly.
- ![Configure single sign-on link](common/select-sso.png)
+To configure and test Azure AD SSO with Bonusly, perform the following steps:
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Bonusly SSO](#configure-bonusly-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Bonusly test user](#create-bonusly-test-user)** - to have a counterpart of B.Simon in Bonusly that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
- ![Single sign-on select mode](common/select-saml-option.png)
+## Configure Azure AD SSO
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
+Follow these steps to enable Azure AD SSO in the Azure portal.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+1. In the Azure portal, on the **Bonusly** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
-4. On the **Basic SAML Configuration** section, perform the following steps:
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
- ![Bonusly Domain and URLs single sign-on information](common/idp-reply.png)
+4. On the **Basic SAML Configuration** section, perform the following step:
In the **Reply URL** text box, type a URL using the following pattern:
- `https://Bonus.ly/saml/<tenant-name>`
+ `https://Bonus.ly/saml/<TENANT_NAME>`
> [!NOTE] > The value is not real. Update the value with the actual Reply URL. Contact [Bonusly Client support team](https://bonus.ly/contact) to get the value. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
To configure Azure AD single sign-on with Bonusly, perform the following steps:
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
- b. Azure AD Identifier
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
- c. Logout URL
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Bonusly.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Bonusly**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
-### Configure Bonusly Single Sign-On
+## Configure Bonusly SSO
1. In a different browser window, sign in to your **Bonusly** tenant. 1. In the toolbar on the top, click **Settings** and then select **Integrations and apps**.
- ![Bonusly Social Section](./media/bonus-tutorial/ic773686.png "Bonusly")
+ ![Bonusly Social Section](./media/bonus-tutorial/settings.png "Bonusly")
1. Under **Single Sign-On**, select **SAML**. 1. On the **SAML** dialog page, perform the following steps:
- ![Bonusly Saml Dialog page](./media/bonus-tutorial/ic773687.png "Bonusly")
+ ![Bonusly Saml Dialog page](./media/bonus-tutorial/dialog-page.png "Bonusly")
a. In the **IdP SSO target URL** textbox, paste the value of **Login URL**, which you have copied from Azure portal.
To configure Azure AD single sign-on with Bonusly, perform the following steps:
e. Click **Save**.
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type `brittasimon@yourcompanydomain.extension`. For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Bonusly.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Bonusly**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **Bonusly**.
-
- ![The Bonusly link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
- ### Create Bonusly test user In order to enable Azure AD users to sign in to Bonusly, they must be provisioned into Bonusly. In the case of Bonusly, provisioning is a manual task.
In order to enable Azure AD users to sign in to Bonusly, they must be provisione
1. Click **Settings**.
- ![Settings](./media/bonus-tutorial/ic781041.png "Settings")
+ ![Settings](./media/bonus-tutorial/users.png "Settings")
1. Click the **Users and bonuses** tab.
- ![Users and bonuses](./media/bonus-tutorial/ic781042.png "Users and bonuses")
+ ![Users and bonuses](./media/bonus-tutorial/manage-user.png "Users and bonuses")
1. Click **Manage Users**.
- ![Manage Users](./media/bonus-tutorial/ic781043.png "Manage Users")
+ ![Manage Users](./media/bonus-tutorial/new-users.png "Manage Users")
1. Click **Add User**.
- ![Screenshot shows Manage Users where you can select Add User.](./media/bonus-tutorial/ic781044.png "Add User")
+ ![Screenshot shows Manage Users where you can select Add User.](./media/bonus-tutorial/add-tab.png "Add User")
1. On the **Add User** dialog, perform the following steps:
- ![Screenshot shows the Add User dialog box where you can enter this information.](./media/bonus-tutorial/ic781045.png "Add User")
+ ![Screenshot shows the Add User dialog box where you can enter this information.](./media/bonus-tutorial/select-user.png "Add User")
a. In the **First name** textbox, enter the first name of user like **Britta**.
In order to enable Azure AD users to sign in to Bonusly, they must be provisione
> [!NOTE] > The Azure AD account holder receives an email that includes a link to confirm the account before it becomes active.
-### Test single sign-on
-
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+## Test SSO
-When you click the Bonusly tile in the Access Panel, you should be automatically signed in to the Bonusly for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+In this section, you test your Azure AD single sign-on configuration with following options.
-## Additional Resources
+* Click on Test this application in Azure portal and you should be automatically signed in to the Bonusly for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Bonusly tile in the My Apps, you should be automatically signed in to the Bonusly for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Bonusly you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Citrix Gotomeeting Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/citrix-gotomeeting-tutorial.md
Previously updated : 01/16/2020 Last updated : 06/16/2021 # Tutorial: Azure Active Directory single sign-on (SSO) integration with GoToMeeting
In this tutorial, you'll learn how to integrate GoToMeeting with Azure Active Di
* Enable your users to be automatically signed-in to GoToMeeting with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment. * GoToMeeting supports **IDP** initiated SSO.
-* Once you configure the GoToMeeting you can enforce session controls, which protect exfiltration and infiltration of your organizationΓÇÖs sensitive data in real-time. Session controls extend from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad)
-## Adding GoToMeeting from the gallery
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
+
+## Add GoToMeeting from the gallery
To configure the integration of GoToMeeting into Azure AD, you need to add GoToMeeting from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **GoToMeeting** in the search box. 1. Select **GoToMeeting** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. -
-## Configure and test Azure AD single sign-on for GoToMeeting
+## Configure and test Azure AD SSO for GoToMeeting
Configure and test Azure AD SSO with GoToMeeting using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in GoToMeeting.
-To configure and test Azure AD SSO with GoToMeeting, complete the following building blocks:
+To configure and test Azure AD SSO with GoToMeeting, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
1. **[Configure GoToMeeting SSO](#configure-gotomeeting-sso)** - to configure the single sign-on settings on application side.
- * **[Create GoToMeeting test user](#create-gotomeeting-test-user)** - to have a counterpart of B.Simon in GoToMeeting that is linked to the Azure AD representation of user.
+ 1. **[Create GoToMeeting test user](#create-gotomeeting-test-user)** - to have a counterpart of B.Simon in GoToMeeting that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **GoToMeeting** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **GoToMeeting** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, perform the following steps:
- a. In the **Identifier** text box, type a URL using the following pattern:
+ a. In the **Identifier** text box, type the URL:
`https://authentication.logmeininc.com/saml/sp`
- b. In the **Reply URL** text box, type a URL using the following pattern:
+ b. In the **Reply URL** text box, type the URL:
`https://authentication.logmeininc.com/saml/acs` c. Click **set additional URLs** and configure the below URLs d. **Sign on URL** (keep this blank)
- e. In the **RelayState** text box, type a URL using the following pattern:
+ e. In the **RelayState** text box, type one of the following URLs:
- For GoToMeeting App, use `https://global.gotomeeting.com`
Follow these steps to enable Azure AD SSO in the Azure portal.
- For GoToAssist, use `https://app.gotoassist.com`
- > [!NOTE]
- > These values are not real. Update these values with the actual Identifier and Reply URL. Contact [GoToMeeting Client support team](../manage-apps/view-applications-portal.md) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
- 5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Certificate (Base64)** from the given options as per your requirement and save it on your computer. ![The Certificate download link](common/certificatebase64.png)
Follow these steps to enable Azure AD SSO in the Azure portal.
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure Ad Identifier
-
- c. Logout URL
-- ### Create an Azure AD test user In this section, you'll create a test user in the Azure portal called B.Simon.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **GoToMeeting**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
There is no action item for you in this section. If a user doesn't already exist
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the GoToMeeting tile in the Access Panel, you should be automatically signed in to the GoToMeeting for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
-
-## Additional resources
--- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* Click on Test this application in Azure portal and you should be automatically signed in to the GoToMeeting for which you set up the SSO.
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+* You can use Microsoft My Apps. When you click the GoToMeeting tile in the My Apps, you should be automatically signed in to the GoToMeeting for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [Try GoToMeeting with Azure AD](https://aad.portal.azure.com/)
+## Next steps
-- [What is session control in Microsoft Cloud App Security?](/cloud-app-security/proxy-intro-aad)
+Once you configure GoToMeeting you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
-- [How to protect GoToMeeting with advanced visibility and controls](/cloud-app-security/proxy-intro-aad)
active-directory Hackerone Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/hackerone-tutorial.md
Previously updated : 02/15/2019 Last updated : 06/15/2021 # Tutorial: Azure Active Directory integration with HackerOne
-In this tutorial, you learn how to integrate HackerOne with Azure Active Directory (Azure AD).
-Integrating HackerOne with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate HackerOne with Azure Active Directory (Azure AD). When you integrate HackerOne with Azure AD, you can:
-* You can control in Azure AD who has access to HackerOne.
-* You can enable your users to be automatically signed-in to HackerOne (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to HackerOne.
+* Enable your users to be automatically signed-in to HackerOne with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with HackerOne, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* HackerOne single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* HackerOne single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* HackerOne supports **SP** initiated SSO
-* HackerOne supports **Just In Time** user provisioning
-
-## Adding HackerOne from the gallery
-
-To configure the integration of HackerOne into Azure AD, you need to add HackerOne from the gallery to your list of managed SaaS apps.
-
-**To add HackerOne from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **HackerOne**, select **HackerOne** from result panel then click **Add** button to add the application.
-
- ![HackerOne in the results list](common/search-new-app.png)
+* HackerOne supports **SP** initiated SSO.
+* HackerOne supports **Just In Time** user provisioning.
-## Configure and test Azure AD single sign-on
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-In this section, you configure and test Azure AD single sign-on with HackerOne based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in HackerOne needs to be established.
+## Add HackerOne from the gallery
-To configure and test Azure AD single sign-on with HackerOne, you need to complete the following building blocks:
-
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure HackerOne Single Sign-On](#configure-hackerone-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create HackerOne test user](#create-hackerone-test-user)** - to have a counterpart of Britta Simon in HackerOne that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure the integration of HackerOne into Azure AD, you need to add HackerOne from the gallery to your list of managed SaaS apps.
-### Configure Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **HackerOne** in the search box.
+1. Select **HackerOne** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure and test Azure AD SSO for HackerOne
-To configure Azure AD single sign-on with HackerOne, perform the following steps:
+Configure and test Azure AD SSO with HackerOne using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in HackerOne.
-1. In the [Azure portal](https://portal.azure.com/), on the **HackerOne** application integration page, select **Single sign-on**.
+To configure and test Azure AD SSO with HackerOne, perform the following steps:
- ![Configure single sign-on link](common/select-sso.png)
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure HackerOne SSO](#configure-hackerone-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create HackerOne test user](#create-hackerone-test-user)** - to have a counterpart of B.Simon in HackerOne that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+## Configure Azure AD SSO
- ![Single sign-on select mode](common/select-saml-option.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
+1. In the Azure portal, on the **HackerOne** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![HackerOne Domain and URLs single sign-on information](common/sp-identifier.png)
+ a. In the **Identifier (Entity ID)** text box, type the value:
+ `hackerone.com`
- a. In the **Sign on URL** text box, enter the following:
- `https://hackerone.com/users/saml/sign_in?email=<configured domain>`
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
+ `https://hackerone.com/users/saml/sign_in?email=<CONFIGURED_DOMAIN>`
- b. In the **Identifier (Entity ID)** text box, enter the following:
- `hackerone.com`
+ > [!Note]
+ > The Sign-on URL value is not real. Update this value with the actual Sign-on URL.You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Certificate (Base64)** from the given options as per your requirement and save it on your computer.
To configure Azure AD single sign-on with HackerOne, perform the following steps
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
- b. Azure Ad Identifier
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
- c. Logout URL
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to HackerOne.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **HackerOne**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
-### Configure HackerOne Single Sign-On
+## Configure HackerOne SSO
1. Sign On to your HackerOne tenant as an administrator. 2. In the menu on the top, click the **Settings**.
- ![Screenshot that shows "Settings" selected in the menu.](./media/hackerone-tutorial/tutorial_hackerone_001.png)
+ ![Screenshot that shows "Settings" selected in the menu.](./media/hackerone-tutorial/menu.png)
3. Navigate to **Authentication** and click **Add SAML settings**.
- ![Screenshot that shows the "Authentication Settings" page with the "Add S A M L settings" button selected.](./media/hackerone-tutorial/tutorial_hackerone_003.png)
+ ![Screenshot that shows the "Authentication Settings" page with the "Add S A M L settings" button selected.](./media/hackerone-tutorial/settings.png)
4. On the **SAML Settings** dialog, perform the following steps:
- ![Configure Single Sign-On](./media/hackerone-tutorial/tutorial_hackerone_004.png)
+ ![Configure Single Sign-On](./media/hackerone-tutorial/certificate.png)
a. In the **Email Domain** textbox, type a registered domain.
To configure Azure AD single sign-on with HackerOne, perform the following steps
5. On the Authentication Settings dialog, perform the following steps:
- ![Screenshot that shows the "Authentication Settings" dialog with the "Run test" button selected.](./media/hackerone-tutorial/tutorial_hackerone_005.png)
+ ![Screenshot that shows the "Authentication Settings" dialog with the "Run test" button selected.](./media/hackerone-tutorial/test.png)
a. Click **Run test**. 6. When the test finishes successfully and the **Status** field shows **Last test status: success**, select the **Request Verification** button to submit to HackerOne for approval.
- ![Submit to HackerOne for approval](./media/hackerone-tutorial/tutorial-hackerone-006.png)
+ ![Submit to HackerOne for approval](./media/hackerone-tutorial/verification.png)
7. After HackerOne approves the settings, you can select the **Migrate Users** button to require SSO authentication for all users.
- ![Enable SAML](./media/hackerone-tutorial/tutorial-hackerone-007.png)
-
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to HackerOne.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **HackerOne**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **HackerOne**.
-
- ![The HackerOne link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
+ ![Enable SAML](./media/hackerone-tutorial/users.png)
### Create HackerOne test user In this section, a user called Britta Simon is created in HackerOne. HackerOne supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in HackerOne, a new one is created after authentication.
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the HackerOne tile in the Access Panel, you should be automatically signed in to the HackerOne for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to HackerOne Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to HackerOne Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the HackerOne tile in the My Apps, this will redirect to HackerOne Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure HackerOne you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Ibmid Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/ibmid-tutorial.md
Previously updated : 02/11/2021 Last updated : 06/11/2021
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* IBMid supports **SP and IDP** initiated SSO
-* IBMid supports **Just In Time** user provisioning
+* IBMid supports **SP and IDP** initiated SSO.
+* IBMid supports **Just In Time** user provisioning.
-## Adding IBMid from the gallery
+## Add IBMid from the gallery
To configure the integration of IBMid into Azure AD, you need to add IBMid from the gallery to your list of managed SaaS apps.
To configure the integration of IBMid into Azure AD, you need to add IBMid from
1. In the **Add from the gallery** section, type **IBMid** in the search box. 1. Select **IBMid** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. - ## Configure and test Azure AD SSO for IBMid Configure and test Azure AD SSO with IBMid using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in IBMid.
Follow these steps to enable Azure AD SSO in the Azure portal.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following steps:
a. In the **Identifier** text box, type one of the following URLs: | Identifier | | - |
- | `https://idaas.iam.ibm.com/idaas/mtfim/sps/idaas/saml20` |
| `https://ibmlogin.ice.ibmcloud.com/saml/sps/saml20sp/saml20` | | `https://prepiam.ice.ibmcloud.com/saml/sps/saml20sp/saml20` | |
- a. In the **Reply URL** text box, type one of the following URLs:
+ b. In the **Reply URL** text box, type one of the following URLs:
| Reply URL | | - |
- | `https://idaas.iam.ibm.com/idaas/mtfim/sps/idaas/saml20/login` |
| `https://login.ibm.com/saml/sps/saml20sp/saml20/login` | | `https://prepiam.ice.ibmcloud.com/saml/sps/saml20sp/saml20/login` | |
Follow these steps to enable Azure AD SSO in the Azure portal.
In the **Sign-on URL** text box, type the URL: `https://myibm.ibm.com/` - 1. Click **Save**. 1. IBMid application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
In this section, you test your Azure AD single sign-on configuration with follow
#### IDP initiated:
-* Click on **Test this application** in Azure portal and you should be automatically signed in to the IBMid for which you set up the SSO
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the IBMid for which you set up the SSO.
You can also use Microsoft My Apps to test the application in any mode. When you click the IBMid tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the IBMid for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
active-directory Myworkdrive Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/myworkdrive-tutorial.md
Previously updated : 06/27/2019 Last updated : 06/15/2021 # Tutorial: Integrate MyWorkDrive with Azure Active Directory
In this tutorial, you'll learn how to integrate MyWorkDrive with Azure Active Di
* Enable your users to be automatically signed-in to MyWorkDrive with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
-* An Azure AD subscription. If you don't have a subscription, you can get one-month free trial [here](https://azure.microsoft.com/pricing/free-trial/).
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
* MyWorkDrive single sign-on (SSO) enabled subscription. ## Scenario description
-In this tutorial, you configure and test Azure AD SSO in a test environment. MyWorkDrive supports **SP** and **IDP** initiated SSO
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+* MyWorkDrive supports **SP** and **IDP** initiated SSO.
+
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-## Adding MyWorkDrive from the gallery
+## Add MyWorkDrive from the gallery
To configure the integration of MyWorkDrive into Azure AD, you need to add MyWorkDrive from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **MyWorkDrive** in the search box. 1. Select **MyWorkDrive** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on
+## Configure and test Azure AD SSO for MyWorkDrive
-Configure and test Azure AD SSO with MyWorkDrive using a test user called **Britta Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in MyWorkDrive.
+Configure and test Azure AD SSO with MyWorkDrive using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in MyWorkDrive.
-To configure and test Azure AD SSO with MyWorkDrive, complete the following building blocks:
+To configure and test Azure AD SSO with MyWorkDrive, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
-2. **[Configure MyWorkDrive SSO](#configure-myworkdrive-sso)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create MyWorkDrive test user](#create-myworkdrive-test-user)** - to have a counterpart of Britta Simon in MyWorkDrive that is linked to the Azure AD representation of user.
-6. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure MyWorkDrive SSO](#configure-myworkdrive-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create MyWorkDrive test user](#create-myworkdrive-test-user)** - to have a counterpart of B.Simon in MyWorkDrive that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-### Configure Azure AD SSO
+## Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **MyWorkDrive** application integration page, find the **Manage** section and select **Single sign-on**.
+1. In the Azure portal, on the **MyWorkDrive** application integration page, find the **Manage** section and select **Single sign-on**.
1. On the **Select a Single sign-on method** page, select **SAML**.
-1. On the **Set up Single Sign-On with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** page, If you wish to configure the application in **IDP** initiated mode, enter the values for the following field:
+1. On the **Basic SAML Configuration** page, If you wish to configure the application in **IDP** initiated mode, perform the following step:
In the **Reply URL** text box, type a URL using the following pattern: `https://<SERVER.DOMAIN.COM>/SAML/AssertionConsumerService.aspx`
Follow these steps to enable Azure AD SSO in the Azure portal.
![The Certificate download link](common/copy-metadataurl.png)
-### Configure MyWorkDrive SSO
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called Britta Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `Britta Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `BrittaSimon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable Britta Simon to use Azure single sign-on by granting access to MyWorkDrive.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **MyWorkDrive**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **Britta Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure MyWorkDrive SSO
1. To automate the configuration within MyWorkDrive, you need to install **My Apps Secure Sign-in browser extension** by clicking **Install the extension**.
Follow these steps to enable Azure AD SSO in the Azure portal.
1. On the MyWorkDrive Server in the admin panel, click on **ENTERPRISE** and perform the following steps:
- ![The Admin](./media/myworkdrive-tutorial/tutorial_myworkdrive_admin.png)
+ ![The Admin](./media/myworkdrive-tutorial/admin.png)
a. Enable **SAML/ADFS SSO**.
- b. Select **SAML - Azure AD**
+ b. Select **SAML - Azure AD**.
c. In the **Azure App Federation Metadata Url** textbox, paste the value of **App Federation Metadata Url** which you have copied from the Azure portal.
- d. Click **Save**
+ d. Click **Save**.
> [!NOTE] > For additional information review the [MyWorkDrive Azure AD support article](https://www.myworkdrive.com/support/saml-single-sign-on-azure-ad/).
-### Create an Azure AD test user
-
-In this section, you'll create a test user in the Azure portal called Britta Simon.
-
-1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
-1. Select **New user** at the top of the screen.
-1. In the **User** properties, follow these steps:
- 1. In the **Name** field, enter `Britta Simon`.
- 1. In the **User name** field, enter the username@companydomain.extension. For example, `BrittaSimon@contoso.com`.
- 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
- 1. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you'll enable Britta Simon to use Azure single sign-on by granting access to MyWorkDrive.
-
-1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
-1. In the applications list, select **MyWorkDrive**.
-1. In the app's overview page, find the **Manage** section and select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
+### Create MyWorkDrive test user
-1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+In this section, you create a user called Britta Simon in MyWorkDrive. Work with [MyWorkDrive support team](mailto:support@myworkdrive.com) to add the users in the MyWorkDrive platform. Users must be created and activated before you use single sign-on.
- ![The Add User link](common/add-assign-user.png)
+## Test SSO
-1. In the **Users and groups** dialog, select **Britta Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
-1. In the **Add Assignment** dialog, click the **Assign** button.
+In this section, you test your Azure AD single sign-on configuration with following options.
-### Create MyWorkDrive test user
+#### SP initiated:
-In this section, you create a user called Britta Simon in MyWorkDrive. Work with [MyWorkDrive support team](mailto:support@myworkdrive.com) to add the users in the MyWorkDrive platform. Users must be created and activated before you use single sign-on.
+* Click on **Test this application** in Azure portal. This will redirect to MyWorkDrive Sign on URL where you can initiate the login flow.
-### Test SSO
+* Go to MyWorkDrive Sign-on URL directly and initiate the login flow from there.
-When you select the MyWorkDrive tile in the Access Panel, you should be automatically signed in to the MyWorkDrive for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the MyWorkDrive for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the MyWorkDrive tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the MyWorkDrive for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure MyWorkDrive you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Proprofs Classroom Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/proprofs-classroom-tutorial.md
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with ProProfs Classroom | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and ProProfs Classroom.
++++++++ Last updated : 06/17/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with ProProfs Classroom
+
+In this tutorial, you'll learn how to integrate ProProfs Classroom with Azure Active Directory (Azure AD). When you integrate ProProfs Classroom with Azure AD, you can:
+
+* Control in Azure AD who has access to ProProfs Classroom.
+* Enable your users to be automatically signed-in to ProProfs Classroom with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* ProProfs Classroom single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* ProProfs Classroom supports **IDP** initiated SSO.
+
+## Add ProProfs Classroom from the gallery
+
+To configure the integration of ProProfs Classroom into Azure AD, you need to add ProProfs Classroom from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **ProProfs Classroom** in the search box.
+1. Select **ProProfs Classroom** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for ProProfs Classroom
+
+Configure and test Azure AD SSO with ProProfs Classroom using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in ProProfs Classroom.
+
+To configure and test Azure AD SSO with ProProfs Classroom, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure ProProfs Classroom SSO](#configure-proprofs-classroom-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create ProProfs Classroom test user](#create-proprofs-classroom-test-user)** - to have a counterpart of B.Simon in ProProfs Classroom that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **ProProfs Classroom** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, the application is pre-configured and the necessary URLs are already pre-populated with Azure. The user needs to save the configuration by clicking the **Save** button.
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Base64)** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/certificatebase64.png)
+
+1. On the **Set up ProProfs Classroom** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to ProProfs Classroom.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **ProProfs Classroom**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure ProProfs Classroom SSO
+
+To configure single sign-on on **ProProfs Classroom** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [ProProfs Classroom support team](mailto:support@proprofs.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create ProProfs Classroom test user
+
+In this section, you create a user called Britta Simon in ProProfs Classroom. Work with [ProProfs Classroom support team](mailto:support@proprofs.com) to add the users in the ProProfs Classroom platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on Test this application in Azure portal and you should be automatically signed in to the ProProfs Classroom for which you set up the SSO.
+
+* You can use Microsoft My Apps. When you click the ProProfs Classroom tile in the My Apps, you should be automatically signed in to the ProProfs Classroom for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure ProProfs Classroom you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Quickhelp Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/quickhelp-tutorial.md
Previously updated : 03/27/2019 Last updated : 06/15/2021 # Tutorial: Azure Active Directory integration with QuickHelp
-In this tutorial, you learn how to integrate QuickHelp with Azure Active Directory (Azure AD).
-Integrating QuickHelp with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate QuickHelp with Azure Active Directory (Azure AD). When you integrate QuickHelp with Azure AD, you can:
-* You can control in Azure AD who has access to QuickHelp.
-* You can enable your users to be automatically signed-in to QuickHelp (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to QuickHelp.
+* Enable your users to be automatically signed-in to QuickHelp with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with QuickHelp, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* QuickHelp single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* QuickHelp single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* QuickHelp supports **SP** initiated SSO
-
-* QuickHelp supports **Just In Time** user provisioning
-
-## Adding QuickHelp from the gallery
-
-To configure the integration of QuickHelp into Azure AD, you need to add QuickHelp from the gallery to your list of managed SaaS apps.
-
-**To add QuickHelp from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **QuickHelp**, select **QuickHelp** from result panel then click **Add** button to add the application.
+* QuickHelp supports **SP** initiated SSO.
- ![QuickHelp in the results list](common/search-new-app.png)
+* QuickHelp supports **Just In Time** user provisioning.
-## Configure and test Azure AD single sign-on
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-In this section, you configure and test Azure AD single sign-on with QuickHelp based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in QuickHelp needs to be established.
+## Add QuickHelp from the gallery
-To configure and test Azure AD single sign-on with QuickHelp, you need to complete the following building blocks:
-
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure QuickHelp Single Sign-On](#configure-quickhelp-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create QuickHelp test user](#create-quickhelp-test-user)** - to have a counterpart of Britta Simon in QuickHelp that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure the integration of QuickHelp into Azure AD, you need to add QuickHelp from the gallery to your list of managed SaaS apps.
-### Configure Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **QuickHelp** in the search box.
+1. Select **QuickHelp** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure and test Azure AD SSO for QuickHelp
-To configure Azure AD single sign-on with QuickHelp, perform the following steps:
+Configure and test Azure AD SSO with QuickHelp using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in QuickHelp.
-1. In the [Azure portal](https://portal.azure.com/), on the **QuickHelp** application integration page, select **Single sign-on**.
+To configure and test Azure AD SSO with QuickHelp, perform the following steps:
- ![Configure single sign-on link](common/select-sso.png)
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure QuickHelp SSO](#configure-quickhelp-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create QuickHelp test user](#create-quickhelp-test-user)** - to have a counterpart of B.Simon in QuickHelp that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+## Configure Azure AD SSO
- ![Single sign-on select mode](common/select-saml-option.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
+1. In the Azure portal, on the **QuickHelp** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![QuickHelp Domain and URLs single sign-on information](common/sp-identifier.png)
-
- a. In the **Sign on URL** text box, type a URL using the following pattern:
- `https://quickhelp.com/<ROUTEURL>`
-
- b. In the **Identifier (Entity ID)** text box, type a URL:
+ a. In the **Identifier (Entity ID)** text box, type the URL:
`https://auth.quickhelp.com`
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
+ `https://quickhelp.com/<ROUTE_URL>`
+ > [!NOTE] > The Sign-on URL value is not real. Update the value with the actual Sign-On URL. Contact your organizationΓÇÖs QuickHelp administrator or your BrainStorm Client Success Manager to get the value. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
To configure Azure AD single sign-on with QuickHelp, perform the following steps
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
- b. Azure AD Identifier
+In this section, you'll create a test user in the Azure portal called B.Simon.
- c. Logout URL
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
-### Configure QuickHelp Single Sign-On
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to QuickHelp.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **QuickHelp**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure QuickHelp SSO
1. Sign in to your QuickHelp company site as administrator. 2. In the menu on the top, click **Admin**.
- ![Screenshot shows the Admin menu item for Brainstorm.][21]
+ ![Screenshot shows the Admin menu item for Brainstorm](./media/quickhelp-tutorial/admin.png "Admin menu")
3. In the **QuickHelp Admin** menu, click **Settings**.
- ![Screenshot shows Settings selected from the QuickHelp Admin menu.][22]
+ ![Screenshot shows Settings selected from the QuickHelp Admin menu](./media/quickhelp-tutorial/menu.png "Settings")
4. Click **Authentication Settings**.
-5. On the **Authentication Settings** page, perform the following steps
+5. On the **Authentication Settings** page, perform the following steps.
- ![Screenshot shows the Authentication Settings page where you can enter the values described.][23]
+ ![Screenshot shows the Authentication Settings page where you can enter the values described](./media/quickhelp-tutorial/authenticate.png "Authentication Settings")
a. As **SSO Type**, select **WSFederation**.
To configure Azure AD single sign-on with QuickHelp, perform the following steps
f. In the **Action Bar**, click **Save**.
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type brittasimon@yourcompanydomain.extension. For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to QuickHelp.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **QuickHelp**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **QuickHelp**.
-
- ![The QuickHelp link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
- ### Create QuickHelp test user In this section, a user called Britta Simon is created in QuickHelp. QuickHelp supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in QuickHelp, a new one is created after authentication.
-### Test single sign-on
-
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the QuickHelp tile in the Access Panel, you should be automatically signed in to the QuickHelp for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+## Test SSO
-## Additional Resources
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* Click on **Test this application** in Azure portal. This will redirect to QuickHelp Sign-on URL where you can initiate the login flow.
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+* Go to QuickHelp Sign-on URL directly and initiate the login flow from there.
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+* You can use Microsoft My Apps. When you click the QuickHelp tile in the My Apps, this will redirect to QuickHelp Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-<!--Image references-->
+## Next steps
-[21]: ./media/quickhelp-tutorial/tutorial_quickhelp_05.png
-[22]: ./media/quickhelp-tutorial/tutorial_quickhelp_06.png
-[23]: ./media/quickhelp-tutorial/tutorial_quickhelp_07.png
+Once you configure QuickHelp you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Rally Software Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/rally-software-tutorial.md
Previously updated : 03/27/2019 Last updated : 06/16/2021 # Tutorial: Azure Active Directory integration with Rally Software
-In this tutorial, you learn how to integrate Rally Software with Azure Active Directory (Azure AD).
-Integrating Rally Software with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Rally Software with Azure Active Directory (Azure AD). When you integrate Rally Software with Azure AD, you can:
-* You can control in Azure AD who has access to Rally Software.
-* You can enable your users to be automatically signed-in to Rally Software (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Rally Software.
+* Enable your users to be automatically signed-in to Rally Software with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Rally Software, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Rally Software single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Rally Software single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Rally Software supports **SP** initiated SSO
+* Rally Software supports **SP** initiated SSO.
-## Adding Rally Software from the gallery
+## Add Rally Software from the gallery
To configure the integration of Rally Software into Azure AD, you need to add Rally Software from the gallery to your list of managed SaaS apps.
-**To add Rally Software from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Rally Software**, select **Rally Software** from result panel then click **Add** button to add the application.
-
- ![Rally Software in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with Rally Software based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Rally Software needs to be established.
-
-To configure and test Azure AD single sign-on with Rally Software, you need to complete the following building blocks:
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Rally Software** in the search box.
+1. Select **Rally Software** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Rally Software Single Sign-On](#configure-rally-software-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Rally Software test user](#create-rally-software-test-user)** - to have a counterpart of Britta Simon in Rally Software that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+## Configure and test Azure AD SSO for Rally Software
-### Configure Azure AD single sign-on
+Configure and test Azure AD SSO with Rally Software using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Rally Software.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+To configure and test Azure AD SSO with Rally Software, perform the following steps:
-To configure Azure AD single sign-on with Rally Software, perform the following steps:
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Rally Software SSO](#configure-rally-software-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Rally Software test user](#create-rally-software-test-user)** - to have a counterpart of B.Simon in Rally Software that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-1. In the [Azure portal](https://portal.azure.com/), on the **Rally Software** application integration page, select **Single sign-on**.
+## Configure Azure AD SSO
- ![Configure single sign-on link](common/select-sso.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+1. In the Azure portal, on the **Rally Software** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![Rally Software Domain and URLs single sign-on information](common/sp-identifier.png)
-
- a. In the **Sign on URL** text box, type a URL using the following pattern:
- `https://<tenant-name>.rally.com`
+ a. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
+ `https://<TENANT_NAME>.rally.com`
- b. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
- `https://<tenant-name>.rally.com`
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
+ `https://<TENANT_NAME>.rally.com`
> [!NOTE]
- > These values are not real. Update these values with the actual Sign on URL and Identifier. Contact [Rally Software Client support team](https://help.rallydev.com/) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier and Sign on URL. Contact [Rally Software Client support team](https://help.rallydev.com/) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Federation Metadata XML** from the given options as per your requirement and save it on your computer.
To configure Azure AD single sign-on with Rally Software, perform the following
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
- b. Azure AD Identifier
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
- c. Logout URL
+### Assign the Azure AD test user
-### Configure Rally Software Single Sign-On
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Rally Software.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Rally Software**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Rally Software SSO
1. Sign in to your **Rally Software** tenant. 2. In the toolbar on the top, click **Setup**, and then select **Subscription**.
- ![Subscription](./media/rally-software-tutorial/ic769531.png "Subscription")
+ ![Subscription](./media/rally-software-tutorial/toolbar.png "Subscription")
3. Click the **Action** button. Select **Edit Subscription** at the top right side of the toolbar. 4. On the **Subscription** dialog page, perform the following steps, and then click **Save & Close**:
- ![Authentication](./media/rally-software-tutorial/ic769542.png "Authentication")
+ ![Authentication](./media/rally-software-tutorial/configuration.png "Authentication")
a. Select **Rally or SSO authentication** from Authentication dropdown.
To configure Azure AD single sign-on with Rally Software, perform the following
c. In the **SSO Logout** textbox, paste the value of **Logout URL**, which you have copied from Azure portal.
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type brittasimon@yourcompanydomain.extension. For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Rally Software.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Rally Software**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **Rally Software**.
-
- ![The Rally Software link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
- ### Create Rally Software test user For Azure AD users to be able to sign in, they must be provisioned to the Rally Software application using their Azure Active Directory user names.
For Azure AD users to be able to sign in, they must be provisioned to the Rally
2. Go to **Setup \> USERS**, and then click **+ Add New**.
- ![Users](./media/rally-software-tutorial/ic781039.png "Users")
+ ![Users](./media/rally-software-tutorial/add-user.png "Users")
3. Type the name in the New User textbox, and then click **Add with Details**. 4. In the **Create User** section, perform the following steps:
- ![Create User](./media/rally-software-tutorial/ic781040.png "Create User")
+ ![Create User](./media/rally-software-tutorial/new-user.png "Create User")
a. In the **User Name** textbox, type the name of user like **Brittsimon**.
For Azure AD users to be able to sign in, they must be provisioned to the Rally
>[!NOTE] >You can use any other Rally Software user account creation tools or APIs provided by Rally Software to provision Azure AD user accounts.
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the Rally Software tile in the Access Panel, you should be automatically signed in to the Rally Software for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Rally Software Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to Rally Software Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Rally Software tile in the My Apps, this will redirect to Rally Software Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Rally Software you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Sansan Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/sansan-tutorial.md
Previously updated : 08/27/2020 Last updated : 06/16/2021
In this tutorial, you'll learn how to integrate Sansan with Azure Active Directo
* Enable your users to be automatically signed-in to Sansan with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment. * Sansan supports **SP** initiated SSO.
-* Once you configure Sansan you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
-## Adding Sansan from the gallery
+## Add Sansan from the gallery
To configure the integration of Sansan into Azure AD, you need to add Sansan from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Sansan** in the search box. 1. Select **Sansan** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD SSO
+## Configure and test Azure AD SSO for Sansan
Configure and test Azure AD SSO with Sansan using a test user called **Britta Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Sansan.
-To configure and test Azure AD SSO with Sansan, complete the following building blocks:
+To configure and test Azure AD SSO with Sansan, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** to test Azure AD single sign-on with Britta Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** to enable Britta Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** to test Azure AD single sign-on with Britta Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** to enable Britta Simon to use Azure AD single sign-on.
1. **[Configure Sansan](#configure-sansan)** to configure the SSO settings on application side.
- * **[Create Sansan test user](#create-sansan-test-user)** to have a counterpart of Britta Simon in Sansan that is linked to the Azure AD representation of user.
+ 1. **[Create Sansan test user](#create-sansan-test-user)** to have a counterpart of Britta Simon in Sansan that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** to verify whether the configuration works. ## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Sansan** application integration page, find the **Manage** section and select **Single sign-on**.
+1. In the Azure portal, on the **Sansan** application integration page, find the **Manage** section and select **Single sign-on**.
1. On the **Select a Single sign-on method** page, select **SAML**.
-1. On the **Set up Single Sign-On with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** page, enter the values for the following fields:
-
- 1. In the **Sign-on URL** text box, type the URL:
- `https://ap.sansan.com/`
-
- 1. In the **Identifier (Entity ID)** text box, type the URL:
- `https://ap.sansan.com/saml2/<company name>`
+1. On the **Basic SAML Configuration** page, perform the following steps:
- 1. In the **Reply URL** text box, type any one of the URLs using the following pattern:
+ 1. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
+ `https://ap.sansan.com/saml2/<COMPANY_NAME>`
+ 1. In the **Reply URL** text box, type a URL using one of the following patterns:
| Environment | URL | |: |: |
- | PC |`https://ap.sansan.com/v/saml2/<company name>/acs` |
- | Smartphone App |`https://internal.api.sansan.com/<company name>/acs` |
- | Smartphone Web |`https://ap.sansan.com/s/saml2/<company name>/acs` |
+ | PC |`https://ap.sansan.com/v/saml2/<COMPANY_NAME>/acs` |
+ | Smartphone App |`https://internal.api.sansan.com/<COMPANY_NAME>/acs` |
+ | Smartphone Web |`https://ap.sansan.com/s/saml2/<COMPANY_NAME>/acs` |
+
+ 1. In the **Sign-on URL** text box, type the URL:
+ `https://ap.sansan.com/`
> [!NOTE]
- > These values are not real. Check the actual values on the **Sansan admin settings**.
+ > These values are not real. Check the actual Identifier and Reply URL values on the **Sansan admin settings**.
1. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Base64)** and select **Download** to download the certificate and save it on your computer.
In this section, you'll enable Britta Simon to use Azure single sign-on by grant
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Sansan**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **Britta Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
To perform the **Single Sign-On settings** on the **Sansan** side, please follow
### Create Sansan test user
-In this section, you create a user called Britta Simon in Sansan. For more information on how to create a user, please refer [these](https://jp-help.sansan.com/hc/en-us/articles/206508997-Adding-users) steps.
+In this section, you create a user called Britta Simon in Sansan. For more information on how to create a user, please refer [these](https://jp-help.sansan.com/hc/articles/206508997-Adding-users) steps.
## Test SSO
-When you select the Sansan tile in the Access Panel, you should be automatically signed in to the Sansan for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on **Test this application** in Azure portal. This will redirect to Sansan Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to Sansan Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Sansan tile in the My Apps, this will redirect to Sansan Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Sansan you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Sciforma Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/sciforma-tutorial.md
Previously updated : 03/07/2019 Last updated : 06/16/2021 # Tutorial: Azure Active Directory integration with Sciforma
-In this tutorial, you learn how to integrate Sciforma with Azure Active Directory (Azure AD).
-Integrating Sciforma with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Sciforma with Azure Active Directory (Azure AD). When you integrate Sciforma with Azure AD, you can:
-* You can control in Azure AD who has access to Sciforma.
-* You can enable your users to be automatically signed-in to Sciforma (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Sciforma.
+* Enable your users to be automatically signed-in to Sciforma with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Sciforma, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Sciforma single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Sciforma single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Sciforma supports **SP** initiated SSO
-* Sciforma supports **Just In Time** user provisioning
+* Sciforma supports **SP** initiated SSO.
+* Sciforma supports **Just In Time** user provisioning.
-## Adding Sciforma from the gallery
+## Add Sciforma from the gallery
To configure the integration of Sciforma into Azure AD, you need to add Sciforma from the gallery to your list of managed SaaS apps.
-**To add Sciforma from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Sciforma**, select **Sciforma** from result panel then click **Add** button to add the application.
-
- ![Sciforma in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Sciforma** in the search box.
+1. Select **Sciforma** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you configure and test Azure AD single sign-on with Sciforma based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Sciforma needs to be established.
+## Configure and test Azure AD SSO for Sciforma
-To configure and test Azure AD single sign-on with Sciforma, you need to complete the following building blocks:
+Configure and test Azure AD SSO with Sciforma using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Sciforma.
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Sciforma Single Sign-On](#configure-sciforma-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Sciforma test user](#create-sciforma-test-user)** - to have a counterpart of Britta Simon in Sciforma that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure and test Azure AD SSO with Sciforma, perform the following steps:
-### Configure Azure AD single sign-on
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Sciforma SSO](#configure-sciforma-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Sciforma test user](#create-sciforma-test-user)** - to have a counterpart of B.Simon in Sciforma that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure Azure AD SSO
-To configure Azure AD single sign-on with Sciforma, perform the following steps:
+Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Sciforma** application integration page, select **Single sign-on**.
+1. In the Azure portal, on the **Sciforma** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Configure single sign-on link](common/select-sso.png)
-
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
-
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![Sciforma Domain and URLs single sign-on information](common/sp-identifier.png)
-
- a. In the **Sign on URL** text box, type a URL using the following pattern:
- `https://<subdomain>.sciforma.net/sciforma/main.html`
+ a. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
+ `https://<SUBDOMAIN>.sciforma.net/sciforma/saml`
- b. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
- `https://<subdomain>.sciforma.net/sciforma/saml`
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
+ `https://<SUBDOMAIN>.sciforma.net/sciforma/main.html`
> [!NOTE]
- > These values are not real. Update these values with the actual Sign on URL and Identifier. Contact [Sciforma Client support team](https://www.sciforma.com/about/contact) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier and Sign on URL. Contact [Sciforma Client support team](https://www.sciforma.com/about/contact) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Federation Metadata XML** from the given options as per your requirement and save it on your computer.
To configure Azure AD single sign-on with Sciforma, perform the following steps:
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure AD Identifier
-
- c. Logout URL
-
-### Configure Sciforma Single Sign-On
-
-To configure single sign-on on **Sciforma** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Sciforma support team](https://www.sciforma.com/about/contact). They set this setting to have the SAML SSO connection set properly on both sides.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
+In this section, you'll create a test user in the Azure portal called B.Simon.
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Sciforma.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Sciforma.
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Sciforma**.
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Sciforma**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![Enterprise applications blade](common/enterprise-applications.png)
+## Configure Sciforma SSO
-2. In the applications list, select **Sciforma**.
-
- ![The Sciforma link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
+To configure single sign-on on **Sciforma** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Sciforma support team](https://www.sciforma.com/about/contact). They set this setting to have the SAML SSO connection set properly on both sides.
### Create Sciforma test user In this section, a user called Britta Simon is created in Sciforma. Sciforma supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Sciforma, a new one is created after authentication.
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the Sciforma tile in the Access Panel, you should be automatically signed in to the Sciforma for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Sciforma Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to Sciforma Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Sciforma tile in the My Apps, this will redirect to Sciforma Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Sciforma you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Sd Elements Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/sd-elements-tutorial.md
Previously updated : 10/17/2019 Last updated : 06/15/2021
In this tutorial, you'll learn how to integrate SD Elements with Azure Active Di
* Enable your users to be automatically signed-in to SD Elements with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* SD Elements supports **IDP** initiated SSO
+* SD Elements supports **IDP** initiated SSO.
-## Adding SD Elements from the gallery
+## Add SD Elements from the gallery
To configure the integration of SD Elements into Azure AD, you need to add SD Elements from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **SD Elements** in the search box. 1. Select **SD Elements** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. -
-## Configure and test Azure AD single sign-on for SD Elements
+## Configure and test Azure AD SSO for SD Elements
Configure and test Azure AD SSO with SD Elements using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in SD Elements.
-To configure and test Azure AD SSO with SD Elements, complete the following building blocks:
+To configure and test Azure AD SSO with SD Elements, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
1. **[Configure SD Elements SSO](#configure-sd-elements-sso)** - to configure the single sign-on settings on application side.
- * **[Create SD Elements test user](#create-sd-elements-test-user)** - to have a counterpart of B.Simon in SD Elements that is linked to the Azure AD representation of user.
+ 1. **[Create SD Elements test user](#create-sd-elements-test-user)** - to have a counterpart of B.Simon in SD Elements that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **SD Elements** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **SD Elements** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Set up single sign-on with SAML** page, enter the values for the following fields:
+1. On the **Set up single sign-on with SAML** page, perform the following steps:
a. In the **Identifier** text box, type a URL using the following pattern:
- `https://<tenantname>.sdelements.com/sso/saml2/metadata`
+ `https://<TENANT_NAME>.sdelements.com/sso/saml2/metadata`
b. In the **Reply URL** text box, type a URL using the following pattern:
- `https://<tenantname>.sdelements.com/sso/saml2/acs/`
+ `https://<TENANT_NAME>.sdelements.com/sso/saml2/acs/`
> [!NOTE] > These values are not real. Update these values with the actual Identifier and Reply URL. Contact [SD Elements Client support team](mailto:support@sdelements.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **SD Elements**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button. ## Configure SD Elements SSO
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the menu on the top, click **System**, and then **Single Sign-on**.
- ![Screenshot that shows "System" selected and "Single Sign-on" selected from the drop-down.](./media/sd-elements-tutorial/tutorial_sd-elements_09.png)
+ ![Screenshot that shows "System" selected and "Single Sign-on" selected from the drop-down.](./media/sd-elements-tutorial/system.png)
1. On the **Single Sign-On Settings** dialog, perform the following steps:
- ![Configure Single Sign-On](./media/sd-elements-tutorial/tutorial_sd-elements_10.png)
+ ![Configure Single Sign-On](./media/sd-elements-tutorial/settings.png)
a. As **SSO Type**, select **SAML**.
The objective of this section is to create a user called B.Simon in SD Elements.
1. In the menu on the top, click **User Management**, and then **Users**.
- ![Screenshot that shows "Users" selected from the "User Management" drop-down.](./media/sd-elements-tutorial/tutorial_sd-elements_11.png)
+ ![Screenshot that shows "Users" selected from the "User Management" drop-down.](./media/sd-elements-tutorial/users.png)
1. Click **Add New User**.
- ![Screenshot that shows the "Add New User" button selected.](./media/sd-elements-tutorial/tutorial_sd-elements_12.png)
+ ![Screenshot that shows the "Add New User" button selected.](./media/sd-elements-tutorial/add-user.png)
1. On the **Add New User** dialog, perform the following steps:
- ![Creating a SD Elements test user](./media/sd-elements-tutorial/tutorial_sd-elements_13.png)
+ ![Creating a SD Elements test user](./media/sd-elements-tutorial/new-user.png)
a. In the **E-mail** textbox, enter the email of user like **b.simon@contoso.com**.
The objective of this section is to create a user called B.Simon in SD Elements.
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the SD Elements tile in the Access Panel, you should be automatically signed in to the SD Elements for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
-
-## Additional resources
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* Click on Test this application in Azure portal and you should be automatically signed in to the SD Elements for which you set up the SSO.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* You can use Microsoft My Apps. When you click the SD Elements tile in the My Apps, you should be automatically signed in to the SD Elements for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+## Next steps
-- [Try SD Elements with Azure AD](https://aad.portal.azure.com/)
+Once you configure SD Elements you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Serraview Space Utilization Software Solutions Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/serraview-space-utilization-software-solutions-tutorial.md
Previously updated : 08/11/2020 Last updated : 06/15/2021
In this tutorial, you'll learn how to integrate Serraview Space Utilization Soft
* Enable your users to be automatically signed-in to Serraview Space Utilization Software Solutions with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Serraview Space Utilization Software Solutions supports **SP and IDP** initiated SSO
-
-* Once you configure Serraview Space Utilization Software Solutions you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+* Serraview Space Utilization Software Solutions supports **SP and IDP** initiated SSO.
-## Adding Serraview Space Utilization Software Solutions from the gallery
+## Add Serraview Space Utilization Software Solutions from the gallery
To configure the integration of Serraview Space Utilization Software Solutions into Azure AD, you need to add Serraview Space Utilization Software Solutions from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Serraview Space Utilization Software Solutions** in the search box. 1. Select **Serraview Space Utilization Software Solutions** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. - ## Configure and test Azure AD SSO for Serraview Space Utilization Software Solutions Configure and test Azure AD SSO with Serraview Space Utilization Software Solutions using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Serraview Space Utilization Software Solutions.
-To configure and test Azure AD SSO with Serraview Space Utilization Software Solutions, complete the following building blocks:
+To configure and test Azure AD SSO with Serraview Space Utilization Software Solutions, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature. 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
To configure and test Azure AD SSO with Serraview Space Utilization Software Sol
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Serraview Space Utilization Software Solutions** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Serraview Space Utilization Software Solutions** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following steps:
- a. In the **Identifier** text box, type a URL using the following pattern:
+ a. In the **Identifier** text box, type a value using the following pattern:
`urn:Serraview:<SERRAVIEW_IDENTIFIER>` b. In the **Reply URL** text box, type a URL using the following pattern:
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Serraview Space Utilization Software Solutions**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
In this section, you create a user called B.Simon in Serraview Space Utilization
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
-When you click the Serraview Space Utilization Software Solutions tile in the Access Panel, you should be automatically signed in to the Serraview Space Utilization Software Solutions for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Serraview Space Utilization Software Solutions Sign on URL where you can initiate the login flow.
-## Additional resources
+* Go to Serraview Space Utilization Software Solutions Sign-on URL directly and initiate the login flow from there.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+#### IDP initiated:
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Serraview Space Utilization Software Solutions for which you set up the SSO.
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the Serraview Space Utilization Software Solutions tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Serraview Space Utilization Software Solutions for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [Try Serraview Space Utilization Software Solutions with Azure AD](https://aad.portal.azure.com/)
+## Next steps
-- [What is session control in Microsoft Cloud App Security?](/cloud-app-security/proxy-intro-aad)
+Once you configure Serraview Space Utilization Software Solutions you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Tas Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/tas-tutorial.md
Previously updated : 03/19/2019 Last updated : 06/16/2021 # Tutorial: Azure Active Directory integration with TAS
-In this tutorial, you learn how to integrate TAS with Azure Active Directory (Azure AD).
-Integrating TAS with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate TAS with Azure Active Directory (Azure AD). When you integrate TAS with Azure AD, you can:
-* You can control in Azure AD who has access to TAS.
-* You can enable your users to be automatically signed-in to TAS (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to TAS.
+* Enable your users to be automatically signed-in to TAS with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with TAS, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* TAS single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* TAS single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* TAS supports **SP and IDP** initiated SSO
+* TAS supports **SP and IDP** initiated SSO.
-## Adding TAS from the gallery
+## Add TAS from the gallery
To configure the integration of TAS into Azure AD, you need to add TAS from the gallery to your list of managed SaaS apps.
-**To add TAS from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **TAS**, select **TAS** from result panel then click **Add** button to add the application.
-
- ![TAS in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with TAS based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in TAS needs to be established.
-
-To configure and test Azure AD single sign-on with TAS, you need to complete the following building blocks:
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **TAS** in the search box.
+1. Select **TAS** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure TAS Single Sign-On](#configure-tas-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create TAS test user](#create-tas-test-user)** - to have a counterpart of Britta Simon in TAS that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+## Configure and test Azure AD SSO for TAS
-### Configure Azure AD single sign-on
+Configure and test Azure AD SSO with TAS using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in TAS.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+To configure and test Azure AD SSO with TAS, perform the following steps:
-To configure Azure AD single sign-on with TAS, perform the following steps:
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure TAS SSO](#configure-tas-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create TAS test user](#create-tas-test-user)** - to have a counterpart of B.Simon in TAS that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-1. In the [Azure portal](https://portal.azure.com/), on the **TAS** application integration page, select **Single sign-on**.
+## Configure Azure AD SSO
- ![Configure single sign-on link](common/select-sso.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+1. In the Azure portal, on the **TAS** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, If you wish to configure the application in **IDP** initiated mode, perform the following steps:
- ![Screenshot shows the Basic SAML Configuration, where you can enter Identifier, Reply U R L, and select Save.](common/idp-intiated.png)
- a. In the **Identifier** text box, type a URL using the following pattern: `https://taseu.combtas.com/<DOMAIN>` b. In the **Reply URL** text box, type a URL using the following pattern:
- `https://taseu.combtas.com/<ENVIRONMENTNAME>/AssertionService.aspx`
+ `https://taseu.combtas.com/<ENVIRONMENT_NAME>/AssertionService.aspx`
5. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- ![Screenshot shows Set additional U R Ls where you can enter a Sign on U R L.](common/metadata-upload-additional-signon.png)
- In the **Sign-on URL** text box, type a URL using the following pattern: `https://taseu.combtas.com/<DOMAIN>`
To configure Azure AD single sign-on with TAS, perform the following steps:
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
- b. Azure AD Identifier
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
- c. Logout URL
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to TAS.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **TAS**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
-### Configure TAS Single Sign-On
+## Configure TAS SSO
1. In a different web browser window, login to TAS as an Administrator. 2. On the left side of menu, click on **Settings** and navigate to **Administrator** and then click on **Manage Single sign on**.
- ![Screenshot shows Manage Single sign on selected.](./media/tas-tutorial/configure01.png)
+ ![Screenshot shows Manage Single sign on selected.](./media/tas-tutorial/settings.png)
3. On the **Manage Single sign on** page, perform the following steps:
- ![Screenshot shows the Manage Single sign on page where you can enter the values described.](./media/tas-tutorial/configure02.png)
+ ![Screenshot shows the Manage Single sign on page where you can enter the values described.](./media/tas-tutorial/configure.png)
a. In the **Name** textbox, type your environment name.
To configure Azure AD single sign-on with TAS, perform the following steps:
h. Click **Insert SSO row**.
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to TAS.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **TAS**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **TAS**.
-
- ![The TAS link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
+### Create TAS test user
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
+In this section, you create a user called Britta Simon in TAS. Work with [TAS support team](mailto:support@combtas.com) to add the users in the TAS platform. Users must be created and activated before you use single sign-on.
-7. In the **Add Assignment** dialog click the **Assign** button.
+## Test SSO
-### Create TAS test user
+In this section, you test your Azure AD single sign-on configuration with following options.
-In this section, you create a user called Britta Simon in TAS. Work with [TAS support team](mailto:support@combtas.com) to add the users in the TAS platform. Users must be created and activated before you use single sign-on.
+#### SP initiated:
-### Test single sign-on
+* Click on **Test this application** in Azure portal. This will redirect to TAS Sign on URL where you can initiate the login flow.
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+* Go to TAS Sign-on URL directly and initiate the login flow from there.
-When you click the TAS tile in the Access Panel, you should be automatically signed in to the TAS for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the TAS for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the TAS tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the TAS for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure TAS you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Tigertext Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/tigertext-tutorial.md
Previously updated : 08/21/2020 Last updated : 06/16/2021 # Tutorial: Azure Active Directory integration with TigerConnect Secure Messenger
-In this tutorial, you learn how to integrate TigerConnect Secure Messenger with Azure Active Directory (Azure AD).
+In this tutorial, you'll learn how to integrate TigerConnect Secure Messenger with Azure Active Directory (Azure AD). When you integrate TigerConnect Secure Messenger with Azure AD, you can:
-Integrating TigerConnect Secure Messenger with Azure AD provides you with the following benefits:
-
-* You can control in Azure AD who has access to TigerConnect Secure Messenger.
-* You can enable your users to be automatically signed in to TigerConnect Secure Messenger (single sign-on) with their Azure AD accounts.
-* You can manage your accounts in one central location: the Azure portal.
-
-For details about software as a service (SaaS) app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md).
+* Control in Azure AD who has access to TigerConnect Secure Messenger.
+* Enable your users to be automatically signed-in to TigerConnect Secure Messenger with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
To configure Azure AD integration with TigerConnect Secure Messenger, you need t
In this tutorial, you configure and test Azure AD single sign-on in a test environment and integrate TigerConnect Secure Messenger with Azure AD.
-* TigerConnect Secure Messenger supports **SP** initiated SSO
-* Once you configure TigerConnect Secure Messenger you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+* TigerConnect Secure Messenger supports **SP** initiated SSO.
-## Adding TigerConnect Secure Messenger from the gallery
+## Add TigerConnect Secure Messenger from the gallery
To configure the integration of TigerConnect Secure Messenger into Azure AD, you need to add TigerConnect Secure Messenger from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **TigerConnect Secure Messenger** in the search box. 1. Select **TigerConnect Secure Messenger** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD SSO
+## Configure and test Azure AD SSO for TigerConnect Secure Messenger
In this section, you configure and test Azure AD single sign-on with TigerConnect Secure Messenger based on a test user named **Britta Simon**. For single sign-on to work, you must establish a link between an Azure AD user and the related user in TigerConnect Secure Messenger.
-To configure and test Azure AD single sign-on with TigerConnect Secure Messenger, you need to complete the following building blocks:
+To configure and test Azure AD single sign-on with TigerConnect Secure Messenger, you need to perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** to test Azure AD single sign-on with Britta Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** to enable Britta Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** to test Azure AD single sign-on with Britta Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** to enable Britta Simon to use Azure AD single sign-on.
1. **[Configure TigerConnect Secure Messenger SSO](#configure-tigerconnect-secure-messenger-sso)** to configure the single sign-on settings on the application side.
- * **[Create a TigerConnect Secure Messenger test user](#create-a-tigerconnect-secure-messenger-test-user)** so that there's a user named Britta Simon in TigerConnect Secure Messenger who's linked to the Azure AD user named Britta Simon.
+ 1. **[Create a TigerConnect Secure Messenger test user](#create-a-tigerconnect-secure-messenger-test-user)** so that there's a user named Britta Simon in TigerConnect Secure Messenger who's linked to the Azure AD user named Britta Simon.
1. **[Test SSO](#test-sso)** to verify whether the configuration works.
-### Configure Azure AD SSO
+## Configure Azure AD SSO
In this section, you enable Azure AD single sign-on in the Azure portal. To configure Azure AD single sign-on with TigerConnect Secure Messenger, take the following steps:
-1. In the [Azure portal](https://portal.azure.com/), on the **TigerConnect Secure Messenger** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **TigerConnect Secure Messenger** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. In the **Basic SAML Configuration** section, enter the values for the following fields:
+1. In the **Basic SAML Configuration** section, perform the following steps:
- 1. In the **Sign on URL** box, enter a URL:
+ 1. In the **Sign on URL** box, type the URL:
`https://home.tigertext.com` 1. In the **Identifier (Entity ID)** box, type a URL by using the following pattern:
- `https://saml-lb.tigertext.me/v1/organization/<instance ID>`
+ `https://saml-lb.tigertext.me/v1/organization/<INSTANCE_ID>`
> [!NOTE] > The **Identifier (Entity ID)** value isn't real. Update this value with the actual identifier. To get the value, contact the [TigerConnect Secure Messenger support team](mailto:prosupport@tigertext.com). You can also refer to the patterns shown in the **Basic SAML Configuration** pane in the Azure portal.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **TigerConnect Secure Messenger**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
In this section, you create a user called Britta Simon in TigerConnect Secure Me
## Test SSO
-In this section, you test your Azure AD single sign-on configuration by using the My Apps portal.
-
-When you select **TigerConnect Secure Messenger** in the My Apps portal, you should be automatically signed in to the TigerConnect Secure Messenger subscription for which you set up single sign-on. For more information about the My Apps portal, see [Access and use apps on the My Apps portal](../user-help/my-apps-portal-end-user-access.md).
-
-## Additional resources
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [List of tutorials for integrating SaaS apps with Azure Active Directory](./tutorial-list.md)
+* Click on **Test this application** in Azure portal. This will redirect to TigerConnect Secure Messenger Sign-on URL where you can initiate the login flow.
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+* Go to TigerConnect Secure Messenger Sign-on URL directly and initiate the login flow from there.
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+* You can use Microsoft My Apps. When you click the TigerConnect Secure Messenger tile in the My Apps, this will redirect to TigerConnect Secure Messenger Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [Try TigerConnect Secure Messenger with Azure AD](https://aad.portal.azure.com/)
+## Next steps
-- [What is session control in Microsoft Cloud App Security?](/cloud-app-security/proxy-intro-aad)
+Once you configure TigerConnect Secure Messenger you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Titanfile Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/titanfile-tutorial.md
Previously updated : 07/21/2020 Last updated : 06/15/2021
In this tutorial, you'll learn how to integrate Titanfile with Azure Active Dire
* Enable your users to be automatically signed-in to Titanfile with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Titanfile supports **IDP** initiated SSO
-* Once you configure Titanfile you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real-time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+* Titanfile supports **IDP** initiated SSO.
-## Adding Titanfile from the gallery
+## Add Titanfile from the gallery
To configure the integration of Titanfile into Azure AD, you need to add Titanfile from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Titanfile** in the search box. 1. Select **Titanfile** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. - ## Configure and test Azure AD SSO for Titanfile Configure and test Azure AD SSO with Titanfile using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Titanfile.
-To configure and test Azure AD SSO with Titanfile, complete the following building blocks:
+To configure and test Azure AD SSO with Titanfile, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature. 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
To configure and test Azure AD SSO with Titanfile, complete the following buildi
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Titanfile** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Titanfile** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Set up single sign-on with SAML** page, enter the values for the following fields:
+1. On the **Set up single sign-on with SAML** page, perform the following steps:
a. In the **Identifier** text box, type a URL using the following pattern: `https://<SUBDOMAIN>.titanfile.com/saml2/metadata/`
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Titanfile**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
In this section, you create a user called B.Simon in Titanfile. Work with [Tita
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the Titanfile tile in the Access Panel, you should be automatically signed in to the Titanfile for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
-
-## Additional resources
--- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)--- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+* Click on Test this application in Azure portal and you should be automatically signed in to the Titanfile for which you set up the SSO.
-- [Try Titanfile with Azure AD](https://aad.portal.azure.com/)
+* You can use Microsoft My Apps. When you click the Titanfile tile in the My Apps, you should be automatically signed in to the Titanfile for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is session control in Microsoft Cloud App Security?](/cloud-app-security/proxy-intro-aad)
+## Next steps
-- [How to protect Titanfile with advanced visibility and controls](/cloud-app-security/proxy-intro-aad)
+Once you configure Titanfile you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Vida Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/vida-tutorial.md
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with VIDA | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and VIDA.
++++++++ Last updated : 06/16/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with VIDA
+
+In this tutorial, you'll learn how to integrate VIDA with Azure Active Directory (Azure AD). When you integrate VIDA with Azure AD, you can:
+
+* Control in Azure AD who has access to VIDA.
+* Enable your users to be automatically signed-in to VIDA with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* VIDA single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* VIDA supports **SP** initiated SSO.
+
+* VIDA supports **Just In Time** user provisioning.
+
+## Adding VIDA from the gallery
+
+To configure the integration of VIDA into Azure AD, you need to add VIDA from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **VIDA** in the search box.
+1. Select **VIDA** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
++
+## Configure and test Azure AD SSO for VIDA
+
+Configure and test Azure AD SSO with VIDA using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in VIDA.
+
+To configure and test Azure AD SSO with VIDA, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure VIDA SSO](#configure-vida-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create VIDA test user](#create-vida-test-user)** - to have a counterpart of B.Simon in VIDA that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **VIDA** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, enter the values for the following fields:
+
+ In the **Sign-on URL** text box, type a URL using the following pattern:
+ `https://vitruevida.com/?teamid=<ID>&idp=<IDP_NAME>`
+
+ > [!NOTE]
+ > The value is not real. Update the value with the actual Sign-On URL. Contact [VIDA Client support team](mailto:support@vitruehealth.com) to get the value. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. VIDA application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
+
+ ![image](common/default-attributes.png)
+
+1. In addition to above, VIDA application expects few more attributes to be passed back in SAML response which are shown below. These attributes are also pre populated but you can review them as per your requirements.
+
+ | Name | Source Attribute|
+ | - | |
+ | assignedroles | user.assignedroles |
+
+ > [!NOTE]
+ > VIDA expects roles for users assigned to the application. Please set up these roles in Azure AD so that users can be assigned the appropriate roles. To understand how to configure roles in Azure AD, see [here](../develop/howto-add-app-roles-in-azure-ad-apps.md#app-roles-ui).
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/metadataxml.png)
+
+1. On the **Set up VIDA** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to VIDA.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **VIDA**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure VIDA SSO
+
+To configure single sign-on on **VIDA** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [VIDA support team](mailto:support@vitruehealth.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create VIDA test user
+
+In this section, a user called Britta Simon is created in VIDA. VIDA supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in VIDA, a new one is created after authentication.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on **Test this application** in Azure portal. This will redirect to VIDA Sign-on URL where you can initiate the login flow.
+
+* Go to VIDA Sign-on URL directly and initiate the login flow from there.
+
+* You can use Microsoft My Apps. When you click the VIDA tile in the My Apps, this will redirect to VIDA Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure VIDA you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
++
app-service Configure Language Dotnetcore https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-language-dotnetcore.md
For more information, see [Configure ASP.NET Core to work with proxy servers and
::: zone pivot="platform-linux" > [!div class="nextstepaction"]
-> [App Service Linux FAQ](faq-app-service-linux.md)
+> [App Service Linux FAQ](faq-app-service-linux.yml)
::: zone-end
app-service Configure Language Java https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-language-java.md
Otherwise, your deployment method will depend on your archive type:
To deploy .jar files to Java SE, use the `/api/zipdeploy/` endpoint of the Kudu site. For more information on this API, please see [this documentation](./deploy-zip.md#rest). > [!NOTE]
-> Your .jar application must be named `app.jar` for App Service to identify and run your application. The Maven Plugin (mentioned above) will automatically rename your application for you during deployment. If you do not wish to rename your JAR to *app.jar*, you can upload a shell script with the command to run your .jar app. Paste the absolute path to this script in the [Startup File](faq-app-service-linux.md#built-in-images) textbox in the Configuration section of the Portal. The startup script does not run from the directory into which it is placed. Therefore, always use absolute paths to reference files in your startup script (for example: `java -jar /home/myapp/myapp.jar`).
+> Your .jar application must be named `app.jar` for App Service to identify and run your application. The Maven Plugin (mentioned above) will automatically rename your application for you during deployment. If you do not wish to rename your JAR to *app.jar*, you can upload a shell script with the command to run your .jar app. Paste the absolute path to this script in the [Startup File](/azure/app-service/faq-app-service-linux#built-in-images) textbox in the Configuration section of the Portal. The startup script does not run from the directory into which it is placed. Therefore, always use absolute paths to reference files in your startup script (for example: `java -jar /home/myapp/myapp.jar`).
### Tomcat
Next, determine if the data source should be available to one application or to
#### Shared server-level resources
-Adding a shared, server-level data source will require you to edit Tomcat's server.xml. First, upload a [startup script](faq-app-service-linux.md#built-in-images) and set the path to the script in **Configuration** > **Startup Command**. You can upload the startup script using [FTP](deploy-ftp.md).
+Adding a shared, server-level data source will require you to edit Tomcat's server.xml. First, upload a [startup script](/azure/app-service/faq-app-service-linux#built-in-images) and set the path to the script in **Configuration** > **Startup Command**. You can upload the startup script using [FTP](deploy-ftp.md).
Your startup script will make an [xsl transform](https://www.w3schools.com/xml/xsl_intro.asp) to the server.xml file and output the resulting xml file to `/usr/local/tomcat/conf/server.xml`. The startup script should install libxslt via apk. Your xsl file and startup script can be uploaded via FTP. Below is an example startup script.
Product support for the [Azure-supported Azul Zulu JDK](https://www.azul.com/dow
Visit the [Azure for Java Developers](/java/azure/) center to find Azure quickstarts, tutorials, and Java reference documentation.
-General questions about using App Service for Linux that aren't specific to the Java development are answered in the [App Service Linux FAQ](faq-app-service-linux.md).
+General questions about using App Service for Linux that aren't specific to the Java development are answered in the [App Service Linux FAQ](faq-app-service-linux.yml).
app-service Configure Language Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-language-nodejs.md
When a working Node.js app behaves differently in App Service or has errors, try
::: zone pivot="platform-linux" > [!div class="nextstepaction"]
-> [App Service Linux FAQ](faq-app-service-linux.md)
+> [App Service Linux FAQ](faq-app-service-linux.yml)
::: zone-end
app-service Configure Language Php https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-language-php.md
When a working PHP app behaves differently in App Service or has errors, try the
::: zone pivot="platform-linux" > [!div class="nextstepaction"]
-> [App Service Linux FAQ](faq-app-service-linux.md)
+> [App Service Linux FAQ](faq-app-service-linux.yml)
::: zone-end
app-service Configure Language Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-language-python.md
If you're encountering this error with the sample in [Tutorial: Deploy a Django
> [Tutorial: Deploy from private container repository](tutorial-custom-container.md?pivots=container-linux) > [!div class="nextstepaction"]
-> [App Service Linux FAQ](faq-app-service-linux.md)
+> [App Service Linux FAQ](faq-app-service-linux.yml)
app-service Configure Language Ruby https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-language-ruby.md
az webapp config appsettings set --name <app-name> --resource-group <resource-gr
> [Tutorial: Rails app with PostgreSQL](tutorial-ruby-postgres-app.md) > [!div class="nextstepaction"]
-> [App Service Linux FAQ](faq-app-service-linux.md)
+> [App Service Linux FAQ](faq-app-service-linux.yml)
app-service Configure Linux Open Ssh Session https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/configure-linux-open-ssh-session.md
For more information on Web App for Containers, see:
* [Introducing remote debugging of Node.js apps on Azure App Service from VS Code](https://medium.com/@auchenberg/introducing-remote-debugging-of-node-js-apps-on-azure-app-service-from-vs-code-in-public-preview-9b8d83a6e1f0) * [Quickstart: Run a custom container on App Service](quickstart-custom-container.md?pivots=container-linux) * [Using Ruby in Azure App Service on Linux](quickstart-ruby.md)
-* [Azure App Service Web App for Containers FAQ](faq-app-service-linux.md)
+* [Azure App Service Web App for Containers FAQ](faq-app-service-linux.yml)
app-service Deploy Ci Cd Custom Container https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/deploy-ci-cd-custom-container.md
az acr webhook create --name <webhook-name> --registry <registry-name> --resourc
* [Azure Container Registry](https://azure.microsoft.com/services/container-registry/) * [Create a .NET Core web app in App Service on Linux](quickstart-dotnetcore.md) * [Quickstart: Run a custom container on App Service](quickstart-custom-container.md)
-* [App Service on Linux FAQ](faq-app-service-linux.md)
+* [App Service on Linux FAQ](faq-app-service-linux.yml)
* [Configure custom containers](configure-custom-container.md) * [Actions workflows to deploy to Azure](https://github.com/Azure/actions-workflow-samples)
app-service Deploy Resource Manager Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/deploy-resource-manager-template.md
In your Key Vault, select **Certificates** and **Generate/Import** to upload the
In your template, provide the name of the certificate for the `keyVaultSecretName`.
-For an example template, see [Deploy a Web App certificate from Key Vault secret and use it for creating SSL binding](https://github.com/Azure/azure-quickstart-templates/tree/master/201-web-app-certificate-from-key-vault).
+For an example template, see [Deploy a Web App certificate from Key Vault secret and use it for creating SSL binding](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-certificate-from-key-vault).
## Next steps
app-service Deploy Zip https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/deploy-zip.md
To complete the steps in this article, [create an App Service app](./index.yml),
[!INCLUDE [Create a project ZIP file](../../includes/app-service-web-deploy-zip-prepare.md)] [!INCLUDE [Deploy ZIP file](../../includes/app-service-web-deploy-zip.md)]
-The above endpoint does not work for Linux App Services at this time. Consider using FTP or the [ZIP deploy API](faq-app-service-linux.md#continuous-integration-and-deployment) instead.
+The above endpoint does not work for Linux App Services at this time. Consider using FTP or the [ZIP deploy API](/azure/app-service/faq-app-service-linux#continuous-integration-and-deployment) instead.
## Deploy ZIP file with Azure CLI
app-service App Service App Service Environment Create Ilb Ase Resourcemanager https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/environment/app-service-app-service-environment-create-ilb-ase-resourcemanager.md
To get started with App Service Environments, see [Introduction to App Service E
[!INCLUDE [app-service-web-try-app-service](../../../includes/app-service-web-try-app-service.md)] <!-- LINKS -->
-[quickstartilbasecreate]: https://azure.microsoft.com/documentation/templates/201-web-app-ase-ilb-create/
+[quickstartilbasecreate]: https://azure.microsoft.com/resources/templates/web-app-ase-ilb-create/
[examplebase64encoding]: https://powershellscripts.blogspot.com/2007/02/base64-encode-file.html
-[configuringDefaultSSLCertificate]: https://azure.microsoft.com/documentation/templates/201-web-app-ase-ilb-configure-default-ssl/
+[configuringDefaultSSLCertificate]: https://azure.microsoft.com/resources/templates/web-app-ase-ilb-configure-default-ssl/
app-service Faq App Service Linux https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/faq-app-service-linux.md
- Title: Run built-in containers FAQ
-description: Find answers to the frequently asked questions about the built-in Linux containers in Azure App Service.
-keywords: azure app service, web app, faq, linux, oss, web app for containers, multi-container, multicontainer
--- Previously updated : 10/30/2018---
-# Azure App Service on Linux FAQ
-
-With the release of App Service on Linux, we're working on adding features and making improvements to our platform. This article provides answers to questions that our customers have been asking us recently.
-
-If you have a question, comment on this article.
-
-## Built-in images
-
-**I want to fork the built-in Docker containers that the platform provides. Where can I find those files?**
-
-You can find all Docker files on [GitHub](https://github.com/azure-app-service). You can find all Docker containers on [Docker Hub](https://hub.docker.com/u/appsvc/).
-
-<a id="#startup-file"></a>
-
-**What are the expected values for the Startup File section when I configure the runtime stack?**
-
-| Stack | Expected Value |
-|--|-|
-| Java SE | the command to start your JAR app (for example, `java -jar /home/site/wwwroot/app.jar --server.port=80`) |
-| Tomcat | the location of a script to perform any necessary configurations (for example, `/home/site/deployments/tools/startup_script.sh`) |
-| Node.js | the PM2 configuration file or your script file |
-| .NET Core | the compiled DLL name as `dotnet <myapp>.dll` |
-| Ruby | the Ruby script that you want to initialize your app with |
-
-These commands or scripts are executed after the built-in Docker container is started, but before your application code is started.
-
-## Management
-
-**What happens when I press the restart button in the Azure portal?**
-
-This action is the same as a Docker restart.
-
-**Can I use Secure Shell (SSH) to connect to the app container virtual machine (VM)?**
-
-Yes, you can do that through the source control management (SCM) site.
-
-> [!NOTE]
-> You can also connect to the app container directly from your local development machine using SSH, SFTP, or Visual Studio Code (for live debugging Node.js apps). For more information, see [Remote debugging and SSH in App Service on Linux](https://azure.github.io/AppService/2018/05/07/New-SSH-Experience-and-Remote-Debugging-for-Linux-Web-Apps.html).
->
-
-**How can I create a Linux App Service plan through an SDK or an Azure Resource Manager template?**
-
-Set the **reserved** field of the app service to *true*.
-
-## Continuous integration and deployment
-
-**My web app still uses an old Docker container image after I've updated the image on Docker Hub. Do you support continuous integration and deployment of custom containers?**
-
-Yes, to set up continuous integration/deployment for Azure Container Registry or DockerHub, by following [Continuous Deployment with Web App for Containers](./deploy-ci-cd-custom-container.md). For private registries, you can refresh the container by stopping and then starting your web app. Or you can change or add a dummy application setting to force a refresh of your container.
-
-**Do you support staging environments?**
-
-Yes.
-
-**Can I use *WebDeploy/MSDeploy* to deploy my web app?**
-
-Yes, you need to set an app setting called `WEBSITE_WEBDEPLOY_USE_SCM` to *false*.
-
-**Git deployment of my application fails when using Linux web app. How can I work around the issue?**
-
-If Git deployment fails to your Linux web app, choose one of the following options to deploy your application code:
--- Use the Continuous Delivery (Preview) feature: You can store your app's source code in an Azure DevOps Git repo or GitHub repo to use Azure Continuous Delivery. For more information, see [How to configure Continuous Delivery for Linux web app](https://blogs.msdn.microsoft.com/devops/2017/05/10/use-azure-portal-to-setup-continuous-delivery-for-web-app-on-linux/).--- Use the [ZIP deploy API](https://github.com/projectkudu/kudu/wiki/Deploying-from-a-zip-file): To use this API, [SSH into your web app](configure-linux-open-ssh-session.md) and go to the folder where you want to deploy your code. Run the following code:-
- ```bash
- curl -X POST -u <user> --data-binary @<zipfile> https://{your-sitename}.scm.azurewebsites.net/api/zipdeploy
- ```
-
- If you get an error that the `curl` command is not found, make sure you install curl by using `apt-get install curl` before you run the previous `curl` command.
-
-## Language support
-
-**I want to use web sockets in my Node.js application, any special settings, or configurations to set?**
-
-Yes, disable `perMessageDeflate` in your server-side Node.js code. For example, if you are using socket.io, use the following code:
-
-```nodejs
-const io = require('socket.io')(server,{
-  perMessageDeflate :false
-});
-```
-
-**Do you support uncompiled .NET Core apps?**
-
-Yes.
-
-**Do you support Composer as a dependency manager for PHP apps?**
-
-Yes, during a Git deployment, Kudu should detect that you're deploying a PHP application (thanks to the presence of a composer.lock file), and Kudu will then trigger a composer install.
-
-## Custom containers
-
-**I'm using my own custom container. I want the platform to mount an SMB share to the `/home/` directory.**
-
-If `WEBSITES_ENABLE_APP_SERVICE_STORAGE` setting is **unspecified** or set to *false*, the `/home/` directory **will not be shared** across scale instances, and files written **will not persist** across restarts. Explicitly setting `WEBSITES_ENABLE_APP_SERVICE_STORAGE` to *true* will enable the mount.
-
-**My custom container takes a long time to start, and the platform restarts the container before it finishes starting up.**
-
-You can configure the amount of time the platform will wait before it restarts your container. To do so, set the `WEBSITES_CONTAINER_START_TIME_LIMIT` app setting to the value you want. The default value is 230 seconds, and the maximum value is 1800 seconds.
-
-**What is the format for the private registry server URL?**
-
-Provide the full registry URL, including `http://` or `https://`.
-
-**What is the format for the image name in the private registry option?**
-
-Add the full image name, including the private registry URL (for example, myacr.azurecr.io/dotnet:latest). Image names that use a custom port [cannot be entered through the portal](https://feedback.azure.com/forums/169385-web-apps/suggestions/31304650). To set `docker-custom-image-name`, use the [`az` command-line tool](/cli/azure/webapp/config/container#az_webapp_config_container_set).
-
-**Can I expose more than one port on my custom container image?**
-
-We don't support exposing more than one port.
-
-**Can I bring my own storage?**
-
-Yes, [bring your own storage](./configure-connect-to-azure-storage.md) is in preview.
-
-**Why can't I browse my custom container's file system or running processes from the SCM site?**
-
-The SCM site runs in a separate container. You can't check the file system or running processes of the app container.
-
-**My custom container listens to a port other than port 80. How can I configure my app to route requests to that port?**
-
-We have automatic port detection. You can also specify an app setting called *WEBSITES_PORT* and give it the value of the expected port number. Previously, the platform used the *PORT* app setting. We are planning to deprecate this app setting and to use *WEBSITES_PORT* exclusively.
-
-**Do I need to implement HTTPS in my custom container?**
-
-No, the platform handles HTTPS termination at the shared front ends.
-
-**Do I need to use PORT variable in code for built-in containers?**
-
-No, PORT variable is not necessary due to automatic port detection. If no port is detected, it defaults to 80.
-To manually configure a custom port, use the EXPOSE instruction in the Dockerfile and the app setting, WEBSITES_PORT, with a port value to bind on the container.
-
-**Do I need to use WEBSITES_PORT for custom containers?**
-
-Yes, this is required for custom containers. To manually configure a custom port, use the EXPOSE instruction in the Dockerfile and the app setting, WEBSITES_PORT, with a port value to bind on the container.
-
-**Can I use ASPNETCORE_URLS in the Docker image?**
-
-Yes, overwrite the environmental variable before .NET core app starts.
-E.g. In the init.sh script: export ASPNETCORE_URLS={Your value}
-
-## Multi-container with Docker Compose
-
-**How do I configure Azure Container Registry (ACR) to use with multi-container?**
-
-In order to use ACR with multi-container, **all container images** need to be hosted on the same ACR registry server. Once they are on the same registry server, you will need to create application settings and then update the Docker Compose configuration file to include the ACR image name.
-
-Create the following application settings:
--- DOCKER_REGISTRY_SERVER_USERNAME-- DOCKER_REGISTRY_SERVER_URL (full URL, ex: `https://<server-name>.azurecr.io`)-- DOCKER_REGISTRY_SERVER_PASSWORD (enable admin access in ACR settings)-
-Within the configuration file, reference your ACR image like the following example:
-
-```yaml
-image: <server-name>.azurecr.io/<image-name>:<tag>
-```
-
-**How do I know which container is internet accessible?**
--- Only one container can be open for access-- Only port 80 and 8080 is accessible (exposed ports)-
-Here are the rules for determining which container is accessible - in the order of precedence:
--- Application setting `WEBSITES_WEB_CONTAINER_NAME` set to the container name-- The first container to define port 80 or 8080-- If neither of the above is true, the first container defined in the file will be accessible (exposed)--
-## Web Sockets
-
-Web Sockets are supported on Linux apps.
-
-> [!IMPORTANT]
-> Web Sockets are not currently supported for Linux apps on Free App Service Plans. We are working on removing this limitation and plan to support up to 5 web socket connections on Free App Service plans.
-
-## Pricing and SLA
-
-**What is the pricing, now that the service is generally available?**
-
-Pricing varies by SKU and region but you can see more details at our pricing page: [App Service Pricing](https://azure.microsoft.com/pricing/details/app-service/linux/).
-
-## Other questions
-
-**What does "Requested feature is not available in resource group" mean?**
-
-You may see this message when creating web app using Azure Resource Manager (ARM). Based on a current limitation, for the same resource group, you cannot mix Windows and Linux apps in the same region.
-
-**What are the supported characters in application settings names?**
-
-You can use only letters (A-Z, a-z), numbers (0-9), and the underscore character (_) for application settings.
-
-**Where can I request new features?**
-
-You can submit your idea at the [Web Apps feedback forum](https://aka.ms/webapps-uservoice). Add "[Linux]" to the title of your idea.
-
-## Next steps
--- [What is Azure App Service on Linux?](overview.md#app-service-on-linux)-- [Set up staging environments in Azure App Service](deploy-staging-slots.md)-- [Continuous Deployment with Web App for Containers](./deploy-ci-cd-custom-container.md)-- [Things You Should Know: Web Apps and Linux](https://techcommunity.microsoft.com/t5/apps-on-azure/things-you-should-know-web-apps-and-linux/ba-p/392472)
app-service Overview Diagnostics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/overview-diagnostics.md
To access App Service diagnostics, navigate to your App Service web app or App S
For Azure Functions, navigate to your function app, and in the top navigation, click on **Platform features**, and select **Diagnose and solve problems** from the **Resource management** section.
-In the App Service diagnostics homepage, you can choose the category that best describes the issue with your app by using the keywords in each homepage tile. Also, this page is where you can find **Diagnostic Tools** for Windows apps. See [Diagnostic tools (only for Windows app)](#diagnostic-tools-only-for-windows-app).
+In the App Service diagnostics homepage, you can choose the category that best describes the issue with your app by using the keywords in each homepage tile. Also, this page is where you can find **Diagnostic Tools**. See [Diagnostic tools](#diagnostic-tools).
![Homepage](./media/app-service-diagnostics/app-service-diagnostics-homepage-1.png)
If an issue is detected with a specific problem category within the last 24 hour
![Application Insights and Troubleshooting and Next Steps](./media/app-service-diagnostics/troubleshooting-and-next-steps-8.png)
-## Diagnostic tools (only for Windows app)
+## Diagnostic tools
Diagnostics Tools include more advanced diagnostic tools that help you investigate application code issues, slowness, connection strings, and more. and proactive tools that help you mitigate issues with CPU usage, requests, and memory.
-### Proactive CPU monitoring
+### Proactive CPU monitoring (only for Windows app)
Proactive CPU monitoring provides you an easy, proactive way to take an action when your app or child process for your app is consuming high CPU resources. You can set your own CPU threshold rules to temporarily mitigate a high CPU issue until the real cause for the unexpected issue is found. For more information, see [Mitigate your CPU problems before they happen](https://azure.github.io/AppService/2019/10/07/Mitigate-your-CPU-problems-before-they-even-happen.html). ![Proactive CPU monitoring](./media/app-service-diagnostics/proactive-cpu-monitoring-9.png)
-### Auto-healing and proactive auto-healing
+### Auto-healing
-Auto-healing is a mitigation action you can take when your app is having unexpected behavior. You can set your own rules based on request count, slow request, memory limit, and HTTP status code to trigger mitigation actions. Use the tool to temporarily mitigate an unexpected behavior until you find the root cause. For more information, see [Announcing the new auto healing experience in app service diagnostics](https://azure.github.io/AppService/2018/09/10/Announcing-the-New-Auto-Healing-Experience-in-App-Service-Diagnostics.html).
+Auto-healing is a mitigation action you can take when your app is having unexpected behavior. You can set your own rules based on request count, slow request, memory limit, and HTTP status code to trigger mitigation actions. Use the tool to temporarily mitigate an unexpected behavior until you find the root cause. The tool is currently available for Windows Web Apps, Linux Web Apps, and Linux Custom Containers. Supported conditions and mitigation vary depending on the type of the web app. For more information, see [Announcing the new auto healing experience in app service diagnostics](https://azure.github.io/AppService/2018/09/10/Announcing-the-New-Auto-Healing-Experience-in-App-Service-Diagnostics.html) and [Announcing Auto Heal for Linux](https://azure.github.io/AppService/2021/04/21/Announcing-Autoheal-for-Azure-App-Service-Linux.html).
![Auto-healing](./media/app-service-diagnostics/auto-healing-10.png)
+### Proactive auto-healing (only for Windows app)
+ Like proactive CPU monitoring, proactive auto-healing is a turn-key solution to mitigating unexpected behavior of your app. Proactive auto-healing restarts your app when App Service determines that your app is in an unrecoverable state. For more information, see [Introducing Proactive Auto Heal](https://azure.github.io/AppService/2017/08/17/Introducing-Proactive-Auto-Heal.html). ## Navigator and change analysis (only for Windows app)
In a large team with continuous integration and where your app has many dependen
Change analysis for app changes can be accessed through tile shortcuts, **Application Changes** and **Application Crashes** in **Availability and Performance** so you can use it concurrently with other metrics. Before using the feature, you must first enable it. For more information, see [Announcing the new change analysis experience in App Service Diagnostics](https://azure.github.io/AppService/2019/05/07/Announcing-the-new-change-analysis-experience-in-App-Service-Diagnostics-Analysis.html).
-Post your questions or feedback at [UserVoice](https://feedback.azure.com/forums/169385-web-appsΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇï) by adding "[Diag]" in the title.
+Post your questions or feedback at [UserVoice](https://feedback.azure.com/forums/169385-web-appsΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇïΓÇï) by adding "[Diag]" in the title.
app-service Samples Resource Manager Templates https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/samples-resource-manager-templates.md
To learn about the JSON syntax and properties for App Services resources, see [M
| [App with custom deployment slots](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/webapp-custom-deployment-slots)| Deploys an App Service app with custom deployment slots/environments. | | [App with Private Endpoint](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/private-endpoint-webapp)| Deploys an App Service app with a Private Endpoint. | |**Configuring an app**| **Description** |
-| [App certificate from Key Vault](https://github.com/Azure/azure-quickstart-templates/tree/master/201-web-app-certificate-from-key-vault)| Deploys an App Service app certificate from an Azure Key Vault secret and uses it for TLS/SSL binding. |
-| [App with a custom domain and SSL](https://github.com/Azure/azure-quickstart-templates/tree/master/201-web-app-custom-domain-and-ssl)| Deploys an App Service app with a custom host name, and gets an app certificate from Key Vault for TLS/SSL binding. |
-| [App with a GoLang extension](https://github.com/Azure/azure-quickstart-templates/tree/master/101-webapp-with-golang)| Deploys an App Service app with the Golang site extension. You can then run web applications developed on Golang on Azure. |
+| [App certificate from Key Vault](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-certificate-from-key-vault)| Deploys an App Service app certificate from an Azure Key Vault secret and uses it for TLS/SSL binding. |
+| [App with a custom domain and SSL](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-custom-domain-and-ssl)| Deploys an App Service app with a custom host name, and gets an app certificate from Key Vault for TLS/SSL binding. |
+| [App with a GoLang extension](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/webapp-with-golang)| Deploys an App Service app with the Golang site extension. You can then run web applications developed on Golang on Azure. |
| [App with Java 8 and Tomcat 8](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-java-tomcat)| Deploys an App Service app with Java 8 and Tomcat 8 enabled. You can then run Java applications in Azure. | | [App with regional VNet integration](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/app-service-regional-vnet-integration)| Deploys an App Service app with regional VNet integration enabled. | |**Protecting an app**| **Description** |
To learn about the JSON syntax and properties for App Services resources, see [M
| [App with MySQL](https://github.com/Azure/azure-quickstart-templates/tree/master/101-webapp-managed-mysql)| Deploys an App Service app on Windows with Azure Database for MySQL. | | [App with PostgreSQL](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/webapp-managed-postgresql)| Deploys an App Service app on Windows with Azure Database for PostgreSQL. | | [App with a database in Azure SQL Database](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-sql-database)| Deploys an App Service app and a database in Azure SQL Database at the Basic service level. |
-| [App with a Blob storage connection](https://github.com/Azure/azure-quickstart-templates/tree/master/201-web-app-blob-connection)| Deploys an App Service app with an Azure Blob storage connection string. You can then use Blob storage from the app. |
-| [App with an Azure Cache for Redis](https://github.com/Azure/azure-quickstart-templates/tree/master/201-web-app-with-redis-cache)| Deploys an App Service app with an Azure Cache for Redis. |
+| [App with a Blob storage connection](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-blob-connection)| Deploys an App Service app with an Azure Blob storage connection string. You can then use Blob storage from the app. |
+| [App with an Azure Cache for Redis](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-with-redis-cache)| Deploys an App Service app with an Azure Cache for Redis. |
| [App connected to a backend webapp](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/webapp-privateendpoint-vnet-injection)| Deploys two web apps (frontend and backend) securely connected together with VNet injection and Private Endpoint. | |**App Service Environment**| **Description** | | [Create an App Service environment v2](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-asev2-create) | Creates an App Service environment v2 in your virtual network. | | [Create an App Service environment v2 with an ILB address](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-asev2-ilb-create) | Creates an App Service environment v2 in your virtual network with a private internal load balancer address. |
-| [Configure the default SSL certificate for an ILB App Service environment or an ILB App Service environment v2](https://github.com/Azure/azure-quickstart-templates/tree/master/201-web-app-ase-ilb-configure-default-ssl) | Configures the default TLS/SSL certificate for an ILB App Service environment or an ILB App Service environment v2. |
+| [Configure the default SSL certificate for an ILB App Service environment or an ILB App Service environment v2](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-ase-ilb-configure-default-ssl) | Configures the default TLS/SSL certificate for an ILB App Service environment or an ILB App Service environment v2. |
| | |
app-service Scenario Secure App Access Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/scenario-secure-app-access-storage.md
Previously updated : 11/30/2020 Last updated : 06/16/2021 -+ #Customer intent: As an application developer, I want to learn how to access Azure Storage for an app by using managed identities.
You need to grant your web app access to the storage account before you can crea
# [Portal](#tab/azure-portal)
-In the [Azure portal](https://portal.azure.com), go into your storage account to grant your web app access. Select **Access control (IAM)** in the left pane, and then select **Role assignments**. You'll see a list of who has access to the storage account. Now you want to add a role assignment to a robot, the app service that needs access to the storage account. Select **Add** > **Add role assignment**.
+In the [Azure portal](https://portal.azure.com), go into your storage account to grant your web app access. Select **Access control (IAM)** in the left pane, and then select **Role assignments**. You'll see a list of who has access to the storage account. Now you want to add a role assignment to a robot, the app service that needs access to the storage account. Select **Add** > **Add role assignment** to open the **Add role assignment** page.
-In **Role**, select **Storage Blob Data Contributor** to give your web app access to read storage blobs. In **Assign access to**, select **App Service**. In **Subscription**, select your subscription. Then select the app service you want to provide access to. Select **Save**.
-
+Assign the **Storage Blob Data Contributor** role to the **App Service** at subscription scope. For detailed steps, see [Assign Azure roles using the Azure portal](/azure/role-based-access-control/role-assignments-portal).
Your web app now has access to your storage account.
application-gateway Ingress Controller Install New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/ingress-controller-install-new.md
Kubernetes. We will leverage it to install the `application-gateway-kubernetes-i
``` Values:
- - `verbosityLevel`: Sets the verbosity level of the AGIC logging infrastructure. See [Logging Levels](https://github.com/Azure/application-gateway-kubernetes-ingress/blob/463a87213bbc3106af6fce0f4023477216d2ad78/docs/troubleshooting.md#logging-levels) for possible values.
+ - `verbosityLevel`: Sets the verbosity level of the AGIC logging infrastructure. See [Logging Levels](https://github.com/Azure/application-gateway-kubernetes-ingress/blob/463a87213bbc3106af6fce0f4023477216d2ad78/docs/troubleshooting.yml#logging-levels) for possible values.
- `appgw.subscriptionId`: The Azure Subscription ID in which Application Gateway resides. Example: `a123b234-a3b4-557d-b2df-a0bc12de1234` - `appgw.resourceGroup`: Name of the Azure Resource Group in which Application Gateway was created. Example: `app-gw-resource-group` - `appgw.name`: Name of the Application Gateway. Example: `applicationgatewayd0f0`
automation Automation Role Based Access Control https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/automation-role-based-access-control.md
As a result of this security risk, we recommend you don't use the Log Analytics
### Create using the Azure portal
-Perform the following steps to create the Azure Automation custom role in the Azure portal. If you would like to learn more, see [Azure custom roles](/role-based-access-control/custom-roles.md).
+Perform the following steps to create the Azure Automation custom role in the Azure portal. If you would like to learn more, see [Azure custom roles](./../role-based-access-control/custom-roles.md).
1. Copy and paste the following JSON syntax into a file. Save the file on your local machine or in an Azure storage account. In the JSON file, replace the value for the **assignableScopes** property with the subscription GUID.
Perform the following steps to create the Azure Automation custom role in the Az
} ```
-1. Complete the remaining steps as outlined in [Create or update Azure custom roles using the Azure portal](/role-based-access-control/custom-roles-portal.md#start-from-json). For [Step 3:Basics](/role-based-access-control/custom-roles-portal.md#step-3-basics), note the following:
+1. Complete the remaining steps as outlined in [Create or update Azure custom roles using the Azure portal](./../role-based-access-control/custom-roles-portal.md#start-from-json). For [Step 3:Basics](/role-based-access-control/custom-roles-portal.md#step-3-basics), note the following:
- In the **Custom role name** field, enter **Automation account Contributor (custom)** or a name matching your naming standards. - For **Baseline permissions**, select **Start from JSON**. Then select the custom JSON file you saved earlier.
Perform the following steps to create the Azure Automation custom role in the Az
### Create using PowerShell
-Perform the following steps to create the Azure Automation custom role with PowerShell. If you would like to learn more, see [Azure custom roles](/role-based-access-control/custom-roles.md).
+Perform the following steps to create the Azure Automation custom role with PowerShell. If you would like to learn more, see [Azure custom roles](./../role-based-access-control/custom-roles.md).
1. Copy and paste the following JSON syntax into a file. Save the file on your local machine or in an Azure storage account. In the JSON file, replace the value for the **AssignableScopes** property with the subscription GUID.
Perform the following steps to create the Azure Automation custom role with Powe
} ```
-1. Complete the remaining steps as outlined in [Create or update Azure custom roles using Azure PowerShell](/role-based-access-control/custom-roles-powershell.md#create-a-custom-role-with-json-template). It can take a few minutes for your custom role to appear everywhere.
+1. Complete the remaining steps as outlined in [Create or update Azure custom roles using Azure PowerShell](./../role-based-access-control/custom-roles-powershell.md#create-a-custom-role-with-json-template). It can take a few minutes for your custom role to appear everywhere.
## Update Management permissions
azure-arc Tutorial Gitops Ci Cd https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/kubernetes/tutorial-gitops-ci-cd.md
To avoid having to set an imagePullSecret for every Pod, consider adding the ima
| AZURE_VOTE_IMAGE_REPO | The full path to the Azure Vote App repo, for example azurearctest.azurecr.io/azvote | | ENVIRONMENT_NAME | Dev | | MANIFESTS_BRANCH | `master` |
-| MANIFESTS_REPO | The Git connection string for your GitOps repo |
+| MANIFESTS_FOLDER | `azure-vote-manifests` |
+| MANIFESTS_REPO | `azure-cicd-demo-gitops` |
| ORGANIZATION_NAME | Name of Azure DevOps organization | | PROJECT_NAME | Name of GitOps project in Azure DevOps | | REPO_URL | Full URL for GitOps repo |
azure-arc Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/overview.md
Today, Azure Arc allows you to manage the following resource types hosted outsid
* Servers - both physical and virtual machines running Windows or Linux. * Kubernetes clusters - supporting multiple Kubernetes distributions.
-* Azure data services - Azure SQL Database and PostgreSQL Hyperscale services.
+* Azure data services - Azure SQL Managed Instance and PostgreSQL Hyperscale services.
+* SQL Server - enroll instances from any location.
## What does Azure Arc deliver?
In the current preview phase, Azure Arc enabled data services are offered at no
* To learn more about Arc enabled data services, see the following [overview](https://azure.microsoft.com/services/azure-arc/hybrid-data-services/)
-* Experience Arc enabled services from the [Jumpstart proof of concept](https://azurearcjumpstart.io/azure_arc_jumpstart/)
+* Experience Arc enabled services from the [Jumpstart proof of concept](https://azurearcjumpstart.io/azure_arc_jumpstart/)
azure-cache-for-redis Cache Aspnet Session State Provider https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-aspnet-session-state-provider.md
The NuGet package downloads and adds the required assembly references and adds t
The commented section provides an example of the attributes and sample settings for each attribute.
-Configure the attributes with the values from your cache in the Microsoft Azure portal, and configure the other values as desired. For instructions on accessing your cache properties, see [Configure Azure Cache for Redis settings](cache-configure.md#configure-azure-cache-for-redis-settings).
+Configure the attributes with the values on the left from your cache in the Microsoft Azure portal, and configure the other values as desired. For instructions on accessing your cache properties, see [Configure Azure Cache for Redis settings](cache-configure.md#configure-azure-cache-for-redis-settings).
* **host** ΓÇô specify your cache endpoint. * **port** ΓÇô use either your non-TLS/SSL port or your TLS/SSL port, depending on the TLS settings.
For more information about session state and other best practices, see [Web Deve
## Next steps Check out the [ASP.NET Output Cache Provider for Azure Cache for Redis](cache-aspnet-output-cache-provider.md).+
azure-cache-for-redis Cache Event Grid Quickstart Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-event-grid-quickstart-portal.md
# Quickstart: Route Azure Cache for Redis events to web endpoint with the Azure portal
-Azure Event Grid is an eventing service for the cloud. In this quickstart, you'll use the Azure portal to create an Azure Cache for Redis instance, subscribe to events for that instance, trigger an event, and view the results. Typically, you send events to an endpoint that processes the event data and takes actions. However, to simplify this quickstart, you'll send events to a web app that will collect and display the messages.
+Azure Event Grid is an eventing service for the cloud. In this quickstart, you'll use the Azure portal to create an Azure Cache for Redis instance, subscribe to events for that instance, trigger an event, and view the results. Typically, you send events to an endpoint that processes the event data and takes actions. However, to simplify this quickstart, you'll send events to a web app that will collect and display the messages.
[!INCLUDE [quickstarts-free-trial-note.md](../../includes/quickstarts-free-trial-note.md)]
When you're finished, you'll see that the event data has been sent to the web ap
:::image type="content" source="media/cache-event-grid-portal/event-grid-scaling.png" alt-text="Azure Event Grid Viewer scaling in JSON format.":::
-## Create an Azure Cache for Redis cache instance
+## Create an Azure Cache for Redis cache instance
[!INCLUDE [redis-cache-create](../../includes/redis-cache-create.md)]
When you're finished, you'll see that the event data has been sent to the web ap
Before subscribing to the events for the cache instance, let's create the endpoint for the event message. Typically, the endpoint takes actions based on the event data. To simplify this quickstart, you'll deploy a [pre-built web app](https://github.com/Azure-Samples/azure-event-grid-viewer) that displays the event messages. The deployed solution includes an App Service plan, an App Service web app, and source code from GitHub.
-1. Select **Deploy to Azure** in GitHub README to deploy the solution to your subscription.
+1. Select **Deploy to Azure** in GitHub README to deploy the solution to your subscription.
:::image type="content" source="media/cache-event-grid-portal/deploy-to-azure.png" alt-text="Deploy to Azure button.":::
-2. On the **Custom deployment** page, do the following steps:
+1. On the **Custom deployment** page, do the following steps:
1. For **Resource group**, select the resource group that you created when creating the cache instance. It will be easier for you to clean up after you are done with the tutorial by deleting the resource group. 2. For **Site Name**, enter a name for the web app. 3. For **Hosting plan name**, enter a name for the App Service plan to use for hosting the web app.
- 4. Select the check box for **I agree to the terms and conditions stated above**.
- 5. Select **Purchase**.
-
+ 4. Select the check box for **I agree to the terms and conditions stated above**.
+ 5. Select **Purchase**.
+ | Setting | Suggested value | Description | | | - | -- |
- | **Subscription** | Drop down and select your subscription. | The subscription under which to create this web app. |
- | **Resource group** | Drop down and select a resource group, or select **Create new** and enter a new resource group name. | By putting all your app resources in one resource group, you can easily manage or delete them together. |
- | **Site Name** | Enter a name for your web app. | This value cannot be empty. |
- | **Hosting plan name** | Enter a name for the App Service plan to use for hosting the web app. | This value cannot be empty. |
+ | **Subscription** | Drop down and select your subscription. | The subscription under which to create this web app. |
+ | **Resource group** | Drop down and select a resource group, or select **Create new** and enter a new resource group name. | By putting all your app resources in one resource group, you can easily manage or delete them together. |
+ | **Site Name** | Enter a name for your web app. | This value cannot be empty. |
+ | **Hosting plan name** | Enter a name for the App Service plan to use for hosting the web app. | This value cannot be empty. |
-1. Select Alerts (bell icon) in the portal, and then select **Go to resource group**.
+1. Select Alerts (bell icon) in the portal, and then select **Go to resource group**.
:::image type="content" source="media/cache-event-grid-portal/deployment-notification.png" alt-text="Azure portal deployment notification.":::
-4. On the **Resource group** page, in the list of resources, select the web app that you created. You'll also see the App Service plan and the cache instance in this list.
+1. On the **Resource group** page, in the list of resources, select the web app that you created. You'll also see the App Service plan and the cache instance in this list.
-5. On the **App Service** page for your web app, select the URL to navigate to the web site. The URL should be in this format: `https://<your-site-name>.azurewebsites.net`.
+1. On the **App Service** page for your web app, select the URL to navigate to the web site. The URL should be in this format: `https://<your-site-name>.azurewebsites.net`.
-6. Confirm that you see the site but no events have been posted to it yet.
+1. Confirm that you see the site but no events have been posted to it yet.
:::image type="content" source="media/cache-event-grid-portal/blank-event-grid-viewer.png" alt-text="Empty Event Grid Viewer site.":::
Before subscribing to the events for the cache instance, let's create the endpoi
In this step, you'll subscribe to a topic to tell Event Grid which events you want to track, and where to send the events.
-1. In the portal, navigate to your cache instance that you created earlier.
-2. On the **Azure Cache for Redis** page, select **Events** on the left menu.
-3. Select **Web Hook**. You are sending events to your viewer app using a web hook for the endpoint.
+1. In the portal, navigate to your cache instance that you created earlier.
+1. On the **Azure Cache for Redis** page, select **Events** on the left menu.
+1. Select **Web Hook**. You are sending events to your viewer app using a web hook for the endpoint.
:::image type="content" source="media/cache-event-grid-portal/event-grid-web-hook.png" alt-text="Azure portal Events page.":::
-4. On the **Create Event Subscription** page, enter the following:
+1. On the **Create Event Subscription** page, enter the following:
| Setting | Suggested value | Description | | | - | -- |
- | **Name** | Enter a name for the event subscription. | The value must be between 3 and 64 characters long. It can only contain letters, numbers, and dashes. |
- | **Event Types** | Drop down and select which event type(s) you want to get pushed to your destination. For this quickstart, we'll be scaling our cache instance. | Patching, scaling, import and export are the available options. |
- | **Endpoint Type** | Select **Web Hook**. | Event handler to receive your events. |
- | **Endpoint** | Select **Select an endpoint**, and enter the URL of your web app and add `api/updates` to the home page URL (for example: `https://cache.azurewebsites.net/api/updates`), and then select **Confirm Selection**. | This is the URL of your web app that you created earlier. |
+ | **Name** | Enter a name for the event subscription. | The value must be between 3 and 64 characters long. It can only contain letters, numbers, and dashes. |
+ | **Event Types** | Drop down and select which event type(s) you want to get pushed to your destination. For this quickstart, we'll be scaling our cache instance. | Patching, scaling, import and export are the available options. |
+ | **Endpoint Type** | Select **Web Hook**. | Event handler to receive your events. |
+ | **Endpoint** | Select **Select an endpoint**, and enter the URL of your web app and add `api/updates` to the home page URL (for example: `https://cache.azurewebsites.net/api/updates`), and then select **Confirm Selection**. | This is the URL of your web app that you created earlier. |
-5. Now, on the **Create Event Subscription** page, select **Create** to create the event subscription.
+1. Now, on the **Create Event Subscription** page, select **Create** to create the event subscription.
-6. View your web app again, and notice that a subscription validation event has been sent to it. Select the eye icon to expand the event data. Event Grid sends the validation event so the endpoint can verify that it wants to receive event data. The web app includes code to validate the subscription.
+1. View your web app again, and notice that a subscription validation event has been sent to it. Select the eye icon to expand the event data. Event Grid sends the validation event so the endpoint can verify that it wants to receive event data. The web app includes code to validate the subscription.
:::image type="content" source="media/cache-event-grid-portal/subscription-event.png" alt-text="Azure Event Grid Viewer.":::
Now, let's trigger an event to see how Event Grid distributes the message to you
1. In the Azure portal, navigate to your Azure Cache for Redis instance and select **Scale** on the left menu.
-1. Select the desired pricing tier from the **Scale** page and select **Select**.
+1. Select the desired pricing tier from the **Scale** page and select **Select**.
You can scale to a different pricing tier with the following restrictions:
-
+ * You can't scale from a higher pricing tier to a lower pricing tier. * You can't scale from a **Premium** cache down to a **Standard** or a **Basic** cache. * You can't scale from a **Standard** cache down to a **Basic** cache. * You can scale from a **Basic** cache to a **Standard** cache but you can't change the size at the same time. If you need a different size, you can do a subsequent scaling operation to the desired size. * You can't scale from a **Basic** cache directly to a **Premium** cache. First, scale from **Basic** to **Standard** in one scaling operation, and then from **Standard** to **Premium** in a subsequent scaling operation. * You can't scale from a larger size down to the **C0 (250 MB)** size.
-
- While the cache is scaling to the new pricing tier, a **Scaling** status is displayed on the left in **Azure Cache for Redis**. When scaling is complete, the status changes from **Scaling** to **Running**.
-1. You've triggered the event, and Event Grid sent the message to the endpoint you configured when subscribing. The message is in the JSON format and it contains an array with one or more events. In the following example, the JSON message contains an array with one event. View your web app and notice that a **ScalingCompleted** event was received.
+ While the cache is scaling to the new pricing tier, a **Scaling** status is displayed using **Azure Cache for Redis** on the left. When scaling is complete, the status changes from **Scaling** to **Running**.
+
+1. You've triggered the event, and Event Grid sent the message to the endpoint you configured when subscribing. The message is in the JSON format and it contains an array with one or more events. In the following example, the JSON message contains an array with one event. View your web app and notice that a **ScalingCompleted** event was received.
:::image type="content" source="media/cache-event-grid-portal/event-grid-scaling.png" alt-text="Azure Event Grid Viewer scaling in JSON format.":::
Select the resource group, and select **Delete resource group**.
Now that you know how to create custom topics and event subscriptions, learn more about what Event Grid can help you do: -- [Reacting to Azure Cache for Redis events](cache-event-grid.md)-- [About Event Grid](../event-grid/overview.md)
+* [Reacting to Azure Cache for Redis events](cache-event-grid.md)
+* [About Event Grid](../event-grid/overview.md)
azure-cache-for-redis Cache How To Geo Replication https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-how-to-geo-replication.md
To configure geo-replication between two caches, the following prerequisites mus
Some features aren't supported with geo-replication:
+- Zone Redundancy isn't supported with geo-replication.
- Persistence isn't supported with geo-replication. - Clustering is supported if both caches have clustering enabled and have the same number of shards. - Caches in the same VNET are supported.
Some features aren't supported with geo-replication:
After geo-replication is configured, the following restrictions apply to your linked cache pair: -- The secondary linked cache is read-only; you can read from it, but you can't write any data to it. If you choose to read from the Geo-Secondary instance, it is important to note that whenever a full data sync is happening between the Geo-Primary and the Geo-Secondary (happens when either Geo-Primary or Geo-Secondary is updated and on some reboot scenarios as well), the Geo-Secondary instance will throw errors (stating that a full data sync is in progress) on any Redis operation against it until the full data sync between Geo-Primary and Geo-Secondary is complete. Applications reading from Geo-Secondary should be built to fall back to the Geo-Primary whenever the Geo-Secondary is throwing such errors.
+- The secondary linked cache is read-only; you can read from it, but you can't write any data to it. If you choose to read from the Geo-Secondary instance, it is important to note that whenever a full data sync is happening between the Geo-Primary and the Geo-Secondary (happens when either Geo-Primary or Geo-Secondary is updated and on some reboot scenarios as well), the Geo-Secondary instance will throw errors (stating that a full data sync is in progress) on any Redis operation against it until the full data sync between Geo-Primary and Geo-Secondary is complete. Applications reading from Geo-Secondary should be built to fall back to the Geo-Primary whenever the Geo-Secondary is throwing such errors.
- Any data that was in the secondary linked cache before the link was added is removed. If the geo-replication is later removed however, the replicated data remains in the secondary linked cache. - You can't [scale](cache-how-to-scale.md) either cache while the caches are linked. - You can't [change the number of shards](cache-how-to-premium-clustering.md) if the cache has clustering enabled.
After geo-replication is configured, the following restrictions apply to your li
## Add a geo-replication link
-1. To link two caches together for geo-replication, fist select **Geo-replication** from the Resource menu of the cache that you intend to be the primary linked cache. Next, select **Add cache replication link** from the **Geo-replication** on the left.
+1. To link two caches together for geo-replication, fist click **Geo-replication** from the Resource menu of the cache that you intend to be the primary linked cache. Next, click **Add cache replication link** from **Geo-replication** on the left.
![Add link](./media/cache-how-to-geo-replication/cache-geo-location-menu.png)
-2. Select the name of your intended secondary cache from the **Compatible caches** list. If your secondary cache isn't displayed in the list, verify that the [Geo-replication prerequisites](#geo-replication-prerequisites) for the secondary cache are met. To filter the caches by region, select the region in the map to display only those caches in the **Compatible caches** list.
+1. Select the name of your intended secondary cache from the **Compatible caches** list. If your secondary cache isn't displayed in the list, verify that the [Geo-replication prerequisites](#geo-replication-prerequisites) for the secondary cache are met. To filter the caches by region, select the region in the map to display only those caches in the **Compatible caches** list.
![Geo-replication compatible caches](./media/cache-how-to-geo-replication/cache-geo-location-select-link.png)
-
+ You can also start the linking process or view details about the secondary cache by using the context menu. ![Geo-replication context menu](./media/cache-how-to-geo-replication/cache-geo-location-select-link-context-menu.png)
-3. Select **Link** to link the two caches together and begin the replication process.
+1. Select **Link** to link the two caches together and begin the replication process.
![Link caches](./media/cache-how-to-geo-replication/cache-geo-location-confirm-link.png)
-4. You can view the progress of the replication process on the **Geo-replication** on the left.
+1. You can view the progress of the replication process using **Geo-replication** on the left.
![Linking status](./media/cache-how-to-geo-replication/cache-geo-location-linking.png)
- You can also view the linking status using **Overview** on the left for both the primary and secondary caches.
+ You can also view the linking status on the left, using **Overview**, for both the primary and secondary caches.
![Screenshot that highlights how to view the linking status for the primary and secondary caches.](./media/cache-how-to-geo-replication/cache-geo-location-link-status.png)
After geo-replication is configured, the following restrictions apply to your li
## Remove a geo-replication link
-1. To remove the link between two caches and stop geo-replication, select **Unlink caches** from the **Geo-replication** on the left.
-
+1. To remove the link between two caches and stop geo-replication, click **Unlink caches** from the **Geo-replication** on the left .
+ ![Unlink caches](./media/cache-how-to-geo-replication/cache-geo-location-unlink.png) When the unlinking process completes, the secondary cache is available for both reads and writes.
Yes, you can configure a [firewall](./cache-configure.md#firewall) with geo-repl
Learn more about Azure Cache for Redis features.
-* [Azure Cache for Redis service tiers](cache-overview.md#service-tiers)
-* [High availability for Azure Cache for Redis](cache-high-availability.md)
+- [Azure Cache for Redis service tiers](cache-overview.md#service-tiers)
+- [High availability for Azure Cache for Redis](cache-high-availability.md)
azure-cache-for-redis Cache Redis Cache Arm Provision https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-redis-cache-arm-provision.md
Resource Manager templates for the new [Premium tier](cache-overview.md#service-
* [Create Premium Azure Cache for Redis with data persistence](https://azure.microsoft.com/resources/templates/redis-premium-persistence/) * [Create Premium Redis Cache deployed into a Virtual Network](https://azure.microsoft.com/resources/templates/redis-premium-vnet/)
-To check for the latest templates, see [Azure Quickstart Templates](https://azure.microsoft.com/documentation/templates/) and search for _Azure Cache for Redis_.
+To check for the latest templates, see [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates/) and search for _Azure Cache for Redis_.
## Deploy the template
azure-functions Functions Continuous Deployment https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-continuous-deployment.md
To configure continuous deployment for an existing function app, complete these
When the process is finished, all code from the specified source is deployed to your app. At that point, changes in the deployment source trigger a deployment of those changes to your function app in Azure. > [!NOTE]
-> After you configure continuous integration, you can no longer edit your source files in the Functions portal.
+> After you configure continuous integration, you can no longer edit your source files in the Functions portal. If you originally published your code from your local computer, you may need to change the `WEBSITE_RUN_FROM_PACKAGE` setting in your function app to a value of `0`.
## Next steps
azure-monitor Agents Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/agents/agents-overview.md
Use the Azure Monitor agent if you need to:
- Send data to third-party tools using [Azure Event Hubs](./diagnostics-extension-stream-event-hubs.md). - Manage the security of your machines using [Azure Security Center](../../security-center/security-center-introduction.md) or [Azure Sentinel](../../sentinel/overview.md). (Not available in preview.)
-See [current feature gaps](../faq.md#is-the-new-azure-monitor-agent-at-parity-with-existing-agents) when compared to existing agents.
+See [current feature gaps](/azure/azure-monitor/faq#is-the-new-azure-monitor-agent-at-parity-with-existing-agents) when compared to existing agents.
## Log Analytics agent
azure-monitor Azure Monitor Agent Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/agents/azure-monitor-agent-overview.md
The Azure Monitor agent (AMA) collects monitoring data from the guest operating system of Azure virtual machines and delivers it to Azure Monitor. This articles provides an overview of the Azure Monitor agent including how to install it and how to configure data collection. ## Relationship to other agents
-The Azure Monitor Agent replaces the following agents that are currently used by Azure Monitor to collect guest data from virtual machines ([view known gaps](../faq.md#is-the-new-azure-monitor-agent-at-parity-with-existing-agents)):
+The Azure Monitor Agent replaces the following agents that are currently used by Azure Monitor to collect guest data from virtual machines ([view known gaps](/azure/azure-monitor/faq#is-the-new-azure-monitor-agent-at-parity-with-existing-agents)):
- [Log Analytics agent](./log-analytics-agent.md) - Sends data to Log Analytics workspace and supports VM insights and monitoring solutions. - [Diagnostic extension](./diagnostics-extension-overview.md) - Sends data to Azure Monitor Metrics (Windows only), Azure Event Hubs, and Azure Storage.
Azure Monitor agent uses [Data Collection Rules (DCR)](data-collection-rule-over
Azure Monitor agent coexists with the [generally available agents for Azure Monitor](agents-overview.md), but you may consider transitioning your VMs off the current agents during the Azure Monitor agent public preview period. Consider the following factors when making this determination. - **Environment requirements.** Azure Monitor agent supports [these operating systems](./agents-overview.md#supported-operating-systems) today latest operating systems and future environment support such as new operating system versions and types of networking requirements will most likely be provided only in this new agent. You should assess whether your environment is supported by Azure Monitor agent. If not, then you may need to stay with the current agent. If Azure Monitor agent supports your current environment, then you should consider transitioning to it.-- **Current and new feature requirements.** Azure Monitor agent introduces several new capabilities such as filtering, scoping, and multi-homing, but it isnΓÇÖt at parity yet with the current agents for other functionality such as custom log collection and integration with all solutions ([see solutions in preview](../faq.md#which-log-analytics-solutions-are-supported-on-the-new-azure-monitor-agent)). Most new capabilities in Azure Monitor will only be made available with Azure Monitor agent, so over time more functionality will only be available in the new agent. You should consider whether Azure Monitor agent has the features you require and if there are some features that you can temporarily do without to get other important features in the new agent. If Azure Monitor agent has all the core capabilities you require then consider transitioning to it. If there are critical features that you require then continue with the current agent until Azure Monitor agent reaches parity.
+- **Current and new feature requirements.** Azure Monitor agent introduces several new capabilities such as filtering, scoping, and multi-homing, but it isnΓÇÖt at parity yet with the current agents for other functionality such as custom log collection and integration with all solutions ([see solutions in preview](/azure/azure-monitor/faq#which-log-analytics-solutions-are-supported-on-the-new-azure-monitor-agent)). Most new capabilities in Azure Monitor will only be made available with Azure Monitor agent, so over time more functionality will only be available in the new agent. You should consider whether Azure Monitor agent has the features you require and if there are some features that you can temporarily do without to get other important features in the new agent. If Azure Monitor agent has all the core capabilities you require then consider transitioning to it. If there are critical features that you require then continue with the current agent until Azure Monitor agent reaches parity.
- **Tolerance for rework.** If you're setting up a new environment with resources such as deployment scripts and onboarding templates, asses the effort involved. If it will take a significant amount of work, then consider setting up your new environment with the new agent as it is now generally available. A deprecation date published for the Log Analytics agents in August, 2021. The current agents will be supported for several years once deprecation begins.
azure-monitor Api Custom Events Metrics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/api-custom-events-metrics.md
To determine how long data is kept, see [Data retention and privacy](./data-rete
## <a name="next"></a>Next steps * [Search events and logs](./diagnostic-search.md)
-* [Troubleshooting](../faq.md)
+* [Troubleshooting](../faq.yml)
azure-monitor Api Filtering Sampling https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/api-filtering-sampling.md
What's the difference between telemetry processors and telemetry initializers?
## <a name="next"></a>Next steps * [Search events and logs](./diagnostic-search.md) * [sampling](./sampling.md)
-* [Troubleshooting](../faq.md)
+* [Troubleshooting](../faq.yml)
azure-monitor App Insights Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/app-insights-overview.md
Get started at development time with:
[knowUsers]: app-insights-web-track-usage.md [platforms]: ./platforms.md [portal]: https://portal.azure.com/
-[qna]: ../faq.md
+[qna]: ../faq.yml
[redfield]: ./monitor-performance-live-website-now.md
azure-monitor Asp Net Trace Logs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/asp-net-trace-logs.md
If your application sends voluminous amounts of data and you're using the Applic
[diagnostic]: ./diagnostic-search.md [exceptions]: asp-net-exceptions.md [portal]: https://portal.azure.com/
-[qna]: ../faq.md
+[qna]: ../faq.yml
[start]: ./app-insights-overview.md
azure-monitor Cloudservices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/cloudservices.md
Did you build for .NET 4.6? .NET 4.6 is not automatically supported in Azure clo
[diagnostic]: ./diagnostic-search.md [netlogs]: ./asp-net-trace-logs.md [portal]: https://portal.azure.com/
-[qna]: ../faq.md
+[qna]: ../faq.yml
[redfield]: ./monitor-performance-live-website-now.md [start]: ./app-insights-overview.md
azure-monitor Diagnostic Search https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/diagnostic-search.md
We don't log the POST data automatically, but you can use [TrackTrace or log cal
* [Write complex queries in Analytics](../logs/log-analytics-tutorial.md) * [Send logs and custom telemetry to Application Insights](./asp-net-trace-logs.md) * [Set up availability and responsiveness tests](./monitor-web-app-availability.md)
-* [Troubleshooting](../faq.md)
+* [Troubleshooting](../faq.yml)
azure-monitor Monitor Performance Live Website Now https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/monitor-performance-live-website-now.md
Add more telemetry:
[client]: ./javascript.md [diagnostic]: ./diagnostic-search.md [greenbrown]: ./asp-net.md
-[qna]: ../faq.md
+[qna]: ../faq.yml
[roles]: ./resources-roles-access-control.md [usage]: ./javascript.md
azure-monitor Nodejs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/nodejs.md
These properties are client specific, so you can configure `appInsights.defaultC
<!--references--> [portal]: https://portal.azure.com/
-[FAQ]: ../faq.md
+[FAQ]: ../faq.yml
azure-monitor Sdk Connection String https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/sdk-connection-string.md
Customer scenarios where we visualize this having the most impact:
- Firewall exceptions or proxy redirects
- In cases where monitoring for intranet web server is required, our earlier solution asked customers to add individual service endpoints to your configuration. For more information, see [here](../faq.md#can-i-monitor-an-intranet-web-server).
+ In cases where monitoring for intranet web server is required, our earlier solution asked customers to add individual service endpoints to your configuration. For more information, see [here](../faq.yml#can-i-monitor-an-intranet-web-server-).
Connection strings offer a better alternative by reducing this effort to a single setting. A simple prefix, suffix amendment allows automatic population and redirection of all endpoints to the right services. - Sovereign or Hybrid cloud environments
azure-monitor Status Monitor V2 Detailed Instructions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/status-monitor-v2-detailed-instructions.md
When you monitor a computer on your private intranet, you'll need to route HTTP
The PowerShell commands to download and install Az.ApplicationMonitor from the PowerShell Gallery support a `-Proxy` parameter. Review the preceding instructions when you write your installation scripts.
-The Application Insights SDK will need to send your app's telemetry to Microsoft. We recommend that you configure proxy settings for your app in your web.config file. For more information, see [Application Insights FAQ: Proxy passthrough](../faq.md#proxy-passthrough).
+The Application Insights SDK will need to send your app's telemetry to Microsoft. We recommend that you configure proxy settings for your app in your web.config file. For more information, see [Application Insights FAQ: Proxy passthrough](../faq.yml).
## Enable monitoring
azure-monitor Resource Logs Categories https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/essentials/resource-logs-categories.md
If you think there is something is missing, you can open a GitHub comment at the
|AuditEvent|Audit Logs|No|
-## Microsoft.Kusto/Clusters
-
-|Category|Category Display Name|Costs To Export|
-||||
-|Command|Command|No|
-|FailedIngestion|Failed ingest operations|No|
-|IngestionBatching|Ingestion batching|No|
-|Query|Query|No|
-|SucceededIngestion|Successful ingest operations|No|
-|TableDetails|Table details|No|
-|TableUsageStatistics|Table usage statistics|No|
-- ## Microsoft.Logic/integrationAccounts |Category|Category Display Name|Costs To Export|
azure-monitor Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/faq.md
- Title: Azure Monitor FAQ | Microsoft Docs
-description: Answers to frequently asked questions about Azure Monitor.
---- Previously updated : 10/08/2020---
-# Azure Monitor Frequently Asked Questions
-
-This Microsoft FAQ is a list of commonly asked questions about Azure Monitor. If you have any additional questions, go to the [discussion forum](/answers/questions/topics/single/24223.html) and post your questions. When a question is frequently asked, we add it to this article so that it can be found quickly and easily.
-
-## General
-
-### What is Azure Monitor?
-
-[Azure Monitor](overview.md) is a service in Azure that provides performance and availability monitoring for applications and services in Azure, other cloud environments, or on-premises. Azure Monitor collects data from multiple sources into a common data platform where it can be analyzed for trends and anomalies. Rich features in Azure Monitor assist you in quickly identifying and responding to critical situations that may affect your application.
-
-### What's the difference between Azure Monitor, Log Analytics, and Application Insights?
-
-In September 2018, Microsoft combined Azure Monitor, Log Analytics, and Application Insights into a single service to provide powerful end-to-end monitoring of your applications and the components they rely on. Features in Log Analytics and Application Insights have not changed, although some features have been rebranded to Azure Monitor in order to better reflect their new scope. The log data engine and query language of Log Analytics is now referred to as Azure Monitor Logs. See [Azure Monitor terminology updates](terminology.md).
-
-### What does Azure Monitor cost?
-
-Features of Azure Monitor that are automatically enabled such as collection of metrics and activity logs are provided at no cost. There is a cost associated with other features such as log queries and alerting. See the [Azure Monitor pricing page](https://azure.microsoft.com/pricing/details/monitor/) for detailed pricing information.
-
-### How do I enable Azure Monitor?
-
-Azure Monitor is enabled the moment that you create a new Azure subscription, and [Activity log](./essentials/platform-logs-overview.md) and platform [metrics](essentials/data-platform-metrics.md) are automatically collected. Create [diagnostic settings](essentials/diagnostic-settings.md) to collect more detailed information about the operation of your Azure resources, and add [monitoring solutions](insights/solutions.md) and [insights](./monitor-reference.md) to provide additional analysis on collected data for particular services.
-
-### How do I access Azure Monitor?
-
-Access all Azure Monitor features and data from the **Monitor** menu in the Azure portal. The **Monitoring** section of the menu for different Azure services provides access to the same tools with data filtered to a particular resource. Azure Monitor data is also accessible for a variety of scenarios using CLI, PowerShell, and a REST API.
-
-### Is there an on-premises version of Azure Monitor?
-
-No. Azure Monitor is a scalable cloud service that processes and stores large amounts of data, although Azure Monitor can monitor resources that are on-premises and in other clouds.
-
-### Can Azure Monitor monitor on-premises resources?
-
-Yes, in addition to collecting monitoring data from Azure resources, Azure Monitor can collect data from virtual machines and applications in other clouds and on-premises. See [Sources of monitoring data for Azure Monitor](agents/data-sources.md).
-
-### Does Azure Monitor integrate with System Center Operations Manager?
-
-You can connect your existing System Center Operations Manager management group to Azure Monitor to collect data from agents into Azure Monitor Logs. This allows you to use log queries and solution to analyze data collected from agents. You can also configure existing System Center Operations Manager agents to send data directly to Azure Monitor. See [Connect Operations Manager to Azure Monitor](agents/om-agents.md).
-
-### What IP addresses does Azure Monitor use?
-
-See [IP addresses used by Application Insights and Log Analytics](app/ip-addresses.md) for a listing of the IP addresses and ports required for agents and other external resources to access Azure Monitor.
-
-## Monitoring data
-
-### Where does Azure Monitor get its data?
-
-Azure Monitor collects data from a variety of sources including logs and metrics from Azure platform and resources, custom applications, and agents running on virtual machines. Other services such as Azure Security Center and Network Watcher collect data into a Log Analytics workspace so it can be analyzed with Azure Monitor data. You can also send custom data to Azure Monitor using the REST API for logs or metrics. See [Sources of monitoring data for Azure Monitor](agents/data-sources.md).
-
-### What data is collected by Azure Monitor?
-
-Azure Monitor collects data from a variety of sources into [logs](logs/data-platform-logs.md) or [metrics](essentials/data-platform-metrics.md). Each type of data has its own relative advantages, and each supports a particular set of features in Azure Monitor. There is a single metrics database for each Azure subscription, while you can create multiple Log Analytics workspaces to collect logs depending on your requirements. See [Azure Monitor data platform](data-platform.md).
-
-### Is there a maximum amount of data that I can collect in Azure Monitor?
-
-There is no limit to the amount of metric data you can collect, but this data is stored for a maximum of 93 days. See [Retention of Metrics](essentials/data-platform-metrics.md#retention-of-metrics). There is no limit on the amount of log data that you can collect, but it may be affected by the pricing tier you choose for the Log Analytics workspace. See [pricing details](https://azure.microsoft.com/pricing/details/monitor/).
-
-### How do I access data collected by Azure Monitor?
-
-Insights and solutions provide a custom experience for working with data stored in Azure Monitor. You can work directly with log data using a log query written in Kusto Query Language (KQL). In the Azure portal, you can write and run queries and interactively analyze data using Log Analytics. Analyze metrics in the Azure portal with the Metrics Explorer. See [Analyze log data in Azure Monitor](logs/log-query-overview.md) and [Getting started with Azure Metrics Explorer](essentials/metrics-getting-started.md).
-
-## Solutions and insights
-
-### What is an insight in Azure Monitor?
-
-Insights provide a customized monitoring experience for particular Azure services. They use the same metrics and logs as other features in Azure Monitor but may collect additional data and provide a unique experience in the Azure portal. See [Insights in Azure Monitor](./monitor-reference.md).
-
-To view insights in the Azure portal, see the **Insights** section of the **Monitor** menu or the **Monitoring** section of the service's menu.
-
-### What is a solution in Azure Monitor?
-
-Monitoring solutions are packaged sets of logic for monitoring a particular application or service based on Azure Monitor features. They collect log data in Azure Monitor and provide log queries and views for their analysis using a common experience in the Azure portal. See [Monitoring solutions in Azure Monitor](insights/solutions.md).
-
-To view solutions in the Azure portal, click **More** in the **Insights** section of the **Monitor** menu. Click **Add** to add additional solutions to the workspace.
-
-## Logs
-
-### What's the difference between Azure Monitor Logs and Azure Data Explorer?
-
-Azure Data Explorer is a fast and highly scalable data exploration service for log and telemetry data. Azure Monitor Logs is built on top of Azure Data Explorer and uses the same Kusto Query Language (KQL) with some minor differences. See [Azure Monitor log query language differences](/azure/data-explorer/kusto/query/).
-
-### How do I retrieve log data?
-
-All data is retrieved from a Log Analytics workspace using a log query written using Kusto Query Language (KQL). You can write your own queries or use solutions and insights that include log queries for a particular application or service. See [Overview of log queries in Azure Monitor](logs/log-query-overview.md).
-
-### Can I delete data from a Log Analytics workspace?
-
-Data is removed from a workspace according to its [retention period](logs/manage-cost-storage.md#change-the-data-retention-period). You can delete specific data for privacy or compliance reasons. See [How to export and delete private data](logs/personal-data-mgmt.md#how-to-export-and-delete-private-data) for more information.
-
-### Is Log Analytics storage immutable?
-
-Data in database storage cannot be altered once ingested but can be deleted via [*purge* API path for deleting private data](./logs/personal-data-mgmt.md#delete). Although data cannot be altered, some certifications require that data is kept immutable and cannot be changed or deleted in storage. Data immutability can be achieved using [data export](./logs/logs-data-export.md) to a storage account that is configured as [immutable storage](../storage/blobs/storage-blob-immutability-policies-manage.md).
-
-### What is a Log Analytics workspace?
-
-All log data collected by Azure Monitor is stored in a Log Analytics workspace. A workspace is essentially a container where log data is collected from a variety of sources. You may have a single Log Analytics workspace for all your monitoring data or may have requirements for multiple workspaces. See [Designing your Azure Monitor Logs deployment](logs/design-logs-deployment.md).
-
-### Can you move an existing Log Analytics workspace to another Azure subscription?
-
-You can move a workspace between resource groups or subscriptions but not to a different region. See [Move a Log Analytics workspace to different subscription or resource group](logs/move-workspace.md).
-
-### Why can't I see Query Explorer and Save buttons in Log Analytics?
-
-**Query Explorer**, **Save** and **New alert rule** buttons are not available when the [query scope](logs/scope.md) is set to a specific resource. To create alerts, save or load a query, Log Analytics must be scoped to a workspace. To open Log Analytics in workspace context, select **Logs** from the **Azure Monitor** menu. The last used workspace is selected, but you can select any other workspace. See [Log query scope and time range in Azure Monitor Log Analytics](logs/scope.md)
-
-### Why am I getting the error: "Register resource provider 'Microsoft.Insights' for this subscription to enable this query" when opening Log Analytics from a VM?
-
-Many resource providers are automatically registered, but you may need to manually register some resource providers. The scope for registration is always the subscription. See [Resource providers and types](../azure-resource-manager/management/resource-providers-and-types.md#azure-portal) for more information.
-
-### Why am I getting no access error message when opening Log Analytics from a VM?
-
-To view VM Logs, you need to be granted with read permission to the workspaces that stores the VM logs. In these cases, your administrator must grant you with to permissions in Azure.
-
-## Metrics
-
-### Why are metrics from the guest OS of my Azure virtual machine not showing up in Metrics explorer?
-
-[Platform metrics](essentials/monitor-azure-resource.md#monitoring-data) are collected automatically for Azure resources. You must perform some configuration though to collect metrics from the guest OS of a virtual machine. For a Windows VM, install the diagnostic extension and configure the Azure Monitor sink as described in [Install and configure Windows Azure diagnostics extension (WAD)](agents/diagnostics-extension-windows-install.md). For Linux, install the Telegraf agent as described in [Collect custom metrics for a Linux VM with the InfluxData Telegraf agent](essentials/collect-custom-metrics-linux-telegraf.md).
-
-## Alerts
-
-### What is an alert in Azure Monitor?
-
-Alerts proactively notify you when important conditions are found in your monitoring data. They allow you to identify and address issues before the users of your system notice them. There are multiple kinds of alerts:
--- Metric - Metric value exceeds a threshold.-- Log query - Results of a log query match defined criteria.-- Activity log - Activity log event matches defined criteria.-- Web test - Results of availability test match defined criteria.-
-See [Overview of alerts in Microsoft Azure](alerts/alerts-overview.md).
-
-### What is an action group?
-
-An action group is a collection of notifications and actions that can be triggered by an alert. Multiple alerts can use a single action group allowing you to leverage common sets of notifications and actions. See [Create and manage action groups in the Azure portal](alerts/action-groups.md).
-
-### What is an action rule?
-
-An action rule allows you to modify the behavior of a set of alerts that match a certain criteria. This allows you to perform such requirements as disable alert actions during a maintenance window. You can also apply an action group to a set of alerts rather than applying them directly to the alert rules. See [Action rules](alerts/alerts-action-rules.md).
-
-## Agents
-
-### Does Azure Monitor require an agent?
-
-An agent is only required to collect data from the operating system and workloads in virtual machines. The virtual machines can be located in Azure, another cloud environment, or on-premises. See [Overview of the Azure Monitor agents](agents/agents-overview.md).
-
-### What's the difference between the Azure Monitor agents?
-
-Azure Diagnostic extension is for Azure virtual machines and collects data to Azure Monitor Metrics, Azure Storage, and Azure Event Hubs. The Log Analytics agent is for virtual machines in Azure, another cloud environment, or on-premises and collects data to Azure Monitor Logs. The Dependency agent requires the Log Analytics agent and collected process details and dependencies. See [Overview of the Azure Monitor agents](agents/agents-overview.md).
-
-### Does my agent traffic use my ExpressRoute connection?
-
-Traffic to Azure Monitor uses the Microsoft peering ExpressRoute circuit. See [ExpressRoute documentation](../expressroute/expressroute-faqs.md#supported-services) for a description of the different types of ExpressRoute traffic.
-
-### How can I confirm that the Log Analytics agent is able to communicate with Azure Monitor?
-
-From Control Panel on the agent computer, select **Security & Settings**, **Microsoft Monitoring Agent**. Under the **Azure Log Analytics (OMS)** tab, a green check mark icon confirms that the agent is able to communicate with Azure Monitor. A yellow warning icon means the agent is having issues. One common reason is the **Microsoft Monitoring Agent** service has stopped. Use service control manager to restart the service.
-
-### How do I stop the Log Analytics agent from communicating with Azure Monitor?
-
-For agents connected to Log Analytics directly, open the Control Panel and select **Security & Settings**, **Microsoft Monitoring Agent**. Under the **Azure Log Analytics (OMS)** tab, remove all workspaces listed. In System Center Operations Manager, remove the computer from the Log Analytics managed computers list. Operations Manager updates the configuration of the agent to no longer report to Log Analytics.
-
-### How much data is sent per agent?
-
-The amount of data sent per agent depends on:
-
-* The solutions you have enabled
-* The number of logs and performance counters being collected
-* The volume of data in the logs
-
-See [Manage usage and costs with Azure Monitor Logs](logs/manage-cost-storage.md) for details.
-
-For computers that are able to run the WireData agent, use the following query to see how much data is being sent:
-
-```kusto
-WireData
-| where ProcessName == "C:\\Program Files\\Microsoft Monitoring Agent\\Agent\\MonitoringHost.exe"
-| where Direction == "Outbound"
-| summarize sum(TotalBytes) by Computer
-```
-
-### How much network bandwidth is used by the Microsoft Management Agent (MMA) when sending data to Azure Monitor?
-
-Bandwidth is a function on the amount of data sent. Data is compressed as it is sent over the network.
-
-### How can I be notified when data collection from the Log Analytics agent stops?
-
-Use the steps described in [create a new log alert](alerts/alerts-metric.md) to be notified when data collection stops. Use the following settings for the alert rule:
--- **Define alert condition**: Specify your Log Analytics workspace as the resource target.-- **Alert criteria**
- - **Signal Name**: *Custom log search*
- - **Search query**: `Heartbeat | summarize LastCall = max(TimeGenerated) by Computer | where LastCall < ago(15m)`
- - **Alert logic**: **Based on** *number of results*, **Condition** *Greater than*, **Threshold value** *0*
- - **Evaluated based on**: **Period (in minutes)** *30*, **Frequency (in minutes)** *10*
-- **Define alert details**
- - **Name**: *Data collection stopped*
- - **Severity**: *Warning*
-
-Specify an existing or new [Action Group](alerts/action-groups.md) so that when the log alert matches criteria, you are notified if you have a heartbeat missing for more than 15 minutes.
-
-### What are the firewall requirements for Azure Monitor agents?
-
-See [Network firewall requirements](agents/log-analytics-agent.md#network-requirements)for details on firewall requirements.
--
-## Azure Monitor Agent (preview)
-
-### What is the upgrade path from Log Analytics agents to Azure Monitor Agent? How do we migrate?
-ThereΓÇÖs currently no ΓÇ£auto-updateΓÇ¥ path since this involves complete uninstallation of existing agents on your machines and installing the new agent. As such we expect customers to plan for migration at the most suitable time based on feature parity.
-The more important reason is that with the new agent, data collection setup is much simpler, centralized, and configurable for subsets of machines connected to same or multiple destinations **using the power of Data Collection Rules, which overcomes a lot of limitations of existing Log Analytics agent setup**.
-
-**As such, we recommend looking at your data collection needs holistically, and creating the right DCRs first, and then performing the agent update to go along with this new setup**. To do so, you can follow the guidance on how to deploy at scale.
--
-### WhatΓÇÖs the upgrade path from Log Analytics Agent (MMA) to Azure Monitor Agent (AMA) for monitoring SCOM? Can we use AMA for SCOM scenarios?
-Here's how AMA impacts the two SCOM related monitor scenarios:
-- **Scenario 1**: For monitoring the Windows operating system of SCOM, the upgrade path is same as any other machine, wherein you can migrate from MMA (versions 2016, 2019) to AMA as soon as your required parity features are available on AMA.-- **Scenario 2**: For onboarding/connecting SCOM to Log Analytics workspaces, since this is enabled via a SCOM connector for Log Analytics/Azure Monitor, neither MMA nor AMA is required to be installed on the SCOM management server. As such there is no impact to this use case from AMA perspective. -
-> [!NOTE]
-> You can run both scenarios above with MMA and AMA side-by-side without any impact.
--
-### Will the new Azure Monitor agent support data collection for the various Log Analytics solutions?
-What youΓÇÖre familiar as solutions or management packs, now become VM extensions that use the Azure Monitor Agent extension to send data to Azure Monitor. When you enable a solution on AMA, you hence see an additional VM extension installed, when applicable.
-Over time, AMA will be the only agent that uploads data to Azure Monitor (or additional supported destinations).
-The solution specific VM extensions exist to collect scenario specific data or perform transformation/processing as required, and then use AMA to route the final data to Azure Monitor.
-
-HereΓÇÖs a diagram explaining the **new extensibility architecture**:
-
-![Extensions architecture](agents/media/azure-monitor-agent/extensibility-arch-new.png)
--
-### Which Log Analytics solutions are supported on the new Azure Monitor Agent?
-Log Analytics solutions can be enabled using the new Azure Monitor Agent either as natively supported or by installing the additional VM extension for the solution.
-
-| **Solution (VM extension)** | **Availability on Azure Monitor Agent (AMA)** |
-|:|:|
-| **Azure Security Center** | Private preview on AMA |
-| **Sentinel** | Private preview on AMA |
-| **Change Tracking** | Supported as File Integrity Monitoring (FIM) in ASC private preview on AMA |
-| **Update Management** | You can use the Update Management v2 (private preview) that doesnΓÇÖt need any agent |
-| **VM Insights with metrics support** | Private preview on AMA |
-| **VM Insights guest health (new)** | Public preview: [VM insights guest health (preview)](vm/vminsights-health-overview.md) |
-| **SQL Monitoring (new)** | Public preview exclusively on AMA: [SQL insights (preview)](insights/sql-insights-overview.md) |
--
-### How can I collect Windows security events using the new Azure Monitor Agent?
-There's two ways you can collect Security events using the new agent, when sending to Log Analytics workspace(s):
-- You can use AMA to natively collect Security Events, same as other Windows Events. These flow to the ['Event'](/azure/azure-monitor/reference/tables/Event) table in your Log Analytics workspace.-- If you have Sentinel enabled on the workspace, the Security Events flow via AMA into the ['SecurityEvent'](/azure/azure-monitor/reference/tables/SecurityEvent) table instead (same as using Log Analytics Agent). This will always require the solution to be enabled first.--
-### Can the new Azure Monitor Agent and Log Analytics Agent co-exist side-by-side?
-Yes they can, but with certain considerations. Read more [here](agents/azure-monitor-agent-overview.md#coexistence-with-other-agents).
-
-### Is the new Azure Monitor Agent at parity with existing agents?
-It does not have full parity yet with existing agents. Here are some high-level gaps:
--- **Comparison with Log Analytics Agents (MMA/OMS)**
- - Not all Log Analytics Solutions are supported today. See table above
- - No support for Private Links
- - No support for proxy servers or Log Analytics (OMS) gateway
- - No support for collecting custom logs or IIS logs
- - No support for Hybrid Runbook workers
--- **Comparison with Azure Diagnostic Extensions (WAD/LAD)**
- - No support for Event Hubs and Storage accounts as destinations
--
-### Does the new Azure Monitor Agent support non-Azure environments (other clouds, on-premises)?
-Both on-premises machines and machines connected to other clouds are supported for servers today, once you have the Azure ARC agent installed. For purposes of running AMA and DCR, the ARC requirement comes at **no additional cost or resource consumption**, since the ARC agent is only used an installation mechanism and isnΓÇÖt performing any operations unless you enable them
--
-### What types of machines does the new Azure Monitor Agent support?
-You can directly install them on Virtual Machines, Virtual Machines Scale Sets, and ARC enabled Servers only.
--
-### Can we filter events using event ID, i.e. more granular event filtering using the new Azure Monitor Agent?
-Yes. You can use **Xpath queries** for filtering events on Windows machines. [Learn more](agents/data-collection-rule-azure-monitor-agent.md#limit-data-collection-with-custom-xpath-queries)
-For performance counters, you can specify specific counters you wish to collect, and exclude ones you donΓÇÖt need.
-For syslog on Linux, you can choose Facilities and log level for each facility to collect.
--
-### Does the new Azure Monitor agent support sending data to EventHubs and Azure Storage Accounts?
-Not yet, but the new agent along with Data Collection Rules will support sending data to both Event Hubs as well as Azure Storage accounts in the future. Watch out for announcements in Azure Updates or join the [Teams channel](https://teams.microsoft.com/l/team/19%3af3f168b782f64561b52abe75e59e83bc%40thread.tacv2/conversations?groupId=770d6aa5-c2f7-4794-98a0-84fd6ae7f193&tenantId=72f988bf-86f1-41af-91ab-2d7cd011db47) for frequent updates, support, news and more!
-
-### Does the new Azure Monitor agent have hardening support for Linux?
-This is not available currently for the agent in preview, and is planned to be added post GA.
--
-## Visualizations
-
-### Why can't I see View Designer?
-
-View Designer is only available for users assigned with Contributor permissions or higher in the Log Analytics workspace.
-
-## Application Insights
-
-### Configuration problems
-
-*I'm having trouble setting up my:*
-
-* [.NET app](app/asp-net-troubleshoot-no-data.md)
-* [Monitoring an already-running app](app/monitor-performance-live-website-now.md#troubleshoot)
-* [Azure diagnostics](agents/diagnostics-extension-to-application-insights.md)
-* [Java web app](app/java-2x-troubleshoot.md)
-
-*I get no data from my server:*
-
-* [Set firewall exceptions](app/ip-addresses.md)
-* [Set up an ASP.NET server](app/monitor-performance-live-website-now.md)
-* [Set up a Java server](app/java-2x-agent.md)
-
-*How many Application Insights resources should I deploy:*
-
-* [How to design your Application Insights deployment: One versus many Application Insights resources?](app/separate-resources.md)
-
-### Can I use Application Insights with ...?
-
-* [Web apps on an IIS server in Azure VM or Azure virtual machine scale set](app/azure-vm-vmss-apps.md)
-* [Web apps on an IIS server - on-premises or in a VM](app/asp-net.md)
-* [Java web apps](app/java-in-process-agent.md)
-* [Node.js apps](app/nodejs.md)
-* [Web apps on Azure](app/azure-web-apps.md)
-* [Cloud Services on Azure](app/cloudservices.md)
-* [App servers running in Docker](./azure-monitor-app-hub.yml)
-* [Single-page web apps](app/javascript.md)
-* [SharePoint](app/sharepoint.md)
-* [Windows desktop app](app/windows-desktop.md)
-* [Other platforms](app/platforms.md)
-
-### Is it free?
-
-Yes, for experimental use. In the basic pricing plan, your application can send a certain allowance of data each month free of charge. The free allowance is large enough to cover development, and publishing an app for a small number of users. You can set a cap to prevent more than a specified amount of data from being processed.
-
-Larger volumes of telemetry are charged by the Gb. We provide some tips on how to [limit your charges](app/pricing.md).
-
-The Enterprise plan incurs a charge for each day that each web server node sends telemetry. It is suitable if you want to use Continuous Export on a large scale.
-
-[Read the pricing plan](https://azure.microsoft.com/pricing/details/application-insights/).
-
-### How much does it cost?
-
-* Open the **Usage and estimated costs page** in an Application Insights resource. There's a chart of recent usage. You can set a data volume cap, if you want.
-* Open the [Azure Billing blade](https://portal.azure.com/#blade/Microsoft_Azure_Billing/BillingBlade/Overview) to see your bills across all resources.
-
-### <a name="q14"></a>What does Application Insights modify in my project?
-
-The details depend on the type of project. For a web application:
-
-* Adds these files to your project:
- * ApplicationInsights.config
- * ai.js
-* Installs these NuGet packages:
- * *Application Insights API* - the core API
- * *Application Insights API for Web Applications* - used to send telemetry from the server
- * *Application Insights API for JavaScript Applications* - used to send telemetry from the client
-* The packages include these assemblies:
- * Microsoft.ApplicationInsights
- * Microsoft.ApplicationInsights.Platform
-* Inserts items into:
- * Web.config
- * packages.config
-* (New projects only - if you [add Application Insights to an existing project][start], you have to do this manually.) Inserts snippets into the client and server code to initialize them with the Application Insights resource ID. For example, in an MVC app, code is inserted into the master page Views/Shared/\_Layout.cshtml
-
-### How do I upgrade from older SDK versions?
-
-See the [release notes](app/release-notes.md) for the SDK appropriate to your type of application.
-
-### <a name="update"></a>How can I change which Azure resource my project sends data to?
-
-In Solution Explorer, right-click `ApplicationInsights.config` and choose **Update Application Insights**. You can send the data to an existing or new resource in Azure. The update wizard changes the instrumentation key in ApplicationInsights.config, which determines where the server SDK sends your data. Unless you deselect "Update all," it will also change the key where it appears in your web pages.
-
-### Do new Azure regions require the use of connection strings?
-
-New Azure regions **require** the use of connection strings instead of instrumentation keys. [Connection string](./app/sdk-connection-string.md) identifies the resource that you want to associate your telemetry data with. It also allows you to modify the endpoints your resource will use as a destination for your telemetry. You will need to copy the connection string and add it to your application's code or to an environment variable.
-
-### Should I use connection strings or instrumentation keys?
-
-[Connection Strings](./app/sdk-connection-string.md) are recommended over instrumentation keys.
-
-### Can I use `providers('Microsoft.Insights', 'components').apiVersions[0]` in my Azure Resource Manager deployments?
-
-We do not recommend using this method of populating the API version. The newest version can represent preview releases which may contain breaking changes. Even with newer non-preview releases, the API versions are not always backwards compatible with existing templates, or in some cases the API version may not be available to all subscriptions.
-
-### What is Status Monitor?
-
-A desktop app that you can use in your IIS web server to help configure Application Insights in web apps. It doesn't collect telemetry: you can stop it when you are not configuring an app.
-
-[Learn more](app/monitor-performance-live-website-now.md#questions).
-
-### What telemetry is collected by Application Insights?
-
-From server web apps:
-
-* HTTP requests
-* [Dependencies](app/asp-net-dependencies.md). Calls to: SQL Databases; HTTP calls to external services; Azure Cosmos DB, table, blob storage, and queue.
-* [Exceptions](app/asp-net-exceptions.md) and stack traces.
-* [Performance Counters](app/performance-counters.md) - If you use [Status Monitor](app/monitor-performance-live-website-now.md), [Azure monitoring for App Services](app/azure-web-apps.md), [Azure monitoring for VM or virtual machine scale set](app/azure-vm-vmss-apps.md), or the [Application Insights collectd writer](app/java-2x-collectd.md).
-* [Custom events and metrics](app/api-custom-events-metrics.md) that you code.
-* [Trace logs](app/asp-net-trace-logs.md) if you configure the appropriate collector.
-
-From [client web pages](app/javascript.md):
-
-* [Page view counts](app/usage-overview.md)
-* [AJAX calls](app/asp-net-dependencies.md) Requests made from a running script.
-* Page view load data
-* User and session counts
-* [Authenticated user IDs](app/api-custom-events-metrics.md#authenticated-users)
-
-From other sources, if you configure them:
-
-* [Azure diagnostics](agents/diagnostics-extension-to-application-insights.md)
-* [Import to Analytics](logs/data-collector-api.md)
-* [Log Analytics](logs/data-collector-api.md)
-* [Logstash](logs/data-collector-api.md)
-
-### Can I filter out or modify some telemetry?
-
-Yes, in the server you can write:
-
-* Telemetry Processor to filter or add properties to selected telemetry items before they are sent from your app.
-* Telemetry Initializer to add properties to all items of telemetry.
-
-Learn more for [ASP.NET](app/api-filtering-sampling.md) or [Java](app/java-2x-filter-telemetry.md).
-
-### How are city, country/region, and other geo location data calculated?
-
-We look up the IP address (IPv4 or IPv6) of the web client using [GeoLite2](https://dev.maxmind.com/geoip/geoip2/geolite2/).
-
-* Browser telemetry: We collect the sender's IP address.
-* Server telemetry: The Application Insights module collects the client IP address. It is not collected if `X-Forwarded-For` is set.
-* To learn more about how IP address and geolocation data are collected in Application Insights refer to this [article](./app/ip-collection.md).
-
-You can configure the `ClientIpHeaderTelemetryInitializer` to take the IP address from a different header. In some systems, for example, it is moved by a proxy, load balancer, or CDN to `X-Originating-IP`. [Learn more](https://apmtips.com/posts/2016-07-05-client-ip-address/).
-
-You can [use Power BI](app/export-power-bi.md ) to display your request telemetry on a map.
-
-### <a name="data"></a>How long is data retained in the portal? Is it secure?
-
-Take a look at [Data Retention and Privacy][data].
-
-### What happens to Application Insight's telemetry when a server or device loses connection with Azure?
-
-All of our SDKs, including the web SDK, includes "reliable transport" or "robust transport". When the server or device loses connection with Azure, telemetry is [stored locally on the file system](./app/data-retention-privacy.md#does-the-sdk-create-temporary-local-storage) (Server SDKs) or in HTML5 Session Storage (Web SDK). The SDK will periodically retry to send this telemetry until our ingestion service considers it "stale" (48-hours for logs, 30 minutes for metrics). Stale telemetry will be dropped. In some cases, such as when local storage is full, retry will not occur.
-
-### Could personal data be sent in the telemetry?
-
-This is possible if your code sends such data. It can also happen if variables in stack traces include personal data. Your development team should conduct risk assessments to ensure that personal data is properly handled. [Learn more about data retention and privacy](app/data-retention-privacy.md).
-
-**All** octets of the client web address are always set to 0 after the geo location attributes are looked up.
-
-The [Application Insights JavaScript SDK](app/javascript.md) does not include any personal data in its autocompletion by default. However, some personal data used in your application may be picked up by the SDK (for example, full names in `window.title` or account IDs in XHR URL query parameters). For custom personal data masking, add a [telemetry initializer](app/api-filtering-sampling.md#javascript-web-applications).
-
-### My Instrumentation Key is visible in my web page source.
-
-* This is common practice in monitoring solutions.
-* It can't be used to steal your data.
-* It could be used to skew your data or trigger alerts.
-* We have not heard that any customer has had such problems.
-
-You could:
-
-* Use two separate Instrumentation Keys (separate Application Insights resources), for client and server data. Or
-* Write a proxy that runs in your server, and have the web client send data through that proxy.
-
-### <a name="post"></a>How do I see POST data in Diagnostic search?
-
-We don't log POST data automatically, but you can use a TrackTrace call: put the data in the message parameter. This has a longer size limit than the limits on string properties, though you can't filter on it.
-
-### Should I use single or multiple Application Insights resources?
-
-Use a single resource for all the components or roles in a single business system. Use separate resources for development, test, and release versions, and for independent applications.
-
-* [See the discussion here](app/separate-resources.md)
-* [Example - cloud service with worker and web roles](app/cloudservices.md)
-
-### How do I dynamically change the instrumentation key?
-
-* [Discussion here](app/separate-resources.md)
-* [Example - cloud service with worker and web roles](app/cloudservices.md)
-
-### What are the User and Session counts?
-
-* The JavaScript SDK sets a user cookie on the web client, to identify returning users, and a session cookie to group activities.
-* If there is no client-side script, you can [set cookies at the server](https://apmtips.com/posts/2016-07-09-tracking-users-in-api-apps/).
-* If one real user uses your site in different browsers, or using in-private/incognito browsing, or different machines, then they will be counted more than once.
-* To identify a logged-in user across machines and browsers, add a call to [setAuthenticatedUserContext()](app/api-custom-events-metrics.md#authenticated-users).
-
-### How does Application Insights generate device information (Browser, OS, Language, Model)?
-
-The browser passes the User Agent string in the HTTP header of the request, and the Application Insights ingestion service uses [UA Parser](https://github.com/ua-parser/uap-core) to generate the fields you see in the data tables and experiences. As a result, Application Insights users are unable to change these fields.
-
-Occasionally this data may be missing or inaccurate if the user or enterprise disables sending User Agent in Browser settings. Additionally, the [UA Parser regexes](https://github.com/ua-parser/uap-core/blob/master/regexes.yaml) may not include all device information or Application Insights may not have adopted the latest updates.
-
-### <a name="q17"></a> Have I enabled everything in Application Insights?
-
-| What you should see | How to get it | Why you want it |
-| | | |
-| Availability charts |[Web tests](app/monitor-web-app-availability.md) |Know your web app is up |
-| Server app perf: response times, ... |[Add Application Insights to your project](app/asp-net.md) or [Install AI Status Monitor on server](app/monitor-performance-live-website-now.md) (or write your own code to [track dependencies](app/api-custom-events-metrics.md#trackdependency)) |Detect perf issues |
-| Dependency telemetry |[Install AI Status Monitor on server](app/monitor-performance-live-website-now.md) |Diagnose issues with databases or other external components |
-| Get stack traces from exceptions |[Insert TrackException calls in your code](app/asp-net-exceptions.md) (but some are reported automatically) |Detect and diagnose exceptions |
-| Search log traces |[Add a logging adapter](app/asp-net-trace-logs.md) |Diagnose exceptions, perf issues |
-| Client usage basics: page views, sessions, ... |[JavaScript initializer in web pages](app/javascript.md) |Usage analytics |
-| Client custom metrics |[Tracking calls in web pages](app/api-custom-events-metrics.md) |Enhance user experience |
-| Server custom metrics |[Tracking calls in server](app/api-custom-events-metrics.md) |Business intelligence |
-
-### Why are the counts in Search and Metrics charts unequal?
-
-[Sampling](app/sampling.md) reduces the number of telemetry items (requests, custom events, and so on) that are actually sent from your app to the portal. In Search, you see the number of items actually received. In metric charts that display a count of events, you see the number of original events that occurred.
-
-Each item that is transmitted carries an `itemCount` property that shows how many original events that item represents. To observe sampling in operation, you can run this query in Analytics:
-
-```
- requests | summarize original_events = sum(itemCount), transmitted_events = count()
-```
-
-### How do I move an Application Insights resource to a new region?
-
-Moving existing Application Insights resources from one region to another is **currently not supported**. Historical data that you have collected **cannot be migrated** to a new region. The only partial workaround is to:
-
-1. Create a brand new Application Insights resource ([classic](app/create-new-resource.md) or [workspace-based](./app/create-workspace-resource.md)) in the new region.
-2. Recreate all unique customizations specific to the original resource in the new resource.
-3. Modify your application to use the new region resource's [instrumentation key](app/create-new-resource.md#copy-the-instrumentation-key) or [connection string](app/sdk-connection-string.md).
-4. Test to confirm that everything is continuing to work as expected with your new Application Insights resource.
-5. At this point you can either delete the original resource which will result in **all historical data being lost**. Or retain the original resource for historical reporting purposes for the duration of its data retention settings.
-
-Unique customizations that commonly need to be manually recreated or updated for the resource in the new region include but are not limited to:
--- Recreate custom dashboards and workbooks.-- Recreate or update the scope of any custom log/metric alerts.-- Recreate availability alerts.-- Recreate any custom Azure role-based access control (Azure RBAC) settings that are required for your users to access the new resource.-- Replicate settings involving ingestion sampling, data retention, daily cap, and custom metrics enablement. These settings are controlled via the **Usage and estimated costs** pane.-- Any integration that relies on API keys such as [release annotations](./app/annotations.md), [live metrics secure control channel](app/live-stream.md#secure-the-control-channel) etc. You will need to generate new API keys and update the associated integration.-- Continuous export in classic resources would need to be configured again.-- Diagnostic settings in workspace-based resources would need to be configured again.-
-> [!NOTE]
-> If the resource you are creating in a new region is replacing a classic resource we recommend exploring the benefits of [creating a new workspace-based resource](app/create-workspace-resource.md) or alternatively [migrating your existing resource to workspace-based](app/convert-classic-resource.md).
-
-### Automation
-
-#### Configuring Application Insights
-
-You can [write PowerShell scripts](app/powershell.md) using Azure Resource Monitor to:
-
-* Create and update Application Insights resources.
-* Set the pricing plan.
-* Get the instrumentation key.
-* Add a metric alert.
-* Add an availability test.
-
-You can't set up a Metric Explorer report or set up continuous export.
-
-#### Querying the telemetry
-
-Use the [REST API](https://dev.applicationinsights.io/) to run [Analytics](./logs/log-query-overview.md) queries.
-
-### How can I set an alert on an event?
-
-Azure alerts are only on metrics. Create a custom metric that crosses a value threshold whenever your event occurs. Then set an alert on the metric. You'll get a notification whenever the metric crosses the threshold in either direction; you won't get a notification until the first crossing, no matter whether the initial value is high or low; there is always a latency of a few minutes.
-
-### Are there data transfer charges between an Azure web app and Application Insights?
-
-* If your Azure web app is hosted in a data center where there is an Application Insights collection endpoint, there is no charge.
-* If there is no collection endpoint in your host data center, then your app's telemetry will incur [Azure outgoing charges](https://azure.microsoft.com/pricing/details/bandwidth/).
-
-This doesn't depend on where your Application Insights resource is hosted. It just depends on the distribution of our endpoints.
-
-### Can I send telemetry to the Application Insights portal?
-
-We recommend you use our SDKs and use the [SDK API](app/api-custom-events-metrics.md). There are variants of the SDK for various [platforms](app/platforms.md). These SDKs handle buffering, compression, throttling, retries, and so on. However, the [ingestion schema](https://github.com/microsoft/ApplicationInsights-dotnet/tree/master/BASE/Schem) are public.
-
-### Can I monitor an intranet web server?
-
-Yes, but you will need to allow traffic to our services by either firewall exceptions or proxy redirects.
-- QuickPulse `https://rt.services.visualstudio.com:443`-- ApplicationIdProvider `https://dc.services.visualstudio.com:443`-- TelemetryChannel `https://dc.services.visualstudio.com:443`-
-Review our full list of services and IP addresses [here](app/ip-addresses.md).
-
-#### Firewall exception
-
-Allow your web server to send telemetry to our endpoints.
-
-#### Gateway redirect
-
-Route traffic from your server to a gateway on your intranet by overwriting Endpoints in your configuration. If these "Endpoint" properties are not present in your config, these classes will use the default values shown below in the example ApplicationInsights.config.
-
-Your gateway should route traffic to our endpoint's base address. In your configuration, replace the default values with `http://<your.gateway.address>/<relative path>`.
-
-##### Example ApplicationInsights.config with default endpoints:
-
-```xml
-<ApplicationInsights>
- ...
- <TelemetryModules>
- <Add Type="Microsoft.ApplicationInsights.Extensibility.PerfCounterCollector.QuickPulse.QuickPulseTelemetryModule, Microsoft.AI.PerfCounterCollector">
- <QuickPulseServiceEndpoint>https://rt.services.visualstudio.com/QuickPulseService.svc</QuickPulseServiceEndpoint>
- </Add>
- </TelemetryModules>
- ...
- <TelemetryChannel>
- <EndpointAddress>https://dc.services.visualstudio.com/v2/track</EndpointAddress>
- </TelemetryChannel>
- ...
- <ApplicationIdProvider Type="Microsoft.ApplicationInsights.Extensibility.Implementation.ApplicationId.ApplicationInsightsApplicationIdProvider, Microsoft.ApplicationInsights">
- <ProfileQueryEndpoint>https://dc.services.visualstudio.com/api/profiles/{0}/appId</ProfileQueryEndpoint>
- </ApplicationIdProvider>
- ...
-</ApplicationInsights>
-```
-
-> [!NOTE]
-> ApplicationIdProvider is available starting in v2.6.0.
-
-#### Proxy passthrough
-
-Proxy passthrough can be achieved by configuring either a machine level or application level proxy.
-For more information see dotnet's article on [DefaultProxy](/dotnet/framework/configure-apps/file-schema/network/defaultproxy-element-network-settings).
-
- Example Web.config:
-
-```xml
-<system.net>
- <defaultProxy>
- <proxy proxyaddress="http://xx.xx.xx.xx:yyyy" bypassonlocal="true"/>
- </defaultProxy>
-</system.net>
-```
-
-### Can I run Availability web tests on an intranet server?
-
-Our [web tests](app/monitor-web-app-availability.md) run on points of presence that are distributed around the globe. There are two solutions:
-
-* Firewall door - Allow requests to your server from [the long and changeable list of web test agents](app/ip-addresses.md).
-* Write your own code to send periodic requests to your server from inside your intranet. You could run Visual Studio web tests for this purpose. The tester could send the results to Application Insights using the TrackAvailability() API.
-
-### How long does it take for telemetry to be collected?
-
-Most Application Insights data has a latency of under 5 minutes. Some data can take longer; typically larger log files. For more information, see the [Application Insights SLA](https://azure.microsoft.com/support/legal/sla/application-insights/v1_2/).
-
-<!--Link references-->
-
-[data]: app/data-retention-privacy.md
-[platforms]: app/platforms.md
-[start]: app/app-insights-overview.md
-[windows]: app/app-insights-windows-get-started.md
-
-### HTTP 502 and 503 responses are not always captured by Application Insights
-
-"502 bad gateway" and "503 service unavailable" errors are not always captured by Application Insights. If only client-side JavaScript is being used for monitoring this would be expected behavior since the error response is returned prior to the page containing the HTML header with the monitoring JavaScript snippet being rendered.
-
-If the 502 or 503 response was sent from a server with server-side monitoring enabled the errors would be collected by the Application Insights SDK.
-
-However, there are still cases where even when server-side monitoring is enabled on an application's web server that a 502 or 503 error will not be captured by Application Insights. Many modern web servers do not allow a client to communicate directly, but instead employ solutions like reverse proxies to pass information back and forth between the client and the front-end web servers.
-
-In this scenario, a 502 or 503 response could be returned to a client due to an issue at the reverse proxy layer and this would not be captured out-of-box by Application Insights. To help detect issues at this layer you may need to forward logs from your reverse proxy to Log Analytics and create a custom rule to check for 502/503 responses. To learn more about common causes of 502 and 503 errors consult the Azure App Service [troubleshooting article for "502 bad gateway" and "503 service unavailable"](../app-service/troubleshoot-http-502-http-503.md).
-
-## OpenTelemetry
-
-### What is OpenTelemetry
-
-A new open-source standard for observability. Learn more at [https://opentelemetry.io/](https://opentelemetry.io/).
-
-### Why is Microsoft / Azure Monitor investing in OpenTelemetry?
-
-We believe it better serves our customers for three reasons:
- 1. Enable support for more customer scenarios.
- 2. Instrument without fear of vendor lock-in.
- 3. Increase customer transparency and engagement.
-
-It also aligns with Microsoft's strategy to [embrace open source](https://opensource.microsoft.com/).
-
-### What additional value does OpenTelemetry give me?
-
-In addition to the reasons above, OpenTelemetry is more efficient at-scale and provides consistent design/configurations across languages.
-
-### How can I test out OpenTelemetry?
-
-Sign up to join our Azure Monitor Application Insights early adopter community at [https://aka.ms/AzMonOtel](https://aka.ms/AzMonOtel).
-
-### What does GA mean in the context of OpenTelemetry?
-
-The OpenTelemetry community defines Generally Available (GA) [here](https://medium.com/opentelemetry/ga-planning-f0f6d7b5302). However, OpenTelemetry "GA" does not mean feature parity with the existing Application Insights SDKs. Azure Monitor will continue to recommend our current Application Insights SDKs for customers requiring features such as [pre-aggregated metrics](app/pre-aggregated-metrics-log-metrics.md#pre-aggregated-metrics), [live metrics](app/live-stream.md), [adaptive sampling](app/sampling.md#adaptive-sampling), [profiler](app/profiler-overview.md), and [snapshot debugger](app/snapshot-debugger.md) until the OpenTelemetry SDKs reach feature maturity.
-
-### Can I use Preview builds in production environments?
-
-It's not recommended. See [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) for more information.
-
-### What's the difference between OpenTelemetry SDK and auto-instrumentation?
-
-The OpenTelemetry specification defines [SDK](https://github.com/open-telemetry/opentelemetry-specification/blob/master/specification/glossary.md#telemetry-sdk). In short, "SDK" is a language-specific package that collects telemetry data across the various components of your application and sends the data to Azure Monitor via an exporter.
-
-The concept of auto-instrumentation (sometimes referred to as bytecode injection, codeless, or agent-based) refers to the capability to instrument your application without changing your code. For example, check out the [OpenTelemetry Java Auto-instrumentation Readme](https://github.com/open-telemetry/opentelemetry-java-instrumentation/blob/master/README.md) for more information.
-
-### What's the OpenTelemetry Collector?
-
-The OpenTelemetry Collector is described in its [GitHub readme](https://github.com/open-telemetry/opentelemetry-collector#opentelemetry-collector). Currently Microsoft does not utilize the OpenTelemetry Collector and depends on direct exporters that send to Azure Monitor's Application Insights.
-
-### What's the difference between OpenCensus and OpenTelemetry?
-
-[OpenCensus](https://opencensus.io/) is the precursor to [OpenTelemetry](https://opentelemetry.io/). Microsoft helped bring together [OpenTracing](https://opentracing.io/) and OpenCensus to create OpenTelemetry, a single observability standard for the world. Azure Monitor's current [production-recommended Python SDK](app/opencensus-python.md) is based on OpenCensus, but eventually all Azure Monitor's SDKs will be based on OpenTelemetry.
-
-## Container insights
-
-### What does *Other Processes* represent under the Node view?
-
-**Other processes** are intended to help you clearly understand the root cause of the high resource usage on your node. This enables you to distinguish usage between containerized processes vs non-containerized processes.
-
-What are these **Other Processes**?
-
-These are non-containerized processes that run on your node.
-
-How do we calculate this?
-
-**Other Processes** = *Total usage from CAdvisor* - *Usage from containerized process*
-
-The **Other processes** includes:
--- Self-managed or managed Kubernetes non-containerized processes--- Container Run-time processes--- Kubelet--- System processes running on your node--- Other non-Kubernetes workloads running on node hardware or VM-
-### I don't see Image and Name property values populated when I query the ContainerLog table.
-
-For agent version ciprod12042019 and later, by default these two properties are not populated for every log line to minimize cost incurred on log data collected. There are two options to query the table that include these properties with their values:
-
-#### Option 1
-
-Join other tables to include these property values in the results.
-
-Modify your queries to include Image and ImageTag properties from the `ContainerInventory` table by joining on ContainerID property. You can include the Name property (as it previously appeared in the `ContainerLog` table) from KubepodInventory table's ContaineName field by joining on the ContainerID property. This is the recommended option.
-
-The following example is a sample detailed query that explains how to get these field values with joins.
-
-```
-//lets say we are querying an hour worth of logs
-let startTime = ago(1h);
-let endTime = now();
-//below gets the latest Image & ImageTag for every containerID, during the time window
-let ContainerInv = ContainerInventory | where TimeGenerated >= startTime and TimeGenerated < endTime | summarize arg_max(TimeGenerated, *) by ContainerID, Image, ImageTag | project-away TimeGenerated | project ContainerID1=ContainerID, Image1=Image ,ImageTag1=ImageTag;
-//below gets the latest Name for every containerID, during the time window
-let KubePodInv = KubePodInventory | where ContainerID != "" | where TimeGenerated >= startTime | where TimeGenerated < endTime | summarize arg_max(TimeGenerated, *) by ContainerID2 = ContainerID, Name1=ContainerName | project ContainerID2 , Name1;
-//now join the above 2 to get a 'jointed table' that has name, image & imagetag. Outer left is safer in-case there are no kubepod records are if they are latent
-let ContainerData = ContainerInv | join kind=leftouter (KubePodInv) on $left.ContainerID1 == $right.ContainerID2;
-//now join ContainerLog table with the 'jointed table' above and project-away redundant fields/columns and rename columns that were re-written
-//Outer left is safer so you dont lose logs even if we cannot find container metadata for loglines (due to latency, time skew between data types etc...)
-ContainerLog
-| where TimeGenerated >= startTime and TimeGenerated < endTime
-| join kind= leftouter (
- ContainerData
-) on $left.ContainerID == $right.ContainerID2 | project-away ContainerID1, ContainerID2, Name, Image, ImageTag | project-rename Name = Name1, Image=Image1, ImageTag=ImageTag1
-```
-
-#### Option 2
-
-Re-enable collection for these properties for every container log line.
-
-If the first option is not convenient due to query changes involved, you can re-enable collecting these fields by enabling the setting `log_collection_settings.enrich_container_logs` in the agent config map as described in the [data collection configuration settings](containers/container-insights-agent-config.md).
-
-> [!NOTE]
-> The second option is not recommended with large clusters that have more than 50 nodes because it generates API server calls from every node in the cluster to perform this enrichment. This option also increases data size for every log line collected.
-
-### Can I view metrics collected in Grafana?
-
-Container insights supports viewing metrics stored in your Log Analytics workspace in Grafana dashboards. We have provided a template that you can download from Grafana's [dashboard repository](https://grafana.com/grafana/dashboards?dataSource=grafana-azure-monitor-datasource&category=docker) to get you started and reference to help you learn how to query additional data from your monitored clusters to visualize in custom Grafana dashboards.
-
-### Can I monitor my AKS-engine cluster with Container insights?
-
-Container insights supports monitoring container workloads deployed to AKS-engine (formerly known as ACS-engine) cluster(s) hosted on Azure. For further details and an overview of steps required to enable monitoring for this scenario, see [Using Container insights for AKS-engine](https://github.com/microsoft/OMS-docker/tree/aks-engine).
-
-### Why don't I see data in my Log Analytics workspace?
-
-If you are unable to see any data in the Log Analytics workspace at a certain time everyday, you may have reached the default 500 MB limit or the daily cap specified to control the amount of data to collect daily. When the limit is met for the day, data collection stops and resumes only on the next day. To review your data usage and update to a different pricing tier based on your anticipated usage patterns, see [Log data usage and cost](logs/manage-cost-storage.md).
-
-### What are the container states specified in the ContainerInventory table?
-
-The ContainerInventory table contains information about both stopped and running containers. The table is populated by a workflow inside the agent that queries the docker for all the containers (running and stopped), and forwards that data the Log Analytics workspace.
-
-### How do I resolve *Missing Subscription registration* error?
-
-If you receive the error **Missing Subscription registration for Microsoft.OperationsManagement**, you can resolve it by registering the resource provider **Microsoft.OperationsManagement** in the subscription where the workspace is defined. The documentation for how to do this can be found [here](../azure-resource-manager/templates/error-register-resource-provider.md).
-
-### Is there support for Kubernetes RBAC enabled AKS clusters?
-
-The Container Monitoring solution doesn't support Kubernetes RBAC, but it is supported with Container insights. The solution details page may not show the right information in the blades that show data for these clusters.
-
-### How do I enable log collection for containers in the kube-system namespace through Helm?
-
-The log collection from containers in the kube-system namespace is disabled by default. Log collection can be enabled by setting an environment variable on the omsagent. For more information, see the [Container insights](https://aka.ms/azuremonitor-containers-helm-chart) GitHub page.
-
-### How do I update the omsagent to the latest released version?
-
-To learn how to upgrade the agent, see [Agent management](containers/container-insights-manage-agent.md).
-
-### Why are log lines larger than 16KB split into multiple records in Log Analytics?
-
-The agent uses the [Docker JSON file logging driver](https://docs.docker.com/config/containers/logging/json-file/) to capture the stdout and stderr of containers. This logging driver splits log lines [larger than 16KB](https://github.com/moby/moby/pull/22982) into multiple lines when copied from stdout or stderr to a file.
-
-### How do I enable multi-line logging?
-
-Currently Container insights doesn't support multi-line logging, but there are workarounds available. You can configure all the services to write in JSON format and then Docker/Moby will write them as a single line.
-
-For example, you can wrap your log as a JSON object as shown in the example below for a sample node.js application:
-
-```
-console.log(json.stringify({
- "Hello": "This example has multiple lines:",
- "Docker/Moby": "will not break this into multiple lines",
- "and you will receive":"all of them in log analytics",
- "as one": "log entry"
- }));
-```
-
-This data will look like the following example in Azure Monitor for logs when you query for it:
-
-```
-LogEntry : ({"Hello": "This example has multiple lines:","Docker/Moby": "will not break this into multiple lines", "and you will receive":"all of them in log analytics", "as one": "log entry"}
-```
-
-For a detailed look at the issue, review the following [GitHub link](https://github.com/moby/moby/issues/22920).
-
-### How do I resolve Azure AD errors when I enable live logs?
-
-You may see the following error: **The reply url specified in the request does not match the reply urls configured for the application: '<application ID\>'**. The solution to solve it can be found in the article [How to view container data in real time with Container insights](containers/container-insights-livedata-setup.md#configure-ad-integrated-authentication).
-
-### Why can't I upgrade cluster after onboarding?
-
-If after you enable Container insights for an AKS cluster, you delete the Log Analytics workspace the cluster was sending its data to, when attempting to upgrade the cluster it will fail. To work around this, you will have to disable monitoring and then re-enable it referencing a different valid workspace in your subscription. When you try to perform the cluster upgrade again, it should process and complete successfully.
-
-### Which ports and domains do I need to open/allow for the agent?
-
-See the [Network firewall requirements](containers/container-insights-onboard.md#network-firewall-requirements) for the proxy and firewall configuration information required for the containerized agent with Azure, Azure US Government, and Azure China 21Vianet clouds.
-
-## VM insights
-
-### Can I onboard to an existing workspace?
-
-If your virtual machines are already connected to a Log Analytics workspace, you may continue to use that workspace when onboarding to VM insights, provided it is in one of the [supported regions](vm/vminsights-configure-workspace.md#supported-regions).
-
-### Can I onboard to a new workspace?
-
-If your VMs are not currently connected to an existing Log Analytics workspace, you need to create a new workspace to store your data. Creating a new default workspace is done automatically if you configure a single Azure VM for VM insights through the Azure portal.
-
-If you choose to use the script-based method, these steps are covered in the [Enable VM insights using Azure PowerShell or Resource Manager template](./vm/vminsights-enable-powershell.md) article.
-
-### What do I do if my VM is already reporting to an existing workspace?
-
-If you are already collecting data from your virtual machines, you may have already configured it to report data to an existing Log Analytics workspace. As long as that workspace is in one of our supported regions, you can enable VM insights to that pre-existing workspace. If the workspace you are already using is not in one of our supported regions, you won't be able to onboard to VM insights at this time. We are actively working to support additional regions.
-
-### Why did my VM fail to onboard?
-
-When onboarding an Azure VM from the Azure portal, the following steps occur:
-
-* A default Log Analytics workspace is created, if that option was selected.
-* The Log Analytics agent is installed on Azure VMs using a VM extension, if determined it is required.
-* The VM insights Map Dependency agent is installed on Azure VMs using an extension, if determined it is required.
-
-During the onboard process, we check for status on each of the above to return a notification status to you in the portal. Configuration of the workspace and the agent installation typically takes 5 to 10 minutes. Viewing monitoring data in the portal take an additional 5 to 10 minutes.
-
-If you have initiated onboarding and see messages indicating the VM needs to be onboarded, allow for up to 30 minutes for the VM to complete the process.
-
-### I don't see some or any data in the performance charts for my VM
-
-Our performance charts have been updated to use data stored in the *InsightsMetrics* table. To see data in these charts you will need to upgrade to use the new VM Insights solution. Please refer to our [GA FAQ](vm/vminsights-ga-release-faq.md) for additional information.
-
-If you don't see performance data in the disk table or in some of the performance charts then your performance counters may not be configured in the workspace. To resolve, run the following [PowerShell script](./vm/vminsights-enable-powershell.md).
-
-### How is VM insights Map feature different from Service Map?
-
-The VM insights Map feature is based on Service Map, but has the following differences:
-
-* The Map view can be accessed from the VM blade and from VM insights under Azure Monitor.
-* The connections in the Map are now clickable and display a view of the connection metric data in the side panel for the selected connection.
-* There is a new API that is used to create the maps to better support more complex maps.
-* Monitored VMs are now included in the client group node, and the donut chart shows the proportion of monitored vs unmonitored virtual machines in the group. It can also be used to filter the list of machines when the group is expanded.
-* Monitored virtual machines are now included in the server port group nodes, and the donut chart shows the proportion of monitored vs unmonitored machines in the group. It can also be used to filter the list of machines when the group is expanded.
-* The map style has been updated to be more consistent with App Map from Application insights.
-* The side panels have been updated, and do not have the full set of integration's that were supported in Service Map - Update Management, Change Tracking, Security, and Service Desk.
-* The option for choosing groups and machines to map has been updated and now supports Subscriptions, Resource Groups, Azure virtual machine scale sets, and Cloud services.
-* You cannot create new Service Map machine groups in the VM insights Map feature.
-
-### Why do my performance charts show dotted lines?
-
-This can occur for a few reasons. In cases where there is a gap in data collection we depict the lines as dotted. If you have modified the data sampling frequency for the performance counters enabled (the default setting is to collect data every 60 seconds), you can see dotted lines in the chart if you choose a narrow time range for the chart and your sampling frequency is less than the bucket size used in the chart (for example, the sampling frequency is every 10 minutes and each bucket on the chart is 5 minutes). Choosing a wider time range to view should cause the chart lines to appear as solid lines rather than dots in this case.
-
-### Are groups supported with VM insights?
-
-Yes, once you install the Dependency agent we collect information from the VMs to display groups based upon subscription, resource group, virtual machine scale sets, and cloud services. If you have been using Service Map and have created machine groups, these are displayed as well. Computer groups will also appear in the groups filter if you have created them for the workspace you are viewing.
-
-### How do I see the details for what is driving the 95th percentile line in the aggregate performance charts?
-
-By default, the list is sorted to show you the VMs that have the highest value for the 95th percentile for the selected metric, except for the Available memory chart, which shows the machines with the lowest value of the 5th percentile. Clicking on the chart will open the **Top N List** view with the appropriate metric selected.
-
-### How does the Map feature handle duplicate IPs across different vnets and subnets?
-
-If you are duplicating IP ranges either with VMs or Azure virtual machine scale sets across subnets and vnets, it can cause VM insights Map to display incorrect information. This is a known issue and we are investigating options to improve this experience.
-
-### Does Map feature support IPv6?
-
-Map feature currently only supports IPv4 and we are investigating support for IPv6. We also support IPv4 that is tunneled inside IPv6.
-
-### When I load a map for a Resource Group or other large group the map is difficult to view
-
-While we have made improvements to Map to handle large and complex configurations, we realize a map can have a lot of nodes, connections, and node working as a cluster. We are committed to continuing to enhance support to increase scalability.
-
-### Why does the network chart on the Performance tab look different than the network chart on the Azure VM Overview page?
-
-The overview page for an Azure VM displays charts based on the host's measurement of activity in the guest VM. For the network chart on the Azure VM Overview, it only displays network traffic that will be billed. This does not include inter-virtual network traffic. The data and charts shown for VM insights is based on data from the guest VM and the network chart displays all TCP/IP traffic that is inbound and outbound to that VM, including inter-virtual network.
-
-### How is response time measured for data stored in VMConnection and displayed in the connection panel and workbooks?
-
-Response time is an approximation. Since we do not instrument the code of the application, we do not really know when a request begins and when the response arrives. Instead we observe data being sent on a connection and then data coming back on that connection. Our agent keeps track of these sends and receives and attempts to pair them: a sequence of sends, followed by a sequence of receives is interpreted as a request/response pair. The timing between these operations is the response time. It will include the network latency and the server processing time.
-
-This approximation works well for protocols that are request/response based: a single request goes out on the connection, and a single response arrives. This is the case for HTTP(S) (without pipelining), but not satisfied for other protocols.
-
-### Are there limitations if I am on the Log Analytics Free pricing plan?
-
-If you have configured Azure Monitor with a Log Analytics workspace using the *Free* pricing tier, VM insights Map feature will only support five connected machines connected to the workspace. If you have five VMs connected to a free workspace, you disconnect one of the VMs and then later connect a new VM, the new VM is not monitored and reflected on the Map page.
-
-Under this condition, you will be prompted with the **Try Now** option when you open the VM and select **Insights** from the left-hand pane, even after it has been installed already on the VM. However, you are not prompted with options as would normally occur if this VM were not onboarded to VM insights.
-
-## SQL insights (preview)
-
-### What versions of SQL Server are supported?
-
-We support SQL Server 2012 and all newer versions. See [Supported versions](insights/sql-insights-overview.md#supported-versions) for more details.
-
-### What SQL resource types are supported?
--- Azure SQL Database-- Azure SQL Managed Instance-- SQL Server on Azure Virtual Machines (SQL Server running on virtual machines registered with the [SQL virtual machine](../azure-sql/virtual-machines/windows/sql-agent-extension-manually-register-single-vm.md) provider)-- Azure VMs (SQL Server running on virtual machines not registered with the [SQL virtual machine](../azure-sql/virtual-machines/windows/sql-agent-extension-manually-register-single-vm.md) provider)-
-See [Supported versions](insights/sql-insights-overview.md#supported-versions) for more details and for details about scenarios with no support or limited support.
-
-### What operating systems for the virtual machine running SQL Server are supported?
-
-We support all operating systems specified by the [Windows](../azure-sql/virtual-machines/windows/sql-server-on-azure-vm-iaas-what-is-overview.md#get-started-with-sql-server-vms) and [Linux](../azure-sql/virtual-machines/linux/sql-server-on-linux-vm-what-is-iaas-overview.md#create) documentation for SQL Server on Azure Virtual Machines.
-
-### What operating system for the monitoring virtual machine are supported?
-
-Ubuntu 18.04 is currently the only operating system supported for the monitoring virtual machine.
-
-### Where will the monitoring data be stored in Log Analytics?
-
-All of the monitoring data is stored in the **InsightsMetrics** table. The **Origin** column has the value `solutions.azm.ms/telegraf/SqlInsights`. The **Namespace** column has values that start with `sqlserver_`.
-
-### How often is data collected?
-
-The frequency of data collection is customizable. See [Data collected by SQL insights](../insights/../azure-monitor/insights/sql-insights-overview.md#data-collected-by-sql-insights) for details on the default frequencies and see [Create SQL monitoring profile](../insights/../azure-monitor/insights/sql-insights-enable.md#create-sql-monitoring-profile) for instructions on customizing frequencies.
-
-## Next steps
-
-If your question isn't answered here, you can refer to the following forums to additional questions and answers.
--- [Log Analytics](/answers/topics/azure-monitor.html)-- [Application Insights](/answers/topics/azure-monitor.html)-
-For general feedback on Azure Monitor please visit the [feedback forum](https://feedback.azure.com/forums/34192--general-feedback).
azure-monitor Solutions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/insights/solutions.md
Monitoring solutions from Microsoft and partners are available from the [Azure M
Members of the community can submit management solutions to Azure Quickstart Templates. You can install these solutions directly or download them templates for later installation. 1. Follow the process described in [Log Analytics workspace and Automation account](#log-analytics-workspace-and-automation-account) to link a workspace and account.
-2. Go to [Azure Quickstart Templates](https://azure.microsoft.com/documentation/templates/).
+2. Go to [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates/).
3. Search for a solution that you're interested in. 4. Select the solution from the results to view its details. 5. Click the **Deploy to Azure** button.
azure-monitor Sql Insights Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/insights/sql-insights-overview.md
The tables below have the following columns:
## Next steps - See [Enable SQL insights](sql-insights-enable.md) for instructions on enabling SQL insights-- See [Frequently asked questions](../faq.md#sql-insights-preview) for frequently asked questions about SQL insights
+- See [Frequently asked questions](/azure/azure-monitor/faq#sql-insights-preview) for frequently asked questions about SQL insights
azure-monitor Query Optimization https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/query-optimization.md
Optimized queries will:
You should give particular attention to queries that are used for recurrent and bursty usage such as dashboards, alerts, Logic Apps and Power BI. The impact of an ineffective query in these cases is substantial.
+Here is a detailed video walkthrough on optimizing queries.
+
+> [!VIDEO https://www.microsoft.com/en-us/videoplayer/embed/RE4NUH0]
+ ## Query performance pane After you run a query in Log Analytics, click the down arrow above the query results to view the query performance pane that shows the results of several performance indicators for the query. These performance indicators are each described in the following section.
Query behaviors that can reduce parallelism include:
## Next steps -- [Reference documentation for the Kusto query language](/azure/kusto/query/).
+- [Reference documentation for the Kusto query language](/azure/kusto/query/).
azure-monitor Service Map https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/vm/service-map.md
This article describes the details of onboarding and using Service Map. The prer
* The [Dependency agent](vminsights-enable-overview.md#agents) installed on the Windows computer or Linux server. >[!NOTE]
->If you have already deployed Service Map, you can now also view your maps in VM insights, which includes additional features to monitor VM health and performance. To learn more, see [VM insights overview](../vm/vminsights-overview.md). To learn about the differences between the Service Map solution and VM insights Map feature, see the following [FAQ](../faq.md#vm-insights).
+>If you have already deployed Service Map, you can now also view your maps in VM insights, which includes additional features to monitor VM health and performance. To learn more, see [VM insights overview](../vm/vminsights-overview.md). To learn about the differences between the Service Map solution and VM insights Map feature, see the following [FAQ](/azure/azure-monitor/faq#vm-insights).
## Sign in to Azure
azure-monitor Vminsights Ga Release Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/vm/vminsights-ga-release-faq.md
- Title: VM insights (GA) frequently asked questions | Microsoft Docs
-description: VM insights is a solution in Azure that combines health and performance monitoring of the Azure VM operating system, as well as automatically discovering application components and dependencies with other resources and maps the communication between them. This article answers common questions about the GA release.
--- Previously updated : 01/31/2020---
-# VM insights Generally Available (GA) Frequently Asked Questions
-This General Availability FAQ covers changes that were made in Q4 2019 and Q1 2020 as we prepared for GA.
-
-## Updates for VM insights
-We released a new version of VM insights in January 2020 ahead of our GA announcement. Customers enabling VM insights will now receive the GA version, but existing customers using the version of VM insights from Q4 2019 and earlier will be prompted to upgrade. This FAQ offers guidance to perform an upgrade at scale if you have large deployments across multiple workspaces.
--
-With this upgrade, VM insights performance data is stored in the same *InsightsMetrics* table as [Container insights](../containers/container-insights-overview.md), which makes it easier for you to query the two data sets. Also, you are able to store more diverse data sets that we could not store in the table previously used.
-
-Our performance views are now using the data we store in the *InsightsMetrics* table. If you have not yet upgraded to use the latest VMInsights solution on your workspace, your charts will no longer display information. You can upgrade from our **Get Started** page as described below.
--
-## What is changing?
-We have released a new solution, named VMInsights, that includes more capabilities for data collection along with a new location for storing this data in your Log Analytics workspace.
-
-In the past, we enabled the ServiceMap solution on your workspace and setup performance counters in your Log Analytics workspace to send the data to the *Perf* table. This new solution sends the data to a table named *InsightsMetrics* that is also used by Container insights. This table schema allows us to store more metrics and service data sets that are not compatible with the *Perf* table format.
-
-We have updated our Performance charts to use the data we store in the *InsightsMetrics* table. You can upgrade to use the *InsightsMetrics* table from our **Get Started** page as described below.
--
-## How do I upgrade?
-When a Log Analytics workspace is upgraded to the latest version of Azure Monitor to VMs, it will upgrade the dependency agent on each of the VMs attached to that workspace. Each VM requiring upgrade will be identified in the **Get Started** tab in VM insights in the Azure portal. When you choose to upgrade a VM, it will upgrade the workspace for that VM along with any other VMs attached to that workspace. You can select a single VM or multiple VMs, resource groups, or subscriptions.
-
-Use the following command to upgrade a workspace using PowerShell:
-
-```PowerShell
-Set-AzOperationalInsightsIntelligencePack -ResourceGroupName <resource-group-name> -WorkspaceName <workspace-name> -IntelligencePackName "VMInsights" -Enabled $True
-```
-
-## What should I do about the Performance counters in my workspace if I install the VMInsights solution?
-
-The previous method of enabling VM insights used performance counters in your workspace. The current version stores this data in a table named `InsightsMetrics`. You may choose to disable these performance counters in your workspace if you no longer need to use them.
-
->[!NOTE]
->If you have Alert Rules that reference these counters in the `Perf` table, you need to update them to reference new data stored in the `InsightsMetrics` table. Refer to our documentation for example log queries that you can use that refer to this table.
->
-
-If you decide to keep the performance counters enabled, you will be billed for the data ingested and stored in the `Perf` table based on [Log Analytics pricing[(https://azure.microsoft.com/pricing/details/monitor/).
-
-## How will this change affect my alert rules?
-
-If you have created [Log alerts](../alerts/alerts-unified-log.md) that query the `Perf` table targeting performance counters that were enabled in the workspace, you should update these rules to refer to the `InsightsMetrics` table instead. This guidance also applies to any log search rules using `ServiceMapComputer_CL` and `ServiceMapProcess_CL`, because those data sets are moving to `VMComputer` and `VMProcess` tables.
-
-We will update this FAQ and our documentation to include example log search alert rules for the data sets we collect.
-
-## How will this change affect my bill?
-
-Billing is still based on data ingested and retained in your Log Analytics workspace.
-
-The machine level performance data that we collect is the same, is of a similar size to the data we stored in the `Perf` table, and will cost approximately the same amount.
-
-## What if I only want to use Service Map?
-
-That is fine. You will see prompts in the Azure portal when viewing VM insights about the upcoming update. Once released, you will receive a prompt requesting that you update to the new version. If you prefer to only use the [Maps](vminsights-maps.md) feature, you can choose not to upgrade and continue to use the Maps feature in VM insights and the Service Map solution accessed from your workspace or dashboard tile.
-
-If you chose to manually enable the performance counters in your workspace, then you may be able to see data in some of our performance charts viewed from Azure Monitor. Once the new solution is released we will update our performance charts to query the data stored in the `InsightsMetrics` table. If you would like to see data from that table in these charts, you will need to upgrade to the new version of VM insights.
-
-The changes to move data from `ServiceMapComputer_CL` and `ServiceMapProcess_CL` will affect both Service Map and VM insights, so you still need to plan for this update.
-
-If you chose to not upgrade to the **VMInsights** solution, we will continue to provide legacy versions of our performance workbooks that refer to data in the `Perf` table.
-
-## Will the Service Map data sets also be stored in InsightsMetrics?
-
-The data sets will not be duplicated if you use both solutions. Both offerings share the data sets that will be stored in `VMComputer` (formerly ServiceMapComputer_CL), `VMProcess` (formerly ServiceMapProcess_CL), `VMConnection`, and `VMBoundPort` tables to store the map data sets that we collect.
-
-The `InsightsMetrics` table will store VM, process, and service data sets that we collect and will only be populated if you are using VM insights and the VM Insights solution. The Service Map solution will not collect or store data in the `InsightsMetrics` table.
-
-## Will I be double charged if I have the Service Map and VMInsights solutions in my workspace?
-
-No, the two solutions share the map data sets that we store in `VMComputer` (formerly ServiceMapComputer_CL), `VMProcess` (formerly ServiceMapProcess_CL), `VMConnection`, and `VMBoundPort`. You will not be double charged for this data if you have both solutions in your workspace.
-
-## If I remove either the Service Map or VMInsights solution, will it remove my data?
-
-No, the two solutions share the map data sets that we store in `VMComputer` (formerly ServiceMapComputer_CL), `VMProcess` (formerly ServiceMapProcess_CL), `VMConnection`, and `VMBoundPort`. If you remove one of the solutions, these data sets notice that there is still a solution in place that uses the data and it remains in the Log Analytics workspace. You need to remove both solutions from your workspace in order for the data to be removed from it.
-
-## Health feature is in limited public preview
-
-We have received a lot of great feedback from customers about our VM Health feature set. There is asignificant interest around this feature and excitement over its potential for supporting monitoring workflows. We are planning to make a series of changes to add functionality and address the feedback we have received.
-
-To minimize impact of these changes to new customers, we have moved this feature into a **limited public preview**. This update happened in October 2019.
-
-We plan to re-launch this Health feature in 2020, after VM insights is in GA.
-
-## How do existing customers access the Health feature?
-
-Existing customers that are using the Health feature will continue to have access to it, but it will not be offered to new customers.
-
-To access the feature, you can add the following feature flag `feature.vmhealth=true` to the Azure portal URL [https://portal.azure.com](https://portal.azure.com). Example `https://portal.azure.com/?feature.vmhealth=true`.
-
-You can also use this short url, which sets the feature flag automatically: [https://aka.ms/vmhealthpreview](https://aka.ms/vmhealthpreview).
-
-As an existing customer, you can continue to use the Health feature on VMs that are connected to an existing workspace setup with the health functionality.
-
-## I use VM Health now with one environment and would like to deploy it to a new one
-
-If you are an existing customer that is using the Health feature and want to use it for a new roll-out, contact us at vminsights@microsoft.com to request instructions.
-
-## Next steps
-
-To understand the requirements and methods that help you monitor your virtual machines, review [Deploy VM insights](./vminsights-enable-overview.md).
azure-monitor Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/whats-new.md
Welcome to what's new in the Azure Monitor docs from May, 2021. This article lis
**Updated articles** -- [Azure Monitor Frequently Asked Questions](faq.md)
+- [Azure Monitor Frequently Asked Questions](faq.yml)
- [Azure Monitor partner integrations](partners.md) ## Alerts
azure-netapp-files Configure Kerberos Encryption https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/configure-kerberos-encryption.md
na ms.devlang: na Previously updated : 05/06/2021 Last updated : 06/17/2021 # Configure NFSv4.1 Kerberos encryption for Azure NetApp Files
The following requirements apply to NFSv4.1 client encryption:
* Active Directory Domain Services (AD DS) or Azure Active Directory Domain Services (AADDS) connection to facilitate Kerberos ticketing * DNS A/PTR record creation for both the client and Azure NetApp Files NFS server IP addresses
-* A Linux client
- This article provides guidance for RHEL and Ubuntu clients. Other clients will work with similar configuration steps.
-* NTP server access
- You can use one of the commonly used Active Directory Domain Controller (AD DC) domain controllers.
+* A Linux client: This article provides guidance for RHEL and Ubuntu clients. Other clients will work with similar configuration steps.
+* NTP server access: You can use one of the commonly used Active Directory Domain Controller (AD DC) domain controllers.
+* Ensure that User Principal Names for user accounts do *not* end with a `$` symbol (for example, user$@REALM.COM). <!-- Not using 'contoso.com' in this example; per Mark, A customers REALM namespace may be different from their AD domain name space. -->
+ At this time, Azure NetApp Files Kerberos does not support [Group managed service accounts](/windows-server/security/group-managed-service-accounts/getting-started-with-group-managed-service-accounts) (gMSA).
+ ## Create an NFS Kerberos Volume
azure-netapp-files Configure Nfs Clients https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/configure-nfs-clients.md
na ms.devlang: na Previously updated : 05/17/2021 Last updated : 06/17/2021 # Configure an NFS client for Azure NetApp Files
The following steps are optional. You need to perform the steps only if you wan
`base dc=contoso,dc=com uri ldap://10.20.0.4:389/ ldap_version 3 rootbinddn cn=admin,cn=Users,dc=contoso,dc=com pam_password ad`
-2. Run the following command to restart and enable the service:
+2. Ensure that your `/etc/nsswitch.conf` file has the following `ldap` entries:
+ `passwd: compat systemd ldap`
+ `group: compat systemd ldap`
+
+3. Run the following command to restart and enable the service:
`sudo systemctl restart nscd && sudo systemctl enable nscd`
NFSv4.x requires each client to identify itself to servers with a *unique* strin
* [Create an NFS volume for Azure NetApp Files](azure-netapp-files-create-volumes.md) * [Create a dual-protocol volume for Azure NetApp Files](create-volumes-dual-protocol.md)
-* [Mount or unmount a volume for Windows or Linux virtual machines](azure-netapp-files-mount-unmount-volumes-for-virtual-machines.md)
+* [Mount or unmount a volume for Windows or Linux virtual machines](azure-netapp-files-mount-unmount-volumes-for-virtual-machines.md)
azure-resource-manager Resource Declaration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/resource-declaration.md
resource stg 'Microsoft.Storage/storageAccounts@2019-06-01' existing = {
output blobEndpoint string = stg.properties.primaryEndpoints.blob ```
-The preceding example don't deploy the storage account, but the declaration provides access to properties on the existing resource. Using the 'stg' symbolic name, you can access properties on the storage account.
+The preceding example doesn't deploy the storage account, but the declaration provides access to properties on the existing resource. Using the 'stg' symbolic name, you can access properties on the storage account.
-The following examples shows how to specify the `scope` property:
+The following example shows how to specify the `scope` property:
+```bicep
resource stg 'Microsoft.Storage/storageAccounts@2019-06-01' existing = { name: 'exampleStorage' scope: resourceGroup(mySub, myRg) }
+```
## Next steps -- To conditionally deploy a resource, see [Conditional deployment in Bicep](./conditional-resource-deployment.md).
+- To conditionally deploy a resource, see [Conditional deployment in Bicep](./conditional-resource-deployment.md).
azure-resource-manager Deployment Models https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/management/deployment-models.md
The following table describes changes in how Compute, Network, and Storage resou
| Cloud Service for Virtual Machines |Cloud Service was a container for holding the virtual machines that required Availability from the platform and Load Balancing. |Cloud Service is no longer an object required for creating a Virtual Machine using the new model. | | Virtual Networks |A virtual network is optional for the virtual machine. If included, the virtual network can't be deployed with Resource Manager. |Virtual machine requires a virtual network that has been deployed with Resource Manager. | | Storage Accounts |The virtual machine requires a storage account that stores the virtual hard disks for the operating system, temporary, and additional data disks. |The virtual machine requires a storage account to store its disks in blob storage. |
-| Availability Sets |Availability to the platform was indicated by configuring the same ΓÇ£AvailabilitySetNameΓÇ¥ on the Virtual Machines. The maximum count of fault domains was 2. |Availability Set is a resource exposed by Microsoft.Compute Provider. Virtual Machines that require high availability must be included in the Availability Set. The maximum count of fault domains is now 3. |
-| Affinity Groups |Affinity Groups were required for creating Virtual Networks. However, with the introduction of Regional Virtual Networks, that wasn't required anymore. |To simplify, the Affinity Groups concept doesnΓÇÖt exist in the APIs exposed through Azure Resource Manager. |
+| Availability Sets |Availability to the platform was indicated by configuring the same "AvailabilitySetName" on the Virtual Machines. The maximum count of fault domains was 2. |Availability Set is a resource exposed by Microsoft.Compute Provider. Virtual Machines that require high availability must be included in the Availability Set. The maximum count of fault domains is now 3. |
+| Affinity Groups |Affinity Groups were required for creating Virtual Networks. However, with the introduction of Regional Virtual Networks, that wasn't required anymore. |To simplify, the Affinity Groups concept doesn't exist in the APIs exposed through Azure Resource Manager. |
| Load Balancing |Creation of a Cloud Service provides an implicit load balancer for the Virtual Machines deployed. |The Load Balancer is a resource exposed by the Microsoft.Network provider. The primary network interface of the Virtual Machines that needs to be load balanced should be referencing the load balancer. Load Balancers can be internal or external. A load balancer instance references the backend pool of IP addresses that include the NIC of a virtual machine (optional) and references a load balancer public or private IP address (optional). | | Virtual IP Address |Cloud Services gets a default VIP (Virtual IP Address) when a VM is added to a cloud service. The Virtual IP Address is the address associated with the implicit load balancer. |Public IP address is a resource exposed by the Microsoft.Network provider. Public IP address can be static (reserved) or dynamic. Dynamic public IPs can be assigned to a Load Balancer. Public IPs can be secured using Security Groups. | | Reserved IP Address |You can reserve an IP Address in Azure and associate it with a Cloud Service to ensure that the IP Address is sticky. |Public IP Address can be created in static mode and it offers the same capability as a reserved IP address. |
All the automation and scripts that you've built continue to work for the existi
**Where can I find examples of Azure Resource Manager templates?**
-A comprehensive set of starter templates can be found on [Azure Resource Manager Quickstart Templates](https://azure.microsoft.com/documentation/templates/).
+A comprehensive set of starter templates can be found on [Azure Resource Manager Quickstart Templates](https://azure.microsoft.com/resources/templates/).
## Next steps
azure-resource-manager Move Support Resources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/management/move-support-resources.md
Jump to a resource provider namespace:
> [!div class="mx-tableFixed"] > | Resource type | Resource group | Subscription | Region move | > | - | -- | - | -- |
-> | accounts | Yes | Yes | No. [Learn more](../../azure-monitor/faq.md#how-do-i-move-an-application-insights-resource-to-a-new-region). |
+> | accounts | Yes | Yes | No. [Learn more](../../azure-monitor/faq.yml#how-do-i-move-an-application-insights-resource-to-a-new-region-). |
> | actiongroups | Yes | Yes | No | > | activitylogalerts | No | No | No | > | alertrules | Yes | Yes | No |
azure-resource-manager Deployment Modes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/deployment-modes.md
To set the deployment mode when deploying with Azure CLI, use the `mode` paramet
```azurecli-interactive az deployment group create \
- --name ExampleDeployment \
--mode Complete \
- --resource-group ExampleGroup \
- --template-file storage.json \
- --parameters storageAccountType=Standard_GRS
+ --name ExampleDeployment \
+ --resource-group ExampleResourceGroup \
+ --template-file storage.json
``` The following example shows a linked template set to incremental deployment mode:
The following example shows a linked template set to incremental deployment mode
* To learn about creating Resource Manager templates, see [Understand the structure and syntax of ARM templates](./syntax.md). * To learn about deploying resources, see [Deploy resources with ARM templates and Azure PowerShell](deploy-powershell.md).
-* To view the operations for a resource provider, see [Azure REST API](/rest/api/).
+* To view the operations for a resource provider, see [Azure REST API](/rest/api/).
azure-resource-manager Syntax https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/syntax.md
You can break a string into multiple lines. For example, see the `location` prop
## Next steps
-* To view complete templates for many different types of solutions, see the [Azure Quickstart Templates](https://azure.microsoft.com/documentation/templates/).
+* To view complete templates for many different types of solutions, see the [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates/).
* For details about the functions you can use from within a template, see [ARM template functions](template-functions.md). * To combine several templates during deployment, see [Using linked and nested templates when deploying Azure resources](linked-templates.md). * For recommendations about creating templates, see [ARM template best practices](./best-practices.md).
azure-resource-manager Template Spec Convert https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/template-spec-convert.md
To simplify converting templates in the template gallery, use a PowerShell scrip
To learn how to deploy the template that creates the template spec, see [Quickstart: Create and deploy template spec](quickstart-create-template-specs.md).
-For more information about the script and its parameters, see [Create TemplateSpecs from Template Gallery Templates](https://github.com/Azure/azure-quickstart-templates/tree/master/201-templatespec-migrate-create).
+For more information about the script and its parameters, see [Create TemplateSpecs from Template Gallery Templates](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.resources/templatespec-migrate-create).
## Manually convert through portal
azure-sql Arm Templates Content Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/arm-templates-content-guide.md
The following table includes links to Azure Resource Manager templates for Azure
||| | [SQL Database](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.sql/sql-database-transparent-encryption-create) | This Azure Resource Manager template creates a single database in Azure SQL Database and configures server-level IP firewall rules. | | [Server](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.sql/sql-logical-server) | This Azure Resource Manager template creates a server for Azure SQL Database. |
-| [Elastic pool](https://github.com/Azure/azure-quickstart-templates/tree/master/101-sql-elastic-pool-create) | This template allows you to deploy an elastic pool and to assign databases to it. |
+| [Elastic pool](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.sql/sql-elastic-pool-create) | This template allows you to deploy an elastic pool and to assign databases to it. |
| [Failover groups](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.sql/sql-with-failover-group) | This template creates two servers, a single database, and a failover group in Azure SQL Database.| | [Threat Detection](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.sql/sql-threat-detection-db-policy-multiple-databases) | This template allows you to deploy a server and a set of databases with Threat Detection enabled, with an email address for alerts for each database. Threat Detection is part of the SQL Advanced Threat Protection (ATP) offering and provides a layer of security that responds to potential threats over servers and databases.| | [Auditing to Azure Blob storage](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.sql/sql-auditing-server-policy-to-blob-storage) | This template allows you to deploy a server with auditing enabled to write audit logs to a Blob storage. Auditing for Azure SQL Database tracks database events and writes them to an audit log that can be placed in your Azure storage account, OMS workspace, or Event Hubs.| | [Auditing to Azure Event Hub](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.sql/sql-auditing-server-policy-to-eventhub) | This template allows you to deploy a server with auditing enabled to write audit logs to an existing event hub. In order to send audit events to Event Hubs, set auditing settings with `Enabled` `State`, and set `IsAzureMonitorTargetEnabled` as `true`. Also, configure Diagnostic Settings with the `SQLSecurityAuditEvents` log category on the `master` database (for server-level auditing). Auditing tracks database events and writes them to an audit log that can be placed in your Azure storage account, OMS workspace, or Event Hubs.| | [Azure Web App with SQL Database](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-sql-database) | This sample creates a free Azure web app and a database in Azure SQL Database at the "Basic" service level.|
-| [Azure Web App and Redis Cache with SQL Database](https://github.com/Azure/azure-quickstart-templates/tree/master/201-web-app-redis-cache-sql-database) | This template creates a web app, Redis Cache, and database in the same resource group and creates two connection strings in the web app for the database and Redis Cache.|
-| [Import data from Blob storage using ADF V2](https://github.com/Azure/azure-quickstart-templates/tree/master/101-data-factory-v2-blob-to-sql-copy) | This Azure Resource Manager template creates an instance of Azure Data Factory V2 that copies data from Azure Blob storage to SQL Database.|
+| [Azure Web App and Redis Cache with SQL Database](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.web/web-app-redis-cache-sql-database) | This template creates a web app, Redis Cache, and database in the same resource group and creates two connection strings in the web app for the database and Redis Cache.|
+| [Import data from Blob storage using ADF V2](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.datafactory/data-factory-v2-blob-to-sql-copy) | This Azure Resource Manager template creates an instance of Azure Data Factory V2 that copies data from Azure Blob storage to SQL Database.|
| [HDInsight cluster with a database](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.hdinsight/hdinsight-linux-with-sql-database) | This template allows you to create an HDInsight cluster, a logical SQL server, a database, and two tables. This template is used by the [Use Sqoop with Hadoop in HDInsight article](../../hdinsight/hadoop/hdinsight-use-sqoop.md). | | [Azure Logic App that runs a SQL Stored Procedure on a schedule](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.logic/logic-app-sql-proc) | This template allows you to create a logic app that will run a SQL stored procedure on schedule. Any arguments for the procedure can be put into the body section of the template.|
azure-sql Auditing Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/auditing-overview.md
Previously updated : 06/14/2021 Last updated : 06/18/2021 # Auditing for Azure SQL Database and Azure Synapse Analytics
Auditing of Microsoft Support operations for Azure SQL Server allows you to audi
To enable Auditing of Microsoft Support operations navigate to **Auditing** under the Security heading in your **Azure SQL server** pane, and switch **Auditing of Microsoft support operations** to **ON**.
- > [!IMPORTANT]
- > Auditing of Microsoft support operations does not support storage account destination. To enable the capability, a Log Analytics workspace or an Event Hub destination has to be configured.
- ![Screenshot of Microsoft Support Operations](./media/auditing-overview/support-operations.png) To review the audit logs of Microsoft Support operations in your Log Analytics workspace, use the following query:
azure-sql Connectivity Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/connectivity-architecture.md
If you are connecting from outside Azure, your connections have a connection pol
The table below lists the individual Gateway IP addresses and also Gateway IP address ranges per region.
-Periodically, we will retire Gateways using old hardware and migrate the traffic to new Gateways as per the process outlined at [Azure SQL Database traffic migration to newer Gateways](gateway-migration.md). We strongly encourage customers to use the **Gateway IP address ranges** in order to not be impacted by this activity in a region.
+Periodically, we will retire Gateways using old hardware and migrate the traffic to new Gateways as per the process outlined at [Azure SQL Database traffic migration to newer Gateways](gateway-migration.md). We strongly encourage customers to use the **Gateway IP address subnets** in order to not be impacted by this activity in a region.
> [!IMPORTANT]
-> Logins for SQL Database or Azure Synapse can land on **any of the Gateways in a region**. For consistent connectivity to SQL Database or Azure Synapse, allow network traffic to and from **ALL** Gateway IP addresses or Gateway IP address ranges for the region.
+> Logins for SQL Database or Azure Synapse can land on **any of the Gateways in a region**. For consistent connectivity to SQL Database or Azure Synapse, allow network traffic to and from **ALL** Gateway IP addresses and Gateway IP address subnets for the region.
-| Region name | Gateway IP addresses | Gateway IP address ranges |
+| Region name | Gateway IP addresses | Gateway IP address subnets |
| | | | | Australia Central | 20.36.105.0, 20.36.104.6, 20.36.104.7 | 20.36.105.32/29 | | Australia Central 2 | 20.36.113.0, 20.36.112.6 | 20.36.113.32/29 |
azure-sql Database Copy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/database-copy.md
Start copying the source database with the [CREATE DATABASE ... AS COPY OF](/sql
> [!NOTE] > Terminating the T-SQL statement does not terminate the database copy operation. To terminate the operation, drop the target database.
->
+> [!NOTE]
> Database copy is not supported when the source and/or destination servers have a private endpoint configured and public network access is disabled. If private endpoint is configured but public network access is allowed, initiating database copy when connected to the destination server from a public IP address will succeed. To determine the source IP address of current connection, execute `SELECT client_net_address FROM sys.dm_exec_connections WHERE session_id = @@SPID;`
azure-sql Ledger Audit https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-audit.md
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in **public preview**.
+> Azure SQL Database ledger is currently in public preview and available in West Central US.
-When performing forensics activities with ledger-enabled tables, in addition to the data captured in the ledger view and database ledger, additional action IDs are added to the SQL audit logs. The following table outlines these new audit logging events along with the conditions that trigger the events.
+When you perform forensics activities with ledger-enabled tables, data is captured in the ledger view and database ledger. Other action IDs are added to the SQL audit logs, too. The following tables outline these new audit logging events. The conditions that trigger the events follow each table.
## Enable ledger
When performing forensics activities with ledger-enabled tables, in addition to
| **configuration_group_name** | LEDGER_OPERATION_GROUP | | **action_in_log** | 1 |
-**Condition triggering the event**: Creating a new ledger table or converting a regular table to a ledger table.
+**Conditions that trigger the event**: When you create a new ledger table or convert a regular table to a ledger table.
## Alter ledger
When performing forensics activities with ledger-enabled tables, in addition to
| **configuration_group_name** | LEDGER_OPERATION_GROUP | | **action_in_log** | 1 |
-**Condition triggering the event**: Dropping or renaming a ledger table, converting a ledger table to a normal table, adding, dropping or renaming a column in a ledger table.
+**Conditions that trigger the event**: When you drop or rename a ledger table, convert a ledger table to a normal table, and add, drop, or rename a column in a ledger table.
## Generate ledger digest
When performing forensics activities with ledger-enabled tables, in addition to
| **configuration_group_name** | LEDGER_OPERATION_GROUP | | **action_in_log** | 1 |
-**Condition triggering the event**: Generating a ledger digest.
+**Condition that triggers the event**: When you generate a ledger digest.
## Verify ledger
When performing forensics activities with ledger-enabled tables, in addition to
| **configuration_group_name** | LEDGER_OPERATION_GROUP | | **action_in_log** | 1 |
-**Condition triggering the event**: Verifying a ledger digest.
+**Condition that triggers the event**: When you verify a ledger digest.
-## Ledger operation Group
+## Ledger operation group
| Column | Value | |--|--|
When performing forensics activities with ledger-enabled tables, in addition to
| **configuration_group_name** | LEDGER_OPERATION_GROUP | | **action_in_log** | 0 |
-**Condition triggering the event**: N/A
+**Condition that triggers the event**: N/A
| Column | Value | |--|--|
When performing forensics activities with ledger-enabled tables, in addition to
| **configuration_group_name** | LEDGER_OPERATION_GROUP | | **action_in_log** | 0 |
-**Condition triggering the event**: N/A
+**Condition that triggers the event**: N/A
## Next steps - [Auditing for Azure SQL Database and Azure Synapse Analytics](auditing-overview.md)-- [Azure SQL Database ledger Overview](ledger-overview.md)-- [Quickstart: Create an Azure SQL Database with ledger enabled](ledger-create-a-single-database-with-ledger-enabled.md)
+- [Azure SQL Database ledger overview](ledger-overview.md)
+- [Quickstart: Create a database in Azure SQL Database with ledger enabled](ledger-create-a-single-database-with-ledger-enabled.md)
azure-sql Ledger Create A Single Database With Ledger Enabled https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-create-a-single-database-with-ledger-enabled.md
Title: Create a single database with ledger enabled
-description: Create a single database in Azure SQL Database with ledger enabled using the Azure portal.
+description: Create a single database in Azure SQL Database with ledger enabled by using the Azure portal.
ms.devlang:
Last updated 05/25/2021
-# Quickstart: Create an Azure SQL Database with ledger enabled
+# Quickstart: Create a database in Azure SQL Database with ledger enabled
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in **public preview**.
+> Azure SQL Database ledger is currently in public preview and available in West Central US.
-In this quickstart, you create a [ledger database](ledger-overview.md#ledger-database) in Azure SQL Database and configure [automatic digest storage with Azure Blob storage](ledger-digest-management-and-database-verification.md#automatic-generation-and-storage-of-database-digests) using the Azure portal. For more information about ledger, see [Azure SQL Database ledger](ledger-overview.md).
+In this quickstart, you create a [ledger database](ledger-overview.md#ledger-database) in Azure SQL Database and configure [automatic digest storage with Azure Blob Storage](ledger-digest-management-and-database-verification.md#automatic-generation-and-storage-of-database-digests) by using the Azure portal. For more information about ledger, see [Azure SQL Database ledger](ledger-overview.md).
## Prerequisite -- An active Azure subscription. If you don't have one, [create a free account](https://azure.microsoft.com/free/).
+You need an active Azure subscription. If you don't have one, [create a free account](https://azure.microsoft.com/free/).
## Create a ledger database and configure digest storage
-Create a single ledger database in the [serverless compute tier](serverless-tier-overview.md) and configure uploading ledger digests to an Azure Storage account.
+Create a single ledger database in the [serverless compute tier](serverless-tier-overview.md), and configure uploading ledger digests to an Azure Storage account.
-### Using the Azure portal
+### Use the Azure portal
To create a single database in the Azure portal, this quickstart starts at the Azure SQL page.
To create a single database in the Azure portal, this quickstart starts at the A
1. Under **SQL databases**, leave **Resource type** set to **Single database**, and select **Create**.
- ![Add to Azure SQL](./media/single-database-create-quickstart/select-deployment.png)
+ ![Screenshot that shows adding to Azure SQL.](./media/single-database-create-quickstart/select-deployment.png)
-1. On the **Basics** tab of the **Create SQL Database** form, under **Project details**, select the desired Azure **Subscription**.
+1. On the **Basics** tab of the **Create SQL Database** form, under **Project details**, select the Azure subscription you want to use.
-1. For **Resource group**, select **Create new**, enter *myResourceGroup*, and select **OK**.
+1. For **Resource group**, select **Create new**, enter **myResourceGroup**, and select **OK**.
-1. For **Database name**, enter *demo*.
+1. For **Database name**, enter **demo**.
-1. For **Server**, select **Create new**, and fill out the **New server** form with the following values:
- - **Server name**: Enter *mysqlserver*, and add some characters for uniqueness. We can't provide an exact server name to use because server names must be globally unique for all servers in Azure, not just unique within a subscription. So enter something like mysqlserver12345, and the portal lets you know if it's available or not.
- - **Server admin login**: Enter *azureuser*.
- - **Password**: Enter a password that meets requirements, and enter it again in the **Confirm password** field.
+1. For **Server**, select **Create new**. Fill out the **New server** form with the following values:
+ - **Server name**: Enter **mysqlserver**, and add some characters for uniqueness. We can't provide an exact server name to use because server names must be globally unique for all servers in Azure, not just unique within a subscription. Enter something like **mysqlserver12345**, and the portal lets you know if it's available or not.
+ - **Server admin login**: Enter **azureuser**.
+ - **Password**: Enter a password that meets requirements. Enter it again in the **Confirm password** box.
- **Location**: Select a location from the dropdown list.
- - Select **Allow Azure services to access this server** option to enable access to digest storage.
+ - **Allow Azure services to access this server**: Select this option to enable access to digest storage.
Select **OK**.
To create a single database in the Azure portal, this quickstart starts at the A
1. This quickstart uses a serverless database, so select **Serverless**, and then select **Apply**.
- ![configure serverless database](./media/single-database-create-quickstart/configure-database.png)
+ ![Screenshot that shows configuring a serverless database.](./media/single-database-create-quickstart/configure-database.png)
1. On the **Networking** tab, for **Connectivity method**, select **Public endpoint**. 1. For **Firewall rules**, set **Add current client IP address** to **Yes**. Leave **Allow Azure services and resources to access this server** set to **No**. 1. Select **Next: Security** at the bottom of the page.
- :::image type="content" source="media/ledger/ledger-create-database-networking-tab.png" alt-text="Networking tab of Create Database in Azure portal":::
+ :::image type="content" source="media/ledger/ledger-create-database-networking-tab.png" alt-text="Screenshot that shows the Networking tab of the Create SQL Database screen in the Azure portal.":::
1. On the **Security** tab, in the **Ledger** section, select the **Configure ledger** option.
- :::image type="content" source="media/ledger/ledger-configure-ledger-security-tab.png" alt-text="Configure ledger in Security tab of Azure portal":::
+ :::image type="content" source="media/ledger/ledger-configure-ledger-security-tab.png" alt-text="Screenshot that shows configuring a ledger on the Security tab of the Azure portal.":::
-1. On the **Configure ledger** pane, in the **Ledger** section, select the **Enable for all future tables in this database** checkbox. This setting ensures that all future tables in the database will be ledger tables, which means that all data in the database will be tamper evident. By default, new tables will be created as updatable ledger tables, even if you don't specify `LEDGER = ON` in [CREATE TABLE](/sql/t-sql/statements/create-table-transact-sql). Alternatively, you can leave this unselected, requiring you to enable ledger functionality on a per-table basis when creating new tables using Transact-SQL.
+1. On the **Configure ledger** pane, in the **Ledger** section, select the **Enable for all future tables in this database** checkbox. This setting ensures that all future tables in the database will be ledger tables. For this reason, all data in the database will show any evidence of tampering. By default, new tables will be created as updatable ledger tables, even if you don't specify `LEDGER = ON` in [CREATE TABLE](/sql/t-sql/statements/create-table-transact-sql). You can also leave this option unselected. You're then required to enable ledger functionality on a per-table basis when you create new tables by using Transact-SQL.
-1. In the **Digest storage** section, **Enable automatic digest storage** will be automatically selected, subsequently creating a new Azure Storage account and container where your digests will be stored.
+1. In the **Digest Storage** section, **Enable automatic digest storage** is automatically selected. Then, a new Azure Storage account and container where your digests are stored is created.
-1. Click the **Apply** button.
+1. Select **Apply**.
- :::image type="content" source="media/ledger/ledger-configure-ledger-pane.png" alt-text="Configure ledger pane in Azure portal":::
+ :::image type="content" source="media/ledger/ledger-configure-ledger-pane.png" alt-text="Screenshot that shows the Configure ledger (preview) pane in the Azure portal.":::
-1. Select **Review + create** at the bottom of the page:
+1. Select **Review + create** at the bottom of the page.
- :::image type="content" source="media/ledger/ledger-review-security-tab.png" alt-text="Review and create ledger database in Security tab of Azure portal":::
+ :::image type="content" source="media/ledger/ledger-review-security-tab.png" alt-text="Screenshot that shows reviewing and creating a ledger database on the Security tab of the Azure portal.":::
-1. On the **Review + create** page, after reviewing, select **Create**.
+1. On the **Review + create** page, after you review, select **Create**.
## Clean up resources
-Keep the resource group, server, and single database to go on to the next steps, and learn how to use the ledger feature of your database with different methods.
+Keep the resource group, server, and single database for the next steps. You'll learn how to use the ledger feature of your database with different methods.
-When you're finished using these resources, you can delete the resource group you created, which will also delete the server and single database within it.
+When you're finished using these resources, delete the resource group you created. This action also deletes the server and single database within it.
-### Using the Azure portal
+### Use the Azure portal
-To delete **myResourceGroup** and all its resources using the Azure portal:
+To delete **myResourceGroup** and all its resources by using the Azure portal:
-1. In the portal, search for and select **Resource groups**, and then select **myResourceGroup** from the list.
+1. In the portal, search for and select **Resource groups**. Then select **myResourceGroup** from the list.
1. On the resource group page, select **Delete resource group**.
-1. Under **Type the resource group name**, enter *myResourceGroup*, and then select **Delete**.
+1. Under **Type the resource group name**, enter **myResourceGroup**, and then select **Delete**.
## Next steps
-Connect and query your database using different tools and languages:
+Connect and query your database by using different tools and languages:
- [Create and use updatable ledger tables](ledger-how-to-updatable-ledger-tables.md) - [Create and use append-only ledger tables](ledger-how-to-append-only-ledger-tables.md)
azure-sql Ledger How To Access Acl Digest https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-how-to-access-acl-digest.md
Title: "How to access the digests stored in Azure Confidential Ledger (ACL)"
-description: How to access the digests stored in Azure Confidential Ledger (ACL) with Azure SQL Database ledger
+ Title: "Access the digests stored in Azure Confidential Ledger"
+description: Access the digests stored in Azure Confidential Ledger with an Azure SQL Database ledger.
Last updated "05/25/2021"
-# How to access the digests stored in ACL
+# Access the digests stored in Confidential Ledger
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in **public preview**.
+> Azure SQL Database ledger is currently in public preview and available in West Central US.
-This article shows you how to access an [Azure SQL Database ledger](ledger-overview.md) digest stored in [Azure Confidential Ledger (ACL)](../../confidential-ledger/index.yml) to get end-to-end security and integrity guarantees. Through this article, we'll explain how to access and verify integrity of the stored information.
+This article shows you how to access an [Azure SQL Database ledger](ledger-overview.md) digest stored in [Azure Confidential Ledger](../../confidential-ledger/index.yml) to get end-to-end security and integrity guarantees. Throughout this article, we'll explain how to access and verify integrity of the stored information.
## Prerequisites -- Python 2.7, 3.5.3, or later-- Have an existing Azure SQL Database with ledger enabled. See [Quickstart: Create an Azure SQL Database with ledger enabled](ledger-create-a-single-database-with-ledger-enabled.md) if you haven't already created an Azure SQL Database.-- [Azure Confidential Ledger client library for Python](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/confidentialledger/azure-confidentialledger)-- A running instance of [Azure Confidential Ledger](../../confidential-ledger/index.yml).
+- Python 2.7, 3.5.3, or later.
+- Azure SQL Database with ledger enabled. If you haven't already created a database in SQL Database, see [Quickstart: Create a database in SQL Database with ledger enabled](ledger-create-a-single-database-with-ledger-enabled.md).
+- [Azure Confidential Ledger client library for Python](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/confidentialledger/azure-confidentialledger).
+- A running instance of [Confidential Ledger](../../confidential-ledger/index.yml).
## How does the integration work?
-Azure SQL server calculates the digests of the [ledger database(s)](ledger-overview.md#ledger-database) periodically and stores them in Azure Confidential Ledger. At any time, a user can validate the integrity of the data by downloading the digests from Azure Confidential Ledger and comparing them to the digests stored in Azure SQL Database ledger. The following steps will explain it.
+Azure SQL Server calculates the digests of the [ledger databases](ledger-overview.md#ledger-database) periodically and stores them in Confidential Ledger. At any time, you can validate the integrity of the data. Download the digests from Confidential Ledger and compare them to the digests stored in a SQL Database ledger. The following steps explain the process.
-## 1. Find the Digest location
+## 1. Find the digest location
> [!NOTE]
-> The query will return more than one row if multiple Azure Confidential Ledger instances were used to store the digest. For each row, repeat steps 2 through 6 to download the digests from all instances of Azure Confidential Ledger.
+> The query returns more than one row if multiple Confidential Ledger instances were used to store the digest. For each row, repeat steps 2 through 6 to download the digests from all instances of Confidential Ledger.
-Using the [SQL Server Management Studio (SSMS)](/sql/ssms/download-sql-server-management-studio-ssms), run the following query. The output shows the endpoint of the Azure Confidential Ledger instance where the digests are stored.
+Use [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) to run the following query. The output shows the endpoint of the Confidential Ledger instance where the digests are stored.
```sql SELECT * FROM sys.database_ledger_digest_locations WHERE path like '%.confidential-ledger.azure.com% ```
-## 2. Determine the Subledgerid
+## 2. Determine the subledgerid
-We're interested in the value in the path column from the query output. It consists of two parts, namely the `host name` and the `subledgerid`. As an example, in the Url `https://contoso-ledger.confidential-ledger.azure.com/sqldbledgerdigests/ledgersvr2/ledgerdb/2021-04-13T21:20:51.0000000`, the `host name` is `https://contoso-ledger.confidential-ledger.azure.com` and the `subledgerid` is `sqldbledgerdigests/ledgersvr2/ledgerdb/2021-04-13T21:20:51.0000000`. We'll use it in Step 4 to download the digests.
+We're interested in the value in the path column from the query output. It consists of two parts, namely, the `host name` and the `subledgerid`. As an example, in the URL `https://contoso-ledger.confidential-ledger.azure.com/sqldbledgerdigests/ledgersvr2/ledgerdb/2021-04-13T21:20:51.0000000`, the `host name` is `https://contoso-ledger.confidential-ledger.azure.com` and the `subledgerid` is `sqldbledgerdigests/ledgersvr2/ledgerdb/2021-04-13T21:20:51.0000000`. We'll use it in step 4 to download the digests.
## 3. Obtain an Azure AD token
-The Azure Confidential Ledger API accepts an Azure Active Directory (Azure AD) Bearer token as the caller identity. This identity needs access to ACL via Azure Resource Manager during provisioning. The user who had enabled ledger in SQL Database is automatically given administrator access to Azure Confidential Ledger. To obtain a token, the user needs to authenticate using [Azure CLI](/cli/azure/install-azure-cli) with the same account that was used with Azure portal. Once the user has authenticated, they can use [AzureCliCredential](/python/api/azure-identity/azure.identity.azureclicredential) to retrieve a bearer token and call Azure Confidential Ledger API.
+The Confidential Ledger API accepts an Azure Active Directory (Azure AD) bearer token as the caller identity. This identity needs access to Confidential Ledger via Azure Resource Manager during provisioning. When you enable ledger in SQL Database, you're automatically given administrator access to Confidential Ledger. To obtain a token, you need to authenticate by using the [Azure CLI](/cli/azure/install-azure-cli) with the same account that was used with the Azure portal. After you've authenticated, you can use [AzureCliCredential](/python/api/azure-identity/azure.identity.azureclicredential) to retrieve a bearer token and call the Confidential Ledger API.
-Log in to Azure AD using the identity with access to ACL.
+Sign in to Azure AD by using the identity with access to Confidential Ledger.
```azure-cli az login ```
-Retrieve the Bearer token.
+Retrieve the bearer token.
```python from azure.identity import AzureCliCredential credential = AzureCliCredential() ```
-## 4. Download the digests from Azure Confidential Ledger
+## 4. Download the digests from Confidential Ledger
-The following Python script downloads the digests from Azure Confidential Ledger. The script uses the [Azure Confidential Ledger client library for Python.](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/confidentialledger/azure-confidentialledger)
+The following Python script downloads the digests from Confidential Ledger. The script uses the [Confidential Ledger client library for Python](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/confidentialledger/azure-confidentialledger).
```python from azure.identity import AzureCliCredential
else:
print("\n***No more digests were found for the supplied SubledgerID.") ```
-## 5. Download the Digests from the SQL Server
+## 5. Download the digests from the SQL server
> [!NOTE]
-> This is a way to confirm that the hashes stored in the Azure SQL Database ledger have not changed over time. For complete audit of the integrity of the Azure SQL Database ledger, see [How to verify a ledger table to detect tampering](ledger-verify-database.md).
+> This step is a way to confirm that the hashes stored in the SQL Database ledger haven't changed over time. For a complete audit of the integrity of the SQL Database ledger, see [Verify a ledger table to detect tampering](ledger-verify-database.md).
-Using [SSMS](/sql/ssms/download-sql-server-management-studio-ssms), run the following query. The query returns the digests of the blocks from Genesis.
+Use [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) to run the following query. The query returns the digests of the blocks from Genesis.
```sql SELECT * FROM sys.database_ledger_blocks
SELECT * FROM sys.database_ledger_blocks
## 6. Comparison
-Compare the digest retrieved from the Azure Confidential Ledger to the digest returned from your SQL database using the `block_id` as the key. For example, the digest of `block_id` = `1` is the value of the `previous_block_hash` column in the `block_id`= `2` row. Similarly, for `block_id` = `3`, it's the value of the `previous_block_id` column in the `block_id` = `4` row. A mismatch in the hash value is an indicator of a potential data tampering.
+Compare the digest retrieved from Confidential Ledger to the digest returned from your database in SQL Database by using `block_id` as the key. For example, the digest of `block_id` = `1` is the value of the `previous_block_hash` column in the `block_id`= `2` row. Similarly, for `block_id` = `3`, it's the value of the `previous_block_id` column in the `block_id` = `4` row. A mismatch in the hash value is an indicator of potential data tampering.
-If data tampering is suspected, see [How to verify a ledger table to detect tampering](ledger-verify-database.md) to perform a full audit of the Azure SQL Database ledger.
+If you suspect data tampering, see [Verify a ledger table to detect tampering](ledger-verify-database.md) to perform a full audit of the SQL Database ledger.
## Next steps -- [Azure SQL Database ledger Overview](ledger-overview.md)
+- [Azure SQL Database ledger overview](ledger-overview.md)
- [Database ledger](ledger-database-ledger.md) - [Digest management and database verification](ledger-digest-management-and-database-verification.md) - [Append-only ledger tables](ledger-append-only-ledger-tables.md) - [Updatable ledger tables](ledger-updatable-ledger-tables.md)-- [How to verify a ledger table to detect tampering](ledger-verify-database.md)
+- [Verify a ledger table to detect tampering](ledger-verify-database.md)
azure-sql Ledger Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-limits.md
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in **public preview**.
+> Azure SQL Database ledger is currently in public preview and available in West Central US.
-This article provides an overview of the limitations when using ledger tables with Azure SQL Database.
+This article provides an overview of the limitations of ledger tables used with Azure SQL Database.
## Limitations | Function | Limitation | | : | : |
-| Disabling [ledger database](ledger-database-ledger.md) | Once enabled, ledger database cannot be disabled. |
-| Maximum # of columns | When created, [updatable ledger tables](ledger-updatable-ledger-tables.md) adds four [GENERATED ALWAYS](/sql/t-sql/statements/create-table-transact-sql#generate-always-columns) columns to the ledger table and [append-only ledger tables](ledger-append-only-ledger-tables.md) add two columns to the ledger table. These new columns count against the maximum supported number of columns in Azure SQL Database (1024). |
+| Disabling [ledger database](ledger-database-ledger.md) | After a ledger database is enabled, it can't be disabled. |
+| Maximum number of columns | When an [updatable ledger table](ledger-updatable-ledger-tables.md) is created, it adds four [GENERATED ALWAYS](/sql/t-sql/statements/create-table-transact-sql#generate-always-columns) columns to the ledger table. An [append-only ledger table](ledger-append-only-ledger-tables.md) adds two columns to the ledger table. These new columns count against the maximum supported number of columns in SQL Database (1,024). |
| Restricted data types | XML, SqlVariant, User-defined type, and FILESTREAM data types aren't supported. | | In-memory tables | In-memory tables aren't supported. | | Sparse column sets | Sparse column sets aren't supported. |
-| Ledger truncation | Deleting older data in [append-only ledger tables](ledger-append-only-ledger-tables.md), or the history table of [updatable ledger tables](ledger-updatable-ledger-tables.md) aren't supported. |
-| Converting existing tables to ledger tables | Existing tables in a database that aren't ledger-enabled cannot be converted over to ledger tables. |
-|LRS support for [automated digest management](ledger-digest-management-and-database-verification.md) | Automated digest management with ledger tables using [Azure Storage immutable blobs](../../storage/blobs/storage-blob-immutable-storage.md) doesn't offer the ability for users to use [locally redundant storage (LRS)](../../storage/common/storage-redundancy.md#locally-redundant-storage) accounts.|
+| Ledger truncation | Deleting older data in [append-only ledger tables](ledger-append-only-ledger-tables.md) or the history table of [updatable ledger tables](ledger-updatable-ledger-tables.md) isn't supported. |
+| Converting existing tables to ledger tables | Existing tables in a database that aren't ledger-enabled can't be converted to ledger tables. |
+|Locally redundant storage (LRS) support for [automated digest management](ledger-digest-management-and-database-verification.md) | Automated digest management with ledger tables by using [Azure Storage immutable blobs](../../storage/blobs/storage-blob-immutable-storage.md) doesn't offer the ability for users to use [LRS](../../storage/common/storage-redundancy.md#locally-redundant-storage) accounts.|
## Remarks -- When a ledger database is created, all new tables created by default (without specifying the `APPEND_ONLY = ON` clause) in the database will be [updatable ledger tables](ledger-updatable-ledger-tables.md). [Append-only ledger tables](ledger-append-only-ledger-tables.md) can be created using [CREATE TABLE (Transact-SQL)](/sql/t-sql/statements/create-table-transact-sql) statements.
+- When a ledger database is created, all new tables created by default (without specifying the `APPEND_ONLY = ON` clause) in the database will be [updatable ledger tables](ledger-updatable-ledger-tables.md). To create [append-only ledger tables](ledger-append-only-ledger-tables.md), use [CREATE TABLE (Transact-SQL)](/sql/t-sql/statements/create-table-transact-sql) statements.
- Ledger tables can't be a FILETABLE. - Ledger tables can't have full-text indexes. - Ledger tables can't be renamed. - Ledger tables can't be moved to a different schema. - Only nullable columns can be added to ledger tables, and when they aren't specified WITH VALUES.-- Columns in ledger tables cannot be dropped.
+- Columns in ledger tables can't be dropped.
- Only deterministic-computed columns are allowed for ledger tables.-- Existing columns cannot be altered in a way that modifies the format for this column.
+- Existing columns can't be altered in a way that modifies the format for this column.
- We allow changing:
- - Nullability
- - Collation for nvarchar/ntext columns and when the code page isn't changing for char/text columns
- - Change the length of variable length columns
- - Sparseness
-- SWITCH IN/OUT isn't allowed for ledger tables-- Long-term backups (LTR) aren't supported for databases that have `LEDGER = ON`-- `LEDGER` or `SYSTEM_VERSIONING` cannot be disabled for ledger tables.-- The `UPDATETEXT` and `WRITETEXT` APIs cannot be used on ledger tables.
+ - Nullability.
+ - Collation for nvarchar/ntext columns and when the code page isn't changing for char/text columns.
+ - The length of variable length columns.
+ - Sparseness.
+- SWITCH IN/OUT isn't allowed for ledger tables.
+- Long-term backups (LTR) aren't supported for databases that have `LEDGER = ON`.
+- Versioning that's `LEDGER` or `SYSTEM_VERSIONING` can't be disabled for ledger tables.
+- The `UPDATETEXT` and `WRITETEXT` APIs can't be used on ledger tables.
- A transaction can update up to 200 ledger tables. - For updatable ledger tables, we inherit all of the limitations of temporal tables. - Change tracking isn't allowed on ledger tables.-- Ledger tables can't have a rowstore non-clustered index when they have a clustered Columnstore index.
+- Ledger tables can't have a rowstore non-clustered index when they have a clustered columnstore index.
## Next steps -- [Updatable ledger tables](ledger-updatable-ledger-tables.md) -- [Append-only ledger tables](ledger-append-only-ledger-tables.md) -- [Database ledger](ledger-database-ledger.md) -- [Digest management and database verification](ledger-digest-management-and-database-verification.md)
+- [Updatable ledger tables](ledger-updatable-ledger-tables.md)
+- [Append-only ledger tables](ledger-append-only-ledger-tables.md)
+- [Database ledger](ledger-database-ledger.md)
+- [Digest management and database verification](ledger-digest-management-and-database-verification.md)
azure-sql Ledger Verify Database https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/ledger-verify-database.md
Title: How to verify a ledger table to detect tampering
-description: This article discusses how to verify if an Azure SQL Database table has been tampered with
+ Title: Verify a ledger table to detect tampering
+description: This article discusses how to verify if an Azure SQL Database table was tampered with.
ms.devlang:
Last updated 05/25/2021
-# How to verify a ledger table to detect tampering
+# Verify a ledger table to detect tampering
[!INCLUDE[appliesto-sqldb](../includes/appliesto-sqldb.md)] > [!NOTE]
-> Azure SQL Database ledger is currently in **public preview**.
+> Azure SQL Database ledger is currently in public preview and available in West Central US.
-In this article, you'll verify the integrity of the data in your Azure SQL Database ledger tables. If you've checked **Enable automatic digest storage** when [creating your Azure SQL Database](ledger-create-a-single-database-with-ledger-enabled.md), follow the Azure portal instructions to automatically generate the Transact-SQL (T-SQL) script needed to verify the database ledger in [Query Editor](connect-query-portal.md). Otherwise, follow the T-SQL instructions using either [SQL Server Management Studio (SSMS)](/sql/ssms/download-sql-server-management-studio-ssms) or [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio).
+In this article, you'll verify the integrity of the data in your Azure SQL Database ledger tables. If you selected **Enable automatic digest storage** when you [created your database in SQL Database](ledger-create-a-single-database-with-ledger-enabled.md), follow the Azure portal instructions to automatically generate the Transact-SQL (T-SQL) script needed to verify the database ledger in the [query editor](connect-query-portal.md). Otherwise, follow the T-SQL instructions by using [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) or [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio).
-## Prerequisite
+## Prerequisites
-- An active Azure subscription. If you don't have one, [create a free account](https://azure.microsoft.com/free/).-- [Create an Azure SQL Database with ledger enabled.](ledger-create-a-single-database-with-ledger-enabled.md)-- [Create and use updatable ledger tables](ledger-how-to-updatable-ledger-tables.md) or [Create and use append-only ledger tables](ledger-how-to-append-only-ledger-tables.md)
+- Have an active Azure subscription. If you don't have one, [create a free account](https://azure.microsoft.com/free/).
+- [Create a database in SQL Database with ledger enabled](ledger-create-a-single-database-with-ledger-enabled.md).
+- [Create and use updatable ledger tables](ledger-how-to-updatable-ledger-tables.md) or [create and use append-only ledger tables](ledger-how-to-append-only-ledger-tables.md).
-## Run ledger verification for Azure SQL Database
+## Run ledger verification for SQL Database
# [Portal](#tab/azure-portal)
-1. Open the [Azure portal](https://portal.azure.com/), select **All resources** and locate the database you want to verify. Select that SQL database.
+1. Open the [Azure portal](https://portal.azure.com/), select **All resources**, and locate the database you want to verify. Select that database in SQL Database.
- :::image type="content" source="media/ledger/ledger-portal-all-resources.png" alt-text="Azure portal showing with All resources tab":::
+ :::image type="content" source="media/ledger/ledger-portal-all-resources.png" alt-text="Screenshot that shows the Azure portal with the All resources tab selected.":::
1. In **Security**, select the **Ledger** option.
- :::image type="content" source="media/ledger/ledger-portal-manage-ledger.png" alt-text="Azure portal ledger security tab":::
+ :::image type="content" source="media/ledger/ledger-portal-manage-ledger.png" alt-text="Screenshot that shows the Azure portal with the Security Ledger tab selected.":::
-1. In the **Ledger** pane, select the **</> Verify database** button, and select the **copy** icon in the pre-populated text in the window.
+1. In the **Ledger** pane, select **</> Verify database**, and select the **copy** icon in the pre-populated text in the window.
:::image type="content" source="media/ledger/ledger-portal-verify.png" alt-text="Azure portal verify database button":::
-1. Open **Query Editor** in the left menu.
+ > > [!IMPORTANT]
+ > If you haven't configured automatic digest storage for your database digests and are instead manually managing digests, don't copy this script. Continue to step 6.
- :::image type="content" source="media/ledger/ledger-portal-open-query-editor.png" alt-text="Azure portal query editor button":::
+1. Open **Query editor** in the left menu.
-1. In **Query editor**, paste the T-SQL script you copied in Step 3, and select **Run**.
+ :::image type="content" source="media/ledger/ledger-portal-open-query-editor.png" alt-text="Screenshot that shows the Azure portal Query editor menu option.":::
- :::image type="content" source="media/ledger/ledger-portal-run-query-editor.png" alt-text="Azure portal run query editor to verify the database":::
+1. In the query editor, paste the T-SQL script you copied in step 3, and select **Run**. Continue to step 8.
-1. Successful verification will return the following in the **Results** window.
+ :::image type="content" source="media/ledger/ledger-portal-run-query-editor.png" alt-text="Screenshot that shows the Azure portal Run query editor to verify the database.":::
- - If there was no tampering in your database, the message will be as follows:
+1. If you're using manual digest storage, enter the following T-SQL into the query editor to retrieve your latest database digest. Copy the digest from the results returned for the next step.
- ```output
- Ledger verification successful
+ ```sql
+ EXECUTE sp_generate_database_ledger_digest
+ ```
+
+1. In the query editor, paste the following T-SQL, replacing `<database_digest>` with the digest you copied in step 6, and select **Run**.
+
+ ```sql
+ EXECUTE sp_verify_database_ledger N'<database_digest>'
```
- - If there was tampering in your database, the following error will be in the **Messages** window.
+1. Verification returns the following messages in the **Results** window.
+
+ - If there was no tampering in your database, the message is:
+
+ ```output
+ Ledger verification successful
+ ```
+
+ - If there was tampering in your database, the following error appears in the **Messages** window.
- ```output
- Failed to execute query. Error: The hash of block xxxx in the database ledger does not match the hash provided in the digest for this block.
+ ```output
+ Failed to execute query. Error: The hash of block xxxx in the database ledger does not match the hash provided in the digest for this block.
+ ```
+
+# [T-SQL using automatic digest storage](#tab/t-sql-automatic)
+
+1. Connect to your database by using [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) or [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio).
+
+1. Create a new query with the following T-SQL statement:
+
+ ```sql
+ DECLARE @digest_locations NVARCHAR(MAX) = (SELECT * FROM sys.database_ledger_digest_locations FOR JSON AUTO, INCLUDE_NULL_VALUES);SELECT @digest_locations as digest_locations;
+ BEGIN TRY
+ EXEC sys.sp_verify_database_ledger_from_digest_storage @digest_locations;
+ SELECT 'Ledger verification succeeded.' AS Result;
+ END TRY
+ BEGIN CATCH
+ THROW;
+ END CATCH
```
-# [T-SQL](#tab/t-sql)
+1. Execute the query. You'll see that **digest_locations** returns the current location of where your database digests are stored and any previous locations. **Result** returns the success or failure of ledger verification.
+
+ :::image type="content" source="media/ledger/verification_script_exectution.png" alt-text="Screenshot of running ledger verification by using Azure Data Studio.":::
+
+1. Open the **digest_locations** result set to view the locations of your digests. The following example shows two digest storage locations for this database:
+
+ - **path** indicates the location of the digests.
+ - **last_digest_block_id** indicates the block ID of the last digest stored in the **path** location.
+ - **is_current** indicates whether the location in **path** is the current (true) or previous (false) one.
+
+ ```json
+ [
+ {
+ "path": "https:\/\/digest1.blob.core.windows.net\/sqldbledgerdigests\/janderstestportal2server\/jandersnewdb\/2021-05-20T04:39:47.6570000",
+ "last_digest_block_id": 10016,
+ "is_current": true
+ },
+ {
+ "path": "https:\/\/jandersneweracl.confidential-ledger.azure.com\/sqldbledgerdigests\/janderstestportal2server\/jandersnewdb\/2021-05-20T04:39:47.6570000",
+ "last_digest_block_id": 1704,
+ "is_current": false
+ }
+ ]
+ ```
+
+ > [!IMPORTANT]
+ > When you run ledger verification, inspect the location of **digest_locations** to ensure digests used in verification are retrieved from the locations you expect. You want to make sure that a privileged user hasn't changed locations of digest storage to an unprotected storage location, such as Azure Storage, without a configured and locked immutability policy.
+
+1. Verification returns the following message in the **Results** window.
+
+ - If there was no tampering in your database, the message is:
+
+ ```output
+ Ledger verification successful
+ ```
+
+ - If there was tampering in your database, the following error appears in the **Messages** window:
+
+ ```output
+ Failed to execute query. Error: The hash of block xxxx in the database ledger doesn't match the hash provided in the digest for this block.
+ ```
+
+# [T-SQL using manual digest storage](#tab/t-sql-manual)
-1. Connect to your database using either [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) or [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio).
-1. Create a new query with the following T-SQL statement.
+1. Connect to your database by using [SQL Server Management Studio](/sql/ssms/download-sql-server-management-studio-ssms) or [Azure Data Studio](/sql/azure-data-studio/download-azure-data-studio).
+1. Create a new query with the following T-SQL statement:
```sql /****** This will retrieve the latest digest file ******/ EXECUTE sp_generate_database_ledger_digest ```
-1. Execute the query. The results contain the latest database digest, and represent the hash of the database at the current point in time. Copy the contents of the results to be used in the next step.
+1. Execute the query. The results contain the latest database digest and represent the hash of the database at the current point in time. Copy the contents of the results to be used in the next step.
- :::image type="content" source="media/ledger/ledger-retrieve-digest.png" alt-text="Retrieve digest results using Azure Data Studio":::
+ :::image type="content" source="media/ledger/ledger-retrieve-digest.png" alt-text="Screenshot that shows retrieving digest results by using Azure Data Studio.":::
1. Create a new query with the following T-SQL statement. Replace `<YOUR DATABASE DIGEST>` with the digest you copied in the previous step.
In this article, you'll verify the integrity of the data in your Azure SQL Datab
' ```
-1. Execute the query. The **Messages** will contain the success message below.
+1. Execute the query. The **Messages** window contains the following success message.
- :::image type="content" source="media/ledger/ledger-verify-message.png" alt-text="Message after running T-SQL query for ledger verification using Azure Data Studio":::
+ :::image type="content" source="media/ledger/ledger-verify-message.png" alt-text="Screenshot that shows the message after running T-SQL query for ledger verification by using Azure Data Studio.":::
> [!TIP]
- > Running ledger verification with the latest digest will only verify the database from the time the digest was generated until the time the verification was run. To verify that the historical data in your database was not tampered with, run verification using multiple database digest files. Start with the point in time which you want to verify the database. An example of a verification passing multiple digests would look similar to the below query:
+ > Running ledger verification with the latest digest will only verify the database from the time the digest was generated until the time the verification was run. To verify that the historical data in your database wasn't tampered with, run verification by using multiple database digest files. Start with the point in time for which you want to verify the database. An example of a verification passing multiple digests would look similar to the following query.
``` EXECUTE sp_verify_database_ledger N'
In this article, you'll verify the integrity of the data in your Azure SQL Datab
## Next steps -- [Azure SQL Database ledger Overview](ledger-overview.md)-- [Database ledger](ledger-database-ledger.md)
+- [Azure SQL Database ledger overview](ledger-overview.md)
+- [SQL Database ledger](ledger-database-ledger.md)
- [Digest management and database verification](ledger-digest-management-and-database-verification.md) - [Append-only ledger tables](ledger-append-only-ledger-tables.md) - [Updatable ledger tables](ledger-updatable-ledger-tables.md)-- [How to access the digests stored in Azure Confidential Ledger (ACL)](ledger-how-to-access-acl-digest.md)
+- [Access the digests stored in Azure Confidential Ledger](ledger-how-to-access-acl-digest.md)
azure-sql Sql Data Sync Data Sql Server Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/sql-data-sync-data-sql-server-sql-database.md
Provisioning and deprovisioning during sync group creation, update, and deletion
- If two primary keys are only different in case (e.g. Foo and foo), Data Sync won't support this scenario. - Truncating tables is not an operation supported by Data Sync (changes won't be tracked). - Hyperscale databases are not supported.
+- Memory-optimized tables are not supported.
#### Unsupported data types
azure-sql Vnet Subnet Determine Size https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/managed-instance/vnet-subnet-determine-size.md
ms.devlang:
- Previously updated : 02/22/2019+ Last updated : 06/14/2021 # Determine required subnet size & range for Azure SQL Managed Instance [!INCLUDE[appliesto-sqlmi](../includes/appliesto-sqlmi.md)]
By design, a managed instance needs a minimum of 32 IP addresses in a subnet. As
- service tier - hardware generation - number of vCores
+ - [maintenance window](../database/maintenance-window.md)
- Plans to scale up/down or change service tier > [!IMPORTANT]
Size your subnet according to the future instance deployment and scaling needs.
- Azure uses five IP addresses in the subnet for its own needs - Each virtual cluster allocates additional number of addresses - Each managed instance uses number of addresses that depends on pricing tier and hardware generation
+- Each scaling request temporarily allocates additional number of addresses
> [!IMPORTANT] > It is not possible to change the subnet address range if any resource exists in the subnet. It is also not possible to move managed instances from one subnet to another. Whenever possible, please consider using bigger subnets rather than smaller to prevent issues in the future.
VC = virtual cluster
\* Column total displays number of addresses that would be taken when one instance is deployed in subnet. Each additional instance in subnet adds number of addresses represented with instance usage column. Addresses represented with Azure usage column are shared across multiple virtual clusters while addresses represented with VC usage column are shared across instances placed in that virtual cluster.
-Update operation typically requires virtual cluster resize. In some circumstances, update operation will require virtual cluster creation (for more details check [management operations article](sql-managed-instance-paas-overview.md#management-operations)). In case of virtual cluster creation, number of additional addresses required is equal to number of addresses represented by VC usage column summed with addresses required for instances placed in that virtual cluster (instance usage column).
+Additional input for consideration when determining subnet size (especially when multiple instances will be deployed inside the same subnet) is [maintenance window feature](../database/maintenance-window.md). Specifying maintenance window for managed instance during its creation or afterwards means that it must be placed in virtual cluster with corresponding maintenance window. If there is no such virtual cluster in the subnet, a new one must be created first to accommodate the instance.
-**Example 1**: You plan to have one general purpose managed instance (Gen4 hardware) and one business critical managed instance (Gen5 hardware). That means you need a minimum of 5 + 1 + 1 * 5 + 6 + 1 * 5 = 22 IP addresses to be able to deploy. As IP ranges are defined in power of 2, your subnet requires minimum IP range of 32 (2^5) for this deployment.<br><br>
-As mentioned above, in some circumstances, update operation will require virtual cluster creation. This means that, as an example, in case of an update to the Gen5 business critical managed instance that requires a virtual cluster creation, you will need to have additional 6 + 5 = 11 IP addresses available. Since you are already using 22 of the 32 addresses, there is no available addresses for this operation. Therefore, you need to reserve the subnet with subnet mask of /26 (64 addresses).
-
-**Example 2**: You plan to have three general purpose (Gen5 hardware) and two business critical managed instances (Gen5 hardware) placed in same subnet. That means you need 5 + 6 + 3 * 3 + 2 * 5 = 30 IP addresses. Therefore, you need to reserve the subnet with subnet mask of /26. Selecting a subnet mask of /27 would cause the remaining number of addresses to be 2 (32 ΓÇô 30), this would prevent update operations for all instances as additional addresses are required in subnet for performing instance scaling.
-
-**Example 3**: You plan to have one general purpose managed instance (Gen5 hardware) and perform vCores update operation from time to time. That means you need 5 + 6 + 1 * 3 + 3 = 17 IP addresses. As IP ranges are defined in power of 2, you need the IP range of 32 (2^5) IP addresses. Therefore, you need to reserve the subnet with subnet mask of /27.
+Update operation typically requires virtual cluster resize (for more details check [management operations article](management-operations-overview.md)). When new create or update request comes, managed instance service communicates with compute platform with a request for new nodes that need to be added. Based on the compute response, deployment system either expands existing virtual cluster or creates a new one. Even if in most cases operation will be completed within same virtual cluster, there is no guarantee from the compute side that new one will not be spawned. This will increase number of IP addresses required for performing create or update operation and also reserve additional IP addresses in the subnet for newly created virtual cluster.
### Address requirements for update scenarios
During scaling operation instances temporarily require additional IP capacity th
| Gen5 | BC | Switching to GP | 3 | \* Gen4 hardware is being phased out and is no longer available for new deployments. Update hardware generation from Gen4 to Gen5 to take advantage of the capabilities specific to Gen5 hardware generation.
+
+## Recommended subnet calculator
+
+Taking into the account potential creation of new virtual cluster during subsequent create request or instance update, and maintenance window requirement of virtual cluster per window, recommended formula for calculating total number of IP addresses required is:
+
+**Formula: 5 + a * 12 + b * 16 + c * 16**
+
+- a = number of GP instances
+- b = number of BC instances
+- c = number of different maintenance window configurations
+
+Explanation:
+- 5 = number of IP addresses reserved by Azure
+- 12 addresses per GP instance = 6 for virtual cluster, 3 for managed instance, 3 additional for scaling operation
+- 16 addresses per BC instance = 6 for virtual cluster, 5 for managed instance, 5 additional for scaling operation
+- 16 addresses as a backup = scenario where new virtual cluster is created
+
+Example:
+- You plan to have three general purpose and two business critical managed instances deployed in the same subnet. All instances will have same maintenance window configured. That means you need 5 + 3 * 12 + 2 * 16 + 1 * 16 = 85 IP addresses. As IP ranges are defined in power of 2, your subnet requires minimum IP range of 128 (2^7) for this deployment. Therefore, you need to reserve the subnet with subnet mask of /25.
+
+> [!NOTE]
+> Even though it is possible to deploy managed instances in the subnet with number of IP addresses less than the subnet calculator output, always consider using bigger subnets rather than smaller to avoid issue with lack of IP addresses in the future, including unability to create new instances in the subnet or scale existing ones.
## Next steps
azure-sql Automated Backup Sql 2014 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/virtual-machines/windows/automated-backup-sql-2014.md
When finished, select the **Apply** button on the bottom of the **Backups** page
If you are enabling Automated Backup for the first time, Azure configures the SQL Server IaaS Agent in the background. During this time, the Azure portal might not show that Automated Backup is configured. Wait several minutes for the agent to be installed and configured. After that, the Azure portal will reflect the new settings. > [!NOTE]
-> You can also configure Automated Backup using a template. For more information, see [Azure quickstart template for Automated Backup](https://github.com/Azure/azure-quickstart-templates/tree/master/101-vm-sql-existing-autobackup-update).
+> You can also configure Automated Backup using a template. For more information, see [Azure quickstart template for Automated Backup](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.compute/vm-sql-existing-autobackup-update).
## Configure with PowerShell
azure-sql Availability Group Quickstart Template Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/virtual-machines/windows/availability-group-quickstart-template-configure.md
This article describes how to use the Azure quickstart templates to partially au
| Template | Description | | | |
- | [101-sql-vm-ag-setup](https://github.com/Azure/azure-quickstart-templates/tree/master/101-sql-vm-ag-setup) | Creates the Windows failover cluster and joins the SQL Server VMs to it. |
- | [101-sql-vm-aglistener-setup](https://github.com/Azure/azure-quickstart-templates/tree/master/101-sql-vm-aglistener-setup) | Creates the availability group listener and configures the internal load balancer. This template can be used only if the Windows failover cluster was created with the **101-sql-vm-ag-setup** template. |
+ | [sql-vm-ag-setup](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.sqlvirtualmachine/sql-vm-ag-setup) | Creates the Windows failover cluster and joins the SQL Server VMs to it. |
+ | [sql-vm-aglistener-setup](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.sqlvirtualmachine/sql-vm-aglistener-setup) | Creates the availability group listener and configures the internal load balancer. This template can be used only if the Windows failover cluster was created with the **101-sql-vm-ag-setup** template. |
| &nbsp; | &nbsp; | Other parts of the availability group configuration must be done manually, such as creating the availability group and creating the internal load balancer. This article provides the sequence of automated and manual steps.
After your SQL Server VMs have been registered with the SQL IaaS Agent extension
Adding SQL Server VMs to the *SqlVirtualMachineGroups* resource group bootstraps the Windows Failover Cluster Service to create the cluster and then joins those SQL Server VMs to that cluster. This step is automated with the **101-sql-vm-ag-setup** quickstart template. You can implement it by using the following steps:
-1. Go to the [**101-sql-vm-ag-setup**](https://github.com/Azure/azure-quickstart-templates/tree/master/101-sql-vm-ag-setup) quickstart template. Then, select **Deploy to Azure** to open the quickstart template in the Azure portal.
+1. Go to the [**sql-vm-ag-setup**](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.sqlvirtualmachine/sql-vm-ag-setup) quickstart template. Then, select **Deploy to Azure** to open the quickstart template in the Azure portal.
1. Fill out the required fields to configure the metadata for the Windows failover cluster. You can leave the optional fields blank. The following table shows the necessary values for the template:
Manually create the availability group as you normally would, by using [SQL Serv
[!INCLUDE [sql-ag-use-dnn-listener](../../includes/sql-ag-use-dnn-listener.md)]
-The Always On availability group listener requires an internal instance of Azure Load Balancer. The internal load balancer provides a ΓÇ£floatingΓÇ¥ IP address for the availability group listener that allows for faster failover and reconnection. If the SQL Server VMs in an availability group are part of the same availability set, you can use a Basic load balancer. Otherwise, you need to use a Standard load balancer.
+The Always On availability group listener requires an internal instance of Azure Load Balancer. The internal load balancer provides a "floating" IP address for the availability group listener that allows for faster failover and reconnection. If the SQL Server VMs in an availability group are part of the same availability set, you can use a Basic load balancer. Otherwise, you need to use a Standard load balancer.
> [!IMPORTANT] > The internal load balancer should be in the same virtual network as the SQL Server VM instances.
Create the availability group listener and configure the internal load balancer
To configure the internal load balancer and create the availability group listener, do the following:
-1. Go to the [101-sql-vm-aglistener-setup](https://github.com/Azure/azure-quickstart-templates/tree/master/101-sql-vm-aglistener-setup) quickstart template and select **Deploy to Azure** to start the quickstart template in the Azure portal.
+1. Go to the [sql-vm-aglistener-setup](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.sqlvirtualmachine/sql-vm-aglistener-setup) quickstart template and select **Deploy to Azure** to start the quickstart template in the Azure portal.
1. Fill out the required fields to configure the internal load balancer, and create the availability group listener. You can leave the optional fields blank. The following table shows the necessary values for the template:
azure-sql Azure Key Vault Integration Configure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/virtual-machines/windows/azure-key-vault-integration-configure.md
When you're finished, select the **Apply** button on the bottom of the **Securit
> [!NOTE]
-> You can also configure Key Vault integration by using a template. For more information, see [Azure quickstart template for Azure Key Vault integration](https://github.com/Azure/azure-quickstart-templates/tree/master/101-vm-sql-existing-keyvault-update).
+> You can also configure Key Vault integration by using a template. For more information, see [Azure quickstart template for Azure Key Vault integration](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.compute/vm-sql-existing-keyvault-update).
[!INCLUDE [Key Vault integration next steps](../../../../includes/virtual-machines-sql-server-akv-next-steps.md)]
azure-sql Storage Configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/virtual-machines/windows/storage-configuration.md
For a full walkthrough of how to create a SQL Server VM in the Azure portal, see
If you use the following Resource Manager templates, two premium data disks are attached by default, with no storage pool configuration. However, you can customize these templates to change the number of premium data disks that are attached to the virtual machine. * [Create VM with Automated Backup](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.compute/vm-sql-full-autobackup)
-* [Create VM with Automated Patching](https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-sql-full-autopatching)
-* [Create VM with AKV Integration](https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-sql-full-keyvault)
+* [Create VM with Automated Patching](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.compute/vm-sql-full-autopatching)
+* [Create VM with AKV Integration](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.compute/vm-sql-full-keyvault)
### Quickstart template
azure-video-analyzer Get Started Detect Motion Emit Events https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-video-analyzer/video-analyzer-docs/get-started-detect-motion-emit-events.md
After completing the setup steps, you'll be able to run the simulated live video
The deployment process will take about **20 minutes**. Upon completion, you will have certain Azure resources deployed in the Azure subscription, including: 1. **Video Analyzer account** - This [cloud service](overview.md) is used to register the Video Analyzer edge module, and for playing back recorded video and video analytics. 1. **Storage account** - For storing recorded video and video analytics.
-1. **Managed Identity** - This is the user assigned [managed identity]../../active-directory/managed-identities-azure-resources/overview.md) used to manage access to the above storage account.
+1. **Managed Identity** - This is the user assigned [managed identity](../../active-directory/managed-identities-azure-resources/overview.md) used to manage access to the above storage account.
1. **Virtual machine** - This is a virtual machine that will serve as your simulated edge device. 1. **IoT Hub** - This acts as a central message hub for bi-directional communication between your IoT application, IoT Edge modules and the devices it manages.
azure-vmware Configure Nsx Network Components Azure Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/configure-nsx-network-components-azure-portal.md
Title: Configure NSX network components using Azure VMware Solution description: Learn how to use the Azure VMware Solution to configure NSX-T network segments. Previously updated : 05/28/2021 Last updated : 06/17/2021 # Customer intent: As an Azure service administrator, I want to configure NSX network components using a simplified view of NSX-T operations a VMware administrator needs daily. The simplified view is targeted at users unfamiliar with NSX-T Manager.
You can create and configure an NSX-T segment from the Azure VMware Solution con
>[!NOTE] >If you plan to use DHCP, you'll need to [configure a DHCP server or DHCP relay](#create-a-dhcp-server-or-dhcp-relay-using-the-azure-portal) before you can create and configure an NSX-T segment.
-1. In your Azure VMware Solution private cloud, under **Workload Networking**, select **Segments** > **Add**. Provide the details for the new logical segment and select **OK**.
+1. In your Azure VMware Solution private cloud, under **Workload Networking**, select **Segments** > **Add**.
+
+2. Provide the details for the new logical segment and select **OK**.
:::image type="content" source="media/configure-nsx-network-components-azure-portal/add-new-nsxt-segment.png" alt-text="Screenshot showing how to add a new segment.":::
You can create and configure an NSX-T segment from the Azure VMware Solution con
- **Type** - Overlay segment supported by Azure VMware Solution.
- The segment is now visible in the Azure VMware Solution console, NSX-T Manger, and vCenter.
+The segment is now visible in the Azure VMware Solution console, NSX-T Manger, and vCenter.
## Create a DHCP server or DHCP relay using the Azure portal
backup Backup Azure Move Recovery Services Vault https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-azure-move-recovery-services-vault.md
All public regions and sovereign regions are supported, except France Central, F
- During vault move across resource groups, both the source and target resource groups are locked preventing the write and delete operations. For more information, see this [article](../azure-resource-manager/management/move-resource-group-and-subscription.md). - Only admin subscription has the permissions to move a vault.-- For moving vaults across subscriptions, the target subscription must reside in the same tenant as the source subscription and its state must be enabled. To move a vault to a different Azure AD directory, see [Transfer subscription to a different directory](../role-based-access-control/transfer-subscription.md) and [Recovery Service vault FAQs](/azure/backup/backup-azure-backup-faq.yml#recovery-services-vault).
+- For moving vaults across subscriptions, the target subscription must reside in the same tenant as the source subscription and its state must be enabled. To move a vault to a different Azure AD, see [Transfer subscription to a different directory](../role-based-access-control/transfer-subscription.md) and [Recovery Service vault FAQs](/azure/backup/backup-azure-backup-faq#recovery-services-vault).
- You must have permission to perform write operations on the target resource group. - Moving the vault only changes the resource group. The Recovery Services vault will reside on the same location and it can't be changed. - You can move only one Recovery Services vault, per region, at a time.
If you need to keep the current protected data in the old vault and continue the
You can move many different types of resources between resource groups and subscriptions.
-For more information, see [Move resources to new resource group or subscription](../azure-resource-manager/management/move-resource-group-and-subscription.md).
+For more information, see [Move resources to new resource group or subscription](../azure-resource-manager/management/move-resource-group-and-subscription.md).
backup Backup Rm Template Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-rm-template-samples.md
The following table includes links to Azure Resource Manager templates for use w
|**Recovery Services vault** | | | [Create a Recovery Services vault](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.recoveryservices/recovery-services-vault-create)| Create a Recovery Services vault. The vault can be used for Azure Backup and Azure Site Recovery. | |**Back up virtual machines**| |
-| [Back up Resource Manager VMs](https://github.com/Azure/azure-quickstart-templates/tree/master/101-recovery-services-backup-vms) | Use the existing Recovery Services vault and Backup policy to back up Resource Manager-virtual machines in the same resource group.|
-| [Back up IaaS VMs to Recovery Services vault](https://github.com/Azure/azure-quickstart-templates/tree/master/201-recovery-services-backup-classic-resource-manager-vms) | Template to back up classic and Resource Manager-virtual machines. |
+| [Back up Resource Manager VMs](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.recoveryservices/recovery-services-backup-vms) | Use the existing Recovery Services vault and Backup policy to back up Resource Manager-virtual machines in the same resource group.|
+| [Back up IaaS VMs to Recovery Services vault](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.recoveryservices/recovery-services-backup-classic-resource-manager-vms) | Template to back up classic and Resource Manager-virtual machines. |
| [Create Weekly Backup policy for IaaS VMs](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.recoveryservices/recovery-services-weekly-backup-policy-create) | Template creates Recovery Services vault and a weekly backup policy, which is used to back up classic and Resource Manager-virtual machines.| | [Create Daily Backup policy for IaaS VMs](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.recoveryservices/recovery-services-daily-backup-policy-create) | Template creates Recovery Services vault and a daily backup policy, which is used to back up classic and Resource Manager-virtual machines.| | [Deploy Windows Server VM with backup enabled](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.recoveryservices/recovery-services-create-vm-and-configure-backup) | Template creates a Windows Server VM and Recovery Services vault with the default backup policy enabled.| |**Monitor Backup jobs** | |
-| [Use Azure Monitor logs with Azure Backup](https://github.com/Azure/azure-quickstart-templates/tree/master/101-backup-oms-monitoring) | Template deploys Azure Monitor logs with Azure Backup, which allows you to monitor backup and restore jobs, backup alerts, and the Cloud storage used in your Recovery Services vaults.|
+| [Use Azure Monitor logs with Azure Backup](https://github.com/Azure/azure-quickstart-templates/tree/master/demos/backup-oms-monitoring) | Template deploys Azure Monitor logs with Azure Backup, which allows you to monitor backup and restore jobs, backup alerts, and the Cloud storage used in your Recovery Services vaults.|
|**Back up SQL Server in Azure VM** | |
-| [Back up SQL Server in Azure VM](https://github.com/Azure/azure-quickstart-templates/tree/master/101-recovery-services-vm-workload-backup) | Template creates a Recovery Services vault and Workload specific Backup Policy. It Registers the VM with Azure Backup service and Configures Protection on that VM. Currently, it only works for SQL Gallery images. |
+| [Back up SQL Server in Azure VM](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.recoveryservices/recovery-services-vm-workload-backup) | Template creates a Recovery Services vault and Workload specific Backup Policy. It Registers the VM with Azure Backup service and Configures Protection on that VM. Currently, it only works for SQL Gallery images. |
|**Back up Azure file shares** | |
-| [Back up Azure file shares](https://github.com/Azure/azure-quickstart-templates/tree/master/101-recovery-services-backup-file-share) | This template configures protection for an existing Azure file share by specifying appropriate details for the Recovery Services vault and backup policy. It optionally creates a new Recovery Services vault and backup policy, and registers the storage account containing the file share to the Recovery Services vault. |
+| [Back up Azure file shares](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.recoveryservices/recovery-services-backup-file-share) | This template configures protection for an existing Azure file share by specifying appropriate details for the Recovery Services vault and backup policy. It optionally creates a new Recovery Services vault and backup policy, and registers the storage account containing the file share to the Recovery Services vault. |
| | |
cloud-services-extended-support Deploy Prerequisite https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cloud-services-extended-support/deploy-prerequisite.md
Deployments that utilized the old diagnostics plugins need the settings removed
```xml <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" /> ```
+## Subscription Access Level
+
+The subsciption containing networking resources needs to have [network contributor](https://docs.microsoft.com/azure/role-based-access-control/built-in-roles#network-contributor) access or above for Cloud Services (extended support). For more details on please refer to [RBAC built in roles](https://docs.microsoft.com/azure/role-based-access-control/built-in-roles)
## Key Vault creation
cloud-services-extended-support Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cloud-services-extended-support/faq.md
Cloud Services (extended support) supports dynamic & static IP allocation method
### Why am I getting charged for IP addresses? Customers are billed for IP Address use on Cloud Services (extended support) just as users are billed for IP addresses associated with virtual machines.
+### Can the reserved IP be updated after a successful deployment?
+A reserved IP cannot be added, removed or changed during deployment update or upgrade. If the IP addresses needs to be changed, please use a swappable Cloud Service or deploy two Cloud Services with a CName in Azure DNS\Traffic Manager so that the IP can be pointed to either of them.
+ ### Can I use a DNS name with Cloud Services (extended support)? Yes. Cloud Services (extended support) can also be given a DNS name. With Azure Resource Manager, the DNS label is an optional property of the public IP address that is assigned to the Cloud Service. The format of the DNS name for Azure Resource Manager based deployments is `<userlabel>.<region>.cloudapp.azure.com`
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Computer-vision/language-support.md
Computer Vision's OCR APIs support several languages. They do not require you to
|:--|:-:|:--:|::|::| |Afrikaans|`af`|Γ£ö | | | |Albanian |`sq`|Γ£ö | | |
-|Arabic | `ar`| Γ£ö | | |
+|Arabic | `ar`| | Γ£ö | |
|Asturian |`ast`|Γ£ö | | | |Basque |`eu`| Γ£ö | | | |Bislama |`bi`|Γ£ö | | |
cognitive-services Luis Container Configuration https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-container-configuration.md
Logging:Console:LogLevel:Default=Information
## Next steps * Review [How to install and run containers](luis-container-howto.md)
-* Refer to [Troubleshooting](troubleshooting.md) to resolve issues related to LUIS functionality.
+* Refer to [Troubleshooting](troubleshooting.yml) to resolve issues related to LUIS functionality.
* Use more [Cognitive Services Containers](../cognitive-services-container-support.md)
cognitive-services Luis Container Howto https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-container-howto.md
In this article, you learned concepts and workflow for downloading, installing,
* Review [Configure containers](luis-container-configuration.md) for configuration settings. * See [LUIS container limitations](luis-container-limitations.md) for known capability restrictions.
-* Refer to [Troubleshooting](troubleshooting.md) to resolve issues related to LUIS functionality.
+* Refer to [Troubleshooting](troubleshooting.yml) to resolve issues related to LUIS functionality.
* Use more [Cognitive Services Containers](../cognitive-services-container-support.md) <!-- Links - external -->
cognitive-services Luis User Privacy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-user-privacy.md
Customer content is stored encrypted in Microsoft regional Azure storage and inc
- User account content collected at registration - Training data required to build the models - Logged user queries used by [active learning](luis-concept-review-endpoint-utterances.md) to help improve the model
- - Users can turn off query logging by appending `&log=false` to the request, details [here](troubleshooting.md#how-can-i-disable-the-logging-of-utterances)
+ - Users can turn off query logging by appending `&log=false` to the request, details [here](/azure/cognitive-services/luis/troubleshooting#how-can-i-disable-the-logging-of-utterances-)
## Deleting customer data LUIS users have full control to delete any user content, either through the LUIS web portal or the LUIS Authoring (also known as Programmatic) APIs. The following table displays links assisting with both:
cognitive-services Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/troubleshooting.md
- Title: Frequently asked questions (FAQ) - LUIS
-description: This article contains answers to frequently asked questions about Language Understanding (LUIS).
--- Previously updated : 05/17/2021--
-# Language Understanding Frequently Asked Questions (FAQ)
-
-This article contains answers to frequently asked questions about Language Understanding (LUIS).
-
-## What's new
-
-[Learn more](whats-new.md) about what's new in Language Understanding (LUIS).
-
-<a name="luis-authoring"></a>
-
-## Authoring
-
-### What are the LUIS best practices?
-Start with the [Authoring Cycle](luis-concept-app-iteration.md), then read the [best practices](luis-concept-best-practices.md).
-
-### What is the best way to start building my app in LUIS?
-
-The best way to build your app is through an [incremental process](luis-concept-app-iteration.md).
-
-### What is a good practice to model the intents of my app? Should I create more specific or more generic intents?
-
-Choose intents that are not so general as to be overlapping, but not so specific that it makes it difficult for LUIS to distinguish between similar intents. Creating discriminative specific intents is one of the best practices for LUIS modeling.
-
-### Is it important to train the None intent?
-
-Yes, it is good to train your **None** intent with more utterances as you add more labels to other intents. A good ratio is 1 or 2 labels added to **None** for every 10 labels added to an intent. This ratio boosts the discriminative power of LUIS.
-
-### How can I correct spelling mistakes in utterances?
-
-See the [Bing Spell Check API V7](luis-tutorial-bing-spellcheck.md) tutorial. LUIS enforces limits imposed by Bing Spell Check API V7.
-
-### How do I edit my LUIS app programmatically?
-To edit your LUIS app programmatically, use the [Authoring API](https://go.microsoft.com/fwlink/?linkid=2092087). See [Call LUIS authoring API](./get-started-get-model-rest-apis.md) and [Build a LUIS app programmatically using Node.js](./luis-tutorial-node-import-utterances-csv.md) for examples of how to call the Authoring API. The Authoring API requires that you use an [authoring key](luis-how-to-azure-subscription.md) rather than an endpoint key. Programmatic authoring allows up to 1,000,000 calls per month and five transactions per second. For more info on the keys you use with LUIS, see [Manage keys](./luis-how-to-azure-subscription.md).
-
-### Where is the Pattern feature that provided regular expression matching?
-The previous **Pattern feature** is currently deprecated, replaced by **[Patterns](luis-concept-patterns.md)**.
-
-### How do I use an entity to pull out the correct data?
-See [entities](luis-concept-entity-types.md) and [data extraction](luis-concept-data-extraction.md).
-
-### Should variations of an example utterance include punctuation?
-Use one of the following solutions:
-* Ignore [punctuation](luis-reference-application-settings.md#punctuation-normalization)
-* Add the different variations as example utterances to the intent
-* Add the pattern of the example utterance with the [syntax to ignore](luis-concept-patterns.md#pattern-syntax) the punctuation.
-
-### Does LUIS currently support Cortana?
-
-Cortana prebuilt apps were deprecated in 2017. They are no longer supported.
-
-### How do I transfer ownership of a LUIS app?
-To transfer a LUIS app to a different Azure subscription, export the LUIS app and import it using a new account. Update the LUIS app ID in the client application that calls it. The new app may return slightly different LUIS scores from the original app.
-
-### A prebuilt entity is tagged in an example utterance instead of my custom entity. How do I fix this?
-
-In the LUIS portal, you can label text for the exact entity you are interested in extracting. If the LUIS portal is not showing the correct entity prediction, you may need to add more utterances and label the entity within the text or add a feature.
-
-### I tried to import an app or version file but I got an error, what happened?
-
-Read more about [version import errors](luis-how-to-manage-versions.md#import-errors).
-
-<a name="luis-collaborating"></a>
-
-## Collaborating and contributing
-
-### How do I give collaborators access to LUIS with Azure Active Directory (Azure AD) or Azure role-based access control (Azure RBAC)?
-
-See [Azure Active Directory resources](luis-how-to-collaborate.md#azure-active-directory-resources) and [Azure Active Directory tenant user](luis-how-to-collaborate.md#azure-active-directory-tenant-user) to learn how to give collaborators access.
-
-<a name="luis-endpoint"></a>
-
-## Endpoint
-
-### I received an HTTP 403 error status code. How do I fix it?
-
-You get 403 and 429 error status codes when you exceed the transactions per second or transactions per month for your pricing tier. Increase your pricing tier, or use Language Understanding [containers](luis-container-howto.md).
-
-When you use all those free 1000 endpoint queries or you exceed your pricing tier's monthly transactions quota, you receive an HTTP 403 error status code.
-
-To fix this error, you need to either [change your pricing tier](luis-how-to-azure-subscription.md#change-the-pricing-tier) to a higher tier or [create a new resource](luis-get-started-create-app.md#sign-in-to-luis-portal) and assign it to your app.
-
-Solutions for this error include:
-
-* In the [Azure portal](https://portal.azure.com), on your Language Understanding resource, on the **Resource Management -> Pricing tier**, change your pricing tier to a higher TPS tier. You don't need to do anything in the Language Understanding portal if your resource is already assigned to your Language Understanding app.
-* If your usage exceeds the highest pricing tier, add more Language Understanding resources with a load balancer in front of them. The [Language Understanding container](luis-container-howto.md) with Kubernetes or Docker Compose can help with this.
-
-### I received an HTTP 429 error status code. How do I fix it?
-
-You get 403 and 429 error status codes when you exceed the transactions per second or transactions per month for your pricing tier. Increase your pricing tier, or use Language Understanding [containers](luis-container-howto.md).
-
-This status code is returned when your transactions per second exceeds your pricing tier.
-
-Solutions include:
-
-* You can [increase your pricing tier](luis-how-to-azure-subscription.md#change-the-pricing-tier), if you are not at the highest tier.
-* If your usage exceeds the highest pricing tier, add more Language Understanding resources with a load balancer in front of them. The [Language Understanding container](luis-container-howto.md) with Kubernetes or Docker Compose can help with this.
-* You can gate your client application requests with a [retry policy](/azure/architecture/best-practices/transient-faults#general-guidelines) you implement yourself when you get this status code.
-
-### My endpoint query returned unexpected results. What should I do?
-
-Unexpected query prediction results are based on the state of the published model. To correct the model, you may need to change the model, train, and publish again.
-
-Correcting the model starts with [active learning](luis-how-to-review-endpoint-utterances.md).
-
-You can remove non-deterministic training by updating the [application version settings API](https://westus.dev.cognitive.microsoft.com/docs/services/5890b47c39e2bb17b84a55ff/operations/versions-update-application-version-settings) in order to use all training data.
-
-Review the [best practices](luis-concept-best-practices.md) for other tips.
-
-### Why does LUIS add spaces to the query around or in the middle of words?
-LUIS [tokenizes](luis-glossary.md#token) the utterance based on the [culture](luis-language-support.md#tokenization). Both the original value and the tokenized value are available for [data extraction](luis-concept-data-extraction.md#tokenized-entity-returned).
-
-### How do I create and assign a LUIS endpoint key?
-[Create the endpoint key](luis-how-to-azure-subscription.md) in Azure for your [service](https://azure.microsoft.com/pricing/details/cognitive-services/language-understanding-intelligent-services/) level. [Assign the key](luis-how-to-azure-subscription.md) on the **[Azure Resources](luis-how-to-azure-subscription.md)** page. There is no corresponding API for this action. Then you must change the HTTP request to the endpoint to [use the new endpoint key](luis-how-to-azure-subscription.md).
-
-### How do I interpret LUIS scores?
-Your system should use the highest scoring intent regardless of its value. For example, a score below 0.5 (less than 50%) does not necessarily mean that LUIS has low confidence. Providing more training data can help increase the [score](luis-concept-prediction-score.md) of the most-likely intent.
-
-### Why don't I see my endpoint hits in my app's Dashboard?
-The total endpoint hits in your app's Dashboard are updated periodically, but the metrics associated with your LUIS endpoint key in the Azure portal are updated more frequently.
-
-If you don't see updated endpoint hits in the Dashboard, sign in to the Azure portal, and find the resource associated with your LUIS endpoint key, and open **Metrics** to select the **Total Calls** metric. If the endpoint key is used for more than one LUIS app, the metric in the Azure portal shows the aggregate number of calls from all LUIS apps that use it.
-
-### Is there a PowerShell command get to the endpoint quota?
--
-You can use a PowerShell command to see the endpoint quota:
-
-```powershell
-Get-AzCognitiveServicesAccountUsage -ResourceGroupName <your-resource-group> -Name <your-resource-name>
-```
-
-### My LUIS app was working yesterday but today I'm getting 403 errors. I didn't change the app. How do I fix it?
-Follow these [instructions](#how-do-i-create-and-assign-a-luis-endpoint-key) to create a LUIS endpoint key and assign it to the app. Then you must change the client application's HTTP request to the endpoint to [use the new endpoint key](luis-how-to-azure-subscription.md). If you created a new resource in a different region, change the HTTP client request's region too.
-
-### How do I secure my LUIS endpoint?
-See [Securing the endpoint](luis-how-to-azure-subscription.md#securing-the-endpoint).
-
-## Working within LUIS limits
-
-### What is the maximum number of intents and entities that a LUIS app can support?
-See the [boundaries](luis-limits.md) reference.
-
-### I want to build a LUIS app with more than the maximum number of intents. What should I do?
-
-See [Best practices for intents](luis-concept-intent.md#if-you-need-more-than-the-maximum-number-of-intents).
-
-### What are the limits on the number and size of phrase lists?
-For the maximum length of a [phrase list](./luis-concept-feature.md), see the [boundaries](luis-limits.md) reference.
-
-### What are the limits on example utterances?
-See the [boundaries](luis-limits.md) reference.
-
-## Testing and training
-
-### I see some errors in the batch testing pane for some of the models in my app. How can I address this problem?
-
-The errors indicate that there is some discrepancy between your labels and the predictions from your models. To address the problem, do one or both of the following tasks:
-* To help LUIS improve discrimination among intents, add more labels.
-* To help LUIS learn faster, add phrase-list features that introduce domain-specific vocabulary.
-
-See the [Batch testing](./luis-how-to-batch-test.md) tutorial.
-
-### When an app is exported then reimported into a new app (with a new app ID), the LUIS prediction scores are different. Why does this happen?
-
-See [Prediction differences between copies of same app](luis-concept-prediction-score.md#review-intents-with-similar-scores).
-
-### Some utterances go to the wrong intent after I made changes to my app. The issue seems to disappear at random. How do I fix it?
-
-See [Train with all data](luis-how-to-train.md#train-with-all-data).
-
-## App publishing
-
-### What is the tenant ID in the "Add a key to your app" window?
-In Azure, a tenant represents the client or organization that is associated with a service. Find your tenant ID in the Azure portal in the **Directory ID** box by selecting **Azure Active Directory** > **Manage** > **Properties**.
-
-![Tenant ID in the Azure portal](./media/luis-manage-keys/luis-assign-key-tenant-id.png)
-
-<a name="why-are-there-more-subscription-keys-on-my-apps-publish-page-than-i-assigned-to-the-app"></a>
-<a name="why-are-there-more-endpoint-keys-on-my-apps-publish-page-than-i-assigned-to-the-app"></a>
--
-### Why are there more endpoint keys assigned to my app than I assigned?
-Each LUIS app has the authoring/starter key in the endpoint list as a convenience. This key allows only a few endpoint hits so you can try out LUIS.
-
-If your app existed before LUIS was generally available (GA), LUIS endpoint keys in your subscription are assigned automatically. This was done to make GA migration easier. Any new LUIS endpoint keys in the Azure portal are _not_ automatically assigned to LUIS.
-
-## Key management
-
-### How do I know what key I need, where I get it, and what I do with it?
-
-See [Authoring and query prediction endpoint keys in LUIS](luis-how-to-azure-subscription.md) to learn about the differences between the authoring key and the prediction runtime key.
-
-### I got an error about being out of quota. How do I fix it?
-
-See, Fix HTTP status code [403](#i-received-an-http-403-error-status-code-how-do-i-fix-it) and [429](#i-received-an-http-429-error-status-code-how-do-i-fix-it) to learn more.
-
-### I need to handle more endpoint queries. How do I do that?
-
-See, Fix HTTP status code [403](#i-received-an-http-403-error-status-code-how-do-i-fix-it) and [429](#i-received-an-http-429-error-status-code-how-do-i-fix-it) to learn more.
-
-### I created an authoring key but it isn't showing in the LUIS portal. What happened?
-
-Authoring keys are available in the LUIS portal after [migrating to the authoring key experience](luis-migration-authoring.md).
-
-## App management
-
-### How do I download a log of user utterances?
-By default, your LUIS app logs utterances from users. To download a log of utterances that users send to your LUIS app, go to **My Apps**, and select the app. In the contextual toolbar, select **Export Endpoint Logs**. The log is formatted as a comma-separated value (CSV) file.
-
-### How can I disable the logging of utterances?
-You can turn off the logging of user utterances by setting `log=false` in the Endpoint URL that your client application uses to query LUIS. However, turning off logging disables your LUIS app's ability to suggest utterances or improve performance that's based on [active learning](luis-concept-review-endpoint-utterances.md#what-is-active-learning). If you set `log=false` because of data-privacy concerns, you can't download a record of those user utterances from LUIS or use those utterances to improve your app.
-
-Logging is the only storage of utterances.
-
-### Why don't I want all my endpoint utterances logged?
-If you are using your log for prediction analysis, do not capture test utterances in your log.
-
-## Data management
-
-### Can I delete data from LUIS?
-
-* You can always delete example utterances used for training LUIS. If you delete an example utterance from your LUIS app, it is removed from the LUIS web service and is unavailable for export.
-* You can delete utterances from the list of user utterances that LUIS suggests in the **Review endpoint utterances** page. Deleting utterances from this list prevents them from being suggested, but doesn't delete them from logs.
-* If you delete an account, all apps are deleted, along with their example utterances and logs. The data is retained on the servers for 60 days before it is deleted permanently.
-
-### How does Microsoft manage data I send to LUIS?
-
-The [Trust Center](https://www.microsoft.com/trustcenter) explains our commitments and your options for data management and access in Azure Services.
-
-## Language and translation support
-
-### I have an app in one language and want to create a parallel app in another language. What is the easiest way to do so?
-1. Export your app.
-2. Translate the labeled utterances in the JSON file of the exported app to the target language.
-3. You might need to change the names of the intents and entities or leave them as they are.
-4. Finally, import the app to have a LUIS app in the target language.
-
-## App notification
-
-### Why did I get an email saying I'm almost out of quota?
-Your authoring/starter key is only allowed 1000 endpoint queries a month. Create a LUIS endpoint key (free or paid) and use that key when making endpoint queries. If you are making endpoint queries from a bot or another client application, you need to change the LUIS endpoint key there.
-
-## Bots
-
-### My LUIS bot isn't working. What do I do?
-
-The first issue is to isolate if the issue is related to LUIS or happens outside the LUIS middleware.
-
-#### Resolve issue in LUIS
-Pass the same utterance to LUIS from the [LUIS endpoint](luis-get-started-create-app.md#query-the-v3-api-prediction-endpoint). If you receive an error, resolve the issue in LUIS until the error is no longer returned. Common errors include:
-
-* `Out of call volume quota. Quota will be replenished in <time>.` - This issue indicates you either need to change from an authoring key to an [endpoint key](luis-how-to-azure-subscription.md) or you need to change [service tiers](luis-how-to-azure-subscription.md#change-the-pricing-tier).
-
-#### Resolve issue in Azure Bot Service
-
-If you are using the Azure Bot Service and the issue is that the **Test in Web Chat** returns `Sorry, my bot code is having an issue`, check your logs:
-
-1. In the Azure portal, for your bot, from the **Bot management** section, select **Build**.
-1. Open the online code editor.
-1. In the top, blue navigation bar, select the bot name (the second item to the right).
-1. In the resulting drop-down list, select **Open Kudu Console**.
-1. Select **LogFiles**, then select **Application**. Review all log files. If you don't see the error in the application folder, review all log files under **LogFiles**.
-1. Remember to rebuild your project if you are using a compiled language such as C#.
-
-> [!Tip]
-> The console can also install packages.
-
-#### Resolve issue while debugging on local machine with Bot Framework.
-
-To learn more about local debugging of a bot, see [Debug a bot](/azure/bot-service/bot-service-debug-bot).
-
-## Integrating LUIS
-
-### Where is my LUIS app created during the Azure web app bot subscription process?
-If you select a LUIS template, and select the **Select** button in the template pane, the left-side pane changes to include the template type, and asks in what region to create the LUIS template. The web app bot process doesn't create a LUIS subscription though.
-
-![LUIS template web app bot region](./media/luis-faq/web-app-bot-location.png)
-
-### What LUIS regions support Bot Framework speech priming?
-[Speech priming](/bot-framework/bot-service-manage-speech-priming) is only supported for LUIS apps in the central (US) instance.
-
-## API Programming Strategies
-
-### How do I programmatically get the LUIS region of a resource?
-
-Use the LUIS sample to [find region](https://github.com/Azure-Samples/cognitive-services-language-understanding/tree/master/documentation-samples/find-region) programmatically using C# or Node.Js.
-
-## LUIS service
-
-### Is Language Understanding (LUIS) available on-premises or in private cloud?
-
-Yes, you can use the LUIS [container](luis-container-howto.md) for these scenarios if you have the necessary connectivity to meter usage.
-
-## Migrating to the next version
-
-### How do I migrate to preview V3 API?
-
-See [API v2 to v3 Migration guide for LUIS apps](luis-migration-api-v3.md)
-
-## Build 2019 Conference announcements
-
-The following features were released at the Build 2019 Conference:
-
-* [Preview of V3 API migration guide](luis-migration-api-v3.md)
-* [Improved analytics dashboard](luis-how-to-use-dashboard.md)
-* [Improved prebuilt domains](luis-reference-prebuilt-domains.md)
-* [Dynamic list entities](schema-change-prediction-runtime.md#dynamic-lists-passed-in-at-prediction-time)
-* [External entities](luis-migration-api-v3.md#external-entities-passed-in-at-prediction-time)
-
-Videos:
-
-* [How to use Azure Conversational AI to scale your business for the next generation](https://www.youtube.com/watch?v=_k97jd-csuk&feature=youtu.be)
-
-## Next steps
-
-To learn more about LUIS, see the following resources:
-* [Stack Overflow questions tagged with LUIS](https://stackoverflow.com/questions/tagged/luis)
-* [Microsoft Q&A question page for MSDN Language Understanding Intelligent Services (LUIS)](/answers/topics/azure-language-understanding.html)
cognitive-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/QnAMaker/Overview/overview.md
QnA Maker is a cloud-based Natural Language Processing (NLP) service that allows
QnA Maker is commonly used to build conversational client applications, which include social media applications, chat bots, and speech-enabled desktop applications.
-QnA Maker doesn't store customer data. All customer data (question answers and chatlogs) is stored in the region the customer deploys the dependent service instances in. For more details on dependent services see [here](../concepts/plan.md?tabs=v1).
+QnA Maker doesn't store customer data. All customer data (question answers and chat logs) is stored in the region the customer deploys the dependent service instances in. For more details on dependent services see [here](../concepts/plan.md?tabs=v1).
+
+This documentation contains the following article types:
+
+* The [quickstarts](../quickstarts/create-publish-knowledge-base.md) are step-by-step instructions that let you make calls to the service and get results in a short period of time.
+* The [how-to guides](../how-to/set-up-qnamaker-service-azure.md) contain instructions for using the service in more specific or customized ways.
+* The [conceptual articles](../concepts/plan.md) provide in-depth explanations of the service's functionality and features.
+* [**Tutorials**](../tutorials/create-faq-bot-with-azure-bot-service.md) are longer guides that show you how to use the service as a component in broader business solutions.
## When to use QnA Maker
cognitive-services Speech Container Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/speech-container-faq.md
- Title: Speech service containers frequently asked questions (FAQ)-
-description: Install and run speech containers. speech-to-text transcribes audio streams to text in real time that your applications, tools, or devices can consume or display. Text-to-speech converts input text into human-like synthesized speech.
------ Previously updated : 03/11/2021----
-# Speech service containers frequently asked questions (FAQ)
-
-When using the Speech service with containers, rely on this collection of frequently asked questions before escalating to support. This article captures questions varying degree, from general to technical. To expand an answer, click on the question.
-
-## General questions
-
-<details>
-<summary>
-<b>How do Speech containers work and how do I set them up?</b>
-</summary>
-
-**Answer:** When setting up the production cluster, there are several things to consider. First, setting up single language, multiple containers, on the same machine, should not be a large issue. If you are experiencing problems, it may be a hardware-related issue - so we would first look at resource, that is; CPU and memory specifications.
-
-Consider for a moment, the `ja-JP` container and latest model. The acoustic model is the most demanding piece CPU-wise, while the language model demands the most memory. When we benchmarked the use, it takes about 0.6 CPU cores to process a single speech-to-text request when audio is flowing in at real-time (like from the microphone). If you are feeding audio faster than real-time (like from a file), that usage can double (1.2x cores). Meanwhile, the memory listed below is operating memory for decoding speech. It does *not* take into account the actual full size of the language model, which will reside in file cache. For `ja-JP` that's an additional 2 GB; for `en-US`, it may be more (6-7 GB).
-
-If you have a machine where memory is scarce, and you are trying to deploy multiple languages on it, it is possible that file cache is full, and the OS is forced to page models in and out. For a running transcription, that could be disastrous, and may lead to slowdowns and other performance implications.
-
-Furthermore, we pre-package executables for machines with the [advanced vector extension (AVX2)](speech-container-howto.md#advanced-vector-extension-support) instruction set. A machine with the AVX512 instruction set will require code generation for that target, and starting 10 containers for 10 languages may temporarily exhaust CPU. A message like this one will appear in the docker logs:
-
-```console
-2020-01-16 16:46:54.981118943
-[W:onnxruntime:Default, tvm_utils.cc:276 LoadTVMPackedFuncFromCache]
-Cannot find Scan4_llvm__mcpu_skylake_avx512 in cache, using JIT...
-```
-
-You can set the number of decoders you want inside a *single* container using `DECODER MAX_COUNT` variable. So, basically, we should start with your SKU (CPU/memory), and we can suggest how to get the best out of it. A great starting point is referring to the recommended host machine resource specifications.
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>Could you help with capacity planning and cost estimation of on-prem Speech-to-text containers?</b>
-</summary>
-
-**Answer:** For container capacity in batch processing mode, each decoder could handle 2-3x in real time, with two CPU cores, for a single recognition. We do not recommend keeping more than two concurrent recognitions per container instance, but recommend running more instances of containers for reliability/availability reasons, behind a load balancer.
-
-Though we could have each container instance running with more decoders. For example, we may be able to set up 7 decoders per container instance on an eight core machine (at at more than 2x each), yielding 15x throughput. There is a param `DECODER_MAX_COUNT` to be aware of. For the extreme case, reliability and latency issues arise, with throughput increased significantly. For a microphone, it will be at 1x real time. The overall usage should be at about one core for a single recognition.
-
-For scenario of processing 1 K hours/day in batch processing mode, in an extreme case, 3 VMs could handle it within 24 hours but not guaranteed. To handle spike days, failover, update, and to provide minimum backup/BCP, we recommend 4-5 machines instead of 3 per cluster, and with 2+ clusters.
-
-For hardware, we use standard Azure VM `DS13_v2` as a reference (each core must be 2.6 GHz or better, with AVX2 instruction set enabled).
-
-| Instance | vCPU(s) | RAM | Temp storage | Pay-as-you-go with AHB | 1-year reserve with AHB (% Savings) | 3-year reserved with AHB (% Savings) |
-|--||--|--||-|--|
-| `DS13 v2` | 8 | 56 GiB | 112 GiB | $0.598/hour | $0.3528/hour (~41%) | $0.2333/hour (~61%) |
-
-Based on the design reference (two clusters of 5 VMs to handle 1 K hours/day audio batch processing), 1-year hardware cost will be:
-
-> 2 (clusters) * 5 (VMs per cluster) * $0.3528/hour * 365 (days) * 24 (hours) = $31K / year
-
-When mapping to physical machine, a general estimation is 1 vCPU = 1 Physical CPU Core. In reality, 1vCPU is more powerful than a single core.
-
-For on-prem, all of these additional factors come into play:
--- On what type the physical CPU is and how many cores on it-- How many CPUs running together on the same box/machine-- How VMs are set up-- How hyper-threading / multi-threading is used-- How memory is shared-- The OS, etc.-
-Normally it is not as well tuned as Azure the environment. Considering other overhead, I would say a safe estimation is 10 physical CPU cores = 8 Azure vCPU. Though popular CPUs only have eight cores. With on-prem deployment, the cost will be higher than using Azure VMs. Also, consider the depreciation rate.
-
-Service cost is the same as the online service: $1/hour for speech-to-text. The Speech service cost is:
-
-> $1 * 1000 * 365 = $365K
-
-Maintenance cost paid to Microsoft depends on the service level and content of the service. It various from $29.99/month for basic level to hundreds of thousands if onsite service involved. A rough number is $300/hour for service/maintain. People cost is not included. Other infrastructure costs (such as storage, networks, and load balancers) are not included.
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>Why is punctuation missing from the transcription?</b>
-</summary>
-
-**Answer:** The `speech_recognition_language=<YOUR_LANGUAGE>` should be explicitly configured in the request if they are using Carbon client.
-
-For example:
-
-```python
-if not recognize_once(
- speechsdk.SpeechRecognizer(
- speech_config=speechsdk.SpeechConfig(
- endpoint=template.format("interactive"),
- speech_recognition_language="ja-JP"),
- audio_config=audio_config)):
-
- print("Failed interactive endpoint")
- exit(1)
-```
-Here is the output:
-
-```cmd
-RECOGNIZED: SpeechRecognitionResult(
- result_id=2111117c8700404a84f521b7b805c4e7,
- text="まだ早いまだ早いは猫である名前はまだないどこで生まれたかとんと見当を検討をなつかぬ。
- 何でも薄暗いじめじめした所でながら泣いていた事だけは記憶している。
- まだは今ここで初めて人間と言うものを見た。
- しかも後で聞くと、それは書生という人間中で一番同額同額。",
- reason=ResultReason.RecognizedSpeech)
-```
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>Can I use a custom acoustic model and language model with Speech container?</b>
-</summary>
-
-We are currently only able to pass one model ID, either custom language model or custom acoustic model.
-
-**Answer:** The decision to *not* support both acoustic and language models concurrently was made. This will remain in effect, until a unified identifier is created to reduce API breaks. So, unfortunately this is not supported right now.
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>Could you explain these errors from the custom speech-to-text container?</b>
-</summary>
-
-**Error 1:**
-
-```cmd
-Failed to fetch manifest: Status: 400 Bad Request Body:
-{
- "code": "InvalidModel",
- "message": "The specified model is not supported for endpoint manifests."
-}
-```
-
-**Answer 1:** If you're training with the latest custom model, we currently don't support that. If you train with an older version, it should be possible to use. We are still working on supporting the latest versions.
-
-Essentially, the custom containers do not support Halide or ONNX-based acoustic models (which is the default in the custom training portal). This is due to custom models not being encrypted and we don't want to expose ONNX models, however; language models are fine. The customer will need to explicitly select an older non-ONNX model for custom training. Accuracy will not be affected. The model size may be larger (by 100 MB).
-
-> Support model > 20190220 (v4.5 Unified)
-
-**Error 2:**
-
-```cmd
-HTTPAPI result code = HTTPAPI_OK.
-HTTP status code = 400.
-Reason: Synthesis failed.
-StatusCode: InvalidArgument,
-Details: Voice does not match.
-```
-
-**Answer 2:** You need to provide the correct voice name in the request, which is case-sensitive. Refer to the full service name mapping.
-
-**Error 3:**
-
-```json
-{
- "code": "InvalidProductId",
- "message": "The subscription SKU \"CognitiveServices.S0\" is not supported in this service instance."
-}
-```
-
-**Answer 3:** You reed to create a Speech resource, not a Cognitive Services resource.
--
-<br>
-</details>
-
-<details>
-<summary>
-<b>What API protocols are supported, REST or WS?</b>
-</summary>
-
-**Answer:** For speech-to-text and custom speech-to-text containers, we currently only support the websocket based protocol. The SDK only supports calling in WS but not REST. There's a plan to add REST support, but not ETA for the moment. Always refer to the official documentation, see [query prediction endpoints](speech-container-howto.md#query-the-containers-prediction-endpoint).
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>Is CentOS supported for Speech containers?</b>
-</summary>
-
-**Answer:** CentOS 7 is not supported by Python SDK yet, also Ubuntu 19.04 is not supported.
-
-The Python Speech SDK package is available for these operating systems:
-- **Windows** - x64 and x86-- **Mac** - macOS X version 10.12 or later-- **Linux** - Ubuntu 16.04, Ubuntu 18.04, Debian 9 on x64-
-For more information on environment setup, see [Python platform setup](quickstarts/setup-platform.md?pivots=programming-language-python). For now, Ubuntu 18.04 is the recommended version.
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>Why am I getting errors when attempting to call LUIS prediction endpoints?</b>
-</summary>
-
-I am using the LUIS container in an IoT Edge deployment and am attempting to call the LUIS prediction endpoint from another container. The LUIS container is listening on port 5001, and the URL I'm using is this:
-
-```csharp
-var luisEndpoint =
- $"ws://192.168.1.91:5001/luis/prediction/v3.0/apps/{luisAppId}/slots/production/predict";
-var config = SpeechConfig.FromEndpoint(new Uri(luisEndpoint));
-```
-
-The error I'm getting is:
-
-```cmd
-WebSocket Upgrade failed with HTTP status code: 404 SessionId: 3cfe2509ef4e49919e594abf639ccfeb
-```
-
-I see the request in the LUIS container logs and the message says:
-
-```cmd
-The request path /luis//predict" does not match a supported file type.
-```
-
-What does this mean? What am I missing? I was following the example for the Speech SDK, from [here](https://github.com/Azure-Samples/cognitive-services-speech-sdk). The scenario is that we are detecting the audio directly from the PC microphone and trying to determine the intent, based on the LUIS app we trained. The example I linked to does exactly that. And it works well with the LUIS cloud-based service. Using the Speech SDK seemed to save us from having to make a separate explicit call to the speech-to-text API and then a second call to LUIS.
-
-So, all I am attempting to do is switch from the scenario of using LUIS in the cloud to using the LUIS container. I can't imagine if the Speech SDK works for one, it won't work for the other.
-
-**Answer:**
-The Speech SDK should not be used against a LUIS container. For using the LUIS container, the LUIS SDK or LUIS REST API should be used. Speech SDK should be used against a speech container.
-
-A cloud is different than a container. A cloud can be composed of multiple aggregated containers (sometimes called micro services). So there is a LUIS container and then there is a Speech container - Two separate containers. The Speech container only does speech. The LUIS container only does LUIS. In the cloud, because both containers are known to be deployed, and it is bad performance for a remote client to go to the cloud, do speech, come back, then go to the cloud again and do LUIS, we provide a feature that allows the client to go to Speech, stay in the cloud, go to LUIS then come back to the client. Thus even in this scenario the Speech SDK goes to Speech cloud container with audio, and then Speech cloud container talks to LUIS cloud container with text. The LUIS container has no concept of accepting audio (it would not make sense for LUIS container to accept streaming audio - LUIS is a text-based service). With on-prem, we have no certainty our customer has deployed both containers, we don't presume to orchestrate between containers in our customers' premises, and if both containers are deployed on-prem, given they are more local to the client, it is not a burden to go the SR first, back to client, and have the customer then take that text and go to LUIS.
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>Why are we getting errors with macOS, Speech container and the Python SDK?</b>
-</summary>
-
-When we send a *.wav* file to be transcribed, the result comes back with:
-
-```cmd
-recognition is running....
-Speech Recognition canceled: CancellationReason.Error
-Error details: Timeout: no recognition result received.
-When creating a websocket connection from the browser a test, we get:
-wb = new WebSocket("ws://localhost:5000/speech/recognition/dictation/cognitiveservices/v1")
-WebSocket
-{
- url: "ws://localhost:5000/speech/recognition/dictation/cognitiveservices/v1",
- readyState: 0,
- bufferedAmount: 0,
- onopen: null,
- onerror: null,
- ...
-}
-```
-
-We know the websocket is set up correctly.
-
-**Answer:**
-If that is the case, then see [this GitHub issue](https://github.com/Azure-Samples/cognitive-services-speech-sdk/issues/310). We have a work-around, [proposed here](https://github.com/Azure-Samples/cognitive-services-speech-sdk/issues/310#issuecomment-527542722).
-
-Carbon fixed this at version 1.8.
--
-<br>
-</details>
-
-<details>
-<summary>
-<b>What are the differences in the Speech container endpoints?</b>
-</summary>
-
-Could you help fill the following test metrics, including what functions to test, and how to test the SDK and REST APIs? Especially, differences in "interactive" and "conversation", which I did not see from existing doc/sample.
-
-| Endpoint | Functional test | SDK | REST API |
-||-|--|-|
-| `/speech/synthesize/cognitiveservices/v1` | Synthesize Text (text-to-speech) | | Yes |
-| `/speech/recognition/dictation/cognitiveservices/v1` | Cognitive Services on-prem dictation v1 websocket endpoint | Yes | No |
-| `/speech/recognition/interactive/cognitiveservices/v1` | The Cognitive Services on-prem interactive v1 websocket endpoint | | |
-| `/speech/recognition/conversation/cognitiveservices/v1` | The cognitive services on-prem conversation v1 websocket endpoint | | |
-
-**Answer:**
-This is a fusion of:
-- People trying the dictation endpoint for containers, (I'm not sure how they got that URL)-- The 1<sup>st</sup> party endpoint being the one in a container.-- The 1<sup>st</sup> party endpoint returning speech.fragment messages instead of the `speech.hypothesis` messages the 3<sup>rd</sup> part endpoints return for the dictation endpoint.-- The Carbon quickstarts all use `RecognizeOnce` (interactive mode)-- Carbon having an assert that for `speech.fragment` messages requiring they aren't returned in interactive mode.-- Carbon having the asserts fire in release builds (killing the process).-
-The workaround is either switch to using continuous recognition in your code, or (quicker) connect to either the interactive or continuous endpoints in the container.
-For your code, set the endpoint to `host:port`/speech/recognition/interactive/cognitiveservices/v1
-
-For the various modes, see Speech modes - see below:
-
-## Speech modes - Interactive, conversation, dictation
--
-The proper fix is coming with SDK 1.8, which has on-prem support (will pick the right endpoint, so we will be no worse than online service). In the meantime, there is a sample for continuous recognition, why don't we point to it?
-
-https://github.com/Azure-Samples/cognitive-services-speech-sdk/blob/6805d96bf69d9e95c9137fe129bc5d81e35f6309/samples/python/console/speech_sample.py#L196
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>Which mode should I use for various audio files?</b>
-</summary>
-
-**Answer:** Here's a [quickstart using Python](./get-started-speech-to-text.md?pivots=programming-language-python). You can find the other languages linked on the docs site.
-
-Just to clarify for the interactive, conversation, and dictation; this is an advanced way of specifying the particular way in which our service will handle the speech request. Unfortunately, for the on-prem containers we have to specify the full URI (since it includes local machine), so this information leaked from the abstraction. We are working with the SDK team to make this more usable in the future.
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>How can we benchmark a rough measure of transactions/second/core?</b>
-</summary>
-
-**Answer:** Here are some of the rough numbers to expect from existing model (will change for the better in the one we will ship in GA):
--- For files, the throttling will be in the speech SDK, at 2x. First five seconds of audio are not throttled. Decoder is capable of doing about 3x real time. For this, the overall CPU usage will be close to 2 cores for a single recognition.-- For mic, it will be at 1x real time. The overall usage should be at about 1 core for a single recognition.-
-This can all be verified from the docker logs. We actually dump the line with session and phrase/utterance statistics, and that includes the RTF numbers.
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>How do I make multiple containers run on the same host?</b>
-</summary>
-
-The doc says to expose a different port, which I do, but the LUIS container is still listening on port 5000?
-
-**Answer:** Try `-p <outside_unique_port>:5000`. For example, `-p 5001:5000`.
--
-<br>
-</details>
-
-## Technical questions
-
-<details>
-<summary>
-<b>How can I get non-batch APIs to handle audio &lt;15 seconds long?</b>
-</summary>
-
-**Answer:** `RecognizeOnce()` in interactive mode only processes up to 15 seconds of audio, as the mode is intended for Speech Commanding where utterances are expected to be short. If you use `StartContinuousRecognition()` for dictation or conversation, there is no 15 second limit.
--
-<br>
-</details>
-
-<details>
-<summary>
-<b>What are the recommended resources, CPU and RAM; for 50 concurrent requests?</b>
-</summary>
-
-How many concurrent requests will a 4 core, 4 GB RAM handle? If we have to serve for example, 50 concurrent requests, how many Core and RAM is recommended?
-
-**Answer:**
-At real time, 8 with our latest `en-US`, so we recommend using more docker containers beyond 6 concurrent requests. It gets crazier beyond 16 cores, and it becomes non-uniform memory access (NUMA) node sensitive. The following table describes the minimum and recommended allocation of resources for each Speech container.
-
-# [Speech-to-text](#tab/stt)
-
-| Container | Minimum | Recommended |
-|-|||
-| Speech-to-text | 2 core, 2-GB memory | 4 core, 4-GB memory |
-
-# [Custom Speech-to-text](#tab/cstt)
-
-| Container | Minimum | Recommended |
-|--|||
-| Custom Speech-to-text | 2 core, 2-GB memory | 4 core, 4-GB memory |
-
-# [Text-to-speech](#tab/tts)
-
-| Container | Minimum | Recommended |
-|-|||
-| Text-to-speech | 1 core, 2-GB memory | 2 core, 3-GB memory |
-
-# [Custom Text-to-speech](#tab/ctts)
-
-| Container | Minimum | Recommended |
-|--|||
-| Custom Text-to-speech | 1 core, 2-GB memory | 2 core, 3-GB memory |
-
-***
--- Each core must be at least 2.6 GHz or faster.-- For files, the throttling will be in the Speech SDK, at 2x (first 5 seconds of audio are not throttled).-- The decoder is capable of doing about 2-3x real time. For this, the overall CPU usage will be close to two cores for a single recognition. That's why we do not recommend keeping more than two active connections, per container instance. The extreme side would be to put about 10 decoders at 2x real time in an eight core machine like `DS13_V2`. For the container version 1.3 and later, there's a param you could try setting `DECODER_MAX_COUNT=20`.-- For microphone, it will be at 1x real time. The overall usage should be at about one core for a single recognition.-
-Consider the total number of hours of audio you have. If the number is large, to improve reliability/availability, we suggest running more instances of containers, either on a single box or on multiple boxes, behind a load balancer. Orchestration could be done using Kubernetes (K8S) and Helm, or with Docker compose.
-
-As an example, to handle 1000 hours/24 hours, we have tried setting up 3-4 VMs, with 10 instances/decoders per VM.
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>Does the Speech container support punctuation?</b>
-</summary>
-
-**Answer:** We have capitalization (ITN) available in the on-prem container. Punctuation is language-dependent, and not supported for some languages, including Chinese and Japanese.
-
-We *do* have implicit and basic punctuation support for the existing containers, but it is `off` by default. What that means is that you can get the `.` character in your example, but not the `。` character. To enable this implicit logic, here's an example of how to do so in Python using our Speech SDK (it would be similar in other languages):
-
-```python
-speech_config.set_service_property(
- name='punctuation',
- value='implicit',
- channel=speechsdk.ServicePropertyChannel.UriQueryParameter
-)
-```
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>Why am I getting 404 errors when attempting to POST data to speech-to-text container?</b>
-</summary>
-
-Here is an example HTTP POST:
-
-```http
-POST /speech/recognition/conversation/cognitiveservices/v1?language=en-US&format=detailed HTTP/1.1
-Accept: application/json;text/xml
-Content-Type: audio/wav; codecs=audio/pcm; samplerate=16000
-Transfer-Encoding: chunked
-User-Agent: PostmanRuntime/7.18.0
-Cache-Control: no-cache
-Postman-Token: xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
-Host: 10.0.75.2:5000
-Accept-Encoding: gzip, deflate
-Content-Length: 360044
-Connection: keep-alive
-HTTP/1.1 404 Not Found
-Date: Tue, 22 Oct 2019 15:42:56 GMT
-Server: Kestrel
-Content-Length: 0
-```
-
-**Answer:** We do not support REST API in either speech-to-text container, we only support WebSockets through the Speech SDK. Always refer to the official documentation, see [query prediction endpoints](speech-container-howto.md#query-the-containers-prediction-endpoint).
-
-<br>
-</details>
--
-<details>
-<summary>
-<b> Why is the container running as a non-root user? What issues might occur because of this?</b>
-</summary>
-
-**Answer:** Note that the default user inside the container is a non-root user. This provides protection against processes escaping the container and obtaining escalated permissions on the host node. By default, some platforms like the OpenShift Container Platform already do this by running containers using an arbitrarily assigned user ID. For these platforms, the non-root user will need to have permissions to write to any externally mapped volume that requires writes. For example a logging folder, or a custom model download folder.
-<br>
-</details>
-
-<details>
-<summary>
-<b>When using the speech-to-text service, why am I getting this error?</b>
-</summary>
-
-```cmd
-Error in STT call for file 9136835610040002161_413008000252496:
-{
- "reason": "ResultReason.Canceled",
- "error_details": "Due to service inactivity the client buffer size exceeded. Resetting the buffer. SessionId: xxxxx..."
-}
-```
-
-**Answer:** This typically happens when you feed the audio faster than the Speech recognition container can take it. Client buffers fill up, and the cancellation is triggered. You need to control the concurrency and the RTF at which you send the audio.
-
-<br>
-</details>
-
-<details>
-<summary>
-<b>Could you explain these text-to-speech container errors from the C++ examples?</b>
-</summary>
-
-**Answer:** If the container version is older than 1.3, then this code should be used:
-
-```cpp
-const auto endpoint = "http://localhost:5000/speech/synthesize/cognitiveservices/v1";
-auto config = SpeechConfig::FromEndpoint(endpoint);
-auto synthesizer = SpeechSynthesizer::FromConfig(config);
-auto result = synthesizer->SpeakTextAsync("{{{text1}}}").get();
-```
-
-Older containers don't have the required endpoint for Carbon to work with the `FromHost` API. If the containers used for version 1.3, then this code should be used:
-
-```cpp
-const auto host = "http://localhost:5000";
-auto config = SpeechConfig::FromHost(host);
-config->SetSpeechSynthesisVoiceName(
- "Microsoft Server Speech Text to Speech Voice (en-US, AriaRUS)");
-auto synthesizer = SpeechSynthesizer::FromConfig(config);
-auto result = synthesizer->SpeakTextAsync("{{{text1}}}").get();
-```
-
-Below is an example of using the `FromEndpoint` API:
-
-```cpp
-const auto endpoint = "http://localhost:5000/cognitiveservices/v1";
-auto config = SpeechConfig::FromEndpoint(endpoint);
-config->SetSpeechSynthesisVoiceName(
- "Microsoft Server Speech Text to Speech Voice (en-US, AriaRUS)");
-auto synthesizer = SpeechSynthesizer::FromConfig(config);
-auto result = synthesizer->SpeakTextAsync("{{{text2}}}").get();
-```
-
- The `SetSpeechSynthesisVoiceName` function is called because the containers with an updated text-to-speech engine require the voice name.
-
-<br>
-</details>
-
-## Next steps
-
-> [!div class="nextstepaction"]
-> [Cognitive Services containers](speech-container-howto.md)
cognitive-services Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/metrics-advisor/faq.md
- Title: Metrics Advisor frequently asked questions-
-description: Frequently asked questions about the Metrics Advisor service.
------ Previously updated : 11/05/2020----
-# Metrics Advisor frequently asked questions
-
-### What is the cost of my instance?
-
-There currently isn't a cost to use your instance during the preview.
-
-### Why can't I create the resource? The "Pricing tier" is unavailable and it says "You have already created 1 S0 for this subscription"?
--
-During public preview, only one instance of Metrics Advisor can be created per region under a subscription.
-
-If you already have an instance created in the same region using the same subscription, you can try a different region or a different subscription to create a new instance. You can also delete an existing instance to create a new one.
-
-If you have already deleted the existing instance but still see the error, please wait for about 20 minutes after resource deletion before you create a new instance.
-
-## Basic concepts
-
-### What is multi-dimensional time-series data?
-
-See the [Multi-dimensional metric](glossary.md#multi-dimensional-metric) definition in the glossary.
-
-### How much data is needed for Metrics Advisor to start anomaly detection?
-
-At minimum, one data point can trigger anomaly detection. This doesn't bring the best accuracy, however. The service will assume a window of previous data points using the value you've specified as the "fill-gap" rule during data feed creation.
-
-We recommend having some data before the timestamp that you want detection on.
-Based on the granularity of your data, the recommended data amount varies as below.
-
-| Granularity | Recommended data amount for detection |
-| -- | - |
-| Less than 5 minutes | 4 days of data |
-| 5 minutes to 1 day | 28 days of data |
-| More than 1 day, to 31 days | 4 years of data |
-| Greater than 31 days | 48 years of data |
-
-### Why Metrics Advisor doesn't detect anomalies from historical data?
-
-Metrics Advisor is designed for detecting live streaming data. There's a limitation of the maximum length of historical data that the service will look back and run anomaly detection. It means only data points after a certain earliest timestamp will have anomaly detection results. That earliest timestamp depends on the granularity of your data.
-
-Based on the granularity of your data, the lengths of the historical data that will have anomaly detection results are as below.
-
-| Granularity | Maximum length of historical data for anomaly detection |
-| -- | - |
-| Less than 5 minutes | Onboard time - 13 hours |
-| 5 minutes to less than 1 hour | Onboard time - 4 days |
-| 1 hour to less than 1 day | Onboard time - 14 days |
-| 1 day | Onboard time - 28 days |
-| Greater than 1 day, less than 31 days | Onboard time - 2 years |
-| Greater than 31 days | Onboard time - 24 years |
-
-### More concepts and technical terms
-
-Also see the [Glossary](glossary.md) for more information.
-
-### How do I write a valid query for ingesting my data?
-
-For Metrics Advisor to ingest your data, you will need to create a query that returns the dimensions of your data at a single timestamp. Metrics advisor will run this query multiple times to get the data from each timestamp.
-
-Note that the query should return at most one record for each dimension combination, at a given timestamp. All records returned must have the same timestamp. There should be no duplicate records returned by the query.
-
-For example, suppose you create the query below, for a daily metric:
-
-`select timestamp, city, category, revenue from sampledata where Timestamp >= @StartTime and Timestamp < dateadd(DAY, 1, @StartTime)`
-
-Be sure to use the correct granularity for your time series. For an hourly metric, you would use:
-
-`select timestamp, city, category, revenue from sampledata where Timestamp >= @StartTime and Timestamp < dateadd(hour, 1, @StartTime)`
-
-Note that these queries only return data at a single timestamp, and contain all of the dimension combinations to be ingested by Metrics Advisor.
---
-### How do I detect spikes & dips as anomalies?
-
-If you have hard thresholds predefined, you could actually manually set "hard threshold" in [anomaly detection configurations](how-tos/configure-metrics.md#anomaly-detection-methods).
-If there's no thresholds, you could use "smart detection" which is powered by AI. Please refer to [tune the detecting configuration](how-tos/configure-metrics.md#tune-the-detecting-configuration) for details.
-
-### How do I detect inconformity with regular (seasonal) patterns as anomalies?
-
-"Smart detection" is able to learn the pattern of your data including seasonal patterns. It then detects those data points that don't conform to the regular patterns as anomalies. Please refer to [tune the detecting configuration](how-tos/configure-metrics.md#tune-the-detecting-configuration) for details.
-
-### How do I detect flat lines as anomalies?
-
-If your data is normally quite unstable and fluctuates a lot, and you want to be alerted when it turns too stable or even becomes a flat line,
-"Change threshold" is able to be configured to detect such data points when the change is too tiny.
-Please refer to [anomaly detection configurations](how-tos/configure-metrics.md#anomaly-detection-methods) for details.
-
-### How to set up email settings and enable alerting by email?
-
-1. A user with subscription administrator or resource group administrator privileges needs to navigate to the Metrics Advisor resource that created in the Azure portal, and select the **Access control(IAM)** tab.
-2. Select **Add role assignments**
-3. Pick a role of **Cognitive Services Metrics Advisor Administrator**, select your account as in the image below.
-4. Click **Save** button, then you are successfully been added as administrator of Metrics Advisor resource. Note that all above actions need to be performed by subscription administrator or resource group administrator.
---
-5. It might take up to one minute for the permissions to propagate. Then, select your Metrics Advisor workspace, and select the **Email setting** option in the left navigation panel. Fill in the required items, in particular the SMTP-related info.
-6. Select **Save**, then you're all set with the e-mail configuration. You can create new hooks and subscribe to metric anomalies for near real-time alerts.
-
-## Advanced concepts
-
-### How does Metric Advisor build an incident tree for multi-dimensional metrics?
-
-A metric can be split into multiple time series by dimensions. For example, the metric `Response latency` is monitored for all services owned by the team. The `Service` category could be used as a dimension to enrich the metric, so we get `Response latency` split by `Service1`, `Service2`, and so on. Each service could be deployed on different machines in multiple data centers, so the metric could be further split by `Machine` and `Data center`.
-
-|Service| Data center| Machine |
-|-||- |
-| S1 | DC1 | M1 |
-| S1 | DC1 | M2 |
-| S1 | DC2 | M3 |
-| S1 | DC2 | M4 |
-| S2 | DC1 | M1 |
-| S2 | DC1 | M2 |
-| S2 | DC2 | M5 |
-| S2 | DC2 | M6 |
-| ...| | |
-
-Starting from the total `Response latency`, we can drill down into the metric by `Service`, `Data center` and `Machine`. However, maybe it makes more sense for service owners to use the path `Service` -> `Data center` -> `Machine`, or maybe it makes more sense for infrastructure engineers to use the path `Data Center` -> `Machine` -> `Service`. It all depends on the individual business requirements of your users.
-
-In Metric Advisor, users can specify any path they want to drill down or rollup from one node of the hierarchical topology. More precisely, the hierarchical topology is a directed acyclic graph rather than a tree structure. There's a full hierarchical topology that consists of all potential dimension combinations, like this:
--
-In theory, if the dimension `Service` has `Ls` distinct values, dimension `Data center` has `Ldc` distinct values, and dimension `Machine` has `Lm` distinct values, then there could be `(Ls + 1) * (Ldc + 1) * (Lm + 1)` dimension combinations in the hierarchical topology.
-
-But usually not all dimension combinations are valid, which can significantly reduce the complexity. Currently if users aggregate the metric themselves, we don't limit the number of dimensions. If you need to use the rollup functionality provided by Metrics Advisor, the number of dimensions shouldn't be more than 6. However, we limit the number of time series expanded by dimensions for a metric to less than 10,000.
-
-The **Incident tree** tool in the diagnostics page only shows nodes where an anomaly has been detected, rather than the whole topology. This is to help you focus on the current issue. It also may not show all anomalies within the metric, and instead will display the top anomalies based on contribution. In this way, we can quickly find out the impact, scope, and the spread path of the abnormal data. Which significantly reduces the number of anomalies we need to focus on, and helps users to understand and locate their key issues.
-
-For example, when an anomaly occurs on `Service = S2 | Data Center = DC2 | Machine = M5`, the deviation of the anomaly impacts the parent node `Service= S2` which also has detected the anomaly, but the anomaly doesn't affect the entire data center at `DC2` and all services on `M5`. The incident tree would be built as in the below screenshot, the top anomaly is captured on `Service = S2`, and root cause could be analyzed in two paths which both lead to `Service = S2 | Data Center = DC2 | Machine = M5`.
-
- :::image type="content" source="media/root-cause-paths.png" alt-text="5 labeled vertices with two distinct paths connected by edges with a common node labeled S2. The top anomaly is captured on Service = S2, and root cause can be analyzed by the two paths which both lead to Service = S2 | Data Center = DC2 | Machine = M5" lightbox="media/root-cause-paths.png":::
-
-## Next Steps
-- [Metrics Advisor overview](overview.md)-- [Use the web portal](quickstarts/web-portal.md)
cognitive-services Alerts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/metrics-advisor/how-tos/alerts.md
Metrics Advisor supports three different types of hooks: email hook, web hook an
### Email hook > [!Note]
-> Metrics Advisor resource administrators need to configure the Email settings, and input SMTP related information into Metrics Advisor before anomaly alerts can be sent. The resource group admin or subscription admin needs to assign at least one *Cognitive Services Metrics Advisor Administrator* role in the Access control tab of the Metrics Advisor resource. [Learn more about e-mail settings configuration](../faq.md#how-to-set-up-email-settings-and-enable-alerting-by-email).
+> Metrics Advisor resource administrators need to configure the Email settings, and input SMTP related information into Metrics Advisor before anomaly alerts can be sent. The resource group admin or subscription admin needs to assign at least one *Cognitive Services Metrics Advisor Administrator* role in the Access control tab of the Metrics Advisor resource. [Learn more about e-mail settings configuration](/azure/cognitive-services/metrics-advisor/faq#how-to-set-up-email-settings-and-enable-alerting-by-email-).
To create an email hook, the following parameters are available:
cognitive-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/metrics-advisor/whats-new.md
If you want to learn about the latest updates to Metrics Advisor client SDKs see
### Updated articles
-* [Update on how Metric Advisor builds an incident tree for multi-dimensional metrics](faq.md#how-does-metric-advisor-build-an-incident-tree-for-multi-dimensional-metrics)
+* [Update on how Metric Advisor builds an incident tree for multi-dimensional metrics](/azure/cognitive-services/metrics-advisor/faq#how-does-metric-advisor-build-an-incident-tree-for-multi-dimensional-metrics)
cognitive-services Model Versioning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/concepts/model-versioning.md
The [Text Analytics for Health](../how-tos/text-analytics-for-health.md) contain
| Endpoint | Container Image Tag | Model version | ||--||
-| `/entities/health` | `3.0.015370001-onprem-amd64` or latest | `2021-03-01` |
+| `/entities/health` | `3.0.016230002-onprem-amd64` or latest | `2021-05-15` |
+| `/entities/health` | `3.0.015370001-onprem-amd64` | `2021-03-01` |
| `/entities/health` | `1.1.013530001-amd64-preview` | `2020-09-03` | | `/entities/health` | `1.1.013150001-amd64-preview` | `2020-07-24` | | `/domains/health` | `1.1.012640001-amd64-preview` | `2020-05-08` |
cognitive-services Text Analytics For Health https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-for-health.md
Previously updated : 06/07/2021 Last updated : 06/16/2021
The latest prerelease of the Text Analytics client library enables you to call T
### Preparation
-Text Analytics for health produces a higher-quality result when you give it smaller amounts of text to work on. This is opposite to some of the other Text Analytics features such as key phrase extraction which performs better on larger blocks of text. To get the best results from these operations, consider restructuring the inputs accordingly.
- You must have JSON documents in this format: ID, text, and language.
-Document size must be under 5,120 characters per document. For the maximum number of documents permitted in a collection, see the [data limits](../concepts/data-limits.md?tabs=version-3) article under Concepts. The collection is submitted in the body of the request.
+Document size must be under 5,120 characters per document. For the maximum number of documents permitted in a collection, see the [data limits](../concepts/data-limits.md?tabs=version-3) article under Concepts. The collection is submitted in the body of the request. If your text exceeds this limit, consider splitting the text into separate requests. For best results, split text between sentences.
### Structure the API request for the hosted asynchronous web API
cognitive-services Text Analytics How To Call Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-how-to-call-api.md
If successful, the GET request to the `/analyze` endpoint will return an object
* [Text Analytics overview](../overview.md) * [Model versions](../concepts/model-versioning.md)
-* [Frequently asked questions (FAQ)](../text-analytics-resource-faq.md)</br>
+* [Frequently asked questions (FAQ)](../text-analytics-resource-faq.yml)</br>
* [Text Analytics product page](//go.microsoft.com/fwlink/?LinkID=759712) * [Using the Text Analytics client library](../quickstarts/client-libraries-rest-api.md) * [What's new](../whats-new.md)
cognitive-services Text Analytics How To Entity Linking https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-how-to-entity-linking.md
Previously updated : 06/10/2021 Last updated : 06/15/2021
Starting in `v3.1-preview.5`, The JSON response includes a `redactedText` proper
[Named Entity Recognition version 3.1-preview reference for `PII`](https://westus2.dev.cognitive.microsoft.com/docs/services/TextAnalytics-v3-1-Preview-5/operations/EntitiesRecognitionPii)
-The API will attempt to detect the [listed entity categories](../named-entity-types.md?tabs=personal) for a given document language. If you want to specify which entities will be detected and returned, use the optional pii-categories parameter with the appropriate entity categories. This parameter can also let you detect entities that aren't enabled by default for your document language. For example, a French driver's license number that might occur in English text.
+The API will attempt to detect the [listed entity categories](../named-entity-types.md?tabs=personal) for a given document language. If you want to specify which entities will be detected and returned, use the optional `piiCategories` parameter with the appropriate entity categories. This parameter can also let you detect entities that aren't enabled by default for your document language. The following example would detect a French driver's license number that might occur in English text, along with the default English entities.
-`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/entities/recognition/pii?piiCategories=[FRDriversLicenseNumber]`
+> [!TIP]
+> If you don't include `default` when specifying entity categories, The API will only return the entity cateogires you specify.
+
+`https://<your-custom-subdomain>.cognitiveservices.azure.com/text/analytics/v3.1-preview.5/entities/recognition/pii?piiCategories=default,FRDriversLicenseNumber`
**Asynchronous operation**
cognitive-services Text Analytics How To Install Containers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-how-to-install-containers.md
In this article, you learned concepts and workflow for downloading, installing,
## Next steps * Review [Configure containers](../text-analytics-resource-container-config.md) for configuration settings
-* Refer to [Frequently asked questions (FAQ)](../text-analytics-resource-faq.md) to resolve issues related to functionality.
+* Refer to [Frequently asked questions (FAQ)](../text-analytics-resource-faq.yml) to resolve issues related to functionality.
cognitive-services Text Analytics How To Keyword Extraction https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-how-to-keyword-extraction.md
In this article, you learned concepts and workflow for key phrase extraction by
## See also [Text Analytics overview](../overview.md)
- [Frequently asked questions (FAQ)](../text-analytics-resource-faq.md)</br>
+ [Frequently asked questions (FAQ)](../text-analytics-resource-faq.yml)</br>
[Text Analytics product page](//go.microsoft.com/fwlink/?LinkID=759712) ## Next steps
cognitive-services Text Analytics Resource Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/text-analytics-resource-faq.md
- Title: Frequently Asked Questions about the Text Analytics API-
-description: Find answers to commonly asked questions about concepts, code, and scenarios related to the Text Analytics API for Azure Cognitive Services.
------- Previously updated : 03/29/2021--
-# Frequently Asked Questions (FAQ) about the Text Analytics API
-
- Find answers to commonly asked questions about concepts, code, and scenarios related to the Text Analytics API in Azure Cognitive Services.
-
-## What is the maximum size and number of requests I can make to the API?
-
-See the [data limits](concepts/data-limits.md) article for information on the size and number of requests you can send per minute and second.
-
-## Can Text Analytics identify sarcasm?
-
-Analysis is for positive-negative sentiment rather than mood detection.
-
-There is always some degree of imprecision in sentiment analysis, but the model is most useful when there is no hidden meaning or subtext to the content. Irony, sarcasm, humor, and similarly nuanced content rely on cultural context and norms to convey intent. This type of content is among the most challenging to analyze. Typically, the greatest discrepancy between a given score produced by the analyzer and a subjective assessment by a human is for content with nuanced meaning.
-
-## Can I add my own training data or models?
-
-No, the models are pretrained. The only operations available on uploaded data are scoring, key phrase extraction, and language detection. We do not host custom models. If you want to create and host custom machine learning models, consider the [machine learning capabilities in Microsoft R Server](/r-server/r/concept-what-is-the-microsoftml-package).
-
-## Can I request additional languages?
-
-Sentiment analysis and key phrase extraction are available for a [select number of languages](./language-support.md). Natural language processing is complex and requires substantial testing before new functionality can be released. For this reason, we avoid pre-announcing support so that no one takes a dependency on functionality that needs more time to mature.
-
-To help us prioritize which languages to work on next, vote for specific languages using the [feedback tool](https://feedback.azure.com/forums/932041-azure-cognitive-services?category_id=395749).
-
-## Why does key phrase extraction return some words but not others?
-
-Key phrase extraction eliminates non-essential words and standalone adjectives. Adjective-noun combinations, such as "spectacular views" or "foggy weather" are returned together.
-
-Generally, output consists of nouns and objects of the sentence. Output is listed in order of importance, with the first phrase being the most important. Importance is measured by the number of times a particular concept is mentioned, or the relation of that element to other elements in the text.
-
-## Why does output vary, given identical inputs?
-
-Improvements to models and algorithms are announced if the change is major, or quietly slipstreamed into the service if the update is minor. Over time, you might find that the same text input results in a different sentiment score or key phrase output. This is a normal and intentional consequence of using managed machine learning resources in the cloud.
-
-## Service availability and redundancy
-
-### Is Text Analytics service zone resilient?
-
-Yes. The Text Analytics service is zone-resilient by default.
-
-### How do I configure the Text Analytics service to be zone-resilient?
-
-No customer configuration is necessary to enable zone-resiliency. Zone-resiliency for Text Analytics resources is available by default and managed by the service itself.
-
-## Next steps
-
-Is your question about a missing feature or functionality? Consider requesting or voting for it using the [feedback tool](https://feedback.azure.com/forums/932041-azure-cognitive-services?category_id=395749).
-
-## See also
-
- * [StackOverflow: Text Analytics API](https://stackoverflow.com/questions/tagged/text-analytics-api)
- * [StackOverflow: Cognitive Services](https://stackoverflow.com/questions/tagged/microsoft-cognitive)
cognitive-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/whats-new.md
Previously updated : 06/03/2021 Last updated : 06/17/2021
The Text Analytics API is updated on an ongoing basis. To stay up-to-date with r
* 14 new relation types, * Assertion detection expanded for new entity types and * Linking support for ALLERGEN entity type-
+* A new image for the Text Analytics for health container with tag `3.0.016230002-onprem-amd64` and model version `2021-05-15`. This container is available for download from Microsoft Container Registry.
## May 2021
container-instances Container Instances Region Availability https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-instances/container-instances-region-availability.md
The following regions and maximum resources are available to container groups wi
| Korea Central | 4 | 16 | N/A | N/A | 50 | N/A | | North Central US | 2 | 3.5 | 4 | 16 | 50 | K80, P100, V100 | | North Europe | 4 | 16 | 4 | 16 | 50 | K80 |
+| Norway East | 4 | 16 | N/A | N/A | 50 | N/A |
| South Central US | 4 | 16 | 4 | 16 | 50 | V100 | | Southeast Asia | 4 | 16 | 4 | 16 | 50 | P100, V100 | | South India | 4 | 16 | N/A | N/A | 50 | K80 |
cosmos-db Analytical Store Introduction https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/analytical-store-introduction.md
If you have a globally distributed Azure Cosmos DB account, after you enable ana
## Security
-Authentication with the analytical store is the same as the transactional store for a given database. You can use primary or read-only keys for authentication. You can leverage linked service in Synapse Studio to prevent pasting the Azure Cosmos DB keys in the Spark notebooks. Access to this Linked Service is available to anyone who has access into the workspace.
+* Authentication with the analytical store is the same as the transactional store for a given database. You can use primary or read-only keys for authentication. You can leverage linked service in Synapse Studio to prevent pasting the Azure Cosmos DB keys in the Spark notebooks. Access to this Linked Service is available to anyone who has access into the workspace.
+
+* **Network isolation using private endpoints** - You can control network access to the data in the transactional and analytical stores independently. Network isolation is done using separate managed private endpoints for each store, within managed virtual networks in Azure Synapse workspaces. To learn more, see how to [Configure private endpoints for analytical store](analytical-store-private-endpoints.md) article.
+
+* **Data encryption with customer-managed keys** - You can seamlessly encrypt the data across transactional and analytical stores using the same customer-managed keys in an automatic and transparent manner. Azure Synapse Link only supports configuring customer-managed keys using your Azure Cosmos DB account's managed identity. You must configure your account's managed identity in your Azure Key Vault access policy before enabling Azure Synapse Link](configure-synapse-link.md#enable-synapse-link) on your account. To learn more, see how to [Configure customer-managed keys using Azure Cosmos DB accounts' managed identities](how-to-setup-cmk.md#using-managed-identity) article.
## Support for multiple Azure Synapse Analytics runtimes
cosmos-db Configure Synapse Link https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/configure-synapse-link.md
Azure Synapse Link is available for Azure Cosmos DB SQL API containers or for Az
## <a id="enable-synapse-link"></a>Enable Azure Synapse Link for Azure Cosmos DB accounts
+> [!NOTE]
+> If you want to use customer-managed keys with Azure Synapse Link, you must configure your account's managed identity in your Azure Key Vault access policy before enabling Synapse Link on your account. To learn more, see how to [Configure customer-managed keys using Azure Cosmos DB accounts' managed identities](how-to-setup-cmk.md#using-managed-identity) article.
+>
### Azure portal 1. Sign into the [Azure portal](https://portal.azure.com/).
After the analytical store is enabled with a particular TTL value, you may want
If you created an analytical store enabled container through the Azure portal, it contains a default analytical TTL of -1. Use the following steps to update this value: 1. Sign in to the [Azure portal](https://portal.azure.com/) or the [Azure Cosmos DB Explorer](https://cosmos.azure.com/).- 1. Navigate to your Azure Cosmos DB account and open the **Data Explorer** tab.- 1. Select an existing container that has analytical store enabled. Expand it and modify the following values:
+ 1. Open the **Scale & Settings** window.
+ 1. Under **Setting** find, **Analytical Storage Time to Live**.
+ 1. Select **On (no default)** or select **On** and set a TTL value.
+ 1. Click **Save** to save the changes.
- * Open the **Scale & Settings** window.
- * Under **Setting**, find **Analytical Storage Time to Live**.
- * Select **On (no default)** or select **On** and set a TTL value
- * Click **Save** to save the changes.
### .NET SDK
To learn more, see the following docs:
* [Apache Spark in Azure Synapse Analytics](../synapse-analytics/spark/apache-spark-concepts.md).
-* [Serverless SQL pool runtime support in Azure Synapse Analytics](../synapse-analytics/sql/on-demand-workspace-overview.md).
+* [Serverless SQL pool runtime support in Azure Synapse Analytics](../synapse-analytics/sql/on-demand-workspace-overview.md).
cosmos-db Conflict Resolution Policies https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/conflict-resolution-policies.md
Azure Cosmos DB offers a flexible policy-driven mechanism to resolve write confl
If you configure your container with the custom resolution option, and you fail to register a merge procedure on the container or the merge procedure throws an exception at runtime, the conflicts are written to the *conflicts feed*. Your application then needs to manually resolve the conflicts in the conflicts feed. To learn more, see [examples of how to use the custom resolution policy and how to use the conflicts feed](how-to-manage-conflicts.md). > [!NOTE]
- > Custom conflict resolution policy is available only for SQL API accounts.
+ > Custom conflict resolution policy is available only for SQL API accounts and can be set only at creation time. It is not possible to set a custom resolution policy on an existing container.
## Next steps
cosmos-db Continuous Backup Restore Resource Model https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/continuous-backup-restore-resource-model.md
This resource contains a database account instance that can be restored. The dat
| restorableLocations: creationTime | The time in UTC when the regional account was created.| | restorableLocations: deletionTime | The time in UTC when the regional account was deleted. This value is empty if the regional account is live.|
-To get a list of all restorable accounts, see [Restorable Database Accounts - list](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorabledatabaseaccounts/list) or [Restorable Database Accounts- list by location](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorabledatabaseaccounts/listbylocation) articles.
+To get a list of all restorable accounts, see [Restorable Database Accounts - list](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorable-database-accounts/list) or [Restorable Database Accounts- list by location](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorable-database-accounts/list-by-location) articles.
### Restorable SQL database
Each resource contains information of a mutation event such as creation and dele
| operationType | The operation type of this database event. Here are the possible values:<br/><ul><li>Create: database creation event</li><li>Delete: database deletion event</li><li>Replace: database modification event</li><li>SystemOperation: database modification event triggered by the system. This event is not initiated by the user</li></ul> | | database |The properties of the SQL database at the time of the event|
-To get a list of all database mutations, see [Restorable Sql Databases - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorablesqldatabases/list) article.
+To get a list of all database mutations, see [Restorable Sql Databases - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorable-sql-databases/list) article.
### Restorable SQL container
Each resource contains information of a mutation event such as creation and dele
| operationType | The operation type of this container event. Here are the possible values: <br/><ul><li>Create: container creation event</li><li>Delete: container deletion event</li><li>Replace: container modification event</li><li>SystemOperation: container modification event triggered by the system. This event is not initiated by the user</li></ul> | | container | The properties of the SQL container at the time of the event.|
-To get a list of all container mutations under the same database, see [Restorable Sql Containers - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorablesqlcontainers/list) article.
+To get a list of all container mutations under the same database, see [Restorable Sql Containers - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorable-sql-containers/list) article.
### Restorable SQL resources
Each resource represents a single database and all the containers under that dat
| databaseName | The name of the SQL database. | collectionNames | The list of SQL containers under this database.|
-To get a list of SQL database and container combo that exist on the account at the given timestamp and location, see [Restorable Sql Resources - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorablesqlresources/list) article.
+To get a list of SQL database and container combo that exist on the account at the given timestamp and location, see [Restorable Sql Resources - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorable-sql-resources/list) article.
### Restorable MongoDB database
Each resource contains information of a mutation event such as creation and dele
| ownerResourceId | The resource ID of the MongoDB database. | | operationType | The operation type of this database event. Here are the possible values:<br/><ul><li> Create: database creation event</li><li> Delete: database deletion event</li><li> Replace: database modification event</li><li> SystemOperation: database modification event triggered by the system. This event is not initiated by the user </li></ul> |
-To get a list of all database mutation, see [Restorable Mongodb Databases - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorablemongodbdatabases/list) article.
+To get a list of all database mutation, see [Restorable Mongodb Databases - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorable-mongodb-databases/list) article.
### Restorable MongoDB collection
Each resource contains information of a mutation event such as creation and dele
| ownerResourceId | The resource ID of the MongoDB collection. | | operationType |The operation type of this collection event. Here are the possible values:<br/><ul><li>Create: collection creation event</li><li>Delete: collection deletion event</li><li>Replace: collection modification event</li><li>SystemOperation: collection modification event triggered by the system. This event is not initiated by the user</li></ul> |
-To get a list of all container mutations under the same database, see [Restorable Mongodb Collections - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorablemongodbcollections/list) article.
+To get a list of all container mutations under the same database, see [Restorable Mongodb Collections - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorable-mongodb-collections/list) article.
### Restorable MongoDB resources
Each resource represents a single database and all the collections under that da
| databaseName |The name of the MongoDB database. | | collectionNames | The list of MongoDB collections under this database. |
-To get a list of all MongoDB database and collection combinations that exist on the account at the given timestamp and location, see [Restorable Mongodb Resources - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorablemongodbresources/list) article.
+To get a list of all MongoDB database and collection combinations that exist on the account at the given timestamp and location, see [Restorable Mongodb Resources - List](/rest/api/cosmos-db-resource-provider/2021-04-01-preview/restorable-mongodb-resources/list) article.
## Next steps
cosmos-db How To Setup Cmk https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-setup-cmk.md
This feature is currently available only for new accounts.
### Is it possible to use customer-managed keys in conjunction with the Azure Cosmos DB [analytical store](analytical-store-introduction.md)?
-Yes, but you must [use your Azure Cosmos DB account's managed identity](#using-managed-identity) in your Azure Key Vault access policy before enabling the analytical store.
+Yes, Azure Synapse Link only supports configuring customr-managed keys usings your Azure Cosmos DB account's managed identity. You must [use your Azure Cosmos DB account's managed identity](#using-managed-identity) in your Azure Key Vault access policy before [enabling Azure Synapse Link](configure-synapse-link.md#enable-synapse-link) on your account.
### Is there a plan to support finer granularity than account-level keys?
cosmos-db Managed Identity Based Authentication https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/managed-identity-based-authentication.md
az role assignment create --assignee $principalId --role "DocumentDB Account Con
Now we have a function app that has a system-assigned managed identity with the **DocumentDB Account Contributor** role in the Azure Cosmos DB permissions. The following function app code will get the Azure Cosmos DB keys, create a CosmosClient object, get the temperature of the aquarium, and then save this to Azure Cosmos DB.
-This sample uses the [List Keys API](/rest/api/cosmos-db-resource-provider/2021-04-15/databaseaccounts/listkeys) to access your Azure Cosmos DB account keys.
+This sample uses the [List Keys API](/rest/api/cosmos-db-resource-provider/2021-04-15/database-accounts/list-keys) to access your Azure Cosmos DB account keys.
> [!IMPORTANT]
-> If you want to [assign the Cosmos DB Account Reader](#grant-access-to-your-azure-cosmos-account) role, you'll need to use the [List Read Only Keys API](/rest/api/cosmos-db-resource-provider/2021-04-15/databaseaccounts/listreadonlykeys). This will populate just the read-only keys.
+> If you want to [assign the Cosmos DB Account Reader](#grant-access-to-your-azure-cosmos-account) role, you'll need to use the [List Read Only Keys API](/rest/api/cosmos-db-resource-provider/2021-04-15/database-accounts/list-read-only-keys). This will populate just the read-only keys.
The List Keys API returns the `DatabaseAccountListKeysResult` object. This type isn't defined in the C# libraries. The following code shows the implementation of this class:
cosmos-db Mongodb Custom Commands https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb-custom-commands.md
The create collection extension command creates a new MongoDB collection. The da
collection: "<Collection Name>", shardKey: "<Shard key path>", // Replace the line below with "autoScaleSettings: { maxThroughput: (int) }" to use Autoscale instead of Provisioned Throughput. Fill the required Autoscale max throughput setting.
- offerThroughput: (int) // Provisioned Throughput enabled with required throughput amount set
+ offerThroughput: (int) // Provisioned Throughput enabled with required throughput amount set.
+ indexes: [{key: {_id: 1}}, ... ] // Optional indexes (3.6+ accounts only).
} ```
The following table describes the parameters within the command:
| `offerThroughput` | `int` | Optional | Provisioned throughput to set on the database. If this parameter is not provided, it will default to the minimum, 400 RU/s. * To specify throughput beyond 10,000 RU/s, the `shardKey` parameter is required.| | `shardKey` | `string` | Required for collections with large throughput | The path to the Shard Key for the sharded collection. This parameter is required if you set more than 10,000 RU/s in `offerThroughput`. If it is specified, all documents inserted will require this key and value. | | `autoScaleSettings` | `Object` | Required for [Autoscale mode](provision-throughput-autoscale.md) | This object contains the settings associated with the Autoscale capacity mode. You can set up the `maxThroughput` value, which describes the highest amount of Request Units that the collection will be increased to dynamically. |
+| `indexes` | `Array` | Optionally configure indexes. This parameter is supported for 3.6+ accounts only. | When present, an index on _id is required. Each entry in the array must include a key of one or more fields, and may contain index options. For example, to create a compound unique index on the fields a and b use this entry: `{key: {a: 1, b: 1}, unique: true}`.
### Output
The update collection extension command updates the properties associated with t
customAction: "UpdateCollection", collection: "<Name of the collection that you want to update>", // Replace the line below with "autoScaleSettings: { maxThroughput: (int) }" if using Autoscale instead of Provisioned Throughput. Fill the required Autoscale max throughput setting. Changing between Autoscale and Provisioned throughput is only supported in the Azure Portal.
- offerThroughput: (int) // Provisioned Throughput enabled with required throughput amount set
+ offerThroughput: (int) // Provisioned Throughput enabled with required throughput amount set.
+ indexes: [{key: {_id: 1}}, ... ] // Optional indexes (3.6+ accounts only).
} ```
The following table describes the parameters within the command:
| `collection` | `string` | Name of the collection. | | `offerThroughput` | `int` | Provisioned throughput to set on the collection.| | `autoScaleSettings` | `Object` | Required for [Autoscale mode](provision-throughput-autoscale.md). This object contains the settings associated with the Autoscale capacity mode. The `maxThroughput` value describes the highest amount of Request Units that the collection will be increased to dynamically. |
+| `indexes` | `Array` | Optionally configure indexes. This parameter is supported for 3.6+ accounts only. When present, the existing indexes of the collection are replaced by the set of indexes specified (including dropping indexes). An index on _id is required. Each entry in the array must include a key of one or more fields, and may contain index options. For example, to create a compound unique index on the fields a and b use this entry: `{key: {a: 1, b: 1}, unique: true}`.
## Output
cosmos-db Role Based Access Control https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/role-based-access-control.md
description: Learn how Azure Cosmos DB provides database protection with Active
Previously updated : 05/27/2021 Last updated : 06/17/2021
In addition to the built-in roles, users may also create [custom roles](../role-
## <a id="prevent-sdk-changes"></a>Preventing changes from the Azure Cosmos DB SDKs
-The Azure Cosmos DB resource provider can be locked down to prevent any changes to resources from a client connecting using the account keys (that is applications connecting via the Azure Cosmos SDK). This also includes changes made from the Azure portal. This feature may be desirable for users who want higher degrees of control and governance for production environments. Preventing changes from the SDK also enables features such as resource locks and diagnostic logs for control plane operations. The clients connecting from Azure Cosmos DB SDK will be prevented from changing any property for the Azure Cosmos accounts, databases, containers, and throughput. The operations involving reading and writing data to Cosmos containers themselves are not impacted.
+The Azure Cosmos DB resource provider can be locked down to prevent any changes to resources from a client connecting using the account keys (that is applications connecting via the Azure Cosmos SDK). This feature may be desirable for users who want higher degrees of control and governance for production environments. Preventing changes from the SDK also enables features such as resource locks and diagnostic logs for control plane operations. The clients connecting from Azure Cosmos DB SDK will be prevented from changing any property for the Azure Cosmos accounts, databases, containers, and throughput. The operations involving reading and writing data to Cosmos containers themselves are not impacted.
When this feature is enabled, changes to any resource can only be made from a user with the right Azure role and Azure Active Directory credentials including Managed Service Identities.
This setting will prevent any changes to any Cosmos resource from any client con
- Modifying stored procedures, triggers or user-defined functions.
-If your applications (or users via Azure portal) perform any of these actions they will need to be migrated to execute via [ARM Templates](./manage-with-templates.md), [PowerShell](manage-with-powershell.md), [Azure CLI](manage-with-cli.md), REST, or [Azure Management Library](https://github.com/Azure-Samples/cosmos-management-net). Note that Azure Management is available in [multiple languages](/azure/index.yml?product=developer-tools#languages-and-tools).
+If your applications (or users via Azure portal) perform any of these actions they will need to be migrated to execute via [ARM Templates](./manage-with-templates.md), [PowerShell](manage-with-powershell.md), [Azure CLI](manage-with-cli.md), REST, or [Azure Management Library](https://github.com/Azure-Samples/cosmos-management-net). Note that Azure Management is available in [multiple languages](/azure/?product=featured#languages-and-tools).
### Set via ARM Template
cosmos-db Synapse Link https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/synapse-link.md
Synapse Link enables you to run near real-time analytics over your mission-criti
* **Network isolation using private endpoints** - You can control network access to the data in the transactional and analytical stores independently. Network isolation is done using separate managed private endpoints for each store, within managed virtual networks in Azure Synapse workspaces. To learn more, see how to [Configure private endpoints for analytical store](analytical-store-private-endpoints.md) article.
-* **Data encryption with customer-managed keys** - You can seamlessly encrypt the data across transactional and analytical stores using the same customer-managed keys in an automatic and transparent manner. To learn more, see how to [Configure customer-managed keys](how-to-setup-cmk.md) article.
+* **Data encryption with customer-managed keys** - You can seamlessly encrypt the data across transactional and analytical stores using the same customer-managed keys in an automatic and transparent manner. Azure Synapse Link only supports configuring customr-managed keys usings your Azure Cosmos DB account's managed identity. You must configure your account's managed identity in your Azure Key Vault access policy before enabling Azure Synapse Link](configure-synapse-link.md#enable-synapse-link) on your account. To learn more, see how to [Configure customer-managed keys using Azure Cosmos DB accounts' managed identities](how-to-setup-cmk.md#using-managed-identity) article.
* **Secure key management** - Accessing the data in analytical store from Synapse Spark and Synapse serverless SQL pools requires managing Azure Cosmos DB keys within Synapse Analytics workspaces. Instead of using the Azure Cosmos DB account keys inline in Spark jobs or SQL scripts, Azure Synapse Link provides more secure capabilities.
cost-management-billing Tutorial Acm Create Budgets https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/costs/tutorial-acm-create-budgets.md
Title: Tutorial - Create and manage Azure budgets
description: This tutorial helps you plan and account for the costs of Azure services that you consume. Previously updated : 04/26/2021 Last updated : 06/17/2021
Budget integration with action groups only works for action groups that have the
## Create and edit budgets with PowerShell
-If you're an EA customer, you can create and edit budgets programmatically using the Azure PowerShell module. To download the latest version of Azure PowerShell, run the following command:
+If you're an EA customer, you can create and edit budgets programmatically using the Azure PowerShell module.
+
+>[!Note]
+>Customers with a Microsoft Customer Agreement should use the [Budgets REST API](/rest/api/consumption/budgets/create-or-update) to create budgets programmatically because PowerShell and CLI aren't yet supported.
+
+To download the latest version of Azure PowerShell, run the following command:
```azurepowershell-interactive install-module -name Az
$ActionGroupId = (Set-AzActionGroup -ResourceGroupName YourResourceGroup -Name T
#Create a monthly budget that sends an email and triggers an Action Group to send a second email. Make sure the StartDate for your monthly budget is set to the first day of the current month. Note that Action Groups can also be used to trigger automation such as Azure Functions or Webhooks.
+Get-AzContext
New-AzConsumptionBudget -Amount 100 -Name TestPSBudget -Category Cost -StartDate 2020-02-01 -TimeGrain Monthly -EndDate 2022-12-31 -ContactEmail test@test.com -NotificationKey Key1 -NotificationThreshold 0.8 -NotificationEnabled -ContactGroup $ActionGroupId ```
cost-management-billing Cancel Azure Subscription https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/cancel-azure-subscription.md
After you cancel, billing is stopped immediately. However, it can take up to 10
After you cancel, your services are disabled. That means your virtual machines are de-allocated, temporary IP addresses are freed, and storage is read-only.
-After your subscription is canceled, and the subscription does not have any active resources, Microsoft waits 30 - 90 days before permanently deleting your data in case you need to access it or you change your mind. We don't charge you for keeping the data. To learn more, see [Microsoft Trust Center - How we manage your data](https://go.microsoft.com/fwLink/p/?LinkID=822930&clcid=0x409).
+After your subscription is canceled, Microsoft waits 30 - 90 days before permanently deleting your data in case you need to access it or you change your mind. We don't charge you for keeping the data. To learn more, see [Microsoft Trust Center - How we manage your data](https://go.microsoft.com/fwLink/p/?LinkID=822930&clcid=0x409).
## Delete free account or pay-as-you-go subscription
-If you have a free account or pay-as-you-go subscription, you don't have to wait 30 days for the subscription to automatically delete. The **Delete subscription** option becomes available three days after you cancel a subscription.
+If you have a free account or a pay-as-you-go subscription, you don't have to wait 30 to 90 days for the subscription to automatically delete. The **Delete** option becomes available 3 days after you cancel a subscription. After 3 days, if you don't have any resources under the canceled subscription, you can delete the subscription.
-1. Wait three days after the date you canceled the subscription.
-1. Select your subscription on the [Subscriptions](https://portal.azure.com/#blade/Microsoft_Azure_Billing/SubscriptionsBlade) page in the Azure portal.
-1. Select the subscription that you want to delete.
-1. Select **Overview**, and then select **Delete subscription**.
+Follow these steps to delete a subscription:
+
+1. Wait 3 days after the date you canceled the subscription.
+2. Ensure that you don't have any resources under the subscription before you try to delete it.
+3. In the Azure portal, go to [Subscriptions](https://portal.azure.com/#blade/Microsoft_Azure_Billing/SubscriptionsBlade).
+4. Select the subscription you want to delete.
+5. Select **Overview**, and then select **Delete**.
+6. Type the subscription name in the confirmation prompt, and then select **Delete subscription**.
## Delete other subscriptions
cost-management-billing Pay By Invoice https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/pay-by-invoice.md
tags: billing
Previously updated : 01/13/2021 Last updated : 06/16/2021
If you switch to pay by invoice, that means you pay your bill within 30 days of
## Request to pay by invoice 1. Sign in to the Azure portal to submit a support request. Search for and select **Help + support**.
- ![Search for Help and support, Microsoft Azure portal](./media/pay-by-invoice/search-for-help-and-support.png)
+ :::image type="content" source="./media/pay-by-invoice/search-for-help-and-support.png" alt-text="Screenshot of searching for Help and support." lightbox="./media/pay-by-invoice/search-for-help-and-support.png" :::
1. Select **New support request**.
- ![New support request link, Help and support screen, Microsoft Azure portal](./media/pay-by-invoice/help-and-support.png)
+ :::image type="content" source="./media/pay-by-invoice/help-and-support.png" alt-text="Screenshot of the New support request link." lightbox="./media/pay-by-invoice/help-and-support.png" :::
1. Select **Billing** as the **Issue type**. The *issue type* is the support request category. Select the subscription for which you want to pay by invoice, select a support plan, and then select **Next**. 1. Select **Payment** as the **Problem Type**. The *problem type* is the support request subcategory. 1. Select **Switch to Pay by Invoice** as the **Problem subtype**.
If you have a Microsoft Online Services Program account, you can switch your Azu
Follow the steps below to switch your Azure subscription to invoice pay (check/wire transfer). *Once you switch to invoice pay (check/wire transfer), you can't switch back to a credit card*. 1. Go to the Azure portal to sign in as the Account Administrator. Search for and select **Cost Management + Billing**.
- ![Screenshot shows search for Cost Management and Billing in the Azure portal.](./media/pay-by-invoice/search.png)
+ :::image type="content" source="./media/pay-by-invoice/search.png" alt-text="Screenshot showing search for Cost Management + Billing in the Azure portal." lightbox="./media/pay-by-invoice/search.png" :::
1. Select the subscription you'd like to switch to invoice payment. 1. Select **Payment methods**. 1. In the command bar, select the **Pay by invoice** button.
- ![Pay by invoice button, Payment methods, Microsoft Azure portal](./media/pay-by-invoice/pay-by-invoice.png)
+ :::image type="content" source="./media/pay-by-invoice/pay-by-invoice.png" alt-text="Screenshot showing Pay by invoice." lightbox="./media/pay-by-invoice/pay-by-invoice.png" :::
### Switch billing profile to check/wire transfer
Follow the steps below to switch a billing profile to check/wire transfer. Only
1. Go to the Azure portal view your billing information. Search for and select **Cost Management + Billing**. 1. In the menu, choose **Billing profiles**.
- ![Billing profiles menu item, Cost Management and Billing, Microsoft Azure portal](./media/pay-by-invoice/billing-profile.png)
+ :::image type="content" source="./media/pay-by-invoice/billing-profile.png" alt-text="Screenshot showing Billing profiles menu item." lightbox="./media/pay-by-invoice/billing-profile.png" :::
1. Select a billing profile. 1. In the **Billing profile** menu, select **Payment methods**.
- ![Payment methods menu item, Billing profiles, Cost Management, Microsoft Azure portal](./media/pay-by-invoice/billing-profile-payment-methods.png)
-1. Select the banner that says you're eligible to pay by check/wire transfer.
- ![Banner to switch to check/wire, Payment methods, Microsoft Azure portal](./media/pay-by-invoice/customer-led-switch-to-invoice.png)
+ :::image type="content" source="./media/pay-by-invoice/billing-profile-payment-methods.png" alt-text="Screenshot showing Payment methods menu item." lightbox="./media/pay-by-invoice/billing-profile-payment-methods.png" :::
+1. Under the *Other payment methods* heading, select the ellipsis (...) symbol, and then select **Make default**.
+ :::image type="content" source="./media/pay-by-invoice/customer-led-switch-to-invoice.png" alt-text="Screenshot showing Check/wire transfer ellipsis and Made default option." lightbox="./media/pay-by-invoice/customer-led-switch-to-invoice.png" :::
## Check access to a Microsoft Customer Agreement [!INCLUDE [billing-check-mca](../../../includes/billing-check-mca.md)]
data-factory Connector Azure Databricks Delta Lake https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-databricks-delta-lake.md
Previously updated : 03/29/2021 Last updated : 06/16/2021 # Copy data to and from Azure Databricks Delta Lake by using Azure Data Factory
To copy data to Azure Databricks Delta Lake, the following properties are suppor
| Property | Description | Required | | : | :-- | :- | | type | The type property of the Copy activity sink, set to **AzureDatabricksDeltaLakeSink**. | Yes |
-| preCopyScript | Specify a SQL query for the Copy activity to run before writing data into Databricks delta table in each run. You can use this property to clean up the preloaded data, or add a truncate table or Vacuum statement. | No |
+| preCopyScript | Specify a SQL query for the Copy activity to run before writing data into Databricks delta table in each run. Example : `VACUUM eventsTable DRY RUN` You can use this property to clean up the preloaded data, or add a truncate table or Vacuum statement. | No |
| importSettings | Advanced settings used to write data into delta table. | No | | ***Under `importSettings`:*** | | | | type | The type of import command, set to **AzureDatabricksDeltaLakeImportCommand**. | Yes |
If your source data store and format meet the criteria described in this section
"type": "<source type>" }, "sink": {
- "type": "AzureDatabricksDeltaLakeSink"
+ "type": "AzureDatabricksDeltaLakeSink",
+ "sqlReadrQuery": "VACUUM eventsTable DRY RUN"
} } }
data-factory Connector Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sql-server.md
You are suggested to enable parallel copy with data partitioning especially when
| Scenario | Suggested settings | | | | | Full load from large table, with physical partitions. | **Partition option**: Physical partitions of table. <br><br/>During execution, Data Factory automatically detects the physical partitions, and copies data by partitions. <br><br/>To check if your table has physical partition or not, you can refer to [this query](#sample-query-to-check-physical-partition). |
-| Full load from large table, without physical partitions, while with an integer or datetime column for data partitioning. | **Partition options**: Dynamic range partition.<br>**Partition column** (optional): Specify the column used to partition data. If not specified, the index or primary key column is used.<br/>**Partition upper bound** and **partition lower bound** (optional): Specify if you want to determine the partition stride. This is not for filtering the rows in table, all rows in the table will be partitioned and copied. If not specified, copy activity auto detect the values.<br><br>For example, if your partition column "ID" has values range from 1 to 100, and you set the lower bound as 20 and the upper bound as 80, with parallel copy as 4, Data Factory retrieves data by 4 partitions - IDs in range <=20, [21, 50], [51, 80], and >=81, respectively. |
+| Full load from large table, without physical partitions, while with an integer or datetime column for data partitioning. | **Partition options**: Dynamic range partition.<br>**Partition column** (optional): Specify the column used to partition data. If not specified, the primary key column is used.<br/>**Partition upper bound** and **partition lower bound** (optional): Specify if you want to determine the partition stride. This is not for filtering the rows in table, all rows in the table will be partitioned and copied. If not specified, copy activity auto detects the values and it can take long time depending on MIN and MAX values. It is recommended to provide upper bound and lower bound. <br><br>For example, if your partition column "ID" has values range from 1 to 100, and you set the lower bound as 20 and the upper bound as 80, with parallel copy as 4, Data Factory retrieves data by 4 partitions - IDs in range <=20, [21, 50], [51, 80], and >=81, respectively. |
| Load a large amount of data by using a custom query, without physical partitions, while with an integer or date/datetime column for data partitioning. | **Partition options**: Dynamic range partition.<br>**Query**: `SELECT * FROM <TableName> WHERE ?AdfDynamicRangePartitionCondition AND <your_additional_where_clause>`.<br>**Partition column**: Specify the column used to partition data.<br>**Partition upper bound** and **partition lower bound** (optional): Specify if you want to determine the partition stride. This is not for filtering the rows in table, all rows in the query result will be partitioned and copied. If not specified, copy activity auto detect the value.<br><br>During execution, Data Factory replaces `?AdfRangePartitionColumnName` with the actual column name and value ranges for each partition, and sends to SQL Server. <br>For example, if your partition column "ID" has values range from 1 to 100, and you set the lower bound as 20 and the upper bound as 80, with parallel copy as 4, Data Factory retrieves data by 4 partitions- IDs in range <=20, [21, 50], [51, 80], and >=81, respectively. <br><br>Here are more sample queries for different scenarios:<br> 1. Query the whole table: <br>`SELECT * FROM <TableName> WHERE ?AdfDynamicRangePartitionCondition`<br> 2. Query from a table with column selection and additional where-clause filters: <br>`SELECT <column_list> FROM <TableName> WHERE ?AdfDynamicRangePartitionCondition AND <your_additional_where_clause>`<br> 3. Query with subqueries: <br>`SELECT <column_list> FROM (<your_sub_query>) AS T WHERE ?AdfDynamicRangePartitionCondition AND <your_additional_where_clause>`<br> 4. Query with partition in subquery: <br>`SELECT <column_list> FROM (SELECT <your_sub_query_column_list> FROM <TableName> WHERE ?AdfDynamicRangePartitionCondition) AS T` |
When you copy data from/to SQL Server with [Always Encrypted](/sql/relational-da
6. **Verify connection**: To connect to SQL Server by using a fully qualified name, use SQL Server Management Studio from a different machine. An example is `"<machine>.<domain>.corp.<company>.com,1433"`. ## Next steps
-For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
+For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
data-factory Control Flow Set Variable Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-set-variable-activity.md
Below is an example of this pattern:
} ```
+Variables are currently scoped at the pipeline level. This means that they are not thread safe and can cause unexpected and undesired behavior if they are accessed from within a parallel iteration activity such as a foreach loop, especially when the value is also being modified within that foreach activity.
## Next steps Learn about a related control flow activity supported by Data Factory:
data-factory Create Self Hosted Integration Runtime https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/create-self-hosted-integration-runtime.md
Installation of the self-hosted integration runtime on a domain controller isn't
- [Visual C++ 2010 Redistributable](https://download.microsoft.com/download/3/2/2/3224B87F-CFA0-4E70-BDA3-3DE650EFEBA5/vcredist_x64.exe) Package (x64) - Java Runtime (JRE) version 8 from a JRE provider such as [Adopt OpenJDK](https://adoptopenjdk.net/). Ensure that the `JAVA_HOME` environment variable is set to the JRE folder (and not just the JDK folder).
+>[!NOTE]
+>If you are running in government cloud, please review [Connect to government cloud.](https://docs.microsoft.com/azure/azure-government/documentation-government-get-started-connect-with-ps)
+ ## Setting up a self-hosted integration runtime To create and set up a self-hosted integration runtime, use the following procedures.
Use the following steps to create a self-hosted IR using Azure Data Factory UI.
### Set up a self-hosted IR on an Azure VM via an Azure Resource Manager template
-You can automate self-hosted IR setup on an Azure virtual machine by using the [Create self host IR template](https://github.com/Azure/azure-quickstart-templates/tree/master/101-vms-with-selfhost-integration-runtime). The template provides an easy way to have a fully functional self-hosted IR inside an Azure virtual network. The IR has high-availability and scalability features, as long as you set the node count to 2 or higher.
+You can automate self-hosted IR setup on an Azure virtual machine by using the [Create self host IR template](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.compute/vms-with-selfhost-integration-runtime). The template provides an easy way to have a fully functional self-hosted IR inside an Azure virtual network. The IR has high-availability and scalability features, as long as you set the node count to 2 or higher.
### Set up an existing self-hosted IR via local PowerShell
data-factory Pipeline Trigger Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/pipeline-trigger-troubleshoot-guide.md
Title: Troubleshoot pipeline orchestration and triggers in Azure Data Factory
description: Use different methods to troubleshoot pipeline trigger issues in Azure Data Factory. Previously updated : 04/01/2021 Last updated : 06/17/2021
This can happen if you have not scaled up SHIR as per your workload.
Long queue related error messages can appear for various reasons. + **Resolution** * If you receive an error message from any source or destination via connectors, which can generate a long queue, go to [Connector Troubleshooting Guide.](./connector-troubleshoot-guide.md) * If you receive an error message about Mapping Data Flow, which can generate a long queue, go to [Data Flows Troubleshooting Guide.](./data-flow-troubleshoot-guide.md) * If you receive an error message about other activities, such as Databricks, custom activities, or HDI, which can generate a long queue, go to [Activity Troubleshooting Guide.](./data-factory-troubleshoot-guide.md)
-* If you receive an error message about running SSIS packages, which can generate a long queue, go to the [Azure-SSIS Package Execution Troubleshooting Guide](./ssis-integration-runtime-ssis-activity-faq.md) and [Integration Runtime Management Troubleshooting Guide.](./ssis-integration-runtime-management-troubleshoot.md)
+* If you receive an error message about running SSIS packages, which can generate a long queue, go to the [Azure-SSIS Package Execution Troubleshooting Guide](./ssis-integration-runtime-ssis-activity-faq.yml) and [Integration Runtime Management Troubleshooting Guide.](./ssis-integration-runtime-management-troubleshoot.md)
### Error message - "code":"BadRequest", "message":"null"
It is an user error because JSON payload that hits management.azure.com is corru
Perform network tracing of your API call from ADF portal using Edge/Chrome browser **Developer tools**. You will see offending JSON payload, which could be due to a special characters(for example $), spaces and other types of user input. Once you fix the string expression, you will proceed with rest of ADF usage calls in the browser.
+### ForEach activities do not run in parallel mode
+
+**Cause**
+
+You are running ADF in debug mode.
+
+**Resolution**
+
+Please run pipeline in trigger mode.
+
+### Can not publish because account is locked
+
+**Cause**
+
+You made changes in collaboration branch to remove storage event trigger. You are trying to publish and encounter "Trigger deactivation error" message. This is due to the storage account, used for the event trigger, is being locked.
+
+**Resolution**
+
+Remove the lock to allow publish to succeed.
## Next steps
data-factory Ssis Integration Runtime Ssis Activity Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/ssis-integration-runtime-ssis-activity-faq.md
- Title: Troubleshoot package execution in the SSIS integration runtime
-description: "This article provides troubleshooting guidance for SSIS package execution in the SSIS integration runtime"
------ Previously updated : 04/15/2019--
-# Troubleshoot package execution in the SSIS integration runtime
--
-This article includes the most common errors that you might find when you're executing SQL Server Integration Services (SSIS) packages in the SSIS integration runtime. It describes the potential causes and actions to solve the errors.
-
-## Where to find logs for troubleshooting
-
-Use the Azure Data Factory portal to check the output of the SSIS package execution activity. The output includes the execution result, error messages, and operation ID. For details, see [Monitor the pipeline](how-to-invoke-ssis-package-ssis-activity.md#monitor-the-pipeline).
-
-Use the SSIS catalog (SSISDB) to check the detail logs for the execution. For details, see [Monitor Running Packages and Other Operations](/sql/integration-services/performance/monitor-running-packages-and-other-operations).
-
-## Common errors, causes, and solutions
-
-### Error message: "Connection Timeout Expired" or "The service has encountered an error processing your request. Please try again."
-
-Here are potential causes and recommended actions:
-* The data source or destination is overloaded. Check the load on your data source or destination and see whether it has enough capacity. For example, if you used Azure SQL Database, consider scaling up if the database is likely to time out.
-* The network between the SSIS integration runtime and the data source or destination is unstable, especially when the connection is cross-region or between on-premises and Azure. Apply the retry pattern in the SSIS package by following these steps:
- * Make sure your SSIS packages can rerun on failure without side effects (for example, data loss or data duplication).
- * Configure **Retry** and **Retry interval** of **Execute SSIS Package** activity on the **General** tab.
- ![Set properties on the General tab](media/how-to-invoke-ssis-package-ssis-activity/ssis-activity-general.png)
- * For an ADO.NET and OLE DB source or destination component, set **ConnectRetryCount** and **ConnectRetryInterval** in Connection Manager in the SSIS package or SSIS activity.
-
-### Error message: "ADO NET Source has failed to acquire the connection '...'" with "A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible."
-
-This issue usually means the data source or destination is inaccessible from the SSIS integration runtime. The reasons can vary. Try these actions:
-* Make sure you're passing the data source or destination name/IP correctly.
-* Make sure the firewall is set properly.
-* Make sure your virtual network is configured properly if your data source or destination is on-premises:
- * You can verify whether the issue is from virtual network configuration by provisioning an Azure VM in the same virtual network. Then check whether the data source or destination can be accessed from the Azure VM.
- * You can find more details about using a virtual network with an SSIS integration runtime in [Join an Azure-SSIS integration runtime to a virtual network](join-azure-ssis-integration-runtime-virtual-network.md).
-
-### Error message: "ADO NET Source has failed to acquire the connection '...'" with "Could not create a managed connection manager."
-
-The potential cause is that the ADO.NET provider used in the package isn't installed in the SSIS integration runtime. You can install the provider by using a custom setup. You can find more details about custom setup in [Customize setup for the Azure-SSIS integration runtime](how-to-configure-azure-ssis-ir-custom-setup.md).
-
-### Error message: "The connection '...' is not found"
-
-A known issue in older versions of SQL Server Management Studio (SSMS) can cause this error. If the package contains a custom component (for example, SSIS Azure Feature Pack or partner components) that isn't installed on the machine where SSMS is used to do the deployment, SSMS will remove the component and cause the error. Upgrade [SSMS](/sql/ssms/download-sql-server-management-studio-ssms) to the latest version that has the issue fixed.
-
-### Error message:“SSIS Executor exit code: -1073741819.”
-
-* Potential cause & recommended action:
- * This error may be because of the limitation for Excel source and destination when multiple Excel sources or destinations are executing in parallel in multi-thread. You can workaround this limitation by change your Excel components to execute in sequence, or separate them into different packages and trigger through "Execute Package Task" with ExecuteOutOfProcess property set as True.
-
-### Error message: "There is not enough space on the disk"
-
-This error means the local disk is used up in the SSIS integration runtime node. Check whether your package or custom setup is consuming a lot of disk space:
-* If the disk is consumed by your package, it will be freed up after the package execution finishes.
-* If the disk is consumed by your custom setup, you'll need to stop the SSIS integration runtime, modify your script, and start the integration runtime again. The whole Azure blob container that you specified for custom setup will be copied to the SSIS integration runtime node, so check whether there's any unnecessary content under that container.
-
-### Error message: "Failed to retrieve resource from master. Microsoft.SqlServer.IntegrationServices.Scale.ScaleoutContract.Common.MasterResponseFailedException: Code:300004. Description:Load file "***" failed."
-
-* Potential cause & recommended action:
- * If the SSIS Activity is executing package from file system (package file or project file), this error will occur if the project, package or configuration file is not accessible with the package access credential you provided in the SSIS Activity
- * If you are using Azure File:
- * The file path should start with \\\\\<storage account name\>.file.core.windows.net\\\<file share path\>
- * The domain should be "Azure"
- * The username should be \<storage account name\>
- * The password should be \<storage access key\>
- * If your are using on-premises file, please check if VNet, package access credential and permission are configured properly so that your Azure-SSIS integration runtime can access your on-premises file share
-
-### Error message: "The file name '...' specified in the connection was not valid"
-
-* Potential cause & recommended action:
- * An invalid file name is specified
- * Make sure you are using FQDN (Fully Qualified Domain Name) instead of short time in your connection manager
-
-### Error message: "Cannot open file '...'"
-
-This error occurs when package execution can't find a file in the local disk in the SSIS integration runtime. Try these actions:
-* Don't use the absolute path in the package that's being executed in the SSIS integration runtime. Use the current execution working directory (.) or the temp folder (%TEMP%) instead.
-* If you need to persist some files on SSIS integration runtime nodes, prepare the files as described in [Customize setup](how-to-configure-azure-ssis-ir-custom-setup.md). All the files in the working directory will be cleaned up after the execution is finished.
-* Use Azure Files instead of storing the file in the SSIS integration runtime node. For details, see [Use Azure file shares](/sql/integration-services/lift-shift/ssis-azure-files-file-shares#use-azure-file-shares).
-
-### Error message: "The database 'SSISDB' has reached its size quota"
-
-A potential cause is that the SSISDB database created in Azure SQL Database or in SQL Managed Instance has reached its quota. Try these actions:
-* Consider increasing the DTU of your database. You can find details in [SQL Database limits for an logical server](../azure-sql/database/resource-limits-logical-server.md).
-* Check whether your package would generate many logs. If so, you can configure an elastic job to clean up these logs. For details, see [Clean up SSISDB logs with Azure Elastic Database jobs](how-to-clean-up-ssisdb-logs-with-elastic-jobs.md).
-
-### Error message: "The request limit for the database is ... and has been reached."
-
-If many packages are running in parallel in the SSIS integration runtime, this error might occur because SSISDB has hit its request limit. Consider increasing the DTC of SSISDB to resolve this issue. You can find details in [SQL Database limits for an logical server](../azure-sql/database/resource-limits-logical-server.md).
-
-### Error message: "SSIS Operation failed with unexpected operation status: ..."
-
-The error is mostly caused by a transient problem, so try to rerun the package execution. Apply the retry pattern in the SSIS package by following these steps:
-
-* Make sure your SSIS packages can rerun on failure without side effects (for example, data loss or data duplication).
-* Configure **Retry** and **Retry interval** of **Execute SSIS Package** activity on the **General** tab.
- ![Set properties on the General tab](media/how-to-invoke-ssis-package-ssis-activity/ssis-activity-general.png)
-* For an ADO.NET and OLE DB source or destination component, set **ConnectRetryCount** and **ConnectRetryInterval** in Connection Manager in the SSIS package or SSIS activity.
-
-### Error message: "There is no active worker."
-
-This error usually means the SSIS integration runtime has an unhealthy status. Check the Azure portal for the status and detailed errors. For more information, see [Azure-SSIS integration runtime](./monitor-integration-runtime.md#azure-ssis-integration-runtime).
-
-### Error message: "Your integration runtime cannot be upgraded and will eventually stop working, since we cannot access the Azure Blob container you provided for custom setup."
-
-This error occurs when the SSIS integration runtime can't access the storage configured for custom setup. Check whether the shared access signature (SAS) URI that you provided is valid and hasn't expired.
-
-### Error message: "Microsoft OLE DB Provider for Analysis Services. 'Hresult: 0x80004005 Description:' COM error: COM error: mscorlib; Exception has been thrown by the target of an invocation"
-
-One potential cause is that the username or password with Azure AD Multi-Factor Authentication enabled is configured for Azure Analysis Services authentication. This authentication isn't supported in the SSIS integration runtime. Try to use a service principal for Azure Analysis Services authentication:
-
-1. Prepare a service principal as described in [Automation with service principals](../analysis-services/analysis-services-service-principal.md).
-2. In the Connection Manager, configure **Use a specific user name and password:** set **app:*&lt;AppID&gt;*@*&lt;TenantID&gt;*** as the username and clientSecret as the password. Here is an example of a correctly formatted user name:
-
- `app:12345678-9012-3456-789a-bcdef012345678@9abcdef0-1234-5678-9abc-def0123456789abc`
-1. In Connection Manager, configure **Use a specific user name and password**: set **AppID** as the username and **clientSecret** as the password.
-
-### Error message: "ADONET Source has failed to acquire the connection {GUID} with the following error message: Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON'" when using a managed identity
-
-Make sure you don't configure the authentication method of Connection Manager as **Active Directory Password Authentication** when the parameter *ConnectUsingManagedIdentity* is **True**. You can configure it as **SQL Authentication** instead, which is ignored if *ConnectUsingManagedIdentity* is set.
-
-### Error message: "0xC020801F at ..., OData Source [...]: Cannot acquire a managed connection from the run-time connection manager"
-
-One potential cause is that the Transport Layer Security (TLS) is not enable in SSIS integration runtime which is required by your OData source. You can enable TLS in SSIS integration runtime by using Customize setup. More detail can be found at [Can't connect Project Online Odata from SSIS](/office365/troubleshoot/cant-connect-project-online-odata-from-ssis) and [Customize setup for the Azure-SSIS integration runtime](how-to-configure-azure-ssis-ir-custom-setup.md).
-
-### Error message: "Request staging task with operation guid ... fail since error: Failed to dispatch staging operation with error message: Microsoft.SqlServer.IntegrationServices.AisAgentCore.AisAgentException: Failed to load data proxy."
-
-Make sure your Azure-SSIS integration runtime is configured with Self-Hosted integration runtime. More detail can be found at [Configure Self-Hosted IR as a proxy for Azure-SSIS IR in ADF](self-hosted-integration-runtime-proxy-ssis.md).
-
-### Error message: "Staging task status: Failed. Staging task error: ErrorCode: 2010, ErrorMessage: The Self-hosted Integration Runtime ... is offline"
-
-Make sure your Self-Hosted integration runtime is installed and started. More detail can be found at [Create and configure a self-hosted integration runtime](create-self-hosted-integration-runtime.md)
-
-### Error message: "Staging task error: ErrorCode: 2906, ErrorMessage: Package execution failed., Output: {"OperationErrorMessages": "Error: The requested OLE DB provider ... is not registered. If the 64-bit driver is not installed, run the package in 32-bit mode..."
-
-Make sure the corresponding provider used by your OLE DB connectors in your package are installed on Self-Hosted integration runtime machine properly. More detail can be found at [Configure Self-Hosted IR as a proxy for Azure-SSIS IR in ADF](self-hosted-integration-runtime-proxy-ssis.md#prepare-the-self-hosted-ir)
-
-### Error message: "Staging task error: ErrorCode: 2906, ErrorMessage: Package execution failed., Output: {"OperationErrorMessages": "Error: System.IO.FileLoadException: Could not load file or assembly 'Microsoft.WindowsAzure.Storage, Version=..., Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference.'..."
-
-One potential cause is your Self-Hosted integration runtime is not installed or upgraded properly. Suggest to download and reinstall the latest Self-hosted integration runtime. More detail can be found at [Create and configure a self-hosted integration runtime](create-self-hosted-integration-runtime.md#installation-best-practices)
-
-### Error message: "Staging task failed. TaskStatus: Failed, ErrorCode: 2906, ErrorMessage: Package execution failed. For more details, select the output of your activity run on the same row., Output: {"OperationErrorMessages": "4/14/2021 7:10:35 AM +00:00 : = Failed to start Named pipe proxy..."
-
-Check if security policies are correctly assigned to the account running self-hosted IR service. If Windows authentication is used on Execute SSIS Package activity or the execution credential is set in SSIS catalog (SSISDB), the same security policies must be assigned to the Windows account used. More detail can be found at [Configure Self-Hosted IR as a proxy for Azure-SSIS IR in ADF](self-hosted-integration-runtime-proxy-ssis.md#enable-windows-authentication-for-on-premises-tasks)
-
-### Error message: "A connection is required when requesting metadata. If you are working offline, uncheck Work Offline on the SSIS menu to enable the connection"
-
-* Potential cause & recommended action:
- * If there is also a warning message "The component does not support using connection manager with ConnectByProxy value setting trueΓÇ£ in the execution log, this means a connection manager is used on a component which hasn't supported "ConnectByProxy" yet. The supported components can be found at [Configure Self-Hosted IR as a proxy for Azure-SSIS IR in ADF](self-hosted-integration-runtime-proxy-ssis.md#enable-ssis-packages-to-use-a-proxy)
- * Execution log can be found in [SSMS report](/sql/integration-services/performance/monitor-running-packages-and-other-operations#reports) or in the log folder you specified in SSIS package execution activity.
- * vNet can also be used to access on-premises data as an alternative. More detail can be found at [Join an Azure-SSIS integration runtime to a virtual network](join-azure-ssis-integration-runtime-virtual-network.md)
-
-### Error message: "Staging task status: Failed. Staging task error: ErrorCode: 2906, ErrorMessage: Package execution failed., Output: {"OperationErrorMessages": "SSIS Executor exit code: -1.\n", "LogLocation": "...\\SSISTelemetry\\ExecutionLog\\...", "effectiveIntegrationRuntime": "...", "executionDuration": ..., "durationInQueue": { "integrationRuntimeQueue": ... }}"
-
-Make sure Visual C++ runtime is installed on Self-Hosted integration runtime machine. More detail can be found at [Configure Self-Hosted IR as a proxy for Azure-SSIS IR in ADF](self-hosted-integration-runtime-proxy-ssis.md#prepare-the-self-hosted-ir)
-
-### Multiple Package executions are triggered unexpectedly
-
-* Potential cause & recommended action:
- * ADF stored procedure activity or Lookup activity are used to trigger SSIS package execution. The t-sql command may hit transient issue and trigger the rerun which would cause multiple package executions.
- * Use ExecuteSSISPackage activity instead which ensures package execution wonΓÇÖt rerun unless user set retry count in activity. Detail can be found at [https://docs.microsoft.com/azure/data-factory/how-to-invoke-ssis-package-ssis-activity](./how-to-invoke-ssis-package-ssis-activity.md)
- * Refine your t-sql command to be able to rerun by checking if an execution has already been triggered
-
-### Package execution takes too long
-
-Here are potential causes and recommended actions:
-
-* Too many package executions have been scheduled on the SSIS integration runtime. All these executions will be waiting in a queue for their turn.
- * Determine the maximum by using this formula:
-
- Max Parallel Execution Count per IR = Node Count * Max Parallel Execution per Node
- * To learn how to set the node count and maximum parallel execution per node, see [Create an Azure-SSIS integration runtime in Azure Data Factory](create-azure-ssis-integration-runtime.md).
-* The SSIS integration runtime is stopped or has an unhealthy status. To learn how to check the SSIS integration runtime status and errors, see [Azure-SSIS integration runtime](monitor-integration-runtime.md#azure-ssis-integration-runtime).
-
-We also recommend that you set a timeout on the **General** tab:
- ![Set properties on the General tab](media/how-to-invoke-ssis-package-ssis-activity/ssis-activity-general.png).
-
-### Poor performance in package execution
-
-Try these actions:
-
-* Make sure the SSIS integration runtime is in the same region as the data source and destination.
-
-* Set the logging level of package execution to **Performance** to collect duration information for each component in the execution. For details, see [Integration Services (SSIS) logging](/sql/integration-services/performance/integration-services-ssis-logging).
-
-* Check IR node performance in the Azure portal:
- * For information about how to monitor the SSIS integration runtime, see [Azure-SSIS integration runtime](monitor-integration-runtime.md#azure-ssis-integration-runtime).
- * You can find CPU/memory history for the SSIS integration runtime by viewing the metrics of the data factory in the Azure portal.
- ![Monitor metrics of the SSIS integration runtime](media/ssis-integration-runtime-ssis-activity-faq/monitor-metrics-ssis-integration-runtime.png)
data-factory Data Factory Data Management Gateway High Availability Scalability https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/v1/data-factory-data-management-gateway-high-availability-scalability.md
This section assumes that you have gone through the following two articles or fa
![Data Management Gateway - installation successful](media/data-factory-data-management-gateway-high-availability-scalability/data-factory-gateway-installation-success.png) > [!NOTE]
- > If you are provisioning the gateway on an Azure VM, you can use [this Azure Resource Manager template](https://github.com/Azure/azure-quickstart-templates/tree/master/101-mutiple-vms-with-data-management-gateway). This script creates a logical gateway, sets up VMs with Data Management Gateway software installed, and registers them with the logical gateway.
+ > If you are provisioning the gateway on an Azure VM, you can use [this Azure Resource Manager template](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.datafactory/mutiple-vms-with-data-management-gateway). This script creates a logical gateway, sets up VMs with Data Management Gateway software installed, and registers them with the logical gateway.
6. In Azure portal, launch the **Gateway** page: 1. On the data factory home page in the portal, click **Linked Services**.
data-factory Data Factory How To Use Resource Manager Templates https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/v1/data-factory-how-to-use-resource-manager-templates.md
See the following tutorials for step-by-step instructions to create Data Factory
## Data Factory templates on GitHub Check out the following Azure quickstart templates on GitHub:
-* [Create a Data factory to copy data from Azure Blob Storage to Azure SQL Database](https://github.com/Azure/azure-quickstart-templates/tree/master/101-data-factory-blob-to-sql-copy)
-* [Create a Data factory with Hive activity on Azure HDInsight cluster](https://github.com/Azure/azure-quickstart-templates/tree/master/101-data-factory-hive-transformation)
-* [Create a Data factory to copy data from Salesforce to Azure Blobs](https://github.com/Azure/azure-quickstart-templates/tree/master/101-data-factory-salesforce-to-blob-copy)
-* [Create a Data factory that chains activities: copies data from an FTP server to Azure Blobs, invokes a hive script on an on-demand HDInsight cluster to transform the data, and copies result into Azure SQL Database](https://github.com/Azure/azure-quickstart-templates/tree/master/201-data-factory-ftp-hive-blob)
+* [Create a Data factory to copy data from Azure Blob Storage to Azure SQL Database](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.datafactory/data-factory-blob-to-sql-copy)
+* [Create a Data factory with Hive activity on Azure HDInsight cluster](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.datafactory/data-factory-hive-transformation)
+* [Create a Data factory to copy data from Salesforce to Azure Blobs](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.datafactory/data-factory-salesforce-to-blob-copy)
+* [Create a Data factory that chains activities: copies data from an FTP server to Azure Blobs, invokes a hive script on an on-demand HDInsight cluster to transform the data, and copies result into Azure SQL Database](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.databricks/data-factory-ftp-hive-blob)
-Feel free to share your Azure Data Factory templates at [Azure quickstart](https://azure.microsoft.com/documentation/templates/). Refer to the [contribution guide](https://github.com/Azure/azure-quickstart-templates/tree/master/1-CONTRIBUTION-GUIDE) while developing templates that can be shared via this repository.
+Feel free to share your Azure Data Factory templates at [Azure quickstart](https://azure.microsoft.com/resources/templates/). Refer to the [contribution guide](https://github.com/Azure/azure-quickstart-templates/tree/master/1-CONTRIBUTION-GUIDE) while developing templates that can be shared via this repository.
The following sections provide details about defining Data Factory resources in a Resource Manager template.
You define a data factory in the Resource Manager template as shown in the follo
"location": "East US" } ```
-The dataFactoryName is defined in ΓÇ£variablesΓÇ¥ as:
+The dataFactoryName is defined in "variables" as:
```JSON "dataFactoryName": "[concat('<myDataFactoryName>', uniqueString(resourceGroup().id))]",
The dataFactoryName is defined in ΓÇ£variablesΓÇ¥ as:
} ```
-See [Storage Linked Service](data-factory-azure-blob-connector.md#azure-storage-linked-service) or [Compute Linked Services](data-factory-compute-linked-services.md#azure-hdinsight-on-demand-linked-service) for details about the JSON properties for the specific linked service you wish to deploy. The ΓÇ£dependsOnΓÇ¥ parameter specifies name of the corresponding data factory. An example of defining a linked service for Azure Storage is shown in the following JSON definition:
+See [Storage Linked Service](data-factory-azure-blob-connector.md#azure-storage-linked-service) or [Compute Linked Services](data-factory-compute-linked-services.md#azure-hdinsight-on-demand-linked-service) for details about the JSON properties for the specific linked service you wish to deploy. The "dependsOn" parameter specifies name of the corresponding data factory. An example of defining a linked service for Azure Storage is shown in the following JSON definition:
### Define datasets
See [Storage Linked Service](data-factory-azure-blob-connector.md#azure-storage-
... } ```
-Refer to [Supported data stores](data-factory-data-movement-activities.md#supported-data-stores-and-formats) for details about the JSON properties for the specific dataset type you wish to deploy. Note the ΓÇ£dependsOnΓÇ¥ parameter specifies name of the corresponding data factory and storage linked service. An example of defining dataset type of Azure blob storage is shown in the following JSON definition:
+Refer to [Supported data stores](data-factory-data-movement-activities.md#supported-data-stores-and-formats) for details about the JSON properties for the specific dataset type you wish to deploy. Note the "dependsOn" parameter specifies name of the corresponding data factory and storage linked service. An example of defining dataset type of Azure blob storage is shown in the following JSON definition:
```JSON "type": "datasets",
Refer to [Supported data stores](data-factory-data-movement-activities.md#suppor
} ```
-Refer to [defining pipelines](data-factory-create-pipelines.md#pipeline-json) for details about the JSON properties for defining the specific pipeline and activities you wish to deploy. Note the ΓÇ£dependsOnΓÇ¥ parameter specifies name of the data factory, and any corresponding linked services or datasets. An example of a pipeline that copies data from Azure Blob Storage to Azure SQL Database is shown in the following JSON snippet:
+Refer to [defining pipelines](data-factory-create-pipelines.md#pipeline-json) for details about the JSON properties for defining the specific pipeline and activities you wish to deploy. Note the "dependsOn" parameter specifies name of the data factory, and any corresponding linked services or datasets. An example of a pipeline that copies data from Azure Blob Storage to Azure SQL Database is shown in the following JSON snippet:
```JSON "type": "datapipelines",
data-factory Data Factory Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/v1/data-factory-samples.md
You can find the following Azure Resource Manager templates for Data Factory on
| Template | Description | | | |
-| [Copy from Azure Blob Storage to Azure SQL Database](https://github.com/Azure/azure-quickstart-templates/tree/master/101-data-factory-blob-to-sql-copy) |Deploying this template creates an Azure data factory with a pipeline that copies data from the specified Azure blob storage to Azure SQL Database |
-| [Copy from Salesforce to Azure Blob Storage](https://github.com/Azure/azure-quickstart-templates/tree/master/101-data-factory-salesforce-to-blob-copy) |Deploying this template creates an Azure data factory with a pipeline that copies data from the specified Salesforce account to the Azure blob storage. |
-| [Transform data by running Hive script on an Azure HDInsight cluster](https://github.com/Azure/azure-quickstart-templates/tree/master/101-data-factory-hive-transformation) |Deploying this template creates an Azure data factory with a pipeline that transforms data by running the sample Hive script on an Azure HDInsight Hadoop cluster. |
+| [Copy from Azure Blob Storage to Azure SQL Database](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.datafactory/data-factory-blob-to-sql-copy) |Deploying this template creates an Azure data factory with a pipeline that copies data from the specified Azure blob storage to Azure SQL Database |
+| [Copy from Salesforce to Azure Blob Storage](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.datafactory/data-factory-salesforce-to-blob-copy) |Deploying this template creates an Azure data factory with a pipeline that copies data from the specified Salesforce account to the Azure blob storage. |
+| [Transform data by running Hive script on an Azure HDInsight cluster](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.datafactory/data-factory-hive-transformation) |Deploying this template creates an Azure data factory with a pipeline that transforms data by running the sample Hive script on an Azure HDInsight Hadoop cluster. |
## Samples in Azure portal You can use the **Sample pipelines** tile on the home page of your data factory to deploy sample pipelines and their associated entities (datasets and linked services) in to your data factory.
dms Resource Scenario Status https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/dms/resource-scenario-status.md
The following table shows Azure Database Migration Service support for online mi
| **Azure SQL VM** | SQL Server | X | | | | Oracle | X | | | **Azure Cosmos DB** | MongoDB | Γ£ö | GA |
-| **Azure DB for MySQL** | MySQL | Γ£ö | GA |
-| | RDS MySQL | Γ£ö | GA |
+| **Azure DB for MySQL** | MySQL | X | |
+| | RDS MySQL | X | |
| **Azure DB for PostgreSQL - Single server** | PostgreSQL | Γ£ö | GA | | | Azure DB for PostgreSQL - Single server* | Γ£ö | GA | | | RDS PostgreSQL | Γ£ö | GA |
dms Tutorial Azure Postgresql To Azure Postgresql Online Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/dms/tutorial-azure-postgresql-to-azure-postgresql-online-portal.md
To complete this tutorial, you need to:
* Check [status of migration scenarios supported by Azure Database Migration Service](./resource-scenario-status.md) for supported migration and version combinations. * An existing [Azure Database for PostgreSQL](../postgresql/index.yml) version 10 and later instance with the **DVD Rental** database.
- Also note that the target Azure Database for PostgreSQL version must be equal to or later than the on-premises PostgreSQL version. For example, PostgreSQL 10 can migrate to Azure Database for PostgreSQL 10, or 11, but not to Azure Database for PostgreSQL 9.6.
+ Also note that the target Azure Database for PostgreSQL version must be equal to or later than the on-premises PostgreSQL version. For example, PostgreSQL 10 can migrate to Azure Database for PostgreSQL 10, or 11, but not to Azure Database for PostgreSQL 9.6. Migrations to PostgreSQL 13+ are not supported at this time.
* [Create an Azure Database for PostgreSQL server](../postgresql/quickstart-create-server-database-portal.md) or [Create an Azure Database for PostgreSQL - Hyperscale (Citus) server](../postgresql/quickstart-create-hyperscale-portal.md) as the target database server to migrate data into. * Create a Microsoft Azure Virtual Network for Azure Database Migration Service by using the Azure Resource Manager deployment model. For more information about creating a virtual network, see the [Virtual Network Documentation](../virtual-network/index.yml), and especially the quickstart articles with step-by-step details.
After the initial Full load is completed, the databases are marked **Ready to cu
* For information about known issues and limitations when performing online migrations to Azure Database for PostgreSQL, see the article [Known issues and workarounds with Azure Database for PostgreSQL online migrations](known-issues-azure-postgresql-online.md). * For information about the Azure Database Migration Service, see the article [What is the Azure Database Migration Service?](./dms-overview.md).
-* For information about Azure Database for PostgreSQL, see the article [What is Azure Database for PostgreSQL?](../postgresql/overview.md).
+* For information about Azure Database for PostgreSQL, see the article [What is Azure Database for PostgreSQL?](../postgresql/overview.md).
dms Tutorial Postgresql Azure Postgresql Online Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/dms/tutorial-postgresql-azure-postgresql-online-portal.md
In this tutorial, you learn how to:
To complete this tutorial, you need to:
-* Download and install [PostgreSQL community edition](https://www.postgresql.org/download/) 9.4, 9.5, 9.6, or 10. The source PostgreSQL Server version must be 9.4, 9.5, 9.6, 10, or 11. For more information, see the article [Supported PostgreSQL Database Versions](../postgresql/concepts-supported-versions.md).
+* Download and install [PostgreSQL community edition](https://www.postgresql.org/download/) 9.4, 9.5, 9.6, or 10. The source PostgreSQL Server version must be 9.4, 9.5, 9.6, 10, 11 or 12. For more information, see the article [Supported PostgreSQL Database Versions](../postgresql/concepts-supported-versions.md).
- Also note that the target Azure Database for PostgreSQL version must be equal to or later than the on-premises PostgreSQL version. For example, PostgreSQL 9.6 can migrate to Azure Database for PostgreSQL 9.6, 10, or 11, but not to Azure Database for PostgreSQL 9.5.
+ Also note that the target Azure Database for PostgreSQL version must be equal to or later than the on-premises PostgreSQL version. For example, PostgreSQL 9.6 can migrate to Azure Database for PostgreSQL 9.6, 10, or 11, but not to Azure Database for PostgreSQL 9.5. Migrations to PostgreSQL 13+ are not supported at this time.
* [Create an Azure Database for PostgreSQL server](../postgresql/quickstart-create-server-database-portal.md) or [Create an Azure Database for PostgreSQL - Hyperscale (Citus) server](../postgresql/quickstart-create-hyperscale-portal.md). * Create a Microsoft Azure Virtual Network for Azure Database Migration Service by using the Azure Resource Manager deployment model, which provides site-to-site connectivity to your on-premises source servers by using either [ExpressRoute](../expressroute/expressroute-introduction.md) or [VPN](../vpn-gateway/vpn-gateway-about-vpngateways.md). For more information about creating a virtual network, see the [Virtual Network Documentation](../virtual-network/index.yml), and especially the quickstart articles with step-by-step details.
After the initial Full load is completed, the databases are marked **Ready to cu
* For information about known issues and limitations when performing online migrations to Azure Database for PostgreSQL, see the article [Known issues and workarounds with Azure Database for PostgreSQL online migrations](known-issues-azure-postgresql-online.md). * For information about the Azure Database Migration Service, see the article [What is the Azure Database Migration Service?](./dms-overview.md).
-* For information about Azure Database for PostgreSQL, see the article [What is Azure Database for PostgreSQL?](../postgresql/overview.md).
+* For information about Azure Database for PostgreSQL, see the article [What is Azure Database for PostgreSQL?](../postgresql/overview.md).
event-grid Event Schema Farmbeats https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-grid/event-schema-farmbeats.md
Title: Azure FarmBeats as Event Grid source
+ Title: Azure FarmBeats as Event Grid source (Preview)
description: Describes the properties and schema provided for Azure FarmBeats events with Azure Event Grid Last updated 06/06/2021
-# Azure FarmBeats as Event Grid source
+# Azure FarmBeats as Event Grid source (Preview)
This article provides the properties and schema for Azure FarmBeats events. For an introduction to event schemas, see [Azure Event Grid event schema](event-schema.md). ## Available event types
event-hubs Event Hubs Resource Manager Namespace Event Hub Enable Capture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-hubs/event-hubs-resource-manager-namespace-event-hub-enable-capture.md
You can learn more about Event Hubs by visiting the following links:
* [Event Hubs FAQ](event-hubs-faq.yml) [Authoring Azure Resource Manager templates]: ../azure-resource-manager/templates/syntax.md
-[Azure Quickstart Templates]: https://azure.microsoft.com/documentation/templates/?term=event+hubs
+[Azure Quickstart Templates]: https://azure.microsoft.com/resources/templates/?term=event+hubs
[Azure Resources naming conventions]: /azure/cloud-adoption-framework/ready/azure-best-practices/naming-and-tagging [Event hub and enable Capture to Storage template]: https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.eventhub/eventhubs-create-namespace-and-enable-capture
-[Event hub and enable Capture to Azure Data Lake Store template]: https://github.com/Azure/azure-quickstart-templates/tree/master/201-eventhubs-create-namespace-and-enable-capture-for-adls
+[Event hub and enable Capture to Azure Data Lake Store template]: https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.eventhub/eventhubs-create-namespace-and-enable-capture-for-adls
event-hubs Event Hubs Resource Manager Namespace Event Hub https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-hubs/event-hubs-resource-manager-namespace-event-hub.md
In this article, you created an Event Hubs namespace, and an event hub in the na
[5]: ./media/event-hubs-quickstart-powershell/metrics.png [Authoring Azure Resource Manager templates]: ../azure-resource-manager/templates/syntax.md
-[Azure Quickstart Templates]: https://azure.microsoft.com/documentation/templates/?term=event+hubs
+[Azure Quickstart Templates]: https://azure.microsoft.com/resources/templates/?term=event+hubs
[Using Azure PowerShell with Azure Resource Manager]: ../azure-resource-manager/management/manage-resources-powershell.md [Using the Azure CLI for Mac, Linux, and Windows with Azure Resource Management]: ../azure-resource-manager/management/manage-resources-cli.md
-[Event hub and consumer group template]: https://github.com/Azure/azure-quickstart-templates/blob/master/201-event-hubs-create-event-hub-and-consumer-group/
+[Event hub and consumer group template]: https://github.com/Azure/azure-quickstart-templates/blob/master/quickstarts/microsoft.eventhub/event-hubs-create-event-hub-and-consumer-group/
event-hubs Move Across Regions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-hubs/move-across-regions.md
If you have other resources in the Azure resource group that contains the Event
- Ensure that the services and features that your account uses are supported in the target region. - If you have **capture feature** enabled for event hubs in the namespace, move [Azure Storage or Azure Data Lake Store Gen 2](../storage/common/storage-account-move.md) or [Azure Data Lake Store Gen 1](../data-lake-store/data-lake-store-migration-cross-region.md) accounts before moving the Event Hubs namespace. You can also move the resource group that contains both Storage and Event Hubs namespaces to the other region by following steps similar to the ones described in this article. -- If the Event Hubs namespace is in an **Event Hubs cluster**, [move the dedicated cluster](move-cluster-across-regions.md) to the **target region** before you go through steps in this article. You can also use the [quickstart template on GitHub](https://github.com/Azure/azure-quickstart-templates/tree/master/201-eventhubs-create-cluster-namespace-eventhub/) to create an Event Hubs cluster. In the template, remove the namespace portion of the JSON to create only the cluster.
+- If the Event Hubs namespace is in an **Event Hubs cluster**, [move the dedicated cluster](move-cluster-across-regions.md) to the **target region** before you go through steps in this article. You can also use the [quickstart template on GitHub](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.eventhub/eventhubs-create-cluster-namespace-eventhub/) to create an Event Hubs cluster. In the template, remove the namespace portion of the JSON to create only the cluster.
## Prepare To get started, export a Resource Manager template. This template contains settings that describe your Event Hubs namespace.
expressroute Expressroute Migration Classic Resource Manager https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/expressroute/expressroute-migration-classic-resource-manager.md
This article explains how to migrate ExpressRoute-associated virtual networks fr
* [Platform-supported migration of IaaS resources from classic to Azure Resource Manager](../virtual-machines/migration-classic-resource-manager-overview.md) * [Technical deep dive on platform-supported migration from classic to Azure Resource Manager](../virtual-machines/migration-classic-resource-manager-deep-dive.md)
- * [FAQs: Platform-supported migration of IaaS resources from classic to Azure Resource Manager](../virtual-machines/migration-classic-resource-manager-faq.md)
+ * [FAQs: Platform-supported migration of IaaS resources from classic to Azure Resource Manager](../virtual-machines/migration-classic-resource-manager-faq.yml)
* [Review most common migration errors and mitigations](../virtual-machines/migration-classic-resource-manager-errors.md?toc=%2fazure%2fvirtual-machines%2fwindows%2ftoc.json) ## Supported and unsupported scenarios
This section describes the steps to be followed to migrate a virtual network, ga
## Next steps * [Platform-supported migration of IaaS resources from classic to Azure Resource Manager](../virtual-machines/migration-classic-resource-manager-overview.md) * [Technical deep dive on platform-supported migration from classic to Azure Resource Manager](../virtual-machines/migration-classic-resource-manager-deep-dive.md)
-* [FAQs: Platform-supported migration of IaaS resources from classic to Azure Resource Manager](../virtual-machines/migration-classic-resource-manager-faq.md)
+* [FAQs: Platform-supported migration of IaaS resources from classic to Azure Resource Manager](../virtual-machines/migration-classic-resource-manager-faq.yml)
* [Review most common migration errors and mitigations](../virtual-machines/migration-classic-resource-manager-errors.md?toc=%2fazure%2fvirtual-machines%2fwindows%2ftoc.json)
frontdoor Front Door Quickstart Template Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/front-door-quickstart-template-samples.md
The following table includes links to Azure Resource Manager deployment model te
| | | | [Create a basic Front Door](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-create-basic)| Creates a basic Front Door configuration with a single backend. | | [Create a Front Door with multiple backends and backend pools and URL based routing](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-create-multiple-backends)| Creates a Front Door with load balancing configured for multiple backends in ta backend pool and also across backend pools based on URL path. |
-| [Onboard a custom domain with Front Door](https://github.com/Azure/azure-quickstart-templates/tree/master/101-front-door-custom-domain)| Add a custom domain to your Front Door. |
+| [Onboard a custom domain with Front Door](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-custom-domain)| Add a custom domain to your Front Door. |
| [Create Front Door with geo filtering](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-geo-filtering)| Create a Front Door that allows/blocks traffic from certain countries/regions. |
-| [Control Health Probes for your backends on Front Door](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-health-probes)| Update your Front Door to change the health probe settings by updating the probe path and also the intervals in which the probes will be sent. |
-| [Create Front Door with Active/Standby backend configuration](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-priority-lb)| Creates a Front Door that demonstrates priority-based routing for Active/Standby application topology, that is, by default send all traffic to the primary (highest-priority) backend until it becomes unavailable. |
-| [Create Front Door with caching enabled for certain routes](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-create-caching)| Creates a Front Door with caching enabled for the defined routing configuration thus caching any static assets for your workload. |
-| [Configure Session Affinity for your Front Door host names](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-session-affinity) | Updates a Front Door to enable session affinity for your frontend host, thereby, sending subsequent traffic from the same user session to the same backend. |
+| [Control Health Probes for your backends on Front Door](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-health-probes)| Update your Front Door to change the health probe settings by updating the probe path and also the intervals in which the probes will be sent. |
+| [Create Front Door with Active/Standby backend configuration](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-priority-lb)| Creates a Front Door that demonstrates priority-based routing for Active/Standby application topology, that is, by default send all traffic to the primary (highest-priority) backend until it becomes unavailable. |
+| [Create Front Door with caching enabled for certain routes](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-create-caching)| Creates a Front Door with caching enabled for the defined routing configuration thus caching any static assets for your workload. |
+| [Configure Session Affinity for your Front Door host names](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-session-affinity) | Updates a Front Door to enable session affinity for your frontend host, thereby, sending subsequent traffic from the same user session to the same backend. |
| [Configure Front Door for client IP allowlisting or blocklisting](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-waf-clientip)| Configures a Front Door to restrict traffic certain client IPs using custom access control using client IPs. | | [Configure Front Door to take action with specific http parameters](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-waf-http-params)| Configures a Front Door to allow or block certain traffic based on the http parameters in the incoming request by using custom rules for access control using http parameters. | | [Configure Front Door rate limiting](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-rate-limiting)| Configures a Front Door to rate limit incoming traffic for a given frontend host. |
frontdoor Resource Manager Template Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/frontdoor/standard-premium/resource-manager-template-samples.md
The following table includes links to Azure Resource Manager templates for Azure
| [WAF policy with rate limit](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-standard-premium-rate-limit/) | Creates a Front Door profile and WAF with a custom rule to perform rate limiting. | | [WAF policy with geo-filtering](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-standard-premium-geo-filtering/) | Creates a Front Door profile and WAF with a custom rule to perform geo-filtering. | |**App Service origins**| **Description** |
-| [App Service](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-standard-premium-app-service-public) | Creates an App Service app with a public endpoint, and a Front Door profile. |
-| [App Service with Private Link](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-premium-app-service-private-link) | Creates an App Service app with a private endpoint, and a Front Door profile. |
+| [App Service](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-standard-premium-app-service-public) | Creates an App Service app with a public endpoint, and a Front Door profile. |
+| [App Service with Private Link](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-premium-app-service-private-link) | Creates an App Service app with a private endpoint, and a Front Door profile. |
|**Azure Functions origins**| **Description** |
-| [Azure Functions](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-standard-premium-function-public/) | Creates an Azure Functions app with a public endpoint, and a Front Door profile. |
-| [Azure Functions with Private Link](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-premium-function-private-link) | Creates an Azure Functions app with a private endpoint, and a Front Door profile. |
+| [Azure Functions](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-standard-premium-function-public/) | Creates an Azure Functions app with a public endpoint, and a Front Door profile. |
+| [Azure Functions with Private Link](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-premium-function-private-link) | Creates an Azure Functions app with a private endpoint, and a Front Door profile. |
|**API Management origins**| **Description** |
-| [API Management (external)](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-standard-premium-api-management-external) | Creates an API Management instance with external VNet integration, and a Front Door profile. |
+| [API Management (external)](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-standard-premium-api-management-external) | Creates an API Management instance with external VNet integration, and a Front Door profile. |
|**Storage origins**| **Description** |
-| [Storage static website](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-standard-premium-storage-static-website) | Creates an Azure Storage account and static website with a public endpoint, and a Front Door profile. |
-| [Storage blobs with Private Link](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-premium-storage-blobs-private-link) | Creates an Azure Storage account and blob container with a private endpoint, and a Front Door profile. |
+| [Storage static website](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-standard-premium-storage-static-website) | Creates an Azure Storage account and static website with a public endpoint, and a Front Door profile. |
+| [Storage blobs with Private Link](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-premium-storage-blobs-private-link) | Creates an Azure Storage account and blob container with a private endpoint, and a Front Door profile. |
|**Application Gateway origins**| **Description** |
-| [Application Gateway](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-standard-premium-application-gateway-public) | Creates an Application Gateway, and a Front Door profile. |
+| [Application Gateway](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-standard-premium-application-gateway-public) | Creates an Application Gateway, and a Front Door profile. |
|**Virtual machine origins**| **Description** |
-| [Virtual machine with Private Link service](https://github.com/Azure/azure-quickstart-templates/tree/master/201-front-door-premium-vm-private-link) | Creates a virtual machine and Private Link service, and a Front Door profile. |
+| [Virtual machine with Private Link service](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.network/front-door-premium-vm-private-link) | Creates a virtual machine and Private Link service, and a Front Door profile. |
| | |
governance Guest Configuration Create Linux https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/policy/how-to/guest-configuration-create-linux.md
GuestConfiguration agent expects the certificate public key to be present in the
`/usr/local/share/ca-certificates/extra` on Linux machines. For the node to verify signed content, install the certificate public key on the machine before applying the custom policy. This process can be done using any technique inside the VM, or by using Azure Policy. An example template is
-[provided here](https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-push-certificate-windows).
+[provided here](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.compute/vm-push-certificate-windows).
The Key Vault access policy must allow the Compute resource provider to access certificates during deployments. For detailed steps, see [Set up Key Vault for virtual machines in Azure Resource Manager](../../../virtual-machines/windows/key-vault-setup.md#use-templates-to-set-up-key-vault).
governance Guest Configuration Create https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/policy/how-to/guest-configuration-create.md
Certificate Authorities" on Windows machines and in the path
`/usr/local/share/ca-certificates/extra` on Linux machines. For the node to verify signed content, install the certificate public key on the machine before applying the custom policy. This process can be done using any technique inside the VM or by using Azure Policy. An example template is
-[provided here](https://github.com/Azure/azure-quickstart-templates/tree/master/201-vm-push-certificate-windows).
+[provided here](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts/microsoft.compute/vm-push-certificate-windows).
The Key Vault access policy must allow the Compute resource provider to access certificates during deployments. For detailed steps, see [Set up Key Vault for virtual machines in Azure Resource Manager](../../../virtual-machines/windows/key-vault-setup.md#use-templates-to-set-up-key-vault).
governance Work With Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/resource-graph/concepts/work-with-data.md
az graph query -q "Resources | project name | order by name asc" --first 200 --o
Search-AzGraph -Query "Resources | project name | order by name asc" -First 200 ```
-In the [REST API](/rest/api/azureresourcegraph/resourcegraph(2019-04-01)/resources/resources), the
+In the [REST API](/rest/api/azureresourcegraph/resourcegraph(2021-03-01)/resources/resources), the
control is **$top** and is part of **QueryRequestOptions**. The control that is _most restrictive_ will win. For example, if your query uses the **top** or
az graph query -q "Resources | project name | order by name asc" --skip 10 --out
Search-AzGraph -Query "Resources | project name | order by name asc" -Skip 10 ```
-In the [REST API](/rest/api/azureresourcegraph/resourcegraph(2019-04-01)/resources/resources), the
+In the [REST API](/rest/api/azureresourcegraph/resourcegraph(2021-03-01)/resources/resources), the
control is **$skip** and is part of **QueryRequestOptions**. ## Paging results When it's necessary to break a result set into smaller sets of records for processing or because a result set would exceed the maximum allowed value of _1000_ returned records, use paging. The
-[REST API](/rest/api/azureresourcegraph/resourcegraph(2019-04-01)/resources/resources)
+[REST API](/rest/api/azureresourcegraph/resourcegraph(2021-03-01)/resources/resources)
**QueryResponse** provides values to indicate of a results set has been broken up: **resultTruncated** and **$skipToken**. **resultTruncated** is a Boolean value that informs the consumer if there are more records not returned in the response. This condition can also be
Search-AzGraph -Query "Resources | project id, name | order by id asc" -First 10
> the query, the response won't include the **$skipToken**. For an example, see
-[Next page query](/rest/api/azureresourcegraph/resourcegraph(2019-04-01)/resources/resources#next-page-query)
+[Next page query](/rest/api/azureresourcegraph/resourcegraph(2021-03-01)/resources/resources#next-page-query)
in the REST API docs. ## Formatting results
governance First Query Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/resource-graph/first-query-rest-api.md
the results returned are consistent and ordered by the **Name** property, but st
top five results. For more examples of REST API calls for Azure Resource Graph, see the
-[Azure Resource Graph REST Examples](/rest/api/azureresourcegraph/resourcegraph(2019-04-01)/resources/resources#examples).
+[Azure Resource Graph REST Examples](/rest/api/azureresourcegraph/resourcegraph(2021-03-01)/resources/resources#examples).
## Clean up resources
hdinsight Apache Ambari Usage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/apache-ambari-usage.md
Property Value: environment|env|dl_data_dt
* [Manage HDInsight clusters by using the Apache Ambari Web UI](hdinsight-hadoop-manage-ambari.md) * [Manage HDInsight clusters by using the Apache Ambari REST API](hdinsight-hadoop-manage-ambari-rest-api.md)
hdinsight Domain Joined Authentication Issues https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/domain-joined/domain-joined-authentication-issues.md
This error occurs intermittently when users try to access the ADLS Gen2 using AC
## Next steps
hdinsight Troubleshoot Domainnotfound https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/domain-joined/troubleshoot-domainnotfound.md
When the domain joined clusters are deployed, HDI creates an internal user name
## Next steps
hdinsight Apache Ambari Troubleshoot Directory Alerts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/hadoop/apache-ambari-troubleshoot-directory-alerts.md
Manually create missing directories on the affected worker node(s).
## Next steps
hdinsight Apache Ambari Troubleshoot Down Hosts Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/hadoop/apache-ambari-troubleshoot-down-hosts-services.md
Usually rebooting the active headnode will mitigate this issue. If not please co
## Next steps
hdinsight Apache Ambari Troubleshoot Fivezerotwo Error https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/hadoop/apache-ambari-troubleshoot-fivezerotwo-error.md
Error Processing URI: /api/v1/clusters/xxxxxx/host_components - (java.lang.OutOf
## Next steps
hdinsight Apache Ambari Troubleshoot Heartbeat Issues https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/hadoop/apache-ambari-troubleshoot-heartbeat-issues.md
OMS logs are causing high CPU utilization.
## Next steps
hdinsight Apache Hadoop Connect Excel Hive Odbc Driver https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/hadoop/apache-hadoop-connect-excel-hive-odbc-driver.md
Last updated 04/22/2020
# Connect Excel to Apache Hadoop in Azure HDInsight with the Microsoft Hive ODBC driver Microsoft's Big Data solution integrates Microsoft Business Intelligence (BI) components with Apache Hadoop clusters deployed in HDInsight. An example is the ability to connect Excel to the Hive data warehouse of a Hadoop cluster. Connect using the Microsoft Hive Open Database Connectivity (ODBC) Driver.
hdinsight Apache Hadoop Connect Hive Jdbc Driver https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/hadoop/apache-hadoop-connect-hive-jdbc-driver.md
Last updated 04/20/2020
# Query Apache Hive through the JDBC driver in HDInsight Learn how to use the JDBC driver from a Java application. To submit Apache Hive queries to Apache Hadoop in Azure HDInsight. The information in this document demonstrates how to connect programmatically, and from the SQuirreL SQL client.
hdinsight Apache Hadoop Deep Dive Advanced Analytics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/hadoop/apache-hadoop-deep-dive-advanced-analytics.md
Along with selecting the best-fitting algorithm(s), you need to consider whether
HDInsight has several machine learning options for an advanced analytics workflow: * Machine Learning and Apache Spark
-* R and ML Services
* Azure Machine Learning and Apache Hive * Apache Spark and Deep learning
There are three scalable machine learning libraries that bring algorithmic model
* [**SparkML**](https://spark.apache.org/docs/1.2.2/ml-guide.html) - SparkML is a newer package that provides a higher-level API built on top of Spark DataFrames for constructing ML pipelines. * [**MMLSpark**](https://github.com/Azure/mmlspark) - The Microsoft Machine Learning library for Apache Spark (MMLSpark) is designed to make data scientists more productive on Spark, to increase the rate of experimentation, and to leverage cutting-edge machine learning techniques, including deep learning, on very large datasets. The MMLSpark library simplifies common modeling tasks for building models in PySpark.
-### R and ML Services
-
-As part of HDInsight, you can create an HDInsight cluster with [ML Services](../r-server/r-server-overview.md) ready to be used with massive datasets and models. This new capability provides data scientists and statisticians with a familiar R interface that can scale on-demand through HDInsight, without the overhead of cluster setup and maintenance.
- ### Azure Machine Learning and Apache Hive [Azure Machine Learning Studio (classic)](https://studio.azureml.net/) provides tools to model predictive analytics, and a fully managed service you can use to deploy your predictive models as ready-to-consume web services. Azure Machine Learning provides tools for creating complete predictive analytics solutions in the cloud to quickly create, test, operationalize, and manage predictive models. Select from a large algorithm library, use a web-based studio for building models, and easily deploy your model as a web service.
hdinsight Apache Hadoop On Premises Migration Best Practices Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/hadoop/apache-hadoop-on-premises-migration-best-practices-architecture.md
Azure HDInsight clusters are designed for a specific type of compute usage. Beca
|IoT / Streaming|Kafka, Storm, Spark| |NoSQL Transactional processing|HBase| |Interactive and Faster queries with in-memory caching|Interactive Query|
-|Data Science|ML Services, Spark|
+|Data Science| Spark|
The following table shows the different methods that can be used to create an HDInsight cluster.
hdinsight Apache Hadoop Run Samples Linux https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/hdinsight/hadoop/apache-hadoop-run-samples-linux.md
Last updated 12/12/2019
# Run the MapReduce examples included in HDInsight Learn how to run the MapReduce examples included with Apache Hadoop on HDInsight.