Updates from: 03/09/2021 04:06:16
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Add Password Reset Policy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/add-password-reset-policy.md
Previously updated : 03/02/2021 Last updated : 03/08/2021 zone_pivot_groups: b2c-policy-type
In your user journey, you can represent the Forgot Password sub journey as a **C
```xml <ClaimsExchange Id="ForgotPasswordExchange" TechnicalProfileReferenceId="ForgotPassword" /> ```
+
+1. Add the following orchestration step between the current step, and the next step. The new orchestration step you add, checks whether the `isForgotPassword` claim exists. If the claim exists, it invokes the [password reset sub journey](#add-the-password-reset-sub-journey).
+
+ ```xml
+ <OrchestrationStep Order="3" Type="InvokeSubJourney">
+ <Preconditions>
+ <Precondition Type="ClaimsExist" ExecuteActionsIf="false">
+ <Value>isForgotPassword</Value>
+ <Action>SkipThisOrchestrationStep</Action>
+ </Precondition>
+ </Preconditions>
+ <JourneyList>
+ <Candidate SubJourneyReferenceId="PasswordReset" />
+ </JourneyList>
+ </OrchestrationStep>
+ ```
+
+1. After you add the new orchestration step, renumber the steps sequentially without skipping any integers from 1 to N.
### Set the user journey to be executed
In the following diagram:
1. The user selects the **Forgot your password?** link. Azure AD B2C returns the AADB2C90118 error code to the application. 1. The application handles the error code and initiates a new authorization request. The authorization request specifies the password reset policy name, such as **B2C_1_pwd_reset**.
-![Password reset flow](./media/add-password-reset-policy/password-reset-flow-legacy.png)
+![Legacy password reset user flow](./media/add-password-reset-policy/password-reset-flow-legacy.png)
To see an example, take a look at a [simple ASP.NET sample](https://github.com/AzureADQuickStarts/B2C-WebApp-OpenIDConnect-DotNet-SUSI), which demonstrates the linking of user flows.
active-directory-b2c Claimsproviders https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/claimsproviders.md
Previously updated : 01/29/2020 Last updated : 03/08/2021
[!INCLUDE [active-directory-b2c-advanced-audience-warning](../../includes/active-directory-b2c-advanced-audience-warning.md)]
-A claims provider contains a set of [technical profiles](technicalprofiles.md). Every claims provider must have one or more technical profiles that determine the endpoints and the protocols needed to communicate with the claims provider. A claims provider can have multiple technical profiles. For example, multiple technical profiles may be defined because the claims provider supports multiple protocols, various endpoints with different capabilities, or releases different claims at different assurance levels. It may be acceptable to release sensitive claims in one user journey, but not in another.
+A claims provide is an interface to communicate with different types of parties via its [technical profiles](technicalprofiles.md). Every claims provider must have one or more technical profiles that determine the endpoints and the protocols needed to communicate with the claims provider. A claims provider can have multiple technical profiles. For example, multiple technical profiles may be defined because the claims provider supports multiple protocols, various endpoints with different capabilities, or releases different claims at different assurance levels. It may be acceptable to release sensitive claims in one user journey, but not in another.
+
+A user journey combines calling technical profiles via orchestration steps to define your business logic.
```xml <ClaimsProviders>
active-directory-b2c Identity Provider Adfs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-adfs.md
Previously updated : 02/12/2021 Last updated : 03/08/2021
You can define an AD FS account as a claims provider by adding it to the **Claim
```xml <ClaimsProvider> <Domain>contoso.com</Domain>
- <DisplayName>Contoso AD FS</DisplayName>
+ <DisplayName>Contoso</DisplayName>
<TechnicalProfiles> <TechnicalProfile Id="Contoso-SAML2">
- <DisplayName>Contoso AD FS</DisplayName>
+ <DisplayName>Contoso</DisplayName>
<Description>Login with your AD FS account</Description> <Protocol Name="SAML2"/> <Metadata>
Open a browser and navigate to the URL. Make sure you type the correct URL and t
1. Select your relying party policy, for example `B2C_1A_signup_signin`. 1. For **Application**, select a web application that you [previously registered](tutorial-register-applications.md). The **Reply URL** should show `https://jwt.ms`. 1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Contoso AD FS** to sign in with Contoso AD FS identity provider.
If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.+ ## Troubleshooting AD FS service AD FS is configured to use the Windows application log. If you experience challenges setting up AD FS as a SAML identity provider using custom policies in Azure AD B2C, you may want to check the AD FS event log:
active-directory-b2c Identity Provider Amazon https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-amazon.md
Previously updated : 01/27/2021 Last updated : 03/08/2021 zone_pivot_groups: b2c-policy-type
To enable sign-in for users with an Amazon account in Azure Active Directory B2C
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **Amazon** to sign in with Amazon account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
You can define an Amazon account as a claims provider by adding it to the **Clai
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Amazon** to sign in with Amazon account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Apple Id https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-apple-id.md
Previously updated : 03/03/2021 Last updated : 03/08/2021
To enable users to sign in using an Apple ID, you need to add the Apple identity
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Select **Run user flow**.
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **Apple** to sign in with Apple ID.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
You can define an Apple ID as a claims provider by adding it to the **ClaimsProv
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Apple** to sign in with Apple ID.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Azure Ad B2c https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-azure-ad-b2c.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
To create an application.
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
-1. From the sign-up or sign-in page, select *Fabrikam* to sign in with the other Azure AD B2C tenant.
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **Fabrikam** to sign in with the other Azure AD B2C tenant.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
You can define Azure AD B2C as a claims provider by adding Azure AD B2C to the *
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)] +
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Fabrikam** to sign in with the other Azure AD B2C tenant.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Azure Ad Multi Tenant https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-azure-ad-multi-tenant.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
Perform these steps for each Azure AD tenant that should be used to sign in:
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Common AAD** to sign in with Azure AD account.
To test the multi-tenant sign-in capability, perform the last two steps using the credentials for a user that exists another Azure AD tenant. Copy the **Run now endpoint** and open it in a private browser window, for example, Incognito Mode in Google Chrome or an InPrivate window in Microsoft Edge. Opening in a private browser window allows you to test the full user journey by not using any currently cached Azure AD credentials.
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
+ ## Next steps When working with custom policies, you might sometimes need additional information when troubleshooting a policy during its development.
active-directory-b2c Identity Provider Azure Ad Single Tenant https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-azure-ad-single-tenant.md
Previously updated : 03/04/2021 Last updated : 03/08/2021
If you want to get the `family_name` and `given_name` claims from Azure AD, you
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **Contoso Azure AD** to sign in with Azure AD Contoso account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
To get a token from the Azure AD endpoint, you need to define the protocols that
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Contoso Employee** to sign in with Azure AD Contoso account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
## Next steps
active-directory-b2c Identity Provider Facebook https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-facebook.md
Previously updated : 01/19/2021 Last updated : 03/08/2021
To enable sign-in for users with a Facebook account in Azure Active Directory B2
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **Facebook** to sign in with Facebook account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
+ ::: zone-end
Update the relying party (RP) file that initiates the user journey that you crea
1. Upload the *TrustFrameworkExtensions.xml* file to your tenant. 1. Under **Custom policies**, select **B2C_1A_signup_signin**. 1. For **Select Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Select **Run now** and select Facebook to sign in with Facebook and test the custom policy.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Facebook** to sign in with Facebook account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Generic Openid Connect https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-generic-openid-connect.md
Previously updated : 08/08/2019 Last updated : 03/08/2021
1. Make sure you're using the directory that contains your Azure AD B2C tenant by clicking the **Directory + subscription** filter in the top menu and choosing the directory that contains your tenant. 1. Choose **All services** in the top-left corner of the Azure portal, search for and select **Azure AD B2C**. 1. Select **Identity providers**, and then select **New OpenID Connect provider**.
+1. Enter a **Name**. For example, enter *Contoso*.
## Configure the identity provider
After the custom identity provider sends an ID token back to Azure AD B2C, Azure
* **Given Name**: Enter the claim that provides the *first name* of the user. * **Surname**: Enter the claim that provides the *last name* of the user. * **Email**: Enter the claim that provides the *email address* of the user.+
+## Add the identity provider to a user flow
+
+1. In your Azure AD B2C tenant, select **User flows**.
+1. Click the user flow that you want to add the identity provider.
+1. Under the **Social identity providers**, select the identity provider you added. For example, *Contoso*.
+1. Select **Save**.
+1. To test your policy, select **Run user flow**.
+1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select the identity provider you want to sign-in. For example, *Contoso*.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
active-directory-b2c Identity Provider Generic Saml https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-generic-saml.md
Previously updated : 03/03/2021 Last updated : 03/08/2021
Open a browser and navigate to the URL. Make sure you type the correct URL and t
1. Select your relying party policy, for example `B2C_1A_signup_signin`. 1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`. 1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Contoso** to sign in with Contoso account.
If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
active-directory-b2c Identity Provider Github https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-github.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
To enable sign-in with a GitHub account in Azure Active Directory B2C (Azure AD
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **GitHub** to sign in with GitHub account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
The GitHub technical profile requires the **CreateIssuerUserId** claim transform
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **GitHub** to sign in with GitHub account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Google https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-google.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
Enter a **Name** for your application. Enter *b2clogin.com* in the **Authorized
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **Google** to sign in with Google account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
You can define a Google account as a claims provider by adding it to the **Claim
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Google** to sign in with Google account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Id Me https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-id-me.md
Previously updated : 01/27/2021 Last updated : 03/08/2021 zone_pivot_groups: b2c-policy-type
Next, you need a claims transformation to create the displayName claim. Add the
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **ID.me** to sign in with ID.me account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Linkedin https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-linkedin.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
To enable sign-in for users with a LinkedIn account in Azure Active Directory B2
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **LinkedIn** to sign in with LinkedIn account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
Add the **BuildingBlocks** element near the top of the *TrustFrameworkExtensions
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **LinkedIn** to sign in with LinkedIn account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
## Migration from v1.0 to v2.0
active-directory-b2c Identity Provider Microsoft Account https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-microsoft-account.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
To enable sign-in for users with a Microsoft account in Azure Active Directory B
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **Microsoft** to sign in with Microsoft account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
You've now configured your policy so that Azure AD B2C knows how to communicate
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Microsoft** to sign in with Microsoft account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Qq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-qq.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
To enable sign-in for users with a QQ account in Azure Active Directory B2C (Azu
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **QQ** to sign in with QQ account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
+ ::: zone-end
You can define a QQ account as a claims provider by adding it to the **ClaimsPro
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **QQ** to sign in with QQ account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Salesforce Saml https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-salesforce-saml.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
You can define a Salesforce account as a claims provider by adding it to the **C
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Salesforce** to sign in with Salesforce account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Salesforce https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-salesforce.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
To enable sign-in for users with a Salesforce account in Azure Active Directory
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **Salesforce** to sign in with Salesforce account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
You can define a Salesforce account as a claims provider by adding it to the **C
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Salesforce** to sign in with Salesforce account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
+ ::: zone-end
active-directory-b2c Identity Provider Twitter https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-twitter.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
To enable sign-in for users with a Twitter account in Azure AD B2C, you need to
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **Twitter** to sign in with Twitter account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
You can define a Twitter account as a claims provider by adding it to the **Clai
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Twitter** to sign in with Twitter account.
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Wechat https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-wechat.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
To enable sign-in for users with a WeChat account in Azure Active Directory B2C
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **WeChat** to sign in with WeChat account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
You can define a WeChat account as a claims provider by adding it to the **Claim
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **WeChat** to sign in with WeChat account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Identity Provider Weibo https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/identity-provider-weibo.md
Previously updated : 01/27/2021 Last updated : 03/08/2021
To enable sign-in for users with a Weibo account in Azure Active Directory B2C (
1. Select **Save**. 1. To test your policy, select **Run user flow**. 1. For **Application**, select the web application named *testapp1* that you previously registered. The **Reply URL** should show `https://jwt.ms`.
-1. Click **Run user flow**
+1. Select the **Run user flow** button.
+1. From the sign-up or sign-in page, select **Weibo** to sign in with Weibo account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
The GitHub technical profile requires the **CreateIssuerUserId** claim transform
[!INCLUDE [active-directory-b2c-configure-relying-party-policy](../../includes/active-directory-b2c-configure-relying-party-policy-user-journey.md)]
+## Test your custom policy
+
+1. Select your relying party policy, for example `B2C_1A_signup_signin`.
+1. For **Application**, select a web application that you [previously registered](troubleshoot-custom-policies.md#troubleshoot-the-runtime). The **Reply URL** should show `https://jwt.ms`.
+1. Select the **Run now** button.
+1. From the sign-up or sign-in page, select **Weibo** to sign in with Weibo account.
+
+If the sign-in process is successful, your browser is redirected to `https://jwt.ms`, which displays the contents of the token returned by Azure AD B2C.
::: zone-end
active-directory-b2c Localization String Ids https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/localization-string-ids.md
Previously updated : 11/09/2020 Last updated : 03/08/2021
The following are the IDs for a [Verification display control](display-control-v
| ID | Default value | | -- | - | |intro_msg| Verification is necessary. Please click Send button.|
-|success_send_code_msg | Verification code has been sent to your inbox. Please copy it to the input box below.|
+|success_send_code_msg | Verification code has been sent. Please copy it to the input box below.|
|failure_send_code_msg | We are having trouble verifying your email address. Please enter a valid email address and try again.| |success_verify_code_msg | E-mail address verified. You can now continue.| |failure_verify_code_msg | We are having trouble verifying your email address. Please try again.|
active-directory-b2c Localization https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/localization.md
Previously updated : 10/15/2020 Last updated : 03/08/2021
The **LocalizedString** element contains the following attributes:
| Attribute | Required | Description | | | -- | -- |
-| ElementType | Yes | Possible values: [ClaimsProvider](#claimsprovider), [ClaimType](#claimtype), [ErrorMessage](#errormessage), [GetLocalizedStringsTransformationClaimType](#getlocalizedstringstransformationclaimtype), [Predicate](#predicate), [InputValidation](#inputvalidation), or [UxElement](#uxelement). |
+| ElementType | Yes | Possible values: [ClaimsProvider](#claimsprovider), [ClaimType](#claimtype), [ErrorMessage](#errormessage), [GetLocalizedStringsTransformationClaimType](#getlocalizedstringstransformationclaimtype), [FormatLocalizedStringTransformationClaimType](#formatlocalizedstringtransformationclaimtype), [Predicate](#predicate), [InputValidation](#inputvalidation), or [UxElement](#uxelement). |
| ElementId | Yes | If **ElementType** is set to `ClaimType`, `Predicate`, or `InputValidation`, this element contains a reference to a claim type already defined in the ClaimsSchema section. | | StringId | Yes | If **ElementType** is set to `ClaimType`, this element contains a reference to an attribute of a claim type. Possible values: `DisplayName`, `AdminHelpText`, or `PatternHelpText`. The `DisplayName` value is used to set the claim display name. The `AdminHelpText` value is used to set the help text name of the claim user. The `PatternHelpText` value is used to set the claim pattern help text. If **ElementType** is set to `UxElement`, this element contains a reference to an attribute of a user interface element. If **ElementType** is set to `ErrorMessage`, this element specifies the identifier of an error message. See [Localization string IDs](localization-string-ids.md) for a complete list of the `UxElement` identifiers.|
The following example shows how to localize the UserMessageIfClaimsPrincipalAlre
<LocalizedString ElementType="ErrorMessage" StringId="UserMessageIfClaimsPrincipalAlreadyExists">The account you are trying to create already exists, please sign-in.</LocalizedString> ```
+### FormatLocalizedStringTransformationClaimType
+
+The FormatLocalizedStringTransformationClaimType value is used to format claims into a localized string. For more information, see [FormatLocalizedString claims transformation](string-transformations.md#formatlocalizedstring)
++
+```xml
+<ClaimsTransformation Id="SetResponseMessageForEmailAlreadyExists" TransformationMethod="FormatLocalizedString">
+ <InputClaims>
+ <InputClaim ClaimTypeReferenceId="email" />
+ </InputClaims>
+ <InputParameters>
+ <InputParameter Id="stringFormatId" DataType="string" Value="ResponseMessge_EmailExists" />
+ </InputParameters>
+ <OutputClaims>
+ <OutputClaim ClaimTypeReferenceId="responseMsg" TransformationClaimType="outputClaim" />
+ </OutputClaims>
+</ClaimsTransformation>
+```
+
+The following example shows how to localize string format of the FormatLocalizedStringTransformationClaimType claims transformation.
+
+```xml
+<LocalizedString ElementType="FormatLocalizedStringTransformationClaimType" StringId="ResponseMessge_EmailExists">The email '{0}' is already an account in this organization. Click Next to sign in with that account.</LocalizedString>
+```
+ ### GetLocalizedStringsTransformationClaimType The GetLocalizedStringsTransformationClaimType value is used to copy localized strings into claims. For more information, see [GetLocalizedStringsTransformation claims transformation](string-transformations.md#getlocalizedstringstransformation)
active-directory-b2c String Transformations https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/string-transformations.md
Previously updated : 03/04/2021 Last updated : 03/08/2021
Following example generates an integer random value between 0 and 1000. The valu
- **outputClaim**: OTP_853
+## FormatLocalizedString
+
+Format multiple claims according to a provided localized format string. This transformation uses the C# `String.Format` method.
++
+| Item | TransformationClaimType | Data Type | Notes |
+| - | -- | | -- |
+| InputClaims | |string | The collection of input claims that acts as string format {0}, {1}, {2} parameters. |
+| InputParameter | stringFormatId | string | The `StringId` of a [localized string](localization.md). |
+| OutputClaim | outputClaim | string | The ClaimType that is produced after this claims transformation has been invoked. |
+
+> [!NOTE]
+> String format maximum allowed size is 4000.
+
+To use the FormatLocalizedString claims transformation:
+
+1. Define a [localization string](localization.md), and associate it with a [self-asserted-technical-profile](self-asserted-technical-profile.md).
+1. The `ElementType` of the `LocalizedString` element must be set to `FormatLocalizedStringTransformationClaimType`.
+1. The `StringId` is a unique identifier that you define, and use it later in your claims transformation `stringFormatId`.
+1. In the claims transformation, specify the list of claims to be set with the localized string. Then set the `stringFormatId` to the `StringId` of the localized string element.
+1. In a [self-asserted technical profile](self-asserted-technical-profile.md), or a [display control](display-controls.md) input or output claims transformation, make a reference to your claims transformation.
++
+The following example generates an error message when an account is already in the directory. The example defines localized strings for English (default) and Spanish.
+
+```xml
+<Localization Enabled="true">
+ <SupportedLanguages DefaultLanguage="en" MergeBehavior="Append">
+ <SupportedLanguage>en</SupportedLanguage>
+ <SupportedLanguage>es</SupportedLanguage>
+ </SupportedLanguages>
+
+ <LocalizedResources Id="api.localaccountsignup.en">
+ <LocalizedStrings>
+ <LocalizedString ElementType="FormatLocalizedStringTransformationClaimType" StringId="ResponseMessge_EmailExists">The email '{0}' is already an account in this organization. Click Next to sign in with that account.</LocalizedString>
+ </LocalizedStrings>
+ </LocalizedResources>
+ <LocalizedResources Id="api.localaccountsignup.es">
+ <LocalizedStrings>
+ <LocalizedString ElementType="FormatLocalizedStringTransformationClaimType" StringId="ResponseMessge_EmailExists">Este correo electr├│nico "{0}" ya es una cuenta de esta organizaci├│n. Haga clic en Siguiente para iniciar sesi├│n con esa cuenta.</LocalizedString>
+ </LocalizedStrings>
+ </LocalizedResources>
+</Localization>
+```
+
+The claims transformation creates a response message based on the localized string. The message contains the user's email address embedded into the localized sting *ResponseMessge_EmailExists*.
+
+```xml
+<ClaimsTransformation Id="SetResponseMessageForEmailAlreadyExists" TransformationMethod="FormatLocalizedString">
+ <InputClaims>
+ <InputClaim ClaimTypeReferenceId="email" />
+ </InputClaims>
+ <InputParameters>
+ <InputParameter Id="stringFormatId" DataType="string" Value="ResponseMessge_EmailExists" />
+ </InputParameters>
+ <OutputClaims>
+ <OutputClaim ClaimTypeReferenceId="responseMsg" TransformationClaimType="outputClaim" />
+ </OutputClaims>
+</ClaimsTransformation>
+```
+
+### Example
+
+- Input claims:
+ - **inputClaim**: sarah@contoso.com
+- Input parameters:
+ - **stringFormat**: ResponseMessge_EmailExists
+- Output claims:
+ - **outputClaim**: The email 'sarah@contoso.com' is already an account in this organization. Click Next to sign in with that account.
++ ## FormatStringClaim Format a claim according to the provided format string. This transformation uses the C# `String.Format` method.
Format a claim according to the provided format string. This transformation uses
| InputParameter | stringFormat | string | The string format, including the {0} parameter. This input parameter supports [string claims transformation expressions](string-transformations.md#string-claim-transformations-expressions). | | OutputClaim | outputClaim | string | The ClaimType that is produced after this claims transformation has been invoked. |
+> [!NOTE]
+> String format maximum allowed size is 4000.
+ Use this claims transformation to format any string with one parameter {0}. The following example creates a **userPrincipalName**. All social identity provider technical profiles, such as `Facebook-OAUTH` calls the **CreateUserPrincipalName** to generate a **userPrincipalName**. ```xml
Format two claims according to the provided format string. This transformation u
| InputParameter | stringFormat | string | The string format, including the {0} and {1} parameters. This input parameter supports [string claims transformation expressions](string-transformations.md#string-claim-transformations-expressions). | | OutputClaim | outputClaim | string | The ClaimType that is produced after this claims transformation has been invoked. |
+> [!NOTE]
+> String format maximum allowed size is 4000.
+ Use this claims transformation to format any string with two parameters, {0} and {1}. The following example creates a **displayName** with the specified format: ```xml
active-directory-b2c Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/whats-new-docs.md
Title: "What's new in Azure Active Directory business-to-customer (B2C)" description: "New and updated documentation for the Azure Active Directory business-to-customer (B2C)." Previously updated : 02/01/2021 Last updated : 03/08/2021
Welcome to what's new in Azure Active Directory B2C documentation. This article lists new docs that have been added and those that have had significant updates in the last three months. To learn what's new with the B2C service, see [What's new in Azure Active Directory](../active-directory/fundamentals/whats-new.md).
+## February 2021
+
+### New articles
+
+- [Securing phone-based multi-factor authentication (MFA)](phone-based-mfa.md)
+
+### Updated articles
+
+- [Azure Active Directory B2C code samples](code-samples.md)
+- [Track user behavior in Azure AD B2C by using Application Insights](analytics-with-application-insights.md)
+- [Configure session behavior in Azure Active Directory B2C](session-behavior.md)
+ ## January 2021 ### New articles
active-directory-domain-services Powershell Scoped Synchronization https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-domain-services/powershell-scoped-synchronization.md
Previously updated : 01/20/2021 Last updated : 03/08/2021
To enable group-based scoped synchronization for a managed domain, complete the
When prompted, specify the credentials for a *global admin* to sign in to your Azure AD tenant using the [Connect-AzureAD][Connect-AzureAD] cmdlet: ```powershell
- // Connect to your Azure AD tenant
+ # Connect to your Azure AD tenant
Connect-AzureAD
- // Retrieve the Azure AD DS resource.
+ # Retrieve the Azure AD DS resource.
$DomainServicesResource = Get-AzResource -ResourceType "Microsoft.AAD/DomainServices"
- // Enable group-based scoped synchronization.
+ # Enable group-based scoped synchronization.
$enableScopedSync = @{"filteredSync" = "Enabled"}
- // Update the Azure AD DS resource
+ # Update the Azure AD DS resource
Set-AzResource -Id $DomainServicesResource.ResourceId -Properties $enableScopedSync ```
To disable group-based scoped synchronization for a managed domain, set *"filter
When prompted, specify the credentials for a *global admin* to sign in to your Azure AD tenant using the [Connect-AzureAD][Connect-AzureAD] cmdlet: ```powershell
-// Connect to your Azure AD tenant
+# Connect to your Azure AD tenant
Connect-AzureAD
-// Retrieve the Azure AD DS resource.
+# Retrieve the Azure AD DS resource.
$DomainServicesResource = Get-AzResource -ResourceType "Microsoft.AAD/DomainServices"
-// Disable group-based scoped synchronization.
+# Disable group-based scoped synchronization.
$disableScopedSync = @{"filteredSync" = "Disabled"}
-// Update the Azure AD DS resource
+# Update the Azure AD DS resource
Set-AzResource -Id $DomainServicesResource.ResourceId -Properties $disableScopedSync ```
active-directory Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/app-provisioning/whats-new-docs.md
Title: "What's new in Azure Active Directory application provisioning" description: "New and updated documentation for the Azure Active Directory application provisioning." Previously updated : 02/01/2021 Last updated : 03/08/2021
Welcome to what's new in Azure Active Directory application provisioning documentation. This article lists new docs that have been added and those that have had significant updates in the last three months. To learn what's new with the provisioning service, see [What's new in Azure Active Directory](../fundamentals/whats-new.md).
+## February 2021
+
+### Updated articles
+
+- [How Azure Active Directory provisioning integrates with Workday](workday-integration-reference.md)
+- [Tutorial - Customize user provisioning attribute-mappings for SaaS applications in Azure Active Directory](customize-application-attributes.md)
+- [What is automated SaaS app user provisioning in Azure AD?](user-provisioning.md)
+- [Tutorial: Develop a sample SCIM endpoint](use-scim-to-build-users-and-groups-endpoints.md)
+- [Tutorial: Develop and plan provisioning for a SCIM endpoint](use-scim-to-provision-users-and-groups.md)
+- [How provisioning works](how-provisioning-works.md)
+ ## January 2021 ### New articles
active-directory Groups Lifecycle https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/enterprise-users/groups-lifecycle.md
Here are examples of how you can use PowerShell cmdlets to configure the expirat
Remove-AzureADMSGroupLifecyclePolicy -Id "26fcc232-d1c3-4375-b68d-15c296f1f077" ```
-The following cmdlets can be used to configure the policy in more detail. For more information, see [PowerShell documentation](/powershell/module/azuread/?branch=master&view=azureadps-2.0-preview#groups).
+The following cmdlets can be used to configure the policy in more detail. For more information, see [PowerShell documentation](/powershell/module/azuread/?view=azureadps-2.0-preview#groups).
- Get-AzureADMSGroupLifecyclePolicy - New-AzureADMSGroupLifecyclePolicy
active-directory Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/external-identities/whats-new-docs.md
Title: "What's new in Azure Active Directory external identities" description: "New and updated documentation for the Azure Active Directory external identities." Previously updated : 02/01/2021 Last updated : 03/08/2021
Welcome to what's new in Azure Active Directory external identities documentation. This article lists new docs that have been added and those that have had significant updates in the last three months. To learn what's new with the external identities service, see [What's new in Azure Active Directory](../fundamentals/whats-new.md).
+## February 2021
+
+### New articles
+
+- [Reset redemption status for a guest user](reset-redemption-status.md)
+
+### Updated articles
+
+- [Azure Active Directory B2B best practices](b2b-fundamentals.md)
+- [Enable B2B external collaboration and manage who can invite guests](delegate-invitations.md)
+- [Azure Active Directory B2B collaboration FAQs](faq.md)
+- [Email one-time passcode authentication](one-time-passcode.md)
+- [Azure Active Directory B2B collaboration invitation redemption](redemption-experience.md)
+- [Troubleshooting Azure Active Directory B2B collaboration](troubleshoot.md)
+- [Properties of an Azure Active Directory B2B collaboration user](user-properties.md)
+- [What is guest user access in Azure Active Directory B2B?](what-is-b2b.md)
+- [Azure Active Directory external identities: What's new](whats-new-docs.md)
+- [Allow or block invitations to B2B users from specific organizations](allow-deny-list.md)
+- [Azure Active Directory B2B collaboration API and customization](customize-invitation-api.md)
+- [Invite internal users to B2B collaboration](invite-internal-users.md)
+- [Microsoft 365 external sharing and Azure Active Directory (Azure AD) B2B collaboration](o365-external-user.md)
+- [Direct federation with AD FS and third-party providers for guest users (preview)](direct-federation.md)
+ ## January 2021 ### Updated articles
active-directory Whats New Archive https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/whats-new-archive.md
For listing your application in the Azure AD app gallery, please read the detail
-### New provisioning connectors in the Azure AD Application Gallery - July 2020
-
-**Type:** New feature
-**Service category:** App Provisioning
-**Product capability:** 3rd Party Integration
-
-You can now automate creating, updating, and deleting user accounts for the newly integrated app [LinkedIn Learning](../saas-apps/linkedin-learning-provisioning-tutorial.md).
-
-For more information about how to better secure your organization by using automated user account provisioning, see [Automate user provisioning to SaaS applications with Azure AD](../app-provisioning/user-provisioning.md).
--- ### View role assignments across all scopes and ability to download them to a csv file **Type:** Changed feature
The new [policy details blade](../conditional-access/troubleshoot-conditional-ac
In April 2020, we've added these 31 new apps with Federation support to the app gallery:
-[SincroPool Apps](https://www.sincropool.com/), [SmartDB](https://hibiki.dreamarts.co.jp/smartdb/trial/), [Float](../saas-apps/float-tutorial.md), [LMS365](https://lms.365.systems/), [IWT Procurement Suite](../saas-apps/iwt-procurement-suite-tutorial.md), [Lunni](https://lunni.fi/), [EasySSO for Jira](../saas-apps/easysso-for-jira-tutorial.md), [Virtual Training Academy](https://vta.c3p.c), [Trend Micro Web Security(TMWS)](https://review.docs.microsoft.com/azure/active-directory/saas-apps/trend-micro-tutorial)
+[SincroPool Apps](https://www.sincropool.com/), [SmartDB](https://hibiki.dreamarts.co.jp/smartdb/trial/), [Float](../saas-apps/float-tutorial.md), [LMS365](https://lms.365.systems/), [IWT Procurement Suite](../saas-apps/iwt-procurement-suite-tutorial.md), [Lunni](https://lunni.fi/), [EasySSO for Jira](../saas-apps/easysso-for-jira-tutorial.md), [Virtual Training Academy](https://vta.c3p.c), [Trend Micro Web Security(TMWS)](/azure/active-directory/saas-apps/trend-micro-tutorial)
For more information about the apps, see [SaaS application integration with Azure Active Directory](../saas-apps/tutorial-list.md). For more information about listing your application in the Azure AD app gallery, see [List your application in the Azure Active Directory application gallery](../develop/v2-howto-app-gallery-listing.md).
active-directory Entitlement Management Logs And Reporting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/governance/entitlement-management-logs-and-reporting.md
$subs = Get-AzSubscription
$subs | ft ```
-You can reauthenticate and associate your PowerShell session to that subscription using a command such as `Connect-AzAccount ΓÇôSubscription $subs[0].id`. To learn more about how to authenticate to Azure from PowerShell, including non-interactively, see [Sign in with Azure PowerShell](/powershell/azure/authenticate-azureps?view=azps-3.3.0&viewFallbackFrom=azps-2.5.0
-).
+You can reauthenticate and associate your PowerShell session to that subscription using a command such as `Connect-AzAccount ΓÇôSubscription $subs[0].id`. To learn more about how to authenticate to Azure from PowerShell, including non-interactively, see [Sign in with Azure PowerShell](/powershell/azure/authenticate-azureps).
If you have multiple Log Analytics workspaces in that subscription, then the cmdlet [Get-AzOperationalInsightsWorkspace](/powershell/module/Az.OperationalInsights/Get-AzOperationalInsightsWorkspace) returns the list of workspaces. Then you can find the one that has the Azure AD logs. The `CustomerId` field returned by this cmdlet is the same as the value of the "Workspace Id" displayed in the Azure portal in the Log Analytics workspace overview.
active-directory Add Application Portal Setup Sso https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/add-application-portal-setup-sso.md
To set up SSO for an application that you added to your Azure AD tenant, you nee
## Enable single sign-on for an app
-After you finish adding an application to your Azure AD tenant, the overview page appears. If you're configuring an application that was already added, look at the first quickstart. It walks you through viewing the applications added to your tenant.
+After you finish adding an application to your Azure AD tenant, the overview page appears. If you're configuring an application that was already added, look at the first quickstart. It walks you through viewing the applications added to your tenant.
To set up single sign-on for an application:
When you're done with this quickstart series, consider deleting the app to clean
Advance to the next article to learn how to delete an app. > [!div class="nextstepaction"]
-> [Delete an app](delete-application-portal.md)
+> [Delete an app](delete-application-portal.md)
active-directory Application Proxy Back End Kerberos Constrained Delegation How To https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/application-proxy-back-end-kerberos-constrained-delegation-how-to.md
If you still can't make progress, Microsoft support can assist you. Create a sup
## Other scenarios - Azure Application Proxy requests a Kerberos ticket before sending its request to an application. Some third-party applications don't like this method of authenticating. These applications expect the more conventional negotiations to take place. The first request is anonymous, which allows the application to respond with the authentication types that it supports through a 401. This type of Kerberos negotiation can be enabled using the steps outlined in this document: [Kerberos Constrained Delegation for single sign-on](application-proxy-configure-single-sign-on-with-kcd.md).-- Multi-hop authentication is commonly used in scenarios where an application is tiered, with a back end and front end, where both require authentication, such as SQL Server Reporting Services. To configure the multi-hop scenario, see the support article [Kerberos Constrained Delegation May Require Protocol Transition in Multi-hop Scenarios](https://support.microsoft.com/help/2005838/kerberos-constrained-delegation-may-require-protocol-transition-in-mul).
+- Multi-hop authentication is commonly used in scenarios where an application is tiered, with a back end and front end, where both require authentication, such as SQL Server Reporting Services. For more details, see [How to configure Kerberos Constrained Delegation for Web Enrollment proxy pages](/troubleshoot/windows-server/identity/configure-kerberos-constrained-delegation).
## Next steps
-[Configure KCD on a managed domain](../../active-directory-domain-services/deploy-kcd.md).
+[Configure KCD on a managed domain](../../active-directory-domain-services/deploy-kcd.md).
active-directory Application Proxy Faq https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/application-proxy-faq.md
Make sure you have at least an Azure AD Premium P1 or P2 license and an Azure AD
## Connector configuration
+### Why is my connector still using an older version and not auto-upgraded to latest version?
+
+This may be due to either the updater service not working correctly or if there are no new updates available that the service can install.
+
+The updater service is healthy if itΓÇÖs running and there are no errors recorded in the event log (Applications and Services logs -> Microsoft -> AadApplicationProxy -> Updater -> Admin).
+
+> [!IMPORTANT]
+> Only major versions are released for auto-upgrade. We recommend updating your connector manually on a regular schedule. For more information on new releases, the type of the release (download, auto-upgrade), bug fixes and new features see, [Azure AD Application Proxy: Version release history](application-proxy-release-version-history.md).
+
+To manually upgrade a connector:
+
+- Download the latest version of the connector. (You will find it under Application Proxy on the Azure Portal. You can also find the link at [Azure AD Application Proxy: Version release history](application-proxy-release-version-history.md).
+- The installer restarts the Azure AD Application Proxy Connector services. In some cases, a reboot of the server might be required if the installer cannot replace all files. Therefore we recommend closing all applications (i.e. Event Viewer) before you start the upgrade.
+- Run the installer. The upgrade process is quick and does not require providing any credentials and the connector will not be re-registered.
+ ### Can Application Proxy Connector services run in a different user context than the default? No, this scenario isn't supported. The default settings are:
active-directory Migrate Application Authentication To Azure Active Directory https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/migrate-application-authentication-to-azure-active-directory.md
In the cloud environment, you need rich visibility, control over data travel, an
- **APIs** - For apps connected to cloud infrastructure, you can use the APIs and tools on those systems to begin to take an inventory of hosted apps. In the Azure environment:
- - Use the [Get-AzureWebsite](/powershell/module/servicemanagement/azure/get-azurewebsite?view=azuresmps-4.0.0&redirectedfrom=MSDN&preserve-view=true)cmdlet to get information about Azure websites.
-
- - Use the [Get-AzureRMWebApp](/powershell/module/azurerm.websites/get-azurermwebapp?view=azurermps-6.13.0&viewFallbackFrom=azurermps-6.2.0&preserve-view=true)cmdlet to get information about your Azure Web Apps.
+ - Use the [Get-AzureWebsite](/powershell/module/servicemanagement/azure.service/get-azurewebsite) cmdlet to get information about Azure websites.
+ - Use the [Get-AzureRMWebApp](/powershell/module/azurerm.websites/get-azurermwebapp) cmdlet to get information about your Azure Web Apps.
+D
- You can find all the apps running on Microsoft IIS from the Windows command line using [AppCmd.exe](/iis/get-started/getting-started-with-iis/getting-started-with-appcmdexe#working-with-sites-applications-virtual-directories-and-application-pools). - Use [Applications](/previous-versions/azure/ad/graph/api/entity-and-complex-type-reference#application-entity) and [Service Principals](/previous-versions/azure/ad/graph/api/entity-and-complex-type-reference#serviceprincipal-entity) to get you information on an app and app instance in a directory in Azure AD.
active-directory Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/manage-apps/whats-new-docs.md
Title: "What's new in Azure Active Directory application management" description: "New and updated documentation for the Azure Active Directory application management." Previously updated : 02/01/2021 Last updated : 03/08/2021
Welcome to what's new in Azure Active Directory application management documentation. This article lists new docs that have been added and those that have had significant updates in the last three months. To learn what's new with the application management service, see [What's new in Azure Active Directory](../fundamentals/whats-new.md).
+## February 2021
+
+### New articles
+
+- [Integrate with SharePoint (SAML)](application-proxy-integrate-with-sharepoint-server-saml.md)
+- [Migrate application authentication to Azure Active Directory](migrate-application-authentication-to-azure-active-directory.md)
+
+### Updated articles
+
+- [Integrate with SharePoint (SAML)](application-proxy-integrate-with-sharepoint-server-saml.md)
+- [Grant tenant-wide admin consent to an application](grant-admin-consent.md)
+- [Moving application authentication from Active Directory Federation Services to Azure Active Directory](migrate-adfs-apps-to-azure.md)
+- [Tutorial: Add an on-premises application for remote access through Application Proxy in Azure Active Directory](application-proxy-add-on-premises-application.md)
+- [Problems signing in to SAML-based single sign-on configured apps](application-sign-in-problem-federated-sso-gallery.md)
+- [Use tenant restrictions to manage access to SaaS cloud applications](tenant-restrictions.md)
+ ## January 2021 ### New articles
active-directory Custom Assign Graph https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/roles/custom-assign-graph.md
HTTP/1.1 201 Created
POST ``` HTTP
-https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments
+POST https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments
``` Body
HTTP/1.1 404 Not Found
POST ``` HTTP
-https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments
+POST https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments
``` Body
HTTP/1.1 201 Created
POST ``` HTTP
-https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments
+POST https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments
``` Body
Only a subset of built-in roles are enabled for Administrative Unit scoping. Ref
GET ``` HTTP
-https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments&$filter=principalId eq ΓÇÿ<object-id-of-principal>ΓÇÖ
+GET https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments?$filter=principalId+eq+'<object-id-of-principal>'
``` Response
HTTP/1.1 200 OK
GET ``` HTTP
-https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments&$filter=roleDefinitionId eq ΓÇÿ<object-id-or-template-id-of-role-definition>ΓÇÖ
+GET https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments?$filter=roleDefinitionId+eq+'<object-id-or-template-id-of-role-definition>'
``` Response
HTTP/1.1 200 OK
GET ``` HTTP
-GET https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments?$filter=directoryScopeId eq '/d23998b1-8853-4c87-b95f-be97d6c6b610'
+GET https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments?$filter=directoryScopeId+eq+'/d23998b1-8853-4c87-b95f-be97d6c6b610'
``` Response
HTTP/1.1 200 OK
DELETE ``` HTTP
-GET https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments/lAPpYvVpN0KRkAEhdxReEJC2sEqbR_9Hr48lds9SGHI-1
+DELETE https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments/lAPpYvVpN0KRkAEhdxReEJC2sEqbR_9Hr48lds9SGHI-1
``` Response
HTTP/1.1 204 No Content
DELETE ``` HTTP
-GET https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments/lAPpYvVpN0KRkAEhdxReEJC2sEqbR_9Hr48lds9SGHI-1
+DELETE https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments/lAPpYvVpN0KRkAEhdxReEJC2sEqbR_9Hr48lds9SGHI-1
``` Response
HTTP/1.1 404 Not Found
DELETE ``` HTTP
-GET https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments/lAPpYvVpN0KRkAEhdxReEJC2sEqbR_9Hr48lds9SGHI-1
+DELETE https://graph.microsoft.com/beta/roleManagement/directory/roleAssignments/lAPpYvVpN0KRkAEhdxReEJC2sEqbR_9Hr48lds9SGHI-1
``` Response
active-directory Accenture Academy Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/accenture-academy-tutorial.md
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Accenture Academy | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and Accenture Academy.
++++++++ Last updated : 03/08/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with Accenture Academy
+
+In this tutorial, you'll learn how to integrate Accenture Academy with Azure Active Directory (Azure AD). When you integrate Accenture Academy with Azure AD, you can:
+
+* Control in Azure AD who has access to Accenture Academy.
+* Enable your users to be automatically signed-in to Accenture Academy with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Accenture Academy single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Accenture Academy supports **SP and IDP** initiated SSO
+* Accenture Academy supports **Just In Time** user provisioning
+
+## Adding Accenture Academy from the gallery
+
+To configure the integration of Accenture Academy into Azure AD, you need to add Accenture Academy from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Accenture Academy** in the search box.
+1. Select **Accenture Academy** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
++
+## Configure and test Azure AD SSO for Accenture Academy
+
+Configure and test Azure AD SSO with Accenture Academy using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Accenture Academy.
+
+To configure and test Azure AD SSO with Accenture Academy, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Accenture Academy SSO](#configure-accenture-academy-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Accenture Academy test user](#create-accenture-academy-test-user)** - to have a counterpart of B.Simon in Accenture Academy that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Accenture Academy** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, enter the values for the following fields:
+
+ a. In the **Identifier** text box, type a URL using the following pattern:
+ `https://www.accentureacademy.com/a/integration/saml_sso/<Customer ID>/`
+
+ b. In the **Reply URL** text box, type a URL using the following pattern:
+ `https://www.accentureacademy.com/a/integration/saml_sso/<Customer ID>/acs/`
+
+1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
+
+ In the **Sign-on URL** text box, type a URL using the following pattern:
+ `https://www.accentureacademy.com/a/integration/saml_sso/<Customer ID>/request_idp_auth/`
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Identifier, Reply URL and Sign-on URL. Contact [Accenture Academy Client support team](mailto:support@accentureacademy.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/metadataxml.png)
+
+1. On the **Set up Accenture Academy** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Accenture Academy.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Accenture Academy**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Accenture Academy SSO
+
+To configure single sign-on on **Accenture Academy** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Accenture Academy support team](mailto:support@accentureacademy.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create Accenture Academy test user
+
+In this section, a user called Britta Simon is created in Accenture Academy. Accenture Academy supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Accenture Academy, a new one is created after authentication.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Accenture Academy Sign on URL where you can initiate the login flow.
+
+* Go to Accenture Academy Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
+
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Accenture Academy for which you set up the SSO
+
+You can also use Microsoft My Apps to test the application in any mode. When you click the Accenture Academy tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Accenture Academy for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](https://docs.microsoft.com/azure/active-directory/active-directory-saas-access-panel-introduction).
+
+## Next steps
+
+Once you configure Accenture Academy you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
++
active-directory Appdynamics Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/appdynamics-tutorial.md
Previously updated : 12/26/2018 Last updated : 02/25/2021 # Tutorial: Azure Active Directory integration with AppDynamics
-In this tutorial, you learn how to integrate AppDynamics with Azure Active Directory (Azure AD).
-Integrating AppDynamics with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate AppDynamics with Azure Active Directory (Azure AD). When you integrate AppDynamics with Azure AD, you can:
-* You can control in Azure AD who has access to AppDynamics.
-* You can enable your users to be automatically signed-in to AppDynamics (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to AppDynamics.
+* Enable your users to be automatically signed-in to AppDynamics with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with AppDynamics, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* AppDynamics single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* AppDynamics single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* AppDynamics supports **SP** initiated SSO
+* AppDynamics supports **SP** initiated SSO.
-* AppDynamics supports **Just In Time** user provisioning
+* AppDynamics supports **Just In Time** user provisioning.
-## Adding AppDynamics from the gallery
+## Add AppDynamics from the gallery
To configure the integration of AppDynamics into Azure AD, you need to add AppDynamics from the gallery to your list of managed SaaS apps.
-**To add AppDynamics from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **AppDynamics**, select **AppDynamics** from result panel then click **Add** button to add the application.
-
- ![AppDynamics in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with AppDynamics based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in AppDynamics needs to be established.
-
-To configure and test Azure AD single sign-on with AppDynamics, you need to complete the following building blocks:
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **AppDynamics** in the search box.
+1. Select **AppDynamics** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure AppDynamics Single Sign-On](#configure-appdynamics-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create AppDynamics test user](#create-appdynamics-test-user)** - to have a counterpart of Britta Simon in AppDynamics that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+## Configure and test Azure AD SSO for AppDynamics
-### Configure Azure AD single sign-on
+Configure and test Azure AD SSO with AppDynamics using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in AppDynamics.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+To configure and test Azure AD SSO with AppDynamics, perform the following steps:
-To configure Azure AD single sign-on with AppDynamics, perform the following steps:
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure AppDynamics SSO](#configure-appdynamics-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create AppDynamics test user](#create-appdynamics-test-user)** - to have a counterpart of B.Simon in AppDynamics that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-1. In the [Azure portal](https://portal.azure.com/), on the **AppDynamics** application integration page, select **Single sign-on**.
+## Configure Azure AD SSO
- ![Configure single sign-on link](common/select-sso.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+1. In the Azure portal, on the **AppDynamics** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![AppDynamics Domain and URLs single sign-on information](common/sp-identifier.png)
- a. In the **Sign on URL** text box, type a URL using the following pattern: `https://<companyname>.saas.appdynamics.com?accountName=<companyname>`
To configure Azure AD single sign-on with AppDynamics, perform the following ste
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
- b. Azure Ad Identifier
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to AppDynamics.
- c. Logout URL
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **AppDynamics**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
-### Configure AppDynamics Single Sign-On
+## Configure AppDynamics SSO
1. In a different web browser window, log in to your AppDynamics company site as an administrator. 1. In the toolbar on the top, click **Settings**, and then click **Administration**.
- ![Administration](./media/appdynamics-tutorial/ic790216.png "Administration")
+ ![Administration](./media/appdynamics-tutorial/settings.png "Administration")
1. Click the **Authentication Provider** tab.
- ![Authentication Provider](./media/appdynamics-tutorial/ic790224.png "Authentication Provider")
+ ![Authentication Provider](./media/appdynamics-tutorial/authentication.png "Authentication Provider")
1. In the **Authentication Provider** section, perform the following steps:
- ![SAML Configuration](./media/appdynamics-tutorial/ic790225.png "SAML Configuration")
+ ![SAML Configuration](./media/appdynamics-tutorial/configuration.png "SAML Configuration")
a. As **Authentication Provider**, select **SAML**.
To configure Azure AD single sign-on with AppDynamics, perform the following ste
e. Click **Save**.
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to AppDynamics.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **AppDynamics**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, type and select **AppDynamics**.
-
- ![The AppDynamics link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
- ### Create AppDynamics test user
-The objective of this section is to create a user called Britta Simon in AppDynamics. AppDynamics supports just-in-time provisioning, which is by default enabled. There is no action item for you in this section. A new user is created during an attempt to access AppDynamics if it doesn't exist yet.
-
->[!Note]
->If you need to create a user manually, contact [AppDynamics Client support team](https://www.appdynamics.com/support/).
+In this section, a user called B.Simon is created in AppDynamics. AppDynamics supports just-in-time user provisioning, which is enabled by default. There's no action item for you in this section. If a user doesn't already exist in AppDynamics, a new one is created after authentication.
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the AppDynamics tile in the Access Panel, you should be automatically signed in to the AppDynamics for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to AppDynamics Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to AppDynamics Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the AppDynamics tile in the My Apps, this will redirect to AppDynamics Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](https://docs.microsoft.com/azure/active-directory/active-directory-saas-access-panel-introduction).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure AppDynamics you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
active-directory Boomi Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/boomi-tutorial.md
Previously updated : 02/07/2020 Last updated : 02/25/2021
In this tutorial, you'll learn how to integrate Boomi with Azure Active Director
* Enable your users to be automatically signed-in to Boomi with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Boomi supports **IDP** initiated SSO
-* Once you configure the Boomi you can enforce session controls, which protect exfiltration and infiltration of your organizationΓÇÖs sensitive data in real-time. Session controls extend from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+* Boomi supports **IDP** initiated SSO.
-## Adding Boomi from the gallery
+## Add Boomi from the gallery
To configure the integration of Boomi into Azure AD, you need to add Boomi from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**.
To configure the integration of Boomi into Azure AD, you need to add Boomi from
1. Select **Boomi** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on for Boomi
+## Configure and test Azure AD SSO for Boomi
Configure and test Azure AD SSO with Boomi using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Boomi.
-To configure and test Azure AD SSO with Boomi, complete the following building blocks:
+To configure and test Azure AD SSO with Boomi, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature. * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
To configure and test Azure AD SSO with Boomi, complete the following building b
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Boomi** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Boomi** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
Follow these steps to enable Azure AD SSO in the Azure portal.
c. After the metadata file is successfully uploaded, the **Identifier** and **Reply URL** values get auto populated in Basic SAML Configuration section.
- ![Screenshot shows the Basic SAML Configuration, where Identifier and Reply U R L values appear.](common/idp-intiated.png)
- d. Enter the **Sign-on URL**, such as `https://platform.boomi.com/AtomSphere.html#build;accountId={your-accountId}`. > [!Note]
- > You will get the **Service Provider metadata file** from the **Configure Boomi SSO** section, which is explained later in the tutorial. If the **Identifier** and **Reply URL** values do not get auto polulated, then fill in the values manually according to your requirement.
+ > You will get the **Service Provider metadata file** from the **Configure Boomi SSO** section, which is explained later in the tutorial. If the **Identifier** and **Reply URL** values do not get auto populated, then fill in the values manually according to your requirement.
1. Boomi application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Boomi**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button. ## Configure Boomi SSO
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. Click the **SSO Options** tab and perform below steps.
- ![Configure Single Sign-On On App Side](./media/boomi-tutorial/tutorial_boomi_11.png)
+ ![Configure Single Sign-On On App Side](./media/boomi-tutorial/import.png)
a. Check **Enable SAML Single Sign-On** checkbox.
In order to enable Azure AD users to sign in to Boomi, they must be provisioned
1. After logging in, navigate to **User Management** and go to **Users**.
- ![Screenshot shows the User Management page with Users selected.](./media/boomi-tutorial/tutorial_boomi_001.png "Users")
+ ![Screenshot shows the User Management page with Users selected.](./media/boomi-tutorial/user.png "Users")
1. Click **+** icon and the **Add/Maintain User Roles** dialog opens.
- ![Screenshot shows the + icon selected.](./media/boomi-tutorial/tutorial_boomi_002.png "Users")
+ ![Screenshot shows the + icon selected.](./media/boomi-tutorial/add.png "Users")
- ![Screenshot shows the Add / Maintain User Roles where you configure a user.](./media/boomi-tutorial/tutorial_boomi_003.png "Users")
+ ![Screenshot shows the Add / Maintain User Roles where you configure a user.](./media/boomi-tutorial/roles.png "Users")
a. In the **User e-mail address** textbox, type the email of user like B.Simon@contoso.com.
In order to enable Azure AD users to sign in to Boomi, they must be provisioned
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the Boomi tile in the Access Panel, you should be automatically signed in to the Boomi for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
-
-## Additional resources
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* Click on Test this application in Azure portal and you should be automatically signed in to the Boomi for which you set up the SSO.
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* You can use Microsoft My Apps. When you click the Boomi tile in the My Apps, you should be automatically signed in to the Boomi for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](https://docs.microsoft.com/azure/active-directory/active-directory-saas-access-panel-introduction).
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md) -- [What is session control in Microsoft Cloud App Security?](/cloud-app-security/proxy-intro-aad)
+## Next steps
-- [Try Boomi with Azure AD](https://aad.portal.azure.com/)
+Once you configure Boomi you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
active-directory Dropboxforbusiness Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/dropboxforbusiness-tutorial.md
Previously updated : 01/28/2021 Last updated : 02/17/2021 # Tutorial: Integrate Dropbox Business with Azure Active Directory
To get started, you need the following items:
## Scenario description
-* In this tutorial, you configure and test Azure AD SSO in a test environment. Dropbox Business supports **SP** initiated SSO
+* In this tutorial, you configure and test Azure AD SSO in a test environment. Dropbox Business supports **SP** initiated SSO.
-* Dropbox Business supports [Automated user provisioning and deprovisioning](dropboxforbusiness-tutorial.md)
+* Dropbox Business supports [Automated user provisioning and deprovisioning](dropboxforbusiness-tutorial.md).
> [!NOTE] > Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
Follow these steps to enable Azure AD SSO in the Azure portal.
a. In the **Sign on URL** text box, type a URL using the following pattern: `https://www.dropbox.com/sso/<id>`-
- b. In the **Identifier (Entity ID)** text box, type the value:
+
+ b. In the **Identifier (Entity ID)** text box, type the value:
`Dropbox`-
+
> [!NOTE]
- > The preceding Sign-on URL value is not real value. You will update the value with the actual Sign-on URL, which is explained later in the tutorial.
+ > The **Dropbox Sign SSO ID** can be found in the Dropbox site at Dropbox > Admin console > Settings > Single sign-on > SSO sign-in URL.
1. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Certificate (Base64)** from the given options as per your requirement and save it on your computer.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
4. Click on the **User Icon** and select **Settings** tab.
- ![Screenshot that shows the "USER ICON" action and "Settings" selected.](./media/dropboxforbusiness-tutorial/configure1.png "Configure single sign-on")
+ ![Screenshot that shows the "USER ICON" action and "Settings" selected.](./media/dropboxforbusiness-tutorial/configure-1.png "Configure single sign-on")
5. In the navigation pane on the left side, click **Admin console**.
- ![Screenshot that shows "Admin console" selected.](./media/dropboxforbusiness-tutorial/configure2.png "Configure single sign-on")
+ ![Screenshot that shows "Admin console" selected.](./media/dropboxforbusiness-tutorial/configure-2.png "Configure single sign-on")
6. On the **Admin console**, click **Settings** in the left navigation pane.
- ![Screenshot that shows "Settings" selected.](./media/dropboxforbusiness-tutorial/configure3.png "Configure single sign-on")
+ ![Screenshot that shows "Settings" selected.](./media/dropboxforbusiness-tutorial/configure-3.png "Configure single sign-on")
7. Select **Single sign-on** option under the **Authentication** section.
- ![Screenshot that shows the "Authentication" section with "Single sign-on" selected.](./media/dropboxforbusiness-tutorial/configure4.png "Configure single sign-on")
+ ![Screenshot that shows the "Authentication" section with "Single sign-on" selected.](./media/dropboxforbusiness-tutorial/configure-4.png "Configure single sign-on")
8. In the **Single sign-on** section, perform the following steps:
- ![Screenshot that shows the "Single sign-on" configuration settings.](./media/dropboxforbusiness-tutorial/configure5.png "Configure single sign-on")
+ ![Screenshot that shows the "Single sign-on" configuration settings.](./media/dropboxforbusiness-tutorial/configure-5.png "Configure single sign-on")
a. Select **Required** as an option from the dropdown for the **Single sign-on**.
In this section, a user called B.Simon is created in Dropbox Business. Dropbox B
>[!Note] >If you need to create a user manually, Contact [Dropbox Business Client support team](https://www.dropbox.com/business/contact)
-### Test SSO
+## Test SSO
In this section, you test your Azure AD single sign-on configuration with following options.
In this section, you test your Azure AD single sign-on configuration with follow
## Next steps
-Once you configure Dropbox Business you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+Once you configure Dropbox Business you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
active-directory Expensify Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/expensify-tutorial.md
Previously updated : 08/12/2019 Last updated : 03/02/2021
In this tutorial, you'll learn how to integrate Expensify with Azure Active Dire
* Enable your users to be automatically signed-in to Expensify with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Expensify supports **SP** initiated SSO
+* Expensify supports **SP** initiated SSO.
+
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-## Adding Expensify from the gallery
+## Add Expensify from the gallery
To configure the integration of Expensify into Azure AD, you need to add Expensify from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Expensify** in the search box. 1. Select **Expensify** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on for Expensify
+## Configure and test Azure AD SSO for Expensify
Configure and test Azure AD SSO with Expensify using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Expensify.
-To configure and test Azure AD SSO with Expensify, complete the following building blocks:
+To configure and test Azure AD SSO with Expensify, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature. 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
To configure and test Azure AD SSO with Expensify, complete the following buildi
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Expensify** application integration page, find the **Manage** section and select **Single sign-on**.
+1. In the Azure portal, on the **Expensify** application integration page, find the **Manage** section and select **Single sign-on**.
1. On the **Select a Single sign-on method** page, select **SAML**.
-1. On the **Set up Single Sign-On with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png) 1. On the **Basic SAML Configuration** section, enter the values for the following fields:
- a. In the **Sign on URL** text box, type a URL:
+ a. In the **Sign on URL** text box, type the URL:
`https://www.expensify.com/authentication/saml/login`
- b. In the **Identifier (Entity ID)** text box, type a URL:
+ b. In the **Identifier (Entity ID)** text box, type the URL:
`https://www.expensify.com` c. b. In the **Reply URL** text box, type a URL using the following pattern:
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Expensify**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button. ## Configure Expensify SSO To enable SSO in Expensify, you first need to enable **Domain Control** in the application. You can enable Domain Control in the application through the steps listed [here](https://help.expensify.com/domain-control). For additional support, work with [Expensify Client support team](mailto:help@expensify.com). Once you have Domain Control enabled, follow these steps:
-![Configure Single Sign-On](./media/expensify-tutorial/tutorial_expensify_51.png)
+![Configure Single Sign-On](./media/expensify-tutorial/domain-control.png)
1. Sign on to your Expensify application.
In this section, you create a user called B.Simon in Expensify. Work with [Expen
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the Expensify tile in the Access Panel, you should be automatically signed in to the Expensify for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Expensify Sign-on URL where you can initiate the login flow.
-## Additional resources
+* Go to Expensify Sign-on URL directly and initiate the login flow from there.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Expensify tile in the My Apps, this will redirect to Expensify Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](https://docs.microsoft.com/azure/active-directory/active-directory-saas-access-panel-introduction).
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Expensify you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
active-directory Jitbit Helpdesk Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/jitbit-helpdesk-tutorial.md
Previously updated : 03/14/2019 Last updated : 03/02/2021 # Tutorial: Azure Active Directory integration with Jitbit Helpdesk
-In this tutorial, you learn how to integrate Jitbit Helpdesk with Azure Active Directory (Azure AD).
-Integrating Jitbit Helpdesk with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Jitbit Helpdesk with Azure Active Directory (Azure AD). When you integrate Jitbit Helpdesk with Azure AD, you can:
-* You can control in Azure AD who has access to Jitbit Helpdesk.
-* You can enable your users to be automatically signed-in to Jitbit Helpdesk (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Jitbit Helpdesk.
+* Enable your users to be automatically signed-in to Jitbit Helpdesk with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Jitbit Helpdesk, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Jitbit Helpdesk single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Jitbit Helpdesk single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Jitbit Helpdesk supports **SP** initiated SSO
-
-## Adding Jitbit Helpdesk from the gallery
-
-To configure the integration of Jitbit Helpdesk into Azure AD, you need to add Jitbit Helpdesk from the gallery to your list of managed SaaS apps.
-
-**To add Jitbit Helpdesk from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Jitbit Helpdesk**, select **Jitbit Helpdesk** from result panel then click **Add** button to add the application.
+* Jitbit Helpdesk supports **SP** initiated SSO.
- ![Jitbit Helpdesk in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with Jitbit Helpdesk based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Jitbit Helpdesk needs to be established.
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-To configure and test Azure AD single sign-on with Jitbit Helpdesk, you need to complete the following building blocks:
+## Add Jitbit Helpdesk from the gallery
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Jitbit Helpdesk Single Sign-On](#configure-jitbit-helpdesk-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Jitbit Helpdesk test user](#create-jitbit-helpdesk-test-user)** - to have a counterpart of Britta Simon in Jitbit Helpdesk that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure the integration of Jitbit Helpdesk into Azure AD, you need to add Jitbit Helpdesk from the gallery to your list of managed SaaS apps.
-### Configure Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Jitbit Helpdesk** in the search box.
+1. Select **Jitbit Helpdesk** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure and test Azure AD SSO for Jitbit Helpdesk
-To configure Azure AD single sign-on with Jitbit Helpdesk, perform the following steps:
+Configure and test Azure AD SSO with Jitbit Helpdesk using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Jitbit Helpdesk.
-1. In the [Azure portal](https://portal.azure.com/), on the **Jitbit Helpdesk** application integration page, select **Single sign-on**.
+To configure and test Azure AD SSO with Jitbit Helpdesk, perform the following steps:
- ![Configure single sign-on link](common/select-sso.png)
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Jitbit Helpdesk SSO](#configure-jitbit-helpdesk-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Jitbit Helpdesk test user](#create-jitbit-helpdesk-test-user)** - to have a counterpart of B.Simon in Jitbit Helpdesk that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+## Configure Azure AD SSO
- ![Single sign-on select mode](common/select-saml-option.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
+1. In the Azure portal, on the **Jitbit Helpdesk** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![Jitbit Helpdesk Domain and URLs single sign-on information](common/sp-identifier.png)
-
- a. In the **Sign on URL** text box, type a URL using the following pattern:
+ a. In the **Sign on URL** text box, type one of the URLs using the following pattern:
| | | -| | `https://<hostname>/helpdesk/User/Login`|
To configure Azure AD single sign-on with Jitbit Helpdesk, perform the following
> [!NOTE] > This value is not real. Update this value with the actual Sign-On URL. Contact [Jitbit Helpdesk Client support team](https://www.jitbit.com/support/) to get this value.
- b. In the **Identifier (Entity ID)** text box, type a URL as following:
+ b. In the **Identifier (Entity ID)** text box, type the URL:
`https://www.jitbit.com/web-helpdesk/` 5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Certificate (Base64)** from the given options as per your requirement and save it on your computer.
To configure Azure AD single sign-on with Jitbit Helpdesk, perform the following
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
- b. Azure AD Identifier
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Jitbit Helpdesk.
- c. Logout URL
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Jitbit Helpdesk**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
-### Configure Jitbit Helpdesk Single Sign-On
+## Configure Jitbit Helpdesk SSO
1. In a different web browser window, sign in to your Jitbit Helpdesk company site as an administrator. 1. In the toolbar on the top, click **Administration**.
- ![Administration](./media/jitbit-helpdesk-tutorial/ic777681.png "Administration")
+ ![Administration](./media/jitbit-helpdesk-tutorial/settings.png "Administration")
1. Click **General settings**.
- ![Screenshot shows the General Settings link.](./media/jitbit-helpdesk-tutorial/ic777680.png "Users, companies, and permissions")
+ ![Screenshot shows the General Settings link.](./media/jitbit-helpdesk-tutorial/general.png "Users, companies, and permissions")
1. In the **Authentication settings** configuration section, perform the following steps:
- ![Authentication settings](./media/jitbit-helpdesk-tutorial/ic777683.png "Authentication settings")
+ ![Authentication settings](./media/jitbit-helpdesk-tutorial/authentication.png "Authentication settings")
a. Select **Enable SAML 2.0 single sign on**, to sign in using Single Sign-On (SSO), with **OneLogin**.
To configure Azure AD single sign-on with Jitbit Helpdesk, perform the following
d. Click **Save changes**.
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Jitbit Helpdesk.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Jitbit Helpdesk**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **Jitbit Helpdesk**.
-
- ![The Jitbit Helpdesk link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
- ### Create Jitbit Helpdesk test user In order to enable Azure AD users to sign in to Jitbit Helpdesk, they must be provisioned into Jitbit Helpdesk. In the case of Jitbit Helpdesk, provisioning is a manual task.
In order to enable Azure AD users to sign in to Jitbit Helpdesk, they must be pr
1. In the menu on the top, click **Administration**.
- ![Administration](./media/jitbit-helpdesk-tutorial/ic777681.png "Administration")
+ ![Administration](./media/jitbit-helpdesk-tutorial/settings.png "Administration")
1. Click **Users, companies and permissions**.
- ![Users, companies, and permissions](./media/jitbit-helpdesk-tutorial/ic777682.png "Users, companies, and permissions")
+ ![Users, companies, and permissions](./media/jitbit-helpdesk-tutorial/users.png "Users, companies, and permissions")
1. Click **Add user**.
- ![Add user](./media/jitbit-helpdesk-tutorial/ic777685.png "Add user")
+ ![Add user](./media/jitbit-helpdesk-tutorial/add.png "Add user")
1. In the Create section, type the data of the Azure AD account you want to provision as follows:
- ![Create](./media/jitbit-helpdesk-tutorial/ic777686.png "Create")
+ ![Create](./media/jitbit-helpdesk-tutorial/create-section.png "Create")
a. In the **Username** textbox, type the username of the user like **BrittaSimon**.
In order to enable Azure AD users to sign in to Jitbit Helpdesk, they must be pr
> [!NOTE] > You can use any other Jitbit Helpdesk user account creation tools or APIs provided by Jitbit Helpdesk to provision Azure AD user accounts.
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the Jitbit Helpdesk tile in the Access Panel, you should be automatically signed in to the Jitbit Helpdesk for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Jitbit Helpdesk Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to Jitbit Helpdesk Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Jitbit Helpdesk tile in the My Apps, this will redirect to Jitbit Helpdesk Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](https://docs.microsoft.com/azure/active-directory/active-directory-saas-access-panel-introduction).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Jitbit Helpdesk you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
active-directory Jive Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/jive-tutorial.md
Previously updated : 01/16/2020 Last updated : 03/02/2021
In this tutorial, you'll learn how to integrate Jive with Azure Active Directory
* Enable your users to be automatically signed-in to Jive with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Jive supports **SP** initiated SSO
-* Jive supports [**Automated** user provisioning](jive-provisioning-tutorial.md)
-* Once you configure the Jive you can enforce session controls, which protect exfiltration and infiltration of your organizationΓÇÖs sensitive data in real-time. Session controls extend from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad)
+* Jive supports **SP** initiated SSO.
+* Jive supports [**Automated** user provisioning](jive-provisioning-tutorial.md).
-## Adding Jive from the gallery
+## Add Jive from the gallery
To configure the integration of Jive into Azure AD, you need to add Jive from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**.
To configure the integration of Jive into Azure AD, you need to add Jive from th
1. Select **Jive** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-## Configure and test Azure AD single sign-on for Jive
+## Configure and test Azure AD SSO for Jive
Configure and test Azure AD SSO with Jive using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Jive.
-To configure and test Azure AD SSO with Jive, complete the following building blocks:
+To configure and test Azure AD SSO with Jive, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
- * **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
- * **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
1. **[Configure Jive SSO](#configure-jive-sso)** - to configure the single sign-on settings on application side.
- * **[Create Jive test user](#create-jive-test-user)** - to have a counterpart of B.Simon in Jive that is linked to the Azure AD representation of user.
+ 1. **[Create Jive test user](#create-jive-test-user)** - to have a counterpart of B.Simon in Jive that is linked to the Azure AD representation of user.
1. **[Test SSO](#test-sso)** - to verify whether the configuration works. ## Configure Azure AD SSO Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Jive** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Jive** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
Follow these steps to enable Azure AD SSO in the Azure portal.
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure AD Identifier
-
- c. Logout URL
- ### Create an Azure AD test user In this section, you'll create a test user in the Azure portal called B.Simon.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Jive**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button. ## Configure Jive SSO
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the menu on the top, Click **SAML**.
- ![Screenshot shows the SAML tab with Enabled selected.](./media/jive-tutorial/tutorial_jive_002.png)
+ ![Screenshot shows the SAML tab with Enabled selected.](./media/jive-tutorial/jive-2.png)
a. Select **Enabled** under the **General** tab.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. Navigate to the **IDP METADATA** tab.
- ![Screenshot shows the SAML tab I D P METADATA selected.](./media/jive-tutorial/tutorial_jive_003.png)
+ ![Screenshot shows the SAML tab I D P METADATA selected.](./media/jive-tutorial/jive-3.png)
a. Copy the content of the downloaded metadata XML file, and then paste it into the **Identity Provider (IDP) Metadata** textbox.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. Select **USER ATTRIBUTE MAPPING** tab.
- ![Screenshot shows the SAML tab with USER ATTRIBUTE MAPPING selected.](./media/jive-tutorial/tutorial_jive_004.png)
+ ![Screenshot shows the SAML tab with USER ATTRIBUTE MAPPING selected.](./media/jive-tutorial/jive-4.png)
a. In the **Email** textbox, copy and paste the attribute name of **mail** value.
If you need to create user manually, work with [Jive Client support team](https:
## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
-
-When you click the Jive tile in the Access Panel, you should be automatically signed in to the Jive for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
-
-## Additional resources
--- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)--- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+In this section, you test your Azure AD single sign-on configuration with following options.
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+* Click on **Test this application** in Azure portal. This will redirect to Jive Sign-on URL where you can initiate the login flow.
-- [Try Jive with Azure AD](https://aad.portal.azure.com/)
+* Go to Jive Sign-on URL directly and initiate the login flow from there.
-- [What is session control in Microsoft Cloud App Security?](/cloud-app-security/proxy-intro-aad)
+* You can use Microsoft My Apps. When you click the Jive tile in the My Apps, this will redirect to Jive Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](https://docs.microsoft.com/azure/active-directory/active-directory-saas-access-panel-introduction).
-- [Configure User Provisioning](jive-provisioning-tutorial.md)
+## Next steps
-- [How to protect Jive with advanced visibility and controls](/cloud-app-security/proxy-intro-aad)
+Once you configure Jive you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
active-directory Learning At Work Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/learning-at-work-tutorial.md
Previously updated : 08/01/2019 Last updated : 02/25/2021
In this tutorial, you'll learn how to integrate Learning at Work with Azure Acti
* Enable your users to be automatically signed-in to Learning at Work with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Learning at Work supports **SP** initiated SSO
+* Learning at Work supports **SP** initiated SSO.
-## Adding Learning at Work from the gallery
+## Add Learning at Work from the gallery
To configure the integration of Learning at Work into Azure AD, you need to add Learning at Work from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Learning at Work** in the search box. 1. Select **Learning at Work** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. -
-## Configure and test Azure AD single sign-on
+## Configure and test Azure AD SSO for Learning at Work
Configure and test Azure AD SSO with Learning at Work using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Learning at Work.
-To configure and test Azure AD SSO with Learning at Work, complete the following building blocks:
+To configure and test Azure AD SSO with Learning at Work, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
-2. **[Configure Learning at Work SSO](#configure-learning-at-work-sso)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
-5. **[Create Learning at Work test user](#create-learning-at-work-test-user)** - to have a counterpart of B.Simon in Learning at Work that is linked to the Azure AD representation of user.
-6. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Learning at Work SSO](#configure-learning-at-work-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Learning at Work test user](#create-learning-at-work-test-user)** - to have a counterpart of B.Simon in Learning at Work that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-### Configure Azure AD SSO
+## Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Learning at Work** application integration page, find the **Manage** section and select **Single sign-on**.
+1. In the Azure portal, on the **Learning at Work** application integration page, find the **Manage** section and select **Single sign-on**.
1. On the **Select a Single sign-on method** page, select **SAML**.
-1. On the **Set up Single Sign-On with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
Follow these steps to enable Azure AD SSO in the Azure portal.
5. Learning at Work application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes, where as **nameidentifier** is mapped with **user.userprincipalname**.
- You can update the **nameidentifier** value in Azure AD based on your Organization setup and this value needs to match with the **User ID** in the SABA cloud, for that you need to edit the attribute mapping by clicking on **Edit** icon and change the attribute mapping.
+ You can update the **nameidentifier** value in Azure AD based on your Organization setup and this value needs to match with the **User ID** in the SABA cloud, for that you need to edit the attribute mapping by clicking on **pencil** icon and change the attribute mapping.
![image](common/edit-attribute.png)
-4. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
+6. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer.
![The Certificate download link](common/metadataxml.png)
Follow these steps to enable Azure AD SSO in the Azure portal.
![Copy configuration URLs](common/copy-configuration-urls.png)
-### Configure Learning at Work SSO
-
-To configure single sign-on on **Learning at Work** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Learning at Work support team](https://www.learninga-z.com/site/contact/support). They set this setting to have the SAML SSO connection set properly on both side
- ### Create an Azure AD test user In this section, you'll create a test user in the Azure portal called B.Simon.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Learning at Work**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
-1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
1. In the **Add Assignment** dialog, click the **Assign** button.
+## Configure Learning at Work SSO
+
+To configure single sign-on on **Learning at Work** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Learning at Work support team](https://www.learninga-z.com/site/contact/support). They set this setting to have the SAML SSO connection set properly on both sides.
+ ### Create Learning at Work test user In this section, you create a user called B.Simon in Learning at Work. Work with [Learning at Work support team](https://www.learninga-z.com/site/contact/support) to add the users in the Learning at Work platform. Users must be created and activated before you use single sign-on.
-### Test SSO
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the Learning at Work tile in the Access Panel, you should be automatically signed in to the Learning at Work for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Learning at Work Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to Learning at Work Sign-on URL directly and initiate the login flow from there.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Learning at Work tile in the My Apps, this will redirect to Learning at Work Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](https://docs.microsoft.com/azure/active-directory/active-directory-saas-access-panel-introduction).
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Learning at Work you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
active-directory Printerlogic Saas Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/printerlogic-saas-tutorial.md
Previously updated : 12/18/2020 Last updated : 02/25/2021
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* PrinterLogic SaaS supports **SP and IDP** initiated SSO
-* PrinterLogic SaaS supports **Just In Time** user provisioning
+* PrinterLogic SaaS supports **SP and IDP** initiated SSO.
+* PrinterLogic SaaS supports **Just In Time** user provisioning.
-## Adding PrinterLogic SaaS from the gallery
+## Add PrinterLogic SaaS from the gallery
To configure the integration of PrinterLogic SaaS into Azure AD, you need to add PrinterLogic SaaS from the gallery to your list of managed SaaS apps.
In this section, you test your Azure AD single sign-on configuration with follow
#### SP initiated:
-* Click on **Test this application** in Azure portal. This will redirect to AskYourTeam Sign on URL where you can initiate the login flow.
+* Click on **Test this application** in Azure portal. This will redirect to PrinterLogic SaaS Sign on URL where you can initiate the login flow.
-* Go to AskYourTeam Sign-on URL directly and initiate the login flow from there.
+* Go to PrinterLogic SaaS Sign-on URL directly and initiate the login flow from there.
#### IDP initiated:
-* Click on **Test this application** in Azure portal and you should be automatically signed in to the AskYourTeam for which you set up the SSO
-
-You can also use Microsoft My Apps to test the application in any mode. When you click the AskYourTeam tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the AskYourTeam for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the PrinterLogic SaaS for which you set up the SSO.
+You can also use Microsoft My Apps to test the application in any mode. When you click the PrinterLogic SaaS tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the PrinterLogic SaaS for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](https://docs.microsoft.com/azure/active-directory/active-directory-saas-access-panel-introduction).
## Next steps
-Once you configure PrinterLogic SaaS you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+Once you configure PrinterLogic SaaS you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
active-directory Smartlook Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/smartlook-tutorial.md
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Smartlook | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and Smartlook.
++++++++ Last updated : 03/04/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with Smartlook
+
+In this tutorial, you'll learn how to integrate Smartlook with Azure Active Directory (Azure AD). When you integrate Smartlook with Azure AD, you can:
+
+* Control in Azure AD who has access to Smartlook.
+* Enable your users to be automatically signed-in to Smartlook with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Smartlook single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Smartlook supports **SP and IDP** initiated SSO
+* Smartlook supports **Just In Time** user provisioning
+
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
+
+## Adding Smartlook from the gallery
+
+To configure the integration of Smartlook into Azure AD, you need to add Smartlook from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Smartlook** in the search box.
+1. Select **Smartlook** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
++
+## Configure and test Azure AD SSO for Smartlook
+
+Configure and test Azure AD SSO with Smartlook using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Smartlook.
+
+To configure and test Azure AD SSO with Smartlook, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Smartlook SSO](#configure-smartlook-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Smartlook test user](#create-smartlook-test-user)** - to have a counterpart of B.Simon in Smartlook that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Smartlook** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, the user does not have to perform any step as the app is already pre-integrated with Azure.
+
+1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
+
+ In the **Sign-on URL** text box, type the URL:
+ `https://app.smartlook.com/sign/sso`
+
+1. Click **Save**.
+
+1. Smartlook application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
+
+ ![image](common/default-attributes.png)
+
+1. In addition to above, Smartlook application expects few more attributes to be passed back in SAML response, which are shown below. These attributes are also pre populated but you can review them as per your requirements.
+
+ | Name | Source Attribute|
+ | - | |
+ | urn:oasis:names:tc:SAML:attribute:subject-id | user.userprincipalname |
+ | urn:oid:0.9.2342.19200300.100.1.3 | user.mail |
+ |
+
+1. On the **Set up single sign-on with SAML** page, In the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer.
+
+ ![The Certificate download link](common/copy-metadataurl.png)
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Smartlook.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Smartlook**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Smartlook SSO
+
+To configure single sign-on on **Smartlook** side, you need to send the **App Federation Metadata Url** to [Smartlook support team](mailto:info@smartlook.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create Smartlook test user
+
+In this section, a user called Britta Simon is created in Smartlook. Smartlook supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Smartlook, a new one is created after authentication.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Smartlook Sign on URL where you can initiate the login flow.
+
+* Go to Smartlook Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
+
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Smartlook for which you set up the SSO
+
+You can also use Microsoft My Apps to test the application in any mode. When you click the Smartlook tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Smartlook for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](https://docs.microsoft.com/azure/active-directory/active-directory-saas-access-panel-introduction).
+
+## Next steps
+
+Once you configure Smartlook you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](https://docs.microsoft.com/cloud-app-security/proxy-deployment-any-app).
++
active-directory Tutorial List https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/tutorial-list.md
To find more tutorials, use the table of contents on the left.
| ![logo-InVision](./medi)| | ![logo-Jamf Pro](./medi)| | ![logo-Kanbanize](./medi)|
+| ![logo-Kendis - Azure AD Integration](./medi)|
| ![logo-Knowledge Anywhere LMS](./medi)| | ![logo-Litmus](./medi)| | ![logo-Marketo](./medi)|
active-directory Zscaler Internet Access Administrator Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/zscaler-internet-access-administrator-tutorial.md
Previously updated : 12/18/2020 Last updated : 02/25/2021 # Tutorial: Azure Active Directory integration with Zscaler Internet Access Administrator
In this tutorial, you'll learn how to integrate Zscaler Internet Access Administ
## Prerequisites
-To configure Azure AD integration with Zscaler Internet Access Administrator, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Zscaler Internet Access Administrator subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Zscaler Internet Access Administrator single sign-on (SSO) enabled subscription.
> [!NOTE] > This integration is also available to use from Azure AD US Government Cloud environment. You can find this application in the Azure AD US Government Cloud Application Gallery and configure it in the same way as you do from public cloud.
To configure Azure AD integration with Zscaler Internet Access Administrator, yo
In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Zscaler Internet Access Administrator supports **IDP** initiated SSO
+* Zscaler Internet Access Administrator supports **IDP** initiated SSO.
-## Adding Zscaler Internet Access Administrator from the gallery
+## Add Zscaler Internet Access Administrator from the gallery
To configure the integration of Zscaler Internet Access Administrator into Azure AD, you need to add Zscaler Internet Access Administrator from the gallery to your list of managed SaaS apps.
To configure and test Azure AD SSO with Zscaler Internet Access Administrator, p
1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on. 2. **[Configure Zscaler Internet Access Administrator SSO](#configure-zscaler-internet-access-administrator-sso)** - to configure the Single Sign-On settings on application side. 1. **[Create Zscaler Internet Access Administrator test user](#create-zscaler-internet-access-administrator-test-user)** - to have a counterpart of Britta Simon in Zscaler Internet Access Administrator that is linked to the Azure AD representation of user.
-6. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+3. **[Test SSO](#test-sso)** - to verify whether the configuration works.
## Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
1. On the **Basic SAML Configuration** section, enter the values for the following fields:
- a. In the **Identifier** text box, type a URL as per your requirement:
+ a. In the **Identifier** text box, type one of the following URLs as per your requirement:
| Identifier | ||
Follow these steps to enable Azure AD SSO in the Azure portal.
| `https://admin.zscloud.net` | | `https://admin.zscalerbeta.net` |
- b. In the **Reply URL** text box, type a URL as per your requirement:
+ b. In the **Reply URL** text box, type one of the following URLs as per your requirement:
| Reply URL | |--|
Follow these steps to enable Azure AD SSO in the Azure portal.
5. Zscaler Internet Access Administrator application expects the SAML assertions in a specific format. Configure the following claims for this application. You can manage the values of these attributes from the **User Attributes & Claims** section on application integration page. On the **Set up Single Sign-On with SAML page**, click **Edit** button to open **User Attributes & Claims** dialog.
- ![The Attribute link](./media/zscaler-internet-access-administrator-tutorial/tutorial_zscaler-internet_attribute.png)
+ ![The Attribute link](./media/zscaler-internet-access-administrator-tutorial/attributes.png)
6. In the **User Claims** section on the **User Attributes** dialog, configure SAML token attribute as shown in the image above and perform the following steps:
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. If you have setup the roles as explained in the above, you can select it from the **Select a role** dropdown. 1. In the **Add Assignment** dialog, click the **Assign** button. - ## Configure Zscaler Internet Access Administrator SSO 1. In a different web browser window, log in to your Zscaler Internet Access Admin UI. 2. Go to **Administration > Administrator Management** and perform the following steps and click Save:
- ![Screenshot shows Administrator Management with options to Enable SAML Authentication, upload S S L Certificate and specify an Issuer.](./media/zscaler-internet-access-administrator-tutorial/AdminSSO.png "Administration")
+ ![Screenshot shows Administrator Management with options to Enable SAML Authentication, upload S S L Certificate and specify an Issuer.](./media/zscaler-internet-access-administrator-tutorial/management.png "Administration")
a. Check **Enable SAML Authentication**.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
3. On the Admin UI, perform the following steps:
- ![Screenshot shows the Admin U I where you can perform the steps.](./media/zscaler-internet-access-administrator-tutorial/ic800207.png)
+ ![Screenshot shows the Admin U I where you can perform the steps.](./media/zscaler-internet-access-administrator-tutorial/activation.png)
a. Hover over the **Activation** menu near the bottom left.
For steps on how to create an Administrator account, refer to Zscaler documentat
https://help.zscaler.com/zia/adding-admins
-### Test SSO
+## Test SSO
In this section, you test your Azure AD single sign-on configuration with following options.
api-management Howto Protect Backend Frontend Azure Ad B2c https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/api-management/howto-protect-backend-frontend-azure-ad-b2c.md
Title: Protect SPA backend with OAuth 2.0 by using Azure Active Directory B2C and Azure API Management.
-description: Protect an API with OAuth 2.0 by using Azure Active Directory B2C, Azure API Management and Easy Auth to be called from a JavaScript SPA.
+ Title: Protect SPA backend in Azure API Management with Active Directory B2C
+description: Protect an API with OAuth 2.0 by using Azure Active Directory B2C, Azure API Management and Easy Auth to be called from a JavaScript SPA using the PKCE enabled SPA Auth Flow.
documentationcenter: ''
na ms.devlang: na Previously updated : 02/20/2020 Last updated : 02/18/2021 -+ # Protect SPA backend with OAuth 2.0, Azure Active Directory B2C and Azure API Management This scenario shows you how to configure your Azure API Management instance to protect an API.
-We'll use the OAuth 2.0 protocol with Azure AD B2C, alongside API Management to secure an Azure Functions backend using EasyAuth.
+We'll use the Azure AD B2C SPA (Auth Code + PKCE) flow to acquire a token, alongside API Management to secure an Azure Functions backend using EasyAuth.
## Aims
-We're going to see how API Management can be used in a simplified scenario with Azure Functions and Azure AD B2C. You will create a JavaScript (JS) app calling an API, that signs in users with Azure AD B2C. Then you'll use API Management's validate-jwt policy features to protect the Backend API.
-For defense in depth, we then use EasyAuth to validate the token again inside the back-end API.
+We're going to see how API Management can be used in a simplified scenario with Azure Functions and Azure AD B2C. You'll create a JavaScript (JS) app calling an API, that signs in users with Azure AD B2C. Then you'll use API Management's validate-jwt, CORS, and Rate Limit By Key policy features to protect the Backend API.
+
+For defense in depth, we then use EasyAuth to validate the token again inside the back-end API and ensure that API management is the only service that can call the Azure Functions backend.
+
+## What will you learn
+
+> [!div class="checklist"]
+> * Setup of a Single Page App and backend API in Azure Active Directory B2C
+> * Creation of an Azure Functions Backend API
+> * Import of an Azure Functions API into Azure API Management
+> * Securing the API in Azure API Management
+> * Calling the Azure Active Directory B2C Authorization Endpoints via the Microsoft Identity Platform Libraries (MSAL.js)
+> * Storing a HTML / Vanilla JS Single Page Application and serving it from an Azure Blob Storage Endpoint
## Prerequisites+ To follow the steps in this article, you must have:
-* An Azure (StorageV2) General Purpose V2 Storage Account to host the frontend JS Single Page App
-* An Azure API Management instance
-* An empty Azure Function app (running the V2 .NET Core runtime, on a Windows Consumption Plan) to host the called API
-* An Azure AD B2C tenant, linked to a subscription
+
+* An Azure (StorageV2) General Purpose V2 Storage Account to host the frontend JS Single Page App.
+* An Azure API Management instance (Any tier will work, including 'Consumption', however certain features applicable to the full scenario are not available in this tier (rate-limit-by-key and dedicated Virtual IP), these restrictions are called out below in the article where appropriate).
+* An empty Azure Function app (running the V3.1 .NET Core runtime, on a Consumption Plan) to host the called API
+* An Azure AD B2C tenant, linked to a subscription.
Although in practice you would use resources in the same region in production workloads, for this how-to article the region of deployment isn't important. ## Overview
-Here is an illustration of the components in use and the flow between them once this process is complete.
+
+Here's an illustration of the components in use and the flow between them once this process is complete.
![Components in use and flow](../api-management/media/howto-protect-backend-frontend-azure-ad-b2c/image-arch.png "Components in use and flow")
-Here is a quick overview of the steps:
+Here's a quick overview of the steps:
1. Create the Azure AD B2C Calling (Frontend, API Management) and API Applications with scopes and grant API Access
-1. Create the sign up or sign in policies to allow users to sign in with Azure AD B2C
+1. Create the sign up and sign in policies to allow users to sign in with Azure AD B2C
1. Configure API Management with the new Azure AD B2C Client IDs and keys to Enable OAuth2 user authorization in the Developer Console 1. Build the Function API
-1. Configure the Function API to enable EasyAuth with the new Azure AD B2C Client IDΓÇÖs and Keys and lock down to APIM VIP
+1. Configure the Function API to enable EasyAuth with the new Azure AD B2C Client IDΓÇÖs and Keys and lock down to APIM VIP
1. Build the API Definition in API Management 1. Set up Oauth2 for the API Management API configuration 1. Set up the **CORS** policy and add the **validate-jwt** policy to validate the OAuth token for every incoming request
Here is a quick overview of the steps:
1. Configure the Sample JS Client App with the new Azure AD B2C Client IDΓÇÖs and keys 1. Test the Client Application
-## Configure Azure AD B2C
+ > [!TIP]
+ > We're going to capture quite a few pieces of information and keys etc as we walk this document, you might find it handy to have a text editor open to store the following items of configuration temporarily.
+ >
+ > B2C BACKEND CLIENT ID:
+ > B2C BACKEND CLIENT SECRET KEY:
+ > B2C BACKEND API SCOPE URI:
+ > B2C FRONTEND CLIENT ID:
+ > B2C USER FLOW ENDPOINT URI:
+ > B2C WELL-KNOWN OPENID ENDPOINT:
+ > B2C POLICY NAME: Frontendapp_signupandsignin
+ > FUNCTION URL:
+ > APIM API BASE URL:
+ > STORAGE PRIMARY ENDPOINT URL:
+
+## Configure the backend application
+ Open the Azure AD B2C blade in the portal and do the following steps.
-1. Select the **Applications** tab
-1. Click the 'Add' button and create three applications for the following purposes
- * The Frontend Client.
- * The Backend Function API.
- * [Optional] The API Management developer portal (unless you're running Azure API Management in the consumption tier, more on this scenario later).
-1. Set WebApp / Web API for all 3 applications and set 'Allow Implicit flow' to yes for only the Frontend Client.
-1. Now set the App ID URI, choose something unique and relevant to the service being created.
-1. Use placeholders for the reply urls for now such as https://localhost, weΓÇÖll update those urls later.
-1. Click 'Create', then repeat steps 2-5 for each of the three apps above, recording the AppID URI, name, and Application ID for later use for all three apps.
-1. Open the API Management Developer Portal Application from the list of applications and select the *Keys* tab (under General) then click 'Generate Key' to generate an auth key
-1. Upon clicking save, record the key somewhere safe for later use - note that this place is the ONLY chance will you get to view and copy this key.
-1. Now select the *Published Scopes* Tab (Under API Access)
-1. Create and name a scope for your Function API and record the Scope and populated Full Scope Value, then click 'Save'.
+
+1. Select the **App Registrations** tab
+1. Click the 'New Registration' button.
+1. Choose 'Web' from the Redirect URI selection box.
+1. Now set the Display Name, choose something unique and relevant to the service being created. In this example, we will use the name "Backend Application".
+1. Use placeholders for the reply urls, like 'https://jwt.ms' (A Microsoft owned token decoding site), weΓÇÖll update those urls later.
+1. Ensure you have selected the "Accounts in any identity provider or organizational directory (for authenticating users with user flows)" option
+1. For this sample, uncheck the "Grant admin consent" box, as we won't require offline_access permissions today.
+1. Click 'Register'.
+1. Record the Backend Application Client ID for later use (shown under 'Application (client) ID').
+1. Select the *Certificates and Secrets* tab (under Manage) then click 'New Client Secret' to generate an auth key (Accept the default settings and click 'Add').
+1. Upon clicking 'Add', copy the key (under 'value') somewhere safe for later use as the 'Backend client secret' - note that this dialog is the ONLY chance you'll have to copy this key.
+1. Now select the *Expose an API* Tab (Under Manage).
+1. You will be prompted to set the AppID URI, select and record the default value.
+1. Create and name the scope "Hello" for your Function API, you can use the phrase 'Hello' for all of the enterable options, recording the populated Full Scope Value URI, then click 'Add Scope'.
+1. Return to the root of the Azure AD B2C blade by selecting the 'Azure AD B2C' breadcrumb at the top left of the portal.
+ > [!NOTE] > Azure AD B2C scopes are effectively permissions within your API that other applications can request access to via the API access blade from their applications, effectively you just created application permissions for your called API.
-1. Open the other two applications and then look under the *API Access* tab.
-1. Grant them access to the backend API scope and the default one that was already there ("Access the user's profile").
-1. Generate them a key each by selecting the *Keys* tab under 'General' to generate an auth key and record those keys somewhere safe for later.
-
-## Create a "Sign up or Sign in" user flow
-1. Return to the root (Or 'Overview') of the Azure AD B2C Blade
-1. Then select ΓÇ£User Flows (Policies)ΓÇ¥ and click "New user flow"
-1. Choose the 'Sign up and sign in' user flow type
-1. Give the policy a name and record it for later.
-1. Then Under 'Identity providers', then check 'User ID sign up' (this may say 'Email sign up') and click OK.
-1. Under 'User Attributes and claims', click 'Show More...' then choose the claim options that you want your users to enter and have returned in the token. Check at least 'Display Name' and 'Email Address' to collect and return, and click 'OK', then click 'Create'.
-1. Select the policy that you created in the list, then click the 'Run user flow' button.
-1. This action will open the run user flow blade, select the frontend application, then record the address of the b2clogin.com domain that's shown under the dropdown for 'Select domain'.
-1. Click on the link at the top to open the 'well-known openid configuration endpoint', and record the authorization_endpoint and token_endpoint values as well of the value of the link itself as the well-known openid configuration endpoint.
+
+## Configure the frontend application
+
+1. Select the **App Registrations** tab
+1. Click the 'New Registration' button.
+1. Choose 'Single Page Application (SPA)' from the Redirect URI selection box.
+1. Now set the Display Name and AppID URI, choose something unique and relevant to the Frontend application that will use this AAD B2C app registration. In this example, you can use "Frontend Application"
+1. As per the first app registration, leave the supported account types selection to default (authenticating users with user flows)
+1. Use placeholders for the reply urls, like 'https://jwt.ms' (A Microsoft owned token decoding site), weΓÇÖll update those urls later.
+1. Leave the grant admin consent box ticked
+1. Click 'Register'.
+1. Record the Frontend Application Client ID for later use (shown under 'Application (client) ID').
+1. Switch to the *API Permissions* tab.
+1. Grant access to the backend application by clicking 'Add a permission', then 'My APIs', select the 'Backend Application', select 'Permissions', select the scope you created in the previous section, and click 'Add permissions'
+1. Click 'Grant admin consent for {tenant} and click 'Yes' from the popup dialog. This popup consents the "Frontend Application" to use the permission "hello" defined in the "Backend Application" created earlier.
+1. All Permissions should now show for the app as a green tick under the status column
+
+## Create a "Sign up and Sign in" user flow
+
+1. Return to the root of the B2C blade by selecting the Azure AD B2C breadcrumb.
+1. Switch to the 'User Flows' (Under Policies) tab.
+1. Click "New user flow"
+1. Choose the 'Sign up and sign in' user flow type, and select 'Recommended' and then 'Create'
+1. Give the policy a name and record it for later. For this example, you can use "Frontendapp_signupandsignin", note that this will be prefixed with "B2C_1_" to make "B2C_1_Frontendapp_signupandsignin"
+1. Under 'Identity providers' and "Local accounts", check 'Email sign up' (or 'User ID sign up' depending on the config of your B2C tenant) and click OK. This configuration is because we'll be registering local B2C accounts, not deferring to another identity provider (like a social identity provider) to use an user's existing social media account.
+1. Leave the MFA and conditional access settings at their defaults.
+1. Under 'User Attributes and claims', click 'Show More...' then choose the claim options that you want your users to enter and have returned in the token. Check at least 'Display Name' and 'Email Address' to collect, with 'Display Name' and 'Email Addresses' to return (pay careful attention to the fact that you are collecting emailaddress, singular, and asking to return email addresses, multiple), and click 'OK', then click 'Create'.
+1. Click on the user flow that you created in the list, then click the 'Run user flow' button.
+1. This action will open the run user flow blade, select the frontend application, copy the user flow endpoint and save it for later.
+1. Copy and store the link at the top, recording as the 'well-known openid configuration endpoint' for later use.
> [!NOTE]
- > B2C Policies allow you to expose the Azure AD B2C login endpoints to be able to capture different data components and sign in users in different ways.
- > In this case we configured a sign up or sign in endpoint, which exposed a well-known configuration endpoint, specifically our created policy was identified in the URL by the p= parameter.
+ > B2C Policies allow you to expose the Azure AD B2C login endpoints to be able to capture different data components and sign in users in different ways.
>
- > Once this is done ΓÇô you now have a functional Business to Consumer identity platform that will sign users into multiple applications.
- > If you want to you can click the 'Run user flow' button here (to go through the sign up or sign in process) and get a feel for what it will do in practice, but the redirection step at the end will fail as the app has not yet been deployed.
+ > In this case we configured a sign up or sign in flow (policy). This also exposed a well-known configuration endpoint, in both cases our created policy was identified in the URL by the "p=" query string parameter.
+ >
+ > Once this is done, you now have a functional Business to Consumer identity platform that will sign users into multiple applications.
## Build the function API
-1. Switch back to your standard Azure AD tenant in the Azure portal so we can configure items in your subscription again
-1. Go to the Function Apps blade of the Azure portal, open your empty function app, then create a new In-Portal 'Webhook + API' function via the quickstart.
-1. Paste the sample code from below into Run.csx over the existing code that appears.
+
+1. Switch back to your standard Azure AD tenant in the Azure portal so we can configure items in your subscription again.
+1. Go to the Function Apps blade of the Azure portal, open your empty function app, then click 'Functions', click 'Add'.
+1. In the flyout that appears, choose 'Develop in portal', under 'select a template' then choose 'HTTP trigger', under Template details name it 'hello' with authorization level 'Function', then select Add.
+1. Switch to the Code + Test blade and copy-paste the sample code from below *over the existing code* that appears.
+1. Select Save.
```csharp
Open the Azure AD B2C blade in the portal and do the following steps.
```
- > [!NOTE]
+ > [!TIP]
> The c# script function code you just pasted simply logs a line to the functions logs, and returns the text "Hello World" with some dynamic data (the date and time).
-3. Select ΓÇ£IntegrateΓÇ¥ from the left-hand blade, then select ΓÇÿAdvanced EditorΓÇÖ in the top-right-hand corner of the pane.
-4. Paste the sample code below over the existing json.
-
- ```json
- {
- "bindings": [
- {
- "authLevel": "function",
- "name": "req",
- "type": "httpTrigger",
- "direction": "in",
- "methods": [
- "get"
- ],
- "route": "hello"
- },
- {
- "name": "$return",
- "type": "http",
- "direction": "out"
- }
- ]
- }
- ```
-
-5. Switch back to the HttpTrigger1 tab, click 'Get Function URL', then copy the URL that appears.
+1. Select ΓÇ£IntegrationΓÇ¥ from the left-hand blade, then click the http (req) link inside the 'Trigger' box.
+1. From the 'Selected HTTP methods' dropdown, uncheck the http POST method, leaving only GET selected, then click Save.
+1. Switch back to the Code + Test tab, click 'Get Function URL', then copy the URL that appears and save it for later.
> [!NOTE]
- > The bindings you just created simply tell Functions to respond on anonymous http GET requests to the URL you just copied. (`https://yourfunctionappname.azurewebsites.net/api/hello?code=secretkey`)
- > Now we have a scalable serverless https API, that is capable of returning a very simple payload.
- > You can now test calling this API from a web browser using the URL above, you can also strip the ?code=secret portion of the URL and prove that Azure Functions will return a 401 error.
+ > The bindings you just created simply tell Functions to respond on anonymous http GET requests to the URL you just copied (`https://yourfunctionappname.azurewebsites.net/api/hello?code=secretkey`). Now we have a scalable serverless https API, that is capable of returning a very simple payload.
+ >
+ > You can now test calling this API from a web browser using your version of the URL above that you just copied and saved. You can also remove the query string parameters "?code=secretkey" portion of the URL , and test again, to prove that Azure Functions will return a 401 error.
## Configure and secure the function API
-1. Two extra areas in the function app need to be configured (Auth and Network Restrictions).
-1. Firstly Let's configure Authentication / Authorization, so click on the name of the function app (next to the &lt;Z&gt; functions icon) to show the overview page.
-1. Next Select the 'Platform features' tab and select 'Authentication / Authorization'.
-1. Turn on the App Service Authentication feature.
-1. Under 'Authentication Providers' choose ΓÇÿAzure Active DirectoryΓÇÖ, and choose ΓÇÿAdvancedΓÇÖ from the Management Mode switch.
-1. Paste the Backend Function API's application ID (from Azure AD B2C into the ΓÇÿClient IDΓÇÖ box)
-1. Paste the Well-known open-id configuration endpoint from the sign up or sign in policy into the Issuer URL box (we recorded this configuration earlier).
-1. Select OK.
-1. Set the Action to take when request is not authenticated dropdown to "Log in with Azure Active Directory", then click Save.
- > [!NOTE]
- > Now your Function API is deployed and should throw 401 responses if the correct key is not supplied, and should return data when a valid request is presented.
- > You added additional defense-in-depth security in EasyAuth by configuring the 'Login With Azure AD' option to handle unauthenticated requests. Be aware that this will change the unauthorized request behavior between the Backend Function App and Frontend SPA as EasyAuth will issue a 302 redirect to AAD instead of a 401 Not Authorized response, we will correct this by using API Management later.
- > We still have no IP security applied, if you have a valid key and OAuth2 token, anyone can call this from anywhere - ideally we want to force all requests to come via API Management.
- > If you are using the API Management consumption tier, you will not be able to perform this lockdown by VIP as there is no dedicated static IP for that tier, you will need to rely on the method of locking down your API calls via the shared secret function key, so steps 11-13 will not be possible.
+1. Two extra areas in the function app need to be configured (Authorization and Network Restrictions).
+1. Firstly Let's configure Authentication / Authorization, so navigate back to the root blade of the function app via the breadcrumb.
+1. Next select 'Authentication / Authorization' (under 'Settings').
+1. Turn on the App Service Authentication feature.
+1. Set the Action to take when request is not authenticated dropdown to "Log in with Azure Active Directory".
+1. Under 'Authentication Providers', choose ΓÇÿAzure Active DirectoryΓÇÖ.
+1. Choose ΓÇÿAdvancedΓÇÖ from the Management Mode switch.
+1. Paste the Backend application's [Application] Client ID (from Azure AD B2C) into the ΓÇÿClient IDΓÇÖ box
+1. Paste the Well-known open-id configuration endpoint from the sign up and sign in policy into the Issuer URL box (we recorded this configuration earlier).
+1. Click 'Show Secret' and paste the Backend application's client secret into the appropriate box.
+1. Select OK, which takes you back to the identity provider selection blade/screen.
+1. Leave [Token Store](https://docs.microsoft.com/azure/app-service/overview-authentication-authorization#token-store) enabled under advanced settings (default).
+1. Click 'Save' (at the top left of the blade).
+
+ > [!IMPORTANT]
+ > Now your Function API is deployed and should throw 401 responses if the correct JWT is not supplied as an Authorization: Bearer header, and should return data when a valid request is presented.
+ > You added additional defense-in-depth security in EasyAuth by configuring the 'Login With Azure AD' option to handle unauthenticated requests. Be aware that this will change the unauthorized request behavior between the Backend Function App and Frontend SPA as EasyAuth will issue a 302 redirect to AAD instead of a 401 Not Authorized response, we will correct this by using API Management later.
+ >
+ > We still have no IP security applied, if you have a valid key and OAuth2 token, anyone can call this from anywhere - ideally we want to force all requests to come via API Management.
+ >
+ > If you're using APIM Consumption tier then [there isn't a dedicated Azure API Management Virtual IP](./api-management-howto-ip-addresses.md#ip-addresses-of-consumption-tier-api-management-service) to allow-list with the functions access-restrictions. In the Azure API Management Standard SKU and above [the VIP is single tenant and for the lifetime of the resource](./api-management-howto-ip-addresses.md#changes-to-the-ip-addresses). For the Azure API Management Consumption tier, you can lock down your API calls via the shared secret function key in the portion of the URI you copied above. Also, for the Consumption tier - steps 12-17 below do not apply.
1. Close the 'Authentication / Authorization' blade
-1. Select 'Networking' and then select 'Access Restrictions'
-1. Next, lock down the allowed function app IPs to the API Management instance VIP. This VIP is shown in the API management - overview section of the portal.
+1. Open the *API Management blade of the portal*, then open *your instance*.
+1. Record the Private VIP shown on the overview tab.
+1. Return to the *Azure Functions blade of the portal* then open *your instance* again.
+1. Select 'Networking' and then select 'Configure access restrictions'
+1. Click 'Add Rule', and enter the VIP copied in step 3 above in the format xx.xx.xx.xx/32.
1. If you want to continue to interact with the functions portal, and to carry out the optional steps below, you should add your own public IP address or CIDR range here too.
-1. Once thereΓÇÖs an allow entry in the list, Azure adds an implicit deny rule to block all other addresses.
+1. Once thereΓÇÖs an allow entry in the list, Azure adds an implicit deny rule to block all other addresses.
-You'll need to add CIDR formatted blocks of addresses to the IP restrictions panel. When you need to add a single address such as the API Management VIP, you need to add it in the format xx.xx.xx.xx.
+You'll need to add CIDR formatted blocks of addresses to the IP restrictions panel. When you need to add a single address such as the API Management VIP, you need to add it in the format xx.xx.xx.xx/32.
> [!NOTE] > Now your Function API should not be callable from anywhere other than via API management, or your address.
-
-## Import the function app definition
+ 1. Open the *API Management blade*, then open *your instance*.
-1. Select the APIs Blade from the API Management section of your instance.
+1. Select the APIs Blade (under APIs).
1. From the 'Add a New API' pane, choose 'Function App', then select 'Full' from the top of the popup.
-1. Click Browse, choose the function app you're hosting the API inside, and click select.
+1. Click Browse, choose the function app you're hosting the API inside, and click select. Next, click select again.
1. Give the API a name and description for API Management's internal use and add it to the ΓÇÿunlimitedΓÇÖ Product.
-1. Make sure you record the base URL for later use and then click create.
-
-## Configure Oauth2 for API Management
-
-1. Next, Select the Oauth 2.0 blade from the Security Tab, and click 'Add'
-1. Give values for *Display Name* and *Description* for the added Oauth Endpoint (these values will show up in the next step as an Oauth2 endpoint).
-1. You can enter any value in the Client registration page URL, as this value won't be used.
-1. Check the *Implicit Auth* Grant type and leave the the Authorization code grant type checked.
-1. Move to the *Authorization* and *Token* endpoint fields, and enter the values you captured from the well-known configuration xml document earlier.
-1. Scroll down and populate an *Additional body parameter* called 'resource' with the Backend Function API client ID from the Azure AD B2C App registration
-1. Select 'Client credentials', set the Client ID to the Developer console app's app ID - skip this step if using the consumption API Management model.
-1. Set the Client Secret to the key you recorded earlier - skip this step if using the consumption API Management model.
-1. Lastly, now record the redirect_uri of the auth code grant from API Management for later use.
-
-## Set up Oauth2 for your API
-1. Your API will appear on the left-hand side of the portal under the 'All APIs' section, open your API by clicking on it.
-1. Select the 'Settings' Tab.
-1. Update your settings by selecting ΓÇ£Oauth 2.0ΓÇ¥ from the user authorization radio button.
-1. Select the Oauth server that you defined earlier.
-1. Check the ΓÇÿOverride scopeΓÇÖ checkbox and enter the scope you recorded for the backend API call earlier on.
+1. Copy and record the API's 'base URL' and click 'create'.
+1. Click the 'settings' tab, then under subscription - switch off the 'Subscription Required' checkbox as we will use the Oauth JWT token in this case to rate limit. Note that if you are using the consumption tier, this would still be required in a production environment.
- > [!NOTE]
- > Now we have an API Management instance that knows how to get access tokens from Azure AD B2C to authorize requests and understands our Oauth2 Azure Active Directory B2C configuration.
+ > [!TIP]
+ > If using the consumption tier of APIM the unlimited product won't be available as an out of the box. Instead, navigate to "Products" under "APIs" and hit "Add".
+ > Type "Unlimited" as the product name and description and select the API you just added from the "+" APIs callout at the bottom left of the screen. Select the "published" checkbox. Leave the rest as default. Finally, hit the "create" button. This created the "unlimited" product and assigned it to your API. You can customize your new product later.
-## Set up the **CORS** and **validate-jwt** policies
+## Configure and capture the correct storage endpoint settings
-> The following sections should be followed regardless of the APIM tier being used.
+1. Open the storage accounts blade in the Azure portal
+1. Select the account you created and select the 'Static Website' blade from the Settings section (if you don't see a 'Static Website' option, check you created a V2 account).
+1. Set the static web hosting feature to 'enabled', and set the index document name to 'https://docsupdatetracker.net/index.html', then click 'save'.
+1. Note down the contents of the 'Primary Endpoint' for later, as this location is where the frontend site will be hosted.
+
+ > [!TIP]
+ > You could use either Azure Blob Storage + CDN rewrite, or Azure App Service to host the SPA - but Blob Storage's Static Website hosting feature gives us a default container to serve static web content / html / js / css from Azure Storage and will infer a default page for us for zero work.
+
+## Set up the **CORS** and **validate-jwt** policies
-1. Switch back to the design tab and choose ΓÇ£All APIsΓÇ¥, then click the code view button to show the policy editor.
+> The following sections should be followed regardless of the APIM tier being used. The storage account URL is from the storage account you will have made available from the prerequisites at the top of this article.
+1. Switch to the API management blade of the portal and open your instance.
+1. Select APIs, then select ΓÇ£All APIsΓÇ¥.
+1. Under "Inbound processing", click the code view button "</>" to show the policy editor.
1. Edit the inbound section and paste the below xml so it reads like the following.
+1. Replace the following parameters in the Policy
+1. {PrimaryStorageEndpoint} (The 'Primary Storage Endpoint' you copied in the previous section), {b2cpolicy-well-known-openid} (The 'well-known openid configuration endpoint' you copied earlier) and {backend-api-application-client-id} (The B2C Application / Client ID for the **backend API**) with the correct values saved earlier.
+1. If you're using the Consumption tier of API Management, then you should remove both rate-limit-by-key policy as this policy is not available when using the Consumption tier of Azure API Management.
```xml <inbound>
- <validate-jwt header-name="Authorization" failed-validation-httpcode="401" failed-validation-error-message="Unauthorized. Access token is missing or invalid.">
- <openid-config url="https://tenant.b2clogin.com/tenant.onmicrosoft.com/v2.0/.well-known/openid-configuration?p=B2C_1_MyDefaultPolicy" />
+ <cors allow-credentials="true">
+ <allowed-origins>
+ <origin>{PrimaryStorageEndpoint}</origin>
+ </allowed-origins>
+ <allowed-methods preflight-result-max-age="120">
+ <method>GET</method>
+ </allowed-methods>
+ <allowed-headers>
+ <header>*</header>
+ </allowed-headers>
+ <expose-headers>
+ <header>*</header>
+ </expose-headers>
+ </cors>
+ <validate-jwt header-name="Authorization" failed-validation-httpcode="401" failed-validation-error-message="Unauthorized. Access token is missing or invalid." require-expiration-time="true" require-signed-tokens="true" clock-skew="300">
+ <openid-config url="{b2cpolicy-well-known-openid}" />
<required-claims> <claim name="aud">
- <value>your-backend-api-application-client-id</value>
+ <value>{backend-api-application-client-id}</value>
</claim> </required-claims> </validate-jwt>
- <cors>
- <allowed-origins>
- <origin>*</origin>
- </allowed-origins>
- <allowed-methods>
- <method>GET</method>
- </allowed-methods>
- <allowed-headers>
- <header>*</header>
- </allowed-headers>
- <expose-headers>
- <header>*</header>
- </expose-headers>
- </cors>
+ <rate-limit-by-key calls="300" renewal-period="120" counter-key="@(context.Request.IpAddress)" />
+ <rate-limit-by-key calls="15" renewal-period="60" counter-key="@(context.Request.Headers.GetValueOrDefault("Authorization","").AsJwt()?.Subject)" />
</inbound> ```
-1. Edit the openid-config url to match your well-known Azure AD B2C endpoint for the sign up or sign in policy.
-1. Edit the claim value to match the valid application ID, also known as a client ID for the backend API application and save.
> [!NOTE]
- > Now API management is able respond to cross origin requests to JS SPA apps, and it will perform throttling, rate-limiting and pre-validation of the JWT auth token being passed BEFORE forwarding the request on to the Function API.
-
- > [!NOTE]
- > The following section is optional and does not apply to the **Consumption** tier, which does not support the developer portal.
- > If you do not intend to use the developer portal, or cannot use it since you are using the Consumption tier, please skip this step and jump straight to ["Build the JavaScript SPA to consume the API"](#build-the-javascript-spa-to-consume-the-api).
-
-## [Optional] Configure the developer portal
-
-1. Open the Azure AD B2C blade and navigate to the application registration for the Developer Portal
-1. Set the 'Reply URL' entry to the one you noted down when you configured the redirect_uri of the auth code grant in API Management earlier.
-
- Now that the OAuth 2.0 user authorization is enabled on the `Echo API`, the Developer Console obtains an access token for the user, before calling the API.
-
-1. Browse to any operation under the `Echo API` in the developer portal, and select **Try it** to bring you to the Developer Console.
-1. Note a new item in the **Authorization** section, corresponding to the authorization server you just added.
-1. Select **Authorization code** from the authorization drop-down list, and you're prompted to sign in to the Azure AD tenant. If you're already signed in with the account, you might not be prompted.
-1. After successful sign-in, an `Authorization: Bearer` header is added to the request, with an access token from Azure AD B2C encoded in Base64.
-1. Select **Send** and you can call the API successfully.
-
- > [!NOTE]
- > Now API management is able to acquire tokens for the developer portal to test your API and is able to understand it's definition and render the appropriate test page in the dev portal.
-
-1. From the overview blade of the API Management portal, click 'Developer Portal' to sign in as an administrator of the API.
-1. Here, you and other selected consumers of your API can test and call them from a console.
-1. Select ΓÇÿProductsΓÇÖ, then choose ΓÇÿUnlimitedΓÇÖ, then choose the API we created earlier and click ΓÇÿTRY ITΓÇÖ
-1. Unhide the API subscription key, and copy it somewhere safe along with the request url that you'll need later.
-1. Also select Implicit, from the oauth auth dropdown and you may have to authenticate here with a popup.
-1. Click ΓÇÿSendΓÇÖ and if all is well, your Function App should respond back with a hello message via API management with a 200 OK message and some JSON.
-
- > [!NOTE]
- > Congratulations, you now have Azure AD B2C, API Management and Azure Functions working together to publish, secure AND consume an API.
- > You might have noticed that the API is in fact secured twice using this method, once with the API Management Ocp-Subscription-Key Header, and once with the Authorization: Bearer JWT.
- > You would be correct, as this example is a JavaScript Single Page Application, we use the API Management Key only for rate-limiting and billing calls.
- > The actual Authorization and Authentication is handled by Azure AD B2C, and is encapsulated in the JWT, which gets validated twice, once by API Management, and then by Azure Functions.
+ > Now Azure API management is able respond to cross origin requests from your JavaScript SPA apps, and it will perform throttling, rate-limiting and pre-validation of the JWT auth token being passed BEFORE forwarding the request on to the Function API.
+ >
+ > Congratulations, you now have Azure AD B2C, API Management and Azure Functions working together to publish, secure AND consume an API!
-## Build the JavaScript SPA to consume the API
-1. Open the storage accounts blade in the Azure portal
-1. Select the account you created and select the 'Static Website' blade from the Settings section (if you don't see a 'Static Website' option, check you created a V2 account).
-1. Set the static web hosting feature to 'enabled', and set the index document name to 'https://docsupdatetracker.net/index.html', then click 'save'.
-1. Note down the contents of the Primary Endpoint, as this location is where the frontend site will be hosted.
+ > [!TIP]
+ > If you're using the API Management consumption tier then instead of rate limiting by the JWT subject or incoming IP Address (Limit call rate by key policy is not supported today for the "Consumption" tier), you can Limit by call rate quota see [here](./api-management-access-restriction-policies.md#LimitCallRate).
+ > As this example is a JavaScript Single Page Application, we use the API Management Key only for rate-limiting and billing calls. The actual Authorization and Authentication is handled by Azure AD B2C, and is encapsulated in the JWT, which gets validated twice, once by API Management, and then by the backend Azure Function.
- > [!NOTE]
- > You could use either Azure Blob Storage + CDN rewrite, or Azure App Service - but Blob Storage's Static Website hosting feature gives us a default container to serve static web content / html / js / css from Azure Storage and will infer a default page for us for zero work.
+## Upload the JavaScript SPA sample to static storage
-## Upload the JS SPA sample
-1. Still in the storage account blade, select the 'Blobs' blade from the Blob Service section and click on the $web container that appears in the right-hand pane.
+1. Still in the storage account blade, select the 'Containers' blade from the Blob Service section and click on the $web container that appears in the right-hand pane.
1. Save the code below to a file locally on your machine as https://docsupdatetracker.net/index.html and then upload the file https://docsupdatetracker.net/index.html to the $web container. ```html
- <!doctype html>
- <html lang="en">
- <head>
- <meta charset="utf-8">
- <meta http-equiv="X-UA-Compatible" content="IE=edge">
- <meta name="viewport" content="width=device-width, initial-scale=1">
- <title>Sample JS SPA</title>
- <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0/css/bootstrap.min.css" integrity="sha384-Gn5384xqQ1aoWXA+058RXPxPg6fy4IWvTNh0E263XmFcJlSAwiGgFAW/dAiS6JXm" crossorigin="anonymous">
- </head>
- <body>
- <div class="container-fluid">
- <div class="row">
- <div class="col-md-12">
- <nav class="navbar navbar-expand-lg navbar-light bg-light navbar-dark bg-dark">
- <a class="navbar-brand" href="#">Sample Code</a>
- <ul class="navbar-nav ml-md-auto">
- <li class="nav-item dropdown">
- <a class="btn btn-large btn-success" onClick="login()">Sign In</a>
- </li>
- </ul>
- </nav>
- </div>
- </div>
- <div class="row">
- <div class="col-md-12">
- <div class="jumbotron">
- <h2>
- <div id="message">Hello, world!</div>
- </h2>
- <p>
- <a class="btn btn-primary btn-large" onClick="GetAPIData()">Call API</a>
- </p>
- </div>
- </div>
- </div>
- </div>
- <script src="https://code.jquery.com/jquery-3.2.1.min.js"></script>
- <script src="https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.12.9/umd/popper.min.js" integrity="sha384-ApNbgh9B+Y1QKtv3Rn7W3mgPxhU9K/ScQsAP7hUibX39j7fakFPskvXusvfa0b4Q" crossorigin="anonymous"></script>
- <script src="https://maxcdn.bootstrapcdn.com/bootstrap/4.0.0/js/bootstrap.min.js" integrity="sha384-JZR6Spejh4U02d8jOt6vLEHfe/JQGiRRSQQxSfFWpi1MquVdAyjUar5+76PVCmYl" crossorigin="anonymous"></script>
- <script src="https://secure.aadcdn.microsoftonline-p.com/lib/1.0.0/js/msal.js"></script>
- <script lang="javascript">
- var applicationConfig = {
- clientID: "clientidgoeshere",
- authority: "https://tenant.b2clogin.com/tfp/tenant/policy",
- b2cScopes: ["https://tenant/app/scope"],
- webApi: 'http://functionurl',
- subKey: 'apimkeygoeshere'
- };
- var msalConfig = {
- auth: {
- clientId: applicationConfig.clientID,
- authority: applicationConfig.authority,
- validateAuthority: false
- },
- cache: {
- cacheLocation: "localStorage",
- storeAuthStateInCookie: true
+ <!doctype html>
+ <html lang="en">
+ <head>
+ <meta charset="utf-8">
+ <meta http-equiv="X-UA-Compatible" content="IE=edge">
+ <meta name="viewport" content="width=device-width, initial-scale=1">
+ <link href="https://cdn.jsdelivr.net/npm/bootstrap@5.0.0-beta2/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-BmbxuPwQa2lc/FVzBcNJ7UAyJxM6wuqIj61tLrc4wSX0szH/Ev+nYRRuWlolflfl" crossorigin="anonymous">
+ <script type="text/javascript" src="https://alcdn.msauth.net/browser/2.11.1/js/msal-browser.min.js"></script>
+ </head>
+ <body>
+ <div class="container-fluid">
+ <div class="row">
+ <div class="col-md-12">
+ <nav class="navbar navbar-expand-lg navbar-dark bg-dark">
+ <div class="container-fluid">
+ <a class="navbar-brand" href="#">Azure Active Directory B2C with Azure API Management</a>
+ <div class="navbar-nav">
+ <button class="btn btn-success" id="signinbtn" onClick="login()">Sign In</a>
+ </div>
+ </div>
+ </nav>
+ </div>
+ </div>
+ <div class="row">
+ <div class="col-md-12">
+ <div class="card" >
+ <div id="cardheader" class="card-header">
+ <div class="card-text"id="message">Please sign in to continue</div>
+ </div>
+ <div class="card-body">
+ <button class="btn btn-warning" id="callapibtn" onClick="getAPIData()">Call API</a>
+ <div id="progress" class="spinner-border" role="status">
+ <span class="visually-hidden">Loading...</span>
+ </div>
+ </div>
+ </div>
+ </div>
+ </div>
+ </div>
+ <script lang="javascript">
+ // Just change the values in this config object ONLY.
+ var config = {
+ msal: {
+ auth: {
+ clientId: "{CLIENTID}", // This is the client ID of your FRONTEND application that you registered with the SPA type in AAD B2C
+ authority: "{YOURAUTHORITYB2C}", // Formatted as https://{b2ctenantname}.b2clogin.com/tfp/{b2ctenantguid or full tenant name including onmicrosoft.com}/{signuporinpolicyname}
+ redirectUri: "{StoragePrimaryEndpoint}", // The storage hosting address of the SPA, a web-enabled v2 storage account - recorded earlier as the Primary Endpoint.
+ knownAuthorities: ["{B2CTENANTDOMAIN}"] // {b2ctenantname}.b2clogin.com
+ },
+ cache: {
+ cacheLocation: "sessionStorage",
+ storeAuthStateInCookie: false
+ }
+ },
+ api: {
+ scopes: ["{BACKENDAPISCOPE}"], // The scope that we request for the API from B2C, this should be the backend API scope, with the full URI.
+ backend: "{APIBASEURL}/hello" // The location that we will call for the backend api, this should be hosted in API Management, suffixed with the name of the API operation (in the sample this is '/hello').
+ }
+ }
+ document.getElementById("callapibtn").hidden = true;
+ document.getElementById("progress").hidden = true;
+ const myMSALObj = new msal.PublicClientApplication(config.msal);
+ myMSALObj.handleRedirectPromise().then((tokenResponse) => {
+ if(tokenResponse !== null){
+ console.log(tokenResponse.account);
+ document.getElementById("message").innerHTML = "Welcome, " + tokenResponse.account.name;
+ document.getElementById("signinbtn").hidden = true;
+ document.getElementById("callapibtn").hidden = false;
+ }}).catch((error) => {console.log("Error Signing in:" + error);
+ });
+ function login() {
+ try {
+ myMSALObj.loginRedirect({scopes: config.api.scopes});
+ } catch (err) {console.log(err);}
+ }
+ function getAPIData() {
+ document.getElementById("progress").hidden = false;
+ document.getElementById("message").innerHTML = "Calling backend ... "
+ document.getElementById("cardheader").classList.remove('bg-success','bg-warning','bg-danger');
+ myMSALObj.acquireTokenSilent({scopes: config.api.scopes, account: getAccount()}).then(tokenResponse => {
+ const headers = new Headers();
+ headers.append("Authorization", `Bearer ${tokenResponse.accessToken}`);
+ fetch(config.api.backend, {method: "GET", headers: headers})
+ .then(async (response) => {
+ if (!response.ok)
+ {
+ document.getElementById("message").innerHTML = "Error: " + response.status + " " + JSON.parse(await response.text()).message;
+ document.getElementById("cardheader").classList.add('bg-warning');
+ }
+ else
+ {
+ document.getElementById("cardheader").classList.add('bg-success');
+ document.getElementById("message").innerHTML = await response.text();
+ }
+ }).catch(async (error) => {
+ document.getElementById("cardheader").classList.add('bg-danger');
+ document.getElementById("message").innerHTML = "Error: " + error;
+ });
+ }).catch(error => {console.log("Error Acquiring Token Silently: " + error);
+ return myMSALObj.acquireTokenRedirect({scopes: config.api.scopes, forceRefresh: false})
+ });
+ document.getElementById("progress").hidden = true;
+ }
+ function getAccount() {
+ var accounts = myMSALObj.getAllAccounts();
+ if (!accounts || accounts.length === 0) {
+ return null;
+ } else {
+ return accounts[0];
}
- };
- var clientApplication = new Msal.UserAgentApplication(msalConfig);
- function login() {
- var loginRequest = {
- scopes: applicationConfig.b2cScopes
- };
- clientApplication.loginPopup(loginRequest).then(function (loginResponse) {
- var tokenRequest = {
- scopes: applicationConfig.b2cScopes
- };
- clientApplication.acquireTokenSilent(tokenRequest).then(function (tokenResponse) {
- document.getElementById("signinbtn").innerHTML = "Logged in as: " + clientApplication.account.name;
- document.getElementById("callapibtn").hidden = false
- }).catch(function (error) {
- clientApplication.acquireTokenPopup(tokenRequest).then(function (tokenResponse) {
- }).catch (function (error) {
- console.log("Error acquiring the popup:\n" + error);
- });
- })
- }).catch (function (error) {
- console.log("Error during login:\n" + error);
- });
- }
- function GetAPIData() {
- var tokenRequest = {
- scopes: applicationConfig.b2cScopes
- }
- clientApplication.acquireTokenSilent(tokenRequest).then(function (tokenResponse) {
- callApiWithAccessToken(tokenResponse.accessToken);
- }).catch(function (error) {
- clientApplication.acquireTokenPopup(tokenRequest).then(function (tokenResponse) {
- callApiWithAccessToken(tokenResponse.accessToken);
-
- }).catch(function (error) {
- console.log("Error acquiring the access token to call the Web api:\n" + error);
- });
- })
- }
- function callApiWithAccessToken(token)
- {
- console.log("calling " + applicationConfig.webApi + " with " + token);
- // Make the api call here
- $.ajax({
- type: "get",
- headers: {'Authorization': 'Bearer ' + token, 'Ocp-Apim-Subscription-Key': applicationConfig.subKey},
- url: applicationConfig.webApi
- }
- ).done(function (body) {
- document.getElementById("message").innerHTML = "The API Said " + body;
- });
} </script>
- </body>
- </html>
-
+ </body>
+ </html>
``` 1. Browse to the Static Website Primary Endpoint you stored earlier in the last section. > [!NOTE]
- > Congratulations, you just deployed a JavaScript Single Page App to Azure Storage
- > Since we havenΓÇÖt configured the JS app with your keys for the api or configured the JS app with your Azure AD B2C details yet ΓÇô the page will not work yet if you open it.
-
-## Configure the JS SPA for Azure AD B2C
-1. Now we know where everything is: we can configure the SPA with the appropriate API Management API address and the correct Azure AD B2C application / client IDs
-1. Go back to the Azure portal storage blade and click on https://docsupdatetracker.net/index.html, then choose ΓÇÿEdit BlobΓÇÖ
-1. Update the auth details to match your front-end application you registered in B2C earlier, noting that the 'b2cScopes' values are for the API backend.
-1. The webApi key and api url can be found in the API Management test pane for the API operation.
-1. Create An APIM subscription key by heading to the API Management back to the API Management blade, selecting 'Subscriptions', and clicking 'Add Subscription' then saving the record. Clicking the Ellipsis (...) next to the created row will allow you to show the keys so you can copy the primary key.
-1. It should look something like the below code:-
-
- ```javascript
- var applicationConfig =
- clientID: "{aadb2c-clientid-goeshere}",
- authority: "https://{tenant}.b2clogin.com/{tenant}/{policy}",
- b2cScopes: ["https://{tenant}/{app}/{scope}"],
- webApi: 'http://{apim-url-for-your-function}',
- subKey: '{apim-subscription-key-goes-here}'
- };
- ```
-
+ > Congratulations, you just deployed a JavaScript Single Page App to Azure Storage Static content hosting.
+ > Since we havenΓÇÖt configured the JS app with your Azure AD B2C details yet ΓÇô the page won't work yet if you open it.
+
+## Configure the JavaScript SPA for Azure AD B2C
+
+1. Now we know where everything is: we can configure the SPA with the appropriate API Management API address and the correct Azure AD B2C application / client IDs.
+1. Go back to the Azure portal storage blade
+1. Select 'Containers' (under 'Settings')
+1. Select the '$web' container from the list
+1. Select https://docsupdatetracker.net/index.html blob from the list
+1. Click 'Edit'
+1. Update the auth values in the msal config section to match your *front-end* application you registered in B2C earlier. Use the code comments for hints on how the config values should look.
+The *authority* value needs to be in the format:- https://{b2ctenantname}.b2clogin.com/tfp/{b2ctenantname}.onmicrosoft.com}/{signupandsigninpolicyname}, if you have used our sample names and your b2c tenant is called 'contoso' then you would expect the authority to be 'https://contoso.b2clogin.com/tfp/contoso.onmicrosoft.com}/Frontendapp_signupandsignin'.
+1. Set the api values to match your backend address (The API Base Url you recorded earlier, and the 'b2cScopes' values were recorded earlier for the *backend application*).
1. Click Save ## Set the redirect URIs for the Azure AD B2C frontend app
-1. Open the Azure AD B2C blade and navigate to the application registration for the JavaScript Frontend Application
-1. Set the redirect URL to the one you noted down when you previously set up the static website primary endpoint above
- > [!NOTE]
- > This configuration will result in a client of the frontend application receiving an access token with appropriate claims from Azure AD B2C.
- > The SPA will be able to add this as a bearer token in the https header in the call to the backend API.
- > API Management will pre-validate the token, rate-limit calls to the endpoint by the subscriber key, before passing through the request to the receiving Azure Function API.
- > The SPA will render the response in the browser.
+1. Open the Azure AD B2C blade and navigate to the application registration for the JavaScript Frontend Application.
+1. Click 'Redirect URIs' and delete the placeholder 'https://jwt.ms' we entered earlier.
+1. Add a new URI for the primary (storage) endpoint (minus the trailing forward slash).
+ > [!NOTE]
+ > This configuration will result in a client of the frontend application receiving an access token with appropriate claims from Azure AD B2C.
+ > The SPA will be able to add this as a bearer token in the https header in the call to the backend API.
+ >
+ > API Management will pre-validate the token, rate-limit calls to the endpoint by both the subject of the JWT issued by Azure ID (the user) and by IP address of the caller (depending on the service tier of API Management, see the note above), before passing through the request to the receiving Azure Function API, adding the functions security key.
+ > The SPA will render the response in the browser.
+ >
> *Congratulations, youΓÇÖve configured Azure AD B2C, Azure API Management, Azure Functions, Azure App Service Authorization to work in perfect harmony!*
- > [!NOTE]
- > Now we have a simple app with a simple secured API, let's test it.
+Now we have a simple app with a simple secured API, let's test it.
## Test the client application
-1. Open the sample app URL that you noted down from the storage account you created earlier
+
+1. Open the sample app URL that you noted down from the storage account you created earlier.
1. Click ΓÇ£Sign InΓÇ¥ in the top-right-hand corner, this click will pop up your Azure AD B2C sign up or sign in profile.
-1. Post Sign in the "Logged in as" section of the screen will be populated from your JWT.
-1. Now Click "Call Web Api", and you the page should update with the values sent back from your secured API.
+1. The app should welcome you by your B2C profile name.
+1. Now Click "Call API" and the page should update with the values sent back from your secured API.
+1. If you *repeatedly* click the Call API button and you're running in the developer tier or above of API Management, you should note that your solution will begin to rate limit the API and this feature should be reported in the app with an appropriate message.
## And we're done+ The steps above can be adapted and edited to allow many different uses of Azure AD B2C with API Management. ## Next steps+ * Learn more about [Azure Active Directory and OAuth2.0](../active-directory/develop/authentication-vs-authorization.md). * Check out more [videos](https://azure.microsoft.com/documentation/videos/index/?services=api-management) about API Management. * For other ways to secure your back-end service, see [Mutual Certificate authentication](api-management-howto-mutual-certificates.md).
app-service Troubleshoot Diagnostic Logs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/troubleshoot-diagnostic-logs.md
To stream logs in the [Azure portal](https://portal.azure.com), navigate to your
To stream logs live in [Cloud Shell](../cloud-shell/overview.md), use the following command:
+> [!IMPORTANT]
+> This command may not work with web apps hosted in a Linux app service plan.
+ ```azurecli-interactive az webapp log tail --name appname --resource-group myResourceGroup ```
-To filter specific events, such as errors, use the **--Filter** parameter. For example:
-
-```azurecli-interactive
-az webapp log tail --name appname --resource-group myResourceGroup --filter Error
-```
-To filter specific log types, such as HTTP, use the **--Path** parameter. For example:
+To filter specific log types, such as HTTP, use the **--Provider** parameter. For example:
```azurecli-interactive
-az webapp log tail --name appname --resource-group myResourceGroup --path http
+az webapp log tail --name appname --resource-group myResourceGroup --provider http
``` ### In local terminal
app-service Webjobs Create Ieux Conceptual https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/webjobs-create-ieux-conceptual.md
The following file types are supported:
## Next steps * Learn how to [create a WebJob](./webjobs-create-ieux.md)
-* View log history of WebJobs](./webjobs-create-ieux-view-log.md)
+* View log history of [WebJobs](./webjobs-create-ieux-view-log.md)
app-service Webjobs Create Ieux https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/webjobs-create-ieux.md
# Run background tasks with WebJobs in Azure App Service
-The concept of runn [background tasks](./webjobs-create-ieux-conceptual.md) on Azure is provided with Azure App service web jobs. Learn how to deploy <abbr title="A program or script in the same instance as a web app, API app, or mobile app.">WebJobs</abbr> by using the [Azure portal](https://portal.azure.com) to upload an executable or script.
+The concept of running [background tasks](./webjobs-create-ieux-conceptual.md) on Azure is provided with Azure App service web jobs. Learn how to deploy <abbr title="A program or script in the same instance as a web app, API app, or mobile app.">WebJobs</abbr> by using the [Azure portal](https://portal.azure.com) to upload an executable or script.
Three supported WebJobs include:
Three supported WebJobs include:
* Use the [WebJobs SDK](https://github.com/Azure/azure-webjobs-sdk/wiki) to simplify many programming tasks
-* Learn to [develop and deploy WebJobs with Visual Studio](webjobs-dotnet-deploy-vs.md)
+* Learn to [develop and deploy WebJobs with Visual Studio](webjobs-dotnet-deploy-vs.md)
application-gateway Tutorial Autoscale Ps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/application-gateway/tutorial-autoscale-ps.md
Previously updated : 11/13/2019 Last updated : 03/08/2021 #Customer intent: As an IT administrator new to Application Gateway, I want to configure the service in a way that automatically scales based on customer demand and is highly available across availability zones to ensure my customers can access their web applications when they need them.
If you don't have an Azure subscription, create a [free account](https://azure.m
[!INCLUDE [updated-for-az](../../includes/updated-for-az.md)]
-This tutorial requires that you run Azure PowerShell locally. You must have Azure PowerShell module version 1.0.0 or later installed. Run `Get-Module -ListAvailable Az` to find the version. If you need to upgrade, see [Install Azure PowerShell module](/powershell/azure/install-az-ps). After you verify the PowerShell version, run `Connect-AzAccount` to create a connection with Azure.
+This tutorial requires that you run an administrative Azure PowerShell session locally. You must have Azure PowerShell module version 1.0.0 or later installed. Run `Get-Module -ListAvailable Az` to find the version. If you need to upgrade, see [Install Azure PowerShell module](/powershell/azure/install-az-ps). After you verify the PowerShell version, run `Connect-AzAccount` to create a connection with Azure.
## Sign in to Azure
Thumbprint Subject
E1E81C23B3AD33F9B4D1717B20AB65DBB91AC630 CN=www.contoso.com ```
-Use the thumbprint to create the pfx file:
+Use the thumbprint to create the pfx file. Replace *\<password>* with a password of your choice:
```powershell
-$pwd = ConvertTo-SecureString -String "Azure123456!" -Force -AsPlainText
+$pwd = ConvertTo-SecureString -String "<password>" -Force -AsPlainText
Export-PfxCertificate ` -cert cert:\localMachine\my\E1E81C23B3AD33F9B4D1717B20AB65DBB91AC630 `
Export-PfxCertificate `
-Password $pwd ``` + ## Create a virtual network Create a virtual network with one dedicated subnet for an autoscaling application gateway. Currently only one autoscaling application gateway can be deployed in each dedicated subnet.
Specify the allocation method of PublicIPAddress as **Static**. An autoscaling a
```azurepowershell #Create static public IP $pip = New-AzPublicIpAddress -ResourceGroupName $rg -name "AppGwVIP" `
- -location $location -AllocationMethod Static -Sku Standard
+ -location $location -AllocationMethod Static -Sku Standard -Zone 1,2,3
``` ## Retrieve details
$pip = New-AzPublicIpAddress -ResourceGroupName $rg -name "AppGwVIP" `
Retrieve details of the resource group, subnet, and IP in a local object to create the IP configuration details for the application gateway. ```azurepowershell
-$resourceGroup = Get-AzResourceGroup -Name $rg
$publicip = Get-AzPublicIpAddress -ResourceGroupName $rg -name "AppGwVIP" $vnet = Get-AzvirtualNetwork -Name "AutoscaleVNet" -ResourceGroupName $rg $gwSubnet = Get-AzVirtualNetworkSubnetConfig -Name "AppGwSubnet" -VirtualNetwork $vnet ```
+## Create web apps
+
+Configure two web apps for the backend pool. Replace *\<site1-name>* and *\<site-2-name>* with unique names in the `azurewebsites.net` domain.
+
+```azurepowershell
+New-AzAppServicePlan -ResourceGroupName $rg -Name "ASP-01" -Location $location -Tier Basic `
+ -NumberofWorkers 2 -WorkerSize Small
+New-AzWebApp -ResourceGroupName $rg -Name <site1-name> -Location $location -AppServicePlan ASP-01
+New-AzWebApp -ResourceGroupName $rg -Name <site2-name> -Location $location -AppServicePlan ASP-01
+```
+ ## Configure the infrastructure Configure the IP config, front-end IP config, back-end pool, HTTP settings, certificate, port, listener, and rule in an identical format to the existing Standard application gateway. The new SKU follows the same object model as the Standard SKU.
+Replace your two web app FQDNs (for example: `mywebapp.azurewebsites.net`) in the $pool variable definition.
+ ```azurepowershell $ipconfig = New-AzApplicationGatewayIPConfiguration -Name "IPConfig" -Subnet $gwSubnet $fip = New-AzApplicationGatewayFrontendIPConfig -Name "FrontendIPCOnfig" -PublicIPAddress $publicip $pool = New-AzApplicationGatewayBackendAddressPool -Name "Pool1" `
- -BackendIPAddresses testbackend1.westus.cloudapp.azure.com, testbackend2.westus.cloudapp.azure.com
+ -BackendIPAddresses <your first web app FQDN>, <your second web app FQDN>
$fp01 = New-AzApplicationGatewayFrontendPort -Name "SSLPort" -Port 443 $fp02 = New-AzApplicationGatewayFrontendPort -Name "HTTPPort" -Port 80
$listener02 = New-AzApplicationGatewayHttpListener -Name "HTTPListener" `
-Protocol Http -FrontendIPConfiguration $fip -FrontendPort $fp02 $setting = New-AzApplicationGatewayBackendHttpSettings -Name "BackendHttpSetting1" `
- -Port 80 -Protocol Http -CookieBasedAffinity Disabled
+ -Port 80 -Protocol Http -CookieBasedAffinity Disabled -PickHostNameFromBackendAddress
$rule01 = New-AzApplicationGatewayRequestRoutingRule -Name "Rule1" -RuleType basic ` -BackendHttpSettings $setting -HttpListener $listener01 -BackendAddressPool $pool $rule02 = New-AzApplicationGatewayRequestRoutingRule -Name "Rule2" -RuleType basic `
$rule02 = New-AzApplicationGatewayRequestRoutingRule -Name "Rule2" -RuleType bas
## Specify autoscale
-Now you can specify the autoscale configuration for the application gateway. Two autoscaling configuration types are supported:
-
-* **Fixed capacity mode**. In this mode, the application gateway does not autoscale and operates at a fixed Scale Unit capacity.
-
- ```azurepowershell
- $sku = New-AzApplicationGatewaySku -Name Standard_v2 -Tier Standard_v2 -Capacity 2
- ```
-
-* **Autoscaling mode**. In this mode, the application gateway autoscales based on the application traffic pattern.
+Now you can specify the autoscale configuration for the application gateway.
```azurepowershell $autoscaleConfig = New-AzApplicationGatewayAutoscaleConfiguration -MinCapacity 2 $sku = New-AzApplicationGatewaySku -Name Standard_v2 -Tier Standard_v2 ```
+In this mode, the application gateway autoscales based on the application traffic pattern.
## Create the application gateway
$appgw = New-AzApplicationGateway -Name "AutoscalingAppGw" -Zone 1,2,3 `
Use Get-AzPublicIPAddress to get the public IP address of the application gateway. Copy the public IP address or DNS name, and then paste it into the address bar of your browser.
-`Get-AzPublicIPAddress -ResourceGroupName $rg -Name AppGwVIP`
+```azurepowershell
+$pip = Get-AzPublicIPAddress -ResourceGroupName $rg -Name AppGwVIP
+$pip.IpAddress
+```
+ ## Clean up resources
attestation Quickstart Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/attestation/quickstart-powershell.md
# Quickstart: Set up Azure Attestation with Azure PowerShell
-Follow the below steps to create and configure an attestation provider using Azure PowerShell. See [Overview of Azure PowerShell](/powershell/azure/?view=azps-2.8.0&viewFallbackFrom=azps-2.4.0) for information on how to install and run Azure PowerShell.
+Follow the below steps to create and configure an attestation provider using Azure PowerShell. See [Overview of Azure PowerShell](/powershell/azure/) for information on how to install and run Azure PowerShell.
Please note that, the PowerShell Gallery has deprecated Transport Layer Security (TLS) versions 1.0 and 1.1. TLS 1.2 or a later version is recommended. Hence you may receive the following errors:
automation Automation Runbook Gallery https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/automation-runbook-gallery.md
The list below contains a few runbooks that support common scenarios. For a full
1. Select **Source: PowerShell Gallery**. This shows a list of available runbooks that you can browse. 1. You can use the search box above the list to narrow the list, or you can use the filters to narrow the display by publisher, type, and sort. Locate the gallery item you want and select it to view its details.
- :::image type="content" source="media/automation-runbook-gallery/browse-gallery-sm.png" alt-text="Browsing the runbook gallery" lightbox="media/automation-runbook-gallery/browse-gallery-lg.png":::
+ :::image type="content" source="media/automation-runbook-gallery/browse-gallery-sm.png" alt-text="Browsing the runbook gallery." lightbox="media/automation-runbook-gallery/browse-gallery-lg.png":::
1. To import an item, click **Import** on the details blade.
- :::image type="content" source="media/automation-runbook-gallery/gallery-item-detail-sm.png" alt-text="Show a runbook gallery item detail" lightbox="media/automation-runbook-gallery/gallery-item-detail-lg.png":::
+ :::image type="content" source="media/automation-runbook-gallery/gallery-item-detail-sm.png" alt-text="Show a runbook gallery item detail." lightbox="media/automation-runbook-gallery/gallery-item-detail-lg.png":::
1. Optionally, change the name of the runbook and then click **OK** to import the runbook. 1. The runbook appears on the **Runbooks** tab for the Automation account.
The list below contains a few runbooks that support common scenarios. For a full
1. Select **Source: GitHub**. 1. You can use the filters above the list to narrow the display by publisher, type, and sort. Locate the gallery item you want and select it to view its details.
- :::image type="content" source="media/automation-runbook-gallery/browse-gallery-github-sm.png" alt-text="Browsing the GitHub gallery" lightbox="media/automation-runbook-gallery/browse-gallery-github-lg.png":::
+ :::image type="content" source="media/automation-runbook-gallery/browse-gallery-github-sm.png" alt-text="Browsing the GitHub gallery." lightbox="media/automation-runbook-gallery/browse-gallery-github-lg.png":::
1. To import an item, click **Import** on the details blade.
- :::image type="content" source="media/automation-runbook-gallery/gallery-item-details-blade-github-sm.png" alt-text="Detailed view of a runbook from the GitHub gallery" lightbox="media/automation-runbook-gallery/gallery-item-details-blade-github-lg.png":::
+ :::image type="content" source="media/automation-runbook-gallery/gallery-item-details-blade-github-sm.png" alt-text="Detailed view of a runbook from the GitHub gallery." lightbox="media/automation-runbook-gallery/gallery-item-details-blade-github-lg.png":::
1. Optionally, change the name of the runbook and then click **OK** to import the runbook. 1. The runbook appears on the **Runbooks** tab for the Automation account.
Microsoft encourages you to add runbooks to the PowerShell Gallery that you thin
1. Select **Modules** under **Shared Resources** to open the list of modules. 1. Click **Browse gallery** from the top of the page.
- :::image type="content" source="media/automation-runbook-gallery/modules-blade-sm.png" alt-text="View of the module gallery" lightbox="media/automation-runbook-gallery/modules-blade-lg.png":::
+ :::image type="content" source="media/automation-runbook-gallery/modules-blade-sm.png" alt-text="View of the module gallery." lightbox="media/automation-runbook-gallery/modules-blade-lg.png":::
1. On the Browse gallery page, you can use the search box to find matches in any of the following fields:
Microsoft encourages you to add runbooks to the PowerShell Gallery that you thin
When you drill into a specific module, you can view more information. This information includes a link back to the PowerShell Gallery, any required dependencies, and all of the cmdlets or DSC resources that the module contains.
- :::image type="content" source="media/automation-runbook-gallery/gallery-item-details-blade-sm.png" alt-text="Detailed view of a module from the gallery" lightbox="media/automation-runbook-gallery/gallery-item-details-blade-lg.png":::
+ :::image type="content" source="media/automation-runbook-gallery/gallery-item-details-blade-sm.png" alt-text="Detailed view of a module from the gallery." lightbox="media/automation-runbook-gallery/gallery-item-details-blade-lg.png":::
1. To install the module directly into Azure Automation, click **Import**. 1. On the Import pane, you can see the name of the module to import. If all the dependencies are installed, the **OK** button is activated. If you're missing dependencies, you need to import those dependencies before you can import this module.
automation Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/update-management/overview.md
Title: Azure Automation Update Management overview
description: This article provides an overview of the Update Management feature that implements updates for your Windows and Linux machines. Previously updated : 01/22/2021 Last updated : 03/08/2021 # Update Management overview
The following table lists the supported operating systems for update assessments
|Windows Server 2008 R2 (RTM and SP1 Standard)| Update Management supports assessments and patching for this operating system. The [Hybrid Runbook Worker](../automation-windows-hrw-install.md) is supported for Windows Server 2008 R2. | |CentOS 6 and 7 (x64) | Linux agents require access to an update repository. Classification-based patching requires `yum` to return security data that CentOS doesn't have in its RTM releases. For more information on classification-based patching on CentOS, see [Update classifications on Linux](view-update-assessments.md#linux). | |Red Hat Enterprise 6 and 7 (x64) | Linux agents require access to an update repository. |
-|SUSE Linux Enterprise Server 12 (x64) | Linux agents require access to an update repository. |
+|SUSE Linux Enterprise Server 12, 15, and 15.1 (x64) | Linux agents require access to an update repository. For SUSE 15.x, Python 3 is required on the machine. |
|Ubuntu 14.04 LTS, 16.04 LTS, and 18.04 LTS (x64) |Linux agents require access to an update repository. | > [!NOTE]
automation Pre Post Scripts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/update-management/pre-post-scripts.md
Title: Manage pre-scripts and post-scripts in your Update Management deployment
description: This article tells how to configure and manage pre-scripts and post-scripts for update deployments. Previously updated : 12/17/2020 Last updated : 03/08/2021
Pre-scripts and post-scripts are runbooks to run in your Azure Automation accoun
For a runbook to be used as a pre-script or post-script, you must import it into your Automation account and [publish the runbook](../manage-runbooks.md#publish-a-runbook).
+Currently, only PowerShell and Python 2 runbooks are supported as Pre/Post scripts. Other runbook types like Python 3, Graphical, PowerShell Workflow, Graphical PowerShell Workflow are currently not supported as Pre/Post scripts.
+ ## Pre-script and post-script parameters When you configure pre-scripts and post-scripts, you can pass in parameters just like scheduling a runbook. Parameters are defined at the time of update deployment creation. Pre-scripts and post-scripts support the following types:
A full example with all properties can be found at: [Get software update configu
> [!NOTE] > The `SoftwareUpdateConfigurationRunContext` object can contain duplicate entries for machines. This can cause pre-scripts and post-scripts to run multiple times on the same machine. To work around this behavior, use `Sort-Object -Unique` to select only unique VM names.
-> [!NOTE]
-> Currently only PowerShell runbooks are supported as Pre/Post scripts. Other runbook types like Python, Graphical, PowerShell Workflow, Graphical PowerShell Workflow are currently not supported as Pre/Post scripts.
- ## Use a pre-script or post-script in a deployment To use a pre-script or post-script in an update deployment, start by creating an update deployment. Select **Pre-scripts + Post-Scripts**. This action opens the **Select Pre-scripts + Post-scripts** page.
By selecting the update deployment run, you're shown additional details of pre-s
## Stop a deployment
-If you want to stop a deployment based on a pre-script, you must [throw](../automation-runbook-execution.md#throw) an exception. If you don't, the deployment and post-script will still run. The following code snippet shows how to throw an exception.
+If you want to stop a deployment based on a pre-script, you must [throw](../automation-runbook-execution.md#throw) an exception. If you don't, the deployment and post-script will still run. The following code snippet shows how to throw an exception using PowerShell.
```powershell #In this case, we want to terminate the patch job if any run fails.
foreach($summary in $finalStatus)
} ```
+In Python 2, exception handling is managed in a [try](https://www.python-course.eu/exception_handling.php) block.
+ ## Interact with machines Pre-scripts and post-scripts run as runbooks in your Automation account and not directly on the machines in your deployment. Pre-tasks and post-tasks also run in the Azure context and don't have access to non-Azure machines. The following sections show how you can interact with the machines directly, whether they're Azure VMs or non-Azure machines.
if (<My custom error logic>)
} ```
+In Python 2, if you want to throw an error when a certain condition occurs, use a [raise](https://docs.python.org/2.7/reference/simple_stmts.html#the-raise-statement) statement.
+
+```python
+If (<My custom error logic>)
+ raise Exception('Something happened.')
+```
+ ## Samples Samples for pre-scripts and post-scripts can be found in the [Azure Automation GitHub organization](https://github.com/azureautomation) and the [PowerShell Gallery](https://www.powershellgallery.com/packages?q=Tags%3A%22UpdateManagement%22+Tags%3A%22Automation%22), or you can import them through the Azure portal. To do that, in your Automation account, under **Process Automation**, select **Runbooks Gallery**. Use **Update Management** for the filter.
azure-app-configuration Howto Backup Config Store https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-app-configuration/howto-backup-config-store.md
To make it easier for you to start backing up your data, we've [tested and publi
If the sample code provided earlier doesn't meet your requirements, you can also create your own function. Your function must be able to perform the following tasks in order to complete the backup: - Periodically read contents of your queue to see if it contains any notifications from Event Grid. Refer to the [Storage Queue SDK](../storage/queues/storage-quickstart-queues-dotnet.md) for implementation details.-- If your queue contains [event notifications from Event Grid](./concept-app-configuration-event.md?branch=pr-en-us-112982#event-schema), extract all the unique `<key, label>` information from event messages. The combination of key and label is the unique identifier for key-value changes in the primary store.
+- If your queue contains [event notifications from Event Grid](./concept-app-configuration-event.md#event-schema), extract all the unique `<key, label>` information from event messages. The combination of key and label is the unique identifier for key-value changes in the primary store.
- Read all settings from the primary store. Update only those settings in the secondary store that have a corresponding event in the queue. Delete all settings from the secondary store that were present in the queue but not in the primary store. You can use the [App Configuration SDK](https://github.com/Azure/AppConfiguration#sdks) to access your configuration stores programmatically. - Delete messages from the queue if there were no exceptions during processing. - Implement error handling according to your needs. Refer to the preceding code sample to see some common exceptions that you might want to handle.
azure-cache-for-redis Cache Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-best-practices.md
If you would like to test how your code works under error conditions, consider u
* **We recommend using Dv2 VM Series** for your client as they have better hardware and will give the best results. * Make sure the client VM you use has **at least as much compute and bandwidth* as the cache being tested. * **Test under failover conditions** on your cache. It's important to ensure that you don't performance test your cache only under steady state conditions. Also test under failover conditions and measure the CPU / Server Load on your cache during that time. You can initiate a failover by [rebooting the primary node](cache-administration.md#reboot). This will allow you to see how your application behaves in terms of throughput and latency during failover conditions (happens during updates and can happen during an unplanned event). Ideally you dont't want to see CPU / Server Load peak to more than say 80% even during a failover as that can affect performance.
- * **Premium P2 and above** are hosted on VMs with 4 or more cores. This is useful to distribute the TLS encryption / decryption workload across multiple cores to bring down overall CPU usage. [See here for details around VM sizes and cores](cache-planning-faq.md#azure-cache-for-redis-performance)
+ * **Some cache sizes** are hosted on VMs with 4 or more cores. This is useful to distribute the TLS encryption / decryption as well as TLS connection / disconnection workloads across multiple cores to bring down overall CPU usage on the cache VMs. [See here for details around VM sizes and cores](cache-planning-faq.md#azure-cache-for-redis-performance)
* **Enable VRSS** on the client machine if you are on Windows. [See here for details](/previous-versions/windows/it-pro/windows-server-2012-R2-and-2012/dn383582(v=ws.11)). Example PowerShell script: >PowerShell -ExecutionPolicy Unrestricted Enable-NetAdapterRSS -Name ( Get-NetAdapter).Name
Test GET requests using a 1k payload.
**To test throughput:** Pipelined GET requests with 1k payload.
-> redis-benchmark -h yourcache.redis.cache.windows.net -a yourAccesskey -t GET -n 1000000 -d 1024 -P 50 -c 50
+> redis-benchmark -h yourcache.redis.cache.windows.net -a yourAccesskey -t GET -n 1000000 -d 1024 -P 50 -c 50
azure-cache-for-redis Cache Migration Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-migration-guide.md
General steps to implement this option are:
2. Save a snapshot of the existing Redis cache. You can [configure Redis to save snapshots](https://redis.io/topics/persistence) periodically, or run the process manually using the [SAVE](https://redis.io/commands/save) or [BGSAVE](https://redis.io/commands/bgsave) commands. The RDB file is named ΓÇ£dump.rdbΓÇ¥ by default and will be located at the path specified in the *redis.conf* configuration file. > [!NOTE]
- > If youΓÇÖre migrating data within Azure Cache for Redis, see [these instructions on how to export an RDB file](cache-how-to-import-export-data.md) or use the [PowerShell Export cmdlet](/powershell/module/azurerm.rediscache/export-azurermrediscache?view=azurermps-6.13.0&viewFallbackFrom=azurermps-6.4.0) instead.
+ > If youΓÇÖre migrating data within Azure Cache for Redis, see [these instructions on how to export an RDB file](cache-how-to-import-export-data.md) or use the [PowerShell Export cmdlet](/powershell/module/azurerm.rediscache/export-azurermrediscache) instead.
> 3. Copy the RDB file to an Azure storage account in the region where your new cache is located. You can use AzCopy for this task.
-4. Import the RDB file into the new cache using these [import instructions](cache-how-to-import-export-data.md) or the [PowerShell Import cmdlet](/powershell/module/azurerm.rediscache/import-azurermrediscache?view=azurermps-6.13.0&viewFallbackFrom=azurermps-6.4.0).
+4. Import the RDB file into the new cache using these [import instructions](cache-how-to-import-export-data.md) or the [PowerShell Import cmdlet](/powershell/module/azurerm.rediscache/import-azurermrediscache).
5. Update your application to use the new cache instance.
azure-functions Create First Function Cli Node https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/create-first-function-cli-node.md
If desired, you can skip to [Run the function locally](#run-the-function-locally
:::code language="javascript" source="~/functions-quickstart-templates/Functions.Templates/Templates/HttpTrigger-JavaScript/index.js":::
-For an HTTP trigger, the function receives request data in the variable `req` as defined in *function.json*. The return object, defined as `$return` in *function.json*, is the response. To learn more, see [Azure Functions HTTP triggers and bindings](./functions-bindings-http-webhook.md?tabs=javascript).
+For an HTTP trigger, the function receives request data in the variable `req` as defined in *function.json*. The response is defined as `res` in *function.json* and can be accessed using `context.res`. To learn more, see [Azure Functions HTTP triggers and bindings](./functions-bindings-http-webhook.md?tabs=javascript).
#### function.json
azure-functions Dotnet Isolated Process Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/dotnet-isolated-process-guide.md
Last updated 03/01/2021
+#Customer intent: As a developer, I need to know how to create functions that run in an isolated process so that I can run my function code on current (not LTS) releases of .NET.
# Guide for running functions on .NET 5.0 in Azure
The following packages are required to run your .NET functions in an isolated pr
Because functions that run in a .NET isolated process use different binding types, they require a unique set of binding extension packages. You'll find these extension packages under [Microsoft.Azure.Functions.Worker.Extensions](https://www.nuget.org/packages?q=Microsoft.Azure.Functions.Worker.Extensions).
-
+ ## Start-up and configuration When using .NET isolated functions, you have access to the start-up of your function app, which is usually in Program.cs. You're responsible for creating and starting your own host instance. As such, you also have direct access to the configuration pipeline for your app. You can much more easily inject dependencies and run middleware when running out-of-process. The following code shows an example of a `HostBuilder` pipeline: A `HostBuilder` is used to build and return a fully initialized `IHost` instance, which you run asynchronously to start your function app. ### Configuration
Having access to the host builder pipeline means that you can set any app-specif
The following example shows how to add configuration `args`, which are read as command-line arguments:
-The `ConfigureAppConfiguration` method is used to configure the rest of the build process and application. This example also uses an [IConfigurationBuilder](/dotnet/api/microsoft.extensions.configuration.iconfigurationbuilder?view=dotnet-plat-ext-5.0&preserve-view=true), which makes it easier to add multiple configuration items. Because `ConfigureAppConfiguration` returns the same instance of [`IConfiguration `](/dotnet/api/microsoft.extensions.configuration.iconfiguration?view=dotnet-plat-ext-5.0&preserve-view=true), you can also just call it multiple times to add multiple configuration items. You can access the full set of configurations from both [`HostBuilderContext.Configuration`](/dotnet/api/microsoft.extensions.hosting.hostbuildercontext.configuration?view=dotnet-plat-ext-5.0&preserve-view=true) and [`IHost.Services`](/dotnet/api/microsoft.extensions.hosting.ihost.services?view=dotnet-plat-ext-5.0&preserve-view=true).
+The `ConfigureAppConfiguration` method is used to configure the rest of the build process and application. This example also uses an [IConfigurationBuilder](/dotnet/api/microsoft.extensions.configuration.iconfigurationbuilder?view=dotnet-plat-ext-5.0&preserve-view=true), which makes it easier to add multiple configuration items. Because `ConfigureAppConfiguration` returns the same instance of [`IConfiguration`](/dotnet/api/microsoft.extensions.configuration.iconfiguration?view=dotnet-plat-ext-5.0&preserve-view=true), you can also just call it multiple times to add multiple configuration items. You can access the full set of configurations from both [`HostBuilderContext.Configuration`](/dotnet/api/microsoft.extensions.hosting.hostbuildercontext.configuration?view=dotnet-plat-ext-5.0&preserve-view=true) and [`IHost.Services`](/dotnet/api/microsoft.extensions.hosting.ihost.services?view=dotnet-plat-ext-5.0&preserve-view=true).
To learn more about configuration, see [Configuration in ASP.NET Core](/aspnet/core/fundamentals/configuration/?view=aspnetcore-5.0&preserve-view=true).
Dependency injection is simplified, compared to .NET class libraries. Rather tha
The following example injects a singleton service dependency: To learn more, see [Dependency injection in ASP.NET Core](/aspnet/core/fundamentals/dependency-injection?view=aspnetcore-5.0&preserve-view=true).
To learn more, see [Dependency injection in ASP.NET Core](/aspnet/core/fundament
While the full middleware registration set of APIs is not yet exposed, middleware registration is supported and we've added an example to the sample application under the Middleware folder. ## Execution context
While the full middleware registration set of APIs is not yet exposed, middlewar
Bindings are defined by using attributes on methods, parameters, and return types. A function method is a method with a `Function` and a trigger attribute applied to an input parameter, as shown in the following example: The trigger attribute specifies the trigger type and binds input data to a method parameter. The previous example function is triggered by a queue message, and the queue message is passed to the method in the `myQueueItem` parameter.
A function can have zero or more input bindings that can pass data to a function
To write to an output binding, you must apply an output binding attribute to the function method, which defined how to write to the bound service. The value returned by the method is written to the output binding. For example, the following example writes a string value to a message queue named `functiontesting2` by using an output binding: ### Multiple output bindings The data written to an output binding is always the return value of the function. If you need to write to more than one output binding, you must create a custom return type. This return type must have the output binding attribute applied to one or more properties of the class. The following example writes to both an HTTP response and a queue output binding: ### HTTP trigger
Likewise, the function returns an `HttpReponseData` object, which provides data
The following code is an HTTP trigger ## Logging
In .NET isolated, you can write to logs by using an [`ILogger`](/dotnet/api/micr
The following example shows how to get an `ILogger` and write logs inside a function: Use various methods of `ILogger` to write various log levels, such as `LogWarning` or `LogError`. To learn more about log levels, see the [monitoring article](functions-monitoring.md#log-levels-and-categories).
This section describes the current state of the functional and behavioral differ
| Logging | [`ILogger`](/dotnet/api/microsoft.extensions.logging.ilogger?view=dotnet-plat-ext-5.0&preserve-view=true) passed to the function | [`ILogger`](/dotnet/api/microsoft.extensions.logging.ilogger?view=dotnet-plat-ext-5.0&preserve-view=true) obtained from `FunctionContext` | | Cancellation tokens | [Supported](functions-dotnet-class-library.md#cancellation-tokens) | Not supported | | Output bindings | Out parameters | Return values |
-| Output binding types | `IAsyncCollector`, [DocumentClient](/dotnet/api/microsoft.azure.documents.client.documentclient), [BrokeredMessage](/dotnet/api/microsoft.servicebus.messaging.brokeredmessage), and other client-specific types | Simple types, JSON serializable types, and arrays. |
+| Output binding types | `IAsyncCollector`, [DocumentClient](/dotnet/api/microsoft.azure.documents.client.documentclient?view=azure-dotnet&preserve-view=true), [BrokeredMessage](/dotnet/api/microsoft.servicebus.messaging.brokeredmessage?view=azure-dotnet&preserve-view=true), and other client-specific types | Simple types, JSON serializable types, and arrays. |
| Multiple output bindings | Supported | [Supported](#multiple-output-bindings) | | HTTP trigger | [`HttpRequest`](/dotnet/api/microsoft.aspnetcore.http.httprequest?view=aspnetcore-5.0&preserve-view=true)/[`ObjectResult`](/dotnet/api/microsoft.aspnetcore.mvc.objectresult?view=aspnetcore-5.0&preserve-view=true) | `HttpRequestData`/`HttpResponseData` | | Durable Functions | [Supported](durable/durable-functions-overview.md) | Not supported |
azure-functions Durable Functions Serialization And Persistence https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/durable/durable-functions-serialization-and-persistence.md
namespace MyApplication
{ public override void Configure(IFunctionsHostBuilder builder) {
- builder.Services.AddSingleton<IMessageSerializerSettingsFactory, CustomMessageSerializerSettingFactory>();
+ builder.Services.AddSingleton<IMessageSerializerSettingsFactory, CustomMessageSerializerSettingsFactory>();
builder.Services.AddSingleton<IErrorSerializerSettingsFactory, CustomErrorSerializerSettingsFactory>(); }
azure-functions Functions Bindings Storage Blob Input https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-bindings-storage-blob-input.md
Access the blob data via a parameter that matches the name designated by binding
# [Python](#tab/python)
-Access blob data via the parameter typed as [InputStream](/python/api/azure-functions/azure.functions.inputstream?view=azure-python&preserve-view=true). Refer to the [input example](#example) for details.
+Access blob data via the parameter typed as [InputStream](/python/api/azure-functions/azure.functions.inputstream). Refer to the [input example](#example) for details.
azure-functions Functions Bindings Storage Blob Trigger https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-bindings-storage-blob-trigger.md
Access the blob data via a parameter that matches the name designated by binding
# [Python](#tab/python)
-Access blob data via the parameter typed as [InputStream](/python/api/azure-functions/azure.functions.inputstream?view=azure-python&preserve-view=true). Refer to the [trigger example](#example) for details.
+Access blob data via the parameter typed as [InputStream](/python/api/azure-functions/azure.functions.inputstream). Refer to the [trigger example](#example) for details.
azure-functions Functions Bindings Storage Queue Output https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-bindings-storage-queue-output.md
There are two options for outputting an Queue message from a function:
- **Return value**: Set the `name` property in *function.json* to `$return`. With this configuration, the function's return value is persisted as a Queue storage message. -- **Imperative**: Pass a value to the [set](/python/api/azure-functions/azure.functions.out?view=azure-python&preserve-view=true#set-val--t--none) method of the parameter declared as an [Out](/python/api/azure-functions/azure.functions.out?view=azure-python&preserve-view=true) type. The value passed to `set` is persisted as a Queue storage message.
+- **Imperative**: Pass a value to the [set](/python/api/azure-functions/azure.functions.out#set-val--t--none) method of the parameter declared as an [Out](/python/api/azure-functions/azure.functions.out) type. The value passed to `set` is persisted as a Queue storage message.
azure-functions Functions Bindings Storage Queue Trigger https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-bindings-storage-queue-trigger.md
Access the queue message via string parameter that matches the name designated b
# [Python](#tab/python)
-Access the queue message via the parameter typed as [QueueMessage](/python/api/azure-functions/azure.functions.queuemessage?view=azure-python&preserve-view=true).
+Access the queue message via the parameter typed as [QueueMessage](/python/api/azure-functions/azure.functions.queuemessage).
azure-functions Functions Bindings Storage Table Output https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-bindings-storage-table-output.md
There are two options for outputting a Table storage row message from a function
- **Return value**: Set the `name` property in *function.json* to `$return`. With this configuration, the function's return value is persisted as a Table storage row. -- **Imperative**: Pass a value to the [set](/python/api/azure-functions/azure.functions.out?view=azure-python&preserve-view=true#set-val--t--none) method of the parameter declared as an [Out](/python/api/azure-functions/azure.functions.out?view=azure-python&preserve-view=true) type. The value passed to `set` is persisted as an Event Hub message.
+- **Imperative**: Pass a value to the [set](/python/api/azure-functions/azure.functions.out#set-val--t--none) method of the parameter declared as an [Out](/python/api/azure-functions/azure.functions.out) type. The value passed to `set` is persisted as an Event Hub message.
azure-functions Functions Reference Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-functions/functions-reference-python.md
def main(req: azure.functions.HttpRequest) -> str:
return f'Hello, {user}!' ```
-Use the Python annotations included in the [azure.functions.*](/python/api/azure-functions/azure.functions?view=azure-python&preserve-view=true) package to bind input and outputs to your methods.
+Use the Python annotations included in the [azure.functions.*](/python/api/azure-functions/azure.functions) package to bind input and outputs to your methods.
## Alternate entry point
Output can be expressed both in return value and output parameters. If there's o
To use the return value of a function as the value of an output binding, the `name` property of the binding should be set to `$return` in `function.json`.
-To produce multiple outputs, use the `set()` method provided by the [`azure.functions.Out`](/python/api/azure-functions/azure.functions.out?view=azure-python&preserve-view=true) interface to assign a value to the binding. For example, the following function can push a message to a queue and also return an HTTP response.
+To produce multiple outputs, use the `set()` method provided by the [`azure.functions.Out`](/python/api/azure-functions/azure.functions.out) interface to assign a value to the binding. For example, the following function can push a message to a queue and also return an HTTP response.
```json {
For scaling and performance best practices for Python function apps, please refe
## Context
-To get the invocation context of a function during execution, include the [`context`](/python/api/azure-functions/azure.functions.context?view=azure-python&preserve-view=true) argument in its signature.
+To get the invocation context of a function during execution, include the [`context`](/python/api/azure-functions/azure.functions.context) argument in its signature.
For example:
def main(req: azure.functions.HttpRequest,
return f'{context.invocation_id}' ```
-The [**Context**](/python/api/azure-functions/azure.functions.context?view=azure-python&preserve-view=true) class has the following string attributes:
+The [**Context**](/python/api/azure-functions/azure.functions.context) class has the following string attributes:
`function_directory` The directory in which the function is running.
All known issues and feature requests are tracked using [GitHub issues](https://
For more information, see the following resources:
-* [Azure Functions package API documentation](/python/api/azure-functions/azure.functions?view=azure-python&preserve-view=true)
+* [Azure Functions package API documentation](/python/api/azure-functions/azure.functions)
* [Best practices for Azure Functions](functions-best-practices.md) * [Azure Functions triggers and bindings](functions-triggers-bindings.md) * [Blob storage bindings](functions-bindings-storage-blob.md)
For more information, see the following resources:
[Having issues? Let us know.](https://aka.ms/python-functions-ref-survey)
-[HttpRequest]: /python/api/azure-functions/azure.functions.httprequest?view=azure-python&preserve-view=true
-[HttpResponse]: /python/api/azure-functions/azure.functions.httpresponse?view=azure-python&preserve-view=true
+[HttpRequest]: /python/api/azure-functions/azure.functions.httprequest
+[HttpResponse]: /python/api/azure-functions/azure.functions.httpresponse
azure-government Compare Azure Government Global Azure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/compare-azure-government-global-azure.md
ms.devlang: na
na Previously updated : 02/17/2021 Last updated : 03/07/2021 # Compare Azure Government and global Azure
Last updated 02/17/2021
Microsoft Azure Government uses same underlying technologies as global Azure, which includes the core components of [Infrastructure-as-a-Service (IaaS)](https://azure.microsoft.com/overview/what-is-iaas/), [Platform-as-a-Service (PaaS)](https://azure.microsoft.com/overview/what-is-paas/), and [Software-as-a-Service (SaaS)](https://azure.microsoft.com/overview/what-is-saas/). Both Azure and Azure Government have the same comprehensive security controls in place, as well as the same Microsoft commitment on the safeguarding of customer data. Whereas both cloud environments are assessed and authorized at the FedRAMP High impact level, Azure Government provides an additional layer of protection to customers through contractual commitments regarding storage of customer data in the United States and limiting potential access to systems processing customer data to screened US persons. These commitments may be of interest to customers using the cloud to store or process data subject to US export control regulations such as the EAR, ITAR, and DoE 10 CFR Part 810. ### Export control implications
-Customers are responsible for designing and deploying their applications to meet [export control requirements](./documentation-government-overview-itar.md) such as those prescribed in the EAR and ITAR. In doing so, customers should not include sensitive or restricted information in Azure resource names, as explained in [Considerations for naming Azure resources](./documentation-government-concept-naming-resources.md). Data stored or processed in customer VMs, storage accounts, databases, Azure Import/Export, Azure Cache for Redis, ExpressRoute, Azure Cognitive Search, App Service, API Management, and other Azure services suitable for holding, processing, or transmitting customer data can contain export-controlled data. However, metadata for these Azure services is not permitted to contain export-controlled data. This metadata includes all configuration data entered when creating and maintaining an Azure service, including subscription names, service names, server names, database names, tenant role names, resource groups, deployment names, resource names, resource tags, circuit name, etc. It also includes all shipping information that is used to transport media for Azure Import/Export, such as carrier name, tracking number, description, return information, drive list, package list, storage account name, container name, etc. Sensitive data should not be included in HTTP headers sent to the REST API in search/query strings as part of the API.
+
+Customers are responsible for designing and deploying their applications to meet [export control requirements](./documentation-government-overview-itar.md) such as those prescribed in the EAR and ITAR. In doing so, customers should not include sensitive or restricted information in Azure resource names, as explained in [Considerations for naming Azure resources](./documentation-government-concept-naming-resources.md). Data stored or processed in customer VMs, storage accounts, databases, Azure Import/Export, Azure Cache for Redis, ExpressRoute, Azure Cognitive Search, App Service, API Management, and other Azure services suitable for holding, processing, or transmitting customer data can contain export-controlled data. However, metadata for these Azure services is not permitted to contain export-controlled data. This metadata includes all configuration data entered when creating and maintaining an Azure service, including subscription names, service names, server names, database names, tenant role names, resource groups, deployment names, resource names, resource tags, circuit name, etc. It also includes all shipping information that is used to transport media for Azure Import/Export, such as carrier name, tracking number, description, return information, drive list, package list, storage account name, container name, etc. Sensitive data should not be included in HTTP headers sent to the REST API in search/query strings as part of the API.
### Guidance for developers+ Azure Government services operate the same way as the corresponding services in global Azure, which is why most of the existing online Azure documentation applies equally well to Azure Government. However, there are some key differences that developers working on applications hosted in Azure Government must be aware of. For detailed information, see [Guidance for developers](./documentation-government-developer-guide.md). As a developer, you must know how to connect to Azure Government and once you connect you will mostly have the same experience as in global Azure. Table below lists API endpoints in Azure vs. Azure Government for accessing and managing various services. |Service category|Service name|Azure Public|Azure Government|Notes|
Azure Government services operate the same way as the corresponding services in
||Azure Cognitive Search|\*.search.windows.net|\*.search.windows.us|| ### Service availability+ Microsoft's goal is to enable 100% parity in service availability between Azure and Azure Government. For service availability in Azure Government, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=all&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). Services available in Azure Government are listed by category and whether they are Generally Available or available through Preview. If a service is available in Azure Government, that fact is not reiterated in the rest of this article. Instead, customers are encouraged to review [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=all&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia) for the latest, up-to-date information on service availability. In general, service availability in Azure Government implies that all corresponding service features are available to customers. Variations to this approach and other applicable limitations are tracked and explained in this article based on the main service categories outlined in the [online directory of Azure services](https://azure.microsoft.com/services/). Additional considerations for service deployment and usage in Azure Government are also provided. ## AI + Machine Learning+ This section outlines variations and considerations when using **Azure Bot Service**, **Azure Machine Learning**, and **Cognitive Services** in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=machine-learning-service,bot-service,cognitive-services&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). ### [Azure Bot Service](/azure/bot-service/)+ The following Azure Bot Service **features are not currently available** in Azure Government:+ - BotBuilder V3 Bot Templates - Channels - Cortana channel
The following Azure Bot Service **features are not currently available** in Azur
- Payment Card Feature Commonly used services in bot applications that are not currently available in Azure Government:+ - Application Insights - Speech Service For more information, see [How do I create a bot that uses US Government data center](/azure/bot-service/bot-service-resources-faq-ecosystem#how-do-i-create-a-bot-that-uses-the-us-government-data-center). ### [Azure Machine Learning](../machine-learning/overview-what-is-azure-ml.md)+ For feature variations and limitations, see [Azure Machine Learning sovereign cloud parity](../machine-learning/reference-machine-learning-cloud-parity.md). ### [Content Moderator](../cognitive-services/content-moderator/overview.md)+ The following Content Moderator **features are not currently available** in Azure Government:+ - Review UI and Review APIs. ### [Language Understanding](../cognitive-services/luis/what-is-luis.md)+ The following Language Understanding **features are not currently available** in Azure Government:+ - Speech Requests - Prebuilt Domains ### [Speech service](../cognitive-services/speech-service/overview.md)+ The following Speech service **features are not currently available** in Azure Government:+ - Custom Voice See details of supported locales by features in [Speech service supported regions](../cognitive-services/speech-service/regions.md). For additional information including API endpoints, see [Speech service in sovereign clouds](../cognitive-services/Speech-Service/sovereign-clouds.md). ### [Translator](../cognitive-services/translator/translator-info-overview.md)+ The following Translator **features are not currently available** in Azure Government:+ - Custom Translator - Translator Hub ## Analytics+ This section outlines variations and considerations when using Analytics services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=data-share,power-bi-embedded,analysis-services,event-hubs,data-lake-analytics,storage,data-catalog,data-factory,synapse-analytics,stream-analytics,databricks,hdinsight&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). ### [Azure Databricks](/azure/databricks/scenarios/what-is-azure-databricks)+ For access to Azure Databricks in an Azure Government environment, contact your Microsoft or Databricks account representative. ### [HDInsight](../hdinsight/hadoop/apache-hadoop-introduction.md)+ The following HDInsight **features are not currently available** in Azure Government: - HDInsight on Windows
For secured virtual networks, you will want to allow network security groups (NS
You can see a demo on how to build data-centric solutions on Azure Government using [HDInsight](https://channel9.msdn.com/Blogs/Azure/Cognitive-Services-HDInsight-and-Power-BI-on-Azure-Government). ### [Power BI](/power-bi/service-govus-overview)+ The following Power BI **features are not currently available** in Azure Government: - Portal support
You can see a demo on [how to build data-centric solutions on Azure Government u
> The content pack that typically makes activity logs and such available is not intended for use on Government tenants. The intention is to use Log Analytics for the purpose of the logs that aren't available through the content pack. ### [Power BI Embedded](/azure/power-bi-embedded/)+ The following Power BI Embedded **features are not yet available** in Azure Government: - Portal support ## Compute+ This section outlines variations and considerations when using Compute services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=spring-cloud,azure-vmware-cloudsimple,cloud-services,batch,container-instances,app-service,service-fabric,functions,kubernetes-service,virtual-machine-scale-sets,virtual-machines&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). ### [Virtual Machines](../virtual-machines/sizes.md)+ The following Virtual Machines **features are not currently available** in Azure Government: - Settings
The following Virtual Machines **features are not currently available** in Azure
- Ubuntu Advantage support plan ### [Azure Functions](../azure-functions/index.yml)+ When connecting your function app to Application Insights in Azure Government, make sure you use [`APPLICATIONINSIGHTS_CONNECTION_STRING`](../azure-functions/functions-app-settings.md#applicationinsights_connection_string), which lets you customize the Application Insights endpoint. ## Databases+ This section outlines variations and considerations when using Databases services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=azure-api-for-fhir,data-factory,sql-server-stretch-database,redis-cache,database-migration,synapse-analytics,postgresql,mariadb,mysql,sql-database,cosmos-db&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). ### [Azure Database for MySQL](../mysql/index.yml)+ The following Azure Database for MySQL **features are not currently available** in Azure Government: - Advanced Threat Protection - Private endpoint connections ### [Azure Database for PostgreSQL](../postgresql/index.yml)+ The following Azure Database for PostgreSQL **features are not currently available** in Azure Government: -- Advanced Threat Protection-- Private endpoint connections-- Hyperscale (Citus) and Flexible Server deployment options
+- Hyperscale (Citus) and Flexible server deployment options
+- The following features of the Single server deployment option
+ - Advanced Threat Protection
+ - Private endpoint connections
+ ## Developer Tools+ This section outlines variations and considerations when using Developer Tools services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=app-configuration,devtest-lab,lab-services,azure-devops&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). ### [Azure DevTest Labs](../devtest-labs/devtest-lab-overview.md)+ The following Azure DevTest Labs **features are not currently available** in Azure Government:+ - Auto shutdown feature for Azure Compute VMs; however, setting auto shutdown for [Labs](https://azure.microsoft.com/updates/azure-devtest-labs-auto-shutdown-notification/) and [Lab Virtual Machines](https://azure.microsoft.com/updates/azure-devtest-labs-set-auto-shutdown-for-a-single-lab-vm/) is available. ## Internet of Things+ This section outlines variations and considerations when using Internet of Things services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=api-management,cosmos-db,notification-hubs,logic-apps,stream-analytics,machine-learning-studio,machine-learning-service,event-grid,functions,azure-rtos,azure-maps,iot-central,iot-hub&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). ### [Azure IoT Hub](../iot-hub/index.yml)+ If you are using the IoT Hub connection string (instead of the Event Hub-compatible settings) with the Microsoft Azure Service Bus .NET client library to receive telemetry or operations monitoring events, then be sure to use `WindowsAzure.ServiceBus` NuGet package version 4.1.2 or higher. ## Management and Governance+ This section outlines variations and considerations when using Management and Governance services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=managed-applications,azure-policy,network-watcher,monitor,traffic-manager,automation,scheduler,site-recovery,cost-management,backup,blueprints,advisor&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). > [!NOTE] >This article has been updated to use the new Azure PowerShell Az module. You can still use the AzureRM module, which will continue to receive bug fixes until at least December 2020. To learn more about the new Az module and AzureRM compatibility, see [**Introducing the new Azure PowerShell Az module**](/powershell/azure/new-azureps-module-az?preserve-view=true&view=azps-3.3.0). For Az module installation instructions, see [**Install Azure PowerShell**](/powershell/azure/install-az-ps?preserve-view=true&view=azps-3.3.0). ### [Application Insights](../azure-monitor/overview.md)+ This section describes the supplemental configuration that is required to use Application Insights (part of Azure Monitor) in Azure Government. **Enable Application Insights for [ASP.NET](#web) & [ASP.NET Core](#web) with Visual Studio**
The following Azure Monitor **features behave differently** in Azure Government:
- No. The portals for Azure and Azure Government are separate and do not share information. ### [Azure Advisor](../advisor/advisor-overview.md)+ The following Azure Advisor recommendation **features are not currently available** in Azure Government: - High Availability
The calculation for recommending that you should right-size or shut down underut
If you want to be more aggressive at identifying underutilized virtual machines, you can adjust the CPU utilization rule on a per subscription basis. ## Media+ This section outlines variations and considerations when using Media services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=cdn,media-services&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). For Azure Media Services v3 availability, see [Azure clouds and regions in which Media Services v3 exists](../media-services/latest/azure-clouds-regions.md). ### [Media Services](../media-services/previous/index.yml)+ For information on how to connect to Media Services v2, see [Access the Azure Media Services API with Azure AD authentication](../media-services/previous/media-services-use-aad-auth-to-access-ams-api.md). The following Media Services **features are not currently available** in Azure Government: - Analyzing ΓÇô the Azure Media Indexer 2 Preview Azure Media Analytics media processor is not available in Azure Government.
For more information, see [Create a Video Indexer account](../media-services/vid
## Migration+ This section outlines variations and considerations when using Migration services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=database-migration,cost-management,azure-migrate,site-recovery&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). ### [Azure Migrate](../migrate/migrate-services-overview.md)+ The following Azure Migrate **features are not currently available** in Azure Government: - Dependency visualization functionality as Azure Migrate depends on Service Map for dependency visualization which is currently unavailable in Azure Government.
The following Azure Migrate **features are not currently available** in Azure Go
## Networking+ This section outlines variations and considerations when using Networking services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=azure-bastion,frontdoor,virtual-wan,dns,ddos-protection,cdn,azure-firewall,network-watcher,load-balancer,vpn-gateway,expressroute,application-gateway,virtual-network&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). ### [Azure ExpressRoute](../expressroute/index.yml)+ Azure ExpressRoute is used to create private connections between Azure Government datacenters and customer's on-premises infrastructure or a colocation facility. ExpressRoute connections do not go over the public Internet ΓÇö they offer optimized pathways (shortest hops, lowest latency, highest performance, etc.) and Azure Government geo-redundant regions. - By default, all Azure Government ExpressRoute connectivity is configured active-active redundant with support for bursting, and it delivers up to 10 G circuit capacity (smallest is 50 MB).
Aside from ExpressRoute, customers can also use an [IPSec protected VPN](../vpn-
All customers who utilize a private connectivity architecture should validate that an appropriate implementation is established and maintained for the customer connection to the Gateway Network/Internet (GN/I) edge router demarcation point for Azure Government. Similarly, your organization must establish network connectivity between your on-premises environment and Gateway Network/Customer (GN/C) edge router demarcation point for Azure Government. ### BGP communities+ This section provides an overview of how BGP communities are used with ExpressRoute in Azure Government. Microsoft advertises routes in the public peering and Microsoft peering paths, with routes tagged with appropriate community values. The rationale for doing so and the details on community values are described below. If you are connecting to Microsoft through ExpressRoute at any one peering location within the Azure Government region, you will have access to all Microsoft cloud services across all regions within the government boundary. For example, if you connected to Microsoft in Washington D.C. through ExpressRoute, you would have access to all Microsoft cloud services hosted in Azure Government. [ExpressRoute overview](../expressroute/expressroute-introduction.md) provides details on locations and partners, as well as a list of peering locations for Azure Government.
In addition to the above, Microsoft also tags prefixes based on the service they
>Microsoft does not honor any BGP community values that you set on the routes advertised to Microsoft. ### [Traffic Manager](../traffic-manager/traffic-manager-overview.md)
-Traffic Manager health checks can originate from certain IP addresses for Azure Government. Review the [IP addresses in the JSON file](https://azuretrafficmanagerdata.blob.core.windows.net/probes/azure-gov/probe-ip-ranges.json) to ensure that incoming connections from these IP addresses are allowed at the endpoints to check its health status.
+
+Traffic Manager health checks can originate from certain IP addresses for Azure Government. Review the [IP addresses in the JSON file](https://azuretrafficmanagerdata.blob.core.windows.net/probes/azure-gov/probe-ip-ranges.json) to ensure that incoming connections from these IP addresses are allowed at the endpoints to check its health status.
## Security+ This section outlines variations and considerations when using Security services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=azure-sentinel,azure-dedicated-hsm,information-protection,application-gateway,vpn-gateway,security-center,key-vault,active-directory-ds,ddos-protection,active-directory&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). ### [Azure Active Directory Premium P1 and P2](../active-directory/index.yml)+ The following features have known limitations in Azure Government: - Limitations with B2B collaboration in supported Azure Government tenants:
The following features have known limitations in Azure Government:
- Azure AD SSPR from Windows 10 login screen is not available ### [Azure Information Protection](/azure/information-protection/what-is-information-protection)+ Azure Information Protection Premium is part of the [Enterprise Mobility + Security](/enterprise-mobility-security) suite. For details on this service and how to use it, see the [Azure Information Protection Premium Government Service Description](/enterprise-mobility-security/solutions/ems-aip-premium-govt-service-description). ### [Azure Security Center](../security-center/security-center-introduction.md)+ The following Azure Security Center **features are not currently available** in Azure Government: - **1st and 3rd party integrations**
Azure Security Center's integrated cloud workload protection platform (CWPP), Az
Azure Security Center is deployed in Azure Government regions but not in Azure Government for DoD regions. Azure resources created in DoD regions can still utilize Security Center capabilities. However, using it will result in Security Center collected data being moved out from DoD regions and stored in Azure Government regions. By default, all Security Center features which collect and store data are disabled for resources hosted in DoD regions. The type of data collected and stored varies depending on the selected feature. Customers who want to enable Azure Security Center features for DoD resources are advised to consider data separation and protection requirements before doing so. ### [Azure Sentinel](../sentinel/overview.md)+ The following **features have known limitations** in Azure Government:+ - Office 365 data connector - The Office 365 data connector can be used only for [Office 365 GCC High and Office 365 DoD](/office365/servicedescriptions/office-365-platform-service-description/office-365-us-government/gcc-high-and-dod). Office 365 GCC can be accessed only from global (commercial) Azure.
The following **features have known limitations** in Azure Government:
- The AWS CloudTrail data connector can be used only for [AWS in the Public Sector](https://aws.amazon.com/government-education/). ### [Enterprise Mobility + Security (EMS)](/enterprise-mobility-security)+ For information about EMS suite capabilities in Azure Government, see the [Enterprise Mobility + Security for US Government Service Description](/enterprise-mobility-security/solutions/ems-govt-service-description). ## Storage+ This section outlines variations and considerations when using Storage services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=hpc-cache,managed-disks,storsimple,backup,storage&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). ### [Azure Storage](../storage/index.yml)+ For a Quickstart that will help you get started with Storage in Azure Government, see [Develop with Storage API on Azure Government](./documentation-government-get-started-connect-to-storage.md).
-**Storage pairing in Azure Government**
+**Storage pairing in Azure Government**</br>
Azure relies on [paired regions](../best-practices-availability-paired-regions.md) to deliver [geo-redundant storage](../storage/common/storage-redundancy.md). The following table shows the primary and secondary region pairings in Azure Government. |Geography|Regional Pair A|Regional Pair B|
The endpoint suffix to use in these overloads is *core.usgovcloudapi.net*.
When you're deploying the **StorSimple** Manager service, use the [https://portal.azure.us/](https://portal.azure.us/) URL for the Azure Government portal. For deployment instructions for [StorSimple Virtual Array](../storsimple/storsimple-ova-system-requirements.md), see StorSimple Virtual Array system requirements. For the StorSimple 8000 series, see [StorSimple software, high availability, and networking requirements](../storsimple/storsimple-8000-system-requirements.md) and go to the **Deploy** section from the left menu. For more information on StorSimple, see the [StorSimple documentation](../storsimple/index.yml). ### [Azure Import/Export](../import-export/storage-import-export-service.md)+ With Import/Export jobs for US Gov Arizona or US Gov Texas, the mailing address is for US Gov Virginia. The data is loaded into selected storage accounts from the US Gov Virginia region. For DoD IL5 data, use a DoD region storage account to ensure that data is loaded directly into the DoD regions. For more information, see [Azure Import/Export IL5 isolation guidance](./documentation-government-impact-level-5.md#azure-importexport-service).
For all jobs, we recommend that you rotate your storage account keys after the j
## Web+ This section outlines variations and considerations when using Web services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=spring-cloud,signalr-service,api-management,notification-hubs,search,cdn,app-service-linux,app-service&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). ### [API Management](../api-management/index.yml)+ The following API Management **features are not currently available** in Azure Government: - Azure AD B2C integration ### [App Service](../app-service/overview.md)+ The following App Service **features are not currently available** in Azure Government: - Resource
The following App Service **features are not currently available** in Azure Gove
## Next steps+ Learn more about Azure Government: - [Acquiring and accessing Azure Government](https://azure.microsoft.com/offers/azure-government/)
azure-government Documentation Government Overview Wwps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/documentation-government-overview-wwps.md
This article addresses common data residency, security, and isolation concerns p
Established privacy regulations are silent on **data residency and data location**, and permit data transfers in accordance with approved mechanisms such as the EU Standard Contractual Clauses (also known as EU Model Clauses). Microsoft commits contractually in the Online Services Terms [Data Protection Addendum](https://aka.ms/DPA) (DPA) that all potential transfers of customer data out of the EU, European Economic Area (EEA), and Switzerland shall be governed by the EU Model Clauses. Microsoft will abide by the requirements of the EEA and Swiss data protection laws regarding the collection, use, transfer, retention, and other processing of personal data from the EEA and Switzerland. All transfers of personal data are subject to appropriate safeguards and documentation requirements. However, many customers considering cloud adoption are seeking assurances about customer and personal data being kept within the geographic boundaries corresponding to customer operations or location of customerΓÇÖs end users.
-**Data sovereignty** implies data residency; however, it also introduces rules and requirements that define who has control over and access to customer data stored in the cloud. In many cases, data sovereignty mandates that customer data be subject to the laws and legal jurisdiction of the country in which data resides. These laws can have direct implications on data access even for platform maintenance or customer-initiated support requests. Customers can use Azure public multi-tenant cloud in combination with Azure Stack products for on-premises and edge solutions to meet their data sovereignty requirements, as described later in this article. These other products can be deployed to put customers solely in control of their data, including storage, processing, transmission, and remote access.
+**Data sovereignty** implies data residency; however, it also introduces rules and requirements that define who has control over and access to customer data stored in the cloud. In many cases, data sovereignty mandates that customer data be subject to the laws and legal jurisdiction of the country or region in which data resides. These laws can have direct implications on data access even for platform maintenance or customer-initiated support requests. Customers can use Azure public multi-tenant cloud in combination with Azure Stack products for on-premises and edge solutions to meet their data sovereignty requirements, as described later in this article. These other products can be deployed to put customers solely in control of their data, including storage, processing, transmission, and remote access.
Among several [data categories and definitions](https://www.microsoft.com/trust-center/privacy/customer-data-definitions) that Microsoft established for cloud services, the following four categories are discussed in this article:
Customer data in an Azure Storage account is [always replicated](../storage/comm
Data in an Azure Storage account is always replicated three times in the primary region. Azure Storage provides LRS and ZRS redundancy options for replicating data in the primary region. For applications requiring high availability, customers can choose geo-replication to a secondary region that is hundreds of kilometers away from the primary region. Azure Storage offers GRS and GZRS options for copying data to a secondary region. More options are available to customers for configuring read access (RA) to the secondary region (RA-GRS and RA-GZRS), as explained in [Read access to data in the secondary region](../storage/common/storage-redundancy.md#read-access-to-data-in-the-secondary-region).
-Azure Storage redundancy options can have implications on data residency as Azure relies on [paired regions](../best-practices-availability-paired-regions.md) to deliver [geo-redundant storage](../storage/common/storage-redundancy.md#geo-redundant-storage) (GRS). For example, customers concerned about geo-replication across regions that span country boundaries, may want to choose LRS or ZRS to keep Azure Storage data at rest within the geographic boundaries of the country in which the primary region is located. Similarly, [geo replication for Azure SQL Database](../azure-sql/database/active-geo-replication-overview.md) can be obtained by configuring asynchronous replication of transactions to any region in the world, although it is recommended that paired regions be used for this purpose as well. If customers need to keep relational data inside the geographic boundaries of their country, they should not configure Azure SQL Database asynchronous replication to a region outside that country.
+Azure Storage redundancy options can have implications on data residency as Azure relies on [paired regions](../best-practices-availability-paired-regions.md) to deliver [geo-redundant storage](../storage/common/storage-redundancy.md#geo-redundant-storage) (GRS). For example, customers concerned about geo-replication across regions that span country boundaries, may want to choose LRS or ZRS to keep Azure Storage data at rest within the geographic boundaries of the country in which the primary region is located. Similarly, [geo replication for Azure SQL Database](../azure-sql/database/active-geo-replication-overview.md) can be obtained by configuring asynchronous replication of transactions to any region in the world, although it is recommended that paired regions be used for this purpose as well. If customers need to keep relational data inside the geographic boundaries of their country/region, they should not configure Azure SQL Database asynchronous replication to a region outside that country.
As described on the [data location page](https://azure.microsoft.com/global-infrastructure/data-residency/), most Azure **regional** services honor the data at rest commitment to ensure that customer data remains within the geographic boundary where the corresponding service is deployed. A handful of exceptions to this rule are noted on the data location page. Customers should review these exceptions to determine if the type of data stored outside their chosen deployment Geo meets their needs.
Our [Law Enforcement Request Report](https://www.microsoft.com/about/corporate-r
The [CLOUD Act](https://www.congress.gov/bill/115th-congress/house-bill/4943) is a United States law that was enacted in March 2018. For more information, see MicrosoftΓÇÖs [blog post](https://blogs.microsoft.com/on-the-issues/2018/04/03/the-cloud-act-is-an-important-step-forward-but-now-more-steps-need-to-follow/) and the [follow-up blog post](https://blogs.microsoft.com/on-the-issues/2018/09/11/a-call-for-principle-based-international-agreements-to-govern-law-enforcement-access-to-data/) that describes MicrosoftΓÇÖs call for principle-based international agreements governing law enforcement access to data. Key points of interest to government customers procuring Azure services are captured below. - The CLOUD Act enables governments to negotiate new government-to-government agreements that will result in greater transparency and certainty for how information is disclosed to law enforcement agencies across international borders.-- The CLOUD Act is not a mechanism for greater government surveillance; it is a mechanism toward ensuring that customer data is ultimately protected by the laws of each customerΓÇÖs home country while continuing to facilitate lawful access to evidence for legitimate criminal investigations. Law enforcement in the US still needs to obtain a warrant demonstrating probable cause of a crime from an independent court before seeking the contents of communications. The CLOUD Act requires similar protections for other countries seeking bilateral agreements.
+- The CLOUD Act is not a mechanism for greater government surveillance; it is a mechanism toward ensuring that customer data is ultimately protected by the laws of each customerΓÇÖs home country/region while continuing to facilitate lawful access to evidence for legitimate criminal investigations. Law enforcement in the US still needs to obtain a warrant demonstrating probable cause of a crime from an independent court before seeking the contents of communications. The CLOUD Act requires similar protections for other countries seeking bilateral agreements.
- While the CLOUD Act creates new rights under new international agreements, it also preserves the common law right of cloud service providers to go to court to challenge search warrants when there is a conflict of laws ΓÇô even without these new treaties in place.-- Microsoft retains the legal right to object to a law enforcement order in the United States where the order clearly conflicts with the laws of the country where customer data is hosted. Microsoft will continue to carefully evaluate every law enforcement request and exercise its rights to protect customers where appropriate.
+- Microsoft retains the legal right to object to a law enforcement order in the United States where the order clearly conflicts with the laws of the country/region where customer data is hosted. Microsoft will continue to carefully evaluate every law enforcement request and exercise its rights to protect customers where appropriate.
- For legitimate enterprise customers, US law enforcement will, in most instances, now go directly to the customer rather than Microsoft for information requests. **Microsoft does not disclose extra data as a result of the CLOUD Act**. This law does not practically change any of the legal and privacy protections that previously applied to law enforcement requests for data ΓÇô and those protections continue to apply. Microsoft adheres to the same principles and customer commitments related to government demands for user data.
azure-monitor Itsmc Definition https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/alerts/itsmc-definition.md
-# Connect Azure to ITSM tools by using IT Service Management Connector
+# Connect Azure to ITSM tools by using IT Service Management Solution
:::image type="icon" source="media/itsmc-overview/itsmc-symbol.png":::
azure-monitor Custom Endpoints https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/custom-endpoints.md
The current Snippet (listed below) is version "5", the version is encoded in the
```html <script type="text/javascript">
-!function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{nConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[C]&&!0!==s[C]){method="onerror",t(["_"+method]);var c=T[method];T[method]=function(e,t,n,a,i){var r=c&&c(e,t,n,a,i);return!0!==r&&m["_"+method]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);(T[t]=n).queue&&0===n.queue.length&&n.trackPageView({})}(window,document,{
+!function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{
src: "https://js.monitor.azure.com/scripts/b/ai.2.min.js", // The SDK URL Source // name: "appInsights", // Global SDK Instance name defaults to "appInsights" when not supplied // ld: 0, // Defines the load delay (in ms) before attempting to load the sdk. -1 = block page load and add to head. (default) = 0ms load after timeout,
azure-monitor Javascript https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/javascript.md
The current Snippet (listed below) is version "5", the version is encoded in the
```html <script type="text/javascript">
-!function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{nConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[C]&&!0!==s[C]){method="onerror",t(["_"+method]);var c=T[method];T[method]=function(e,t,n,a,i){var r=c&&c(e,t,n,a,i);return!0!==r&&m["_"+method]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);(T[t]=n).queue&&0===n.queue.length&&n.trackPageView({})}(window,document,{
+!function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{
src: "https://js.monitor.azure.com/scripts/b/ai.2.min.js", // The SDK URL Source // name: "appInsights", // Global SDK Instance name defaults to "appInsights" when not supplied // ld: 0, // Defines the load delay (in ms) before attempting to load the sdk. -1 = block page load and add to head. (default) = 0ms load after timeout,
azure-monitor Sdk Connection String https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/sdk-connection-string.md
The current Snippet (listed below) is version "5", the version is encoded in the
```html <script type="text/javascript">
-!function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{nConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[C]&&!0!==s[C]){method="onerror",t(["_"+method]);var c=T[method];T[method]=function(e,t,n,a,i){var r=c&&c(e,t,n,a,i);return!0!==r&&m["_"+method]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);(T[t]=n).queue&&0===n.queue.length&&n.trackPageView({})}(window,document,{
+!function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{
src: "https://js.monitor.azure.com/scripts/b/ai.2.min.js", // The SDK URL Source // name: "appInsights", // Global SDK Instance name defaults to "appInsights" when not supplied // ld: 0, // Defines the load delay (in ms) before attempting to load the sdk. -1 = block page load and add to head. (default) = 0ms load after timeout,
azure-monitor Sharepoint https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/sharepoint.md
and before any other scripts. Your first data will appear
automatically in just a few seconds. --> <script type="text/javascript">
-!function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{nConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[C]&&!0!==s[C]){method="onerror",t(["_"+method]);var c=T[method];T[method]=function(e,t,n,a,i){var r=c&&c(e,t,n,a,i);return!0!==r&&m["_"+method]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);(T[t]=n).queue&&0===n.queue.length&&n.trackPageView({})}(window,document,{
+!function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{
src: "https://js.monitor.azure.com/scripts/b/ai.2.gbl.min.js", // The SDK URL Source // name: "appInsights", // Global SDK Instance name defaults to "appInsights" when not supplied // ld: 0, // Defines the load delay (in ms) before attempting to load the sdk. -1 = block page load and add to head. (default) = 0ms load after timeout,
azure-monitor Usage Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/usage-overview.md
The best experience is obtained by installing Application Insights both in your
```html <script type="text/javascript">
- !function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{nConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[C]&&!0!==s[C]){method="onerror",t(["_"+method]);var c=T[method];T[method]=function(e,t,n,a,i){var r=c&&c(e,t,n,a,i);return!0!==r&&m["_"+method]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);(T[t]=n).queue&&0===n.queue.length&&n.trackPageView({})}(window,document,{
+ !function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{
src: "https://js.monitor.azure.com/scripts/b/ai.2.min.js", // The SDK URL Source // name: "appInsights", // Global SDK Instance name defaults to "appInsights" when not supplied // ld: 0, // Defines the load delay (in ms) before attempting to load the sdk. -1 = block page load and add to head. (default) = 0ms load after timeout,
azure-monitor Website Monitoring https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/website-monitoring.md
Application Insights can gather telemetry data from any internet-connected appli
```html <script type="text/javascript">
- !function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{nConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[C]&&!0!==s[C]){method="onerror",t(["_"+method]);var c=T[method];T[method]=function(e,t,n,a,i){var r=c&&c(e,t,n,a,i);return!0!==r&&m["_"+method]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);(T[t]=n).queue&&0===n.queue.length&&n.trackPageView({})}(window,document,{
+ !function(T,l,y){var S=T.location,k="script",D="instrumentationKey",C="ingestionendpoint",I="disableExceptionTracking",E="ai.device.",b="toLowerCase",w="crossOrigin",N="POST",e="appInsightsSDK",t=y.name||"appInsights";(y.name||T[e])&&(T[e]=t);var n=T[t]||function(d){var g=!1,f=!1,m={initialize:!0,queue:[],sv:"5",version:2,config:d};function v(e,t){var n={},a="Browser";return n[E+"id"]=a[b](),n[E+"type"]=a,n["ai.operation.name"]=S&&S.pathname||"_unknown_",n["ai.internal.sdkVersion"]="javascript:snippet_"+(m.sv||m.version),{time:function(){var e=new Date;function t(e){var t=""+e;return 1===t.length&&(t="0"+t),t}return e.getUTCFullYear()+"-"+t(1+e.getUTCMonth())+"-"+t(e.getUTCDate())+"T"+t(e.getUTCHours())+":"+t(e.getUTCMinutes())+":"+t(e.getUTCSeconds())+"."+((e.getUTCMilliseconds()/1e3).toFixed(3)+"").slice(2,5)+"Z"}(),iKey:e,name:"Microsoft.ApplicationInsights."+e.replace(/-/g,"")+"."+t,sampleRate:100,tags:n,data:{baseData:{ver:2}}}}var h=d.url||y.src;if(h){function a(e){var t,n,a,i,r,o,s,c,u,p,l;g=!0,m.queue=[],f||(f=!0,t=h,s=function(){var e={},t=d.connectionString;if(t)for(var n=t.split(";"),a=0;a<n.length;a++){var i=n[a].split("=");2===i.length&&(e[i[0][b]()]=i[1])}if(!e[C]){var r=e.endpointsuffix,o=r?e.location:null;e[C]="https://"+(o?o+".":"")+"dc."+(r||"services.visualstudio.com")}return e}(),c=s[D]||d[D]||"",u=s[C],p=u?u+"/v2/track":d.endpointUrl,(l=[]).push((n="SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details)",a=t,i=p,(o=(r=v(c,"Exception")).data).baseType="ExceptionData",o.baseData.exceptions=[{typeName:"SDKLoadFailed",message:n.replace(/\./g,"-"),hasFullStack:!1,stack:n+"\nSnippet failed to load ["+a+"] -- Telemetry is disabled\nHelp Link: https://go.microsoft.com/fwlink/?linkid=2128109\nHost: "+(S&&S.pathname||"_unknown_")+"\nEndpoint: "+i,parsedStack:[]}],r)),l.push(function(e,t,n,a){var i=v(c,"Message"),r=i.data;r.baseType="MessageData";var o=r.baseData;return o.message='AI (Internal): 99 message:"'+("SDK LOAD Failure: Failed to load Application Insights SDK script (See stack for details) ("+n+")").replace(/\"/g,"")+'"',o.properties={endpoint:a},i}(0,0,t,p)),function(e,t){if(JSON){var n=T.fetch;if(n&&!y.useXhr)n(t,{method:N,body:JSON.stringify(e),mode:"cors"});else if(XMLHttpRequest){var a=new XMLHttpRequest;a.open(N,t),a.setRequestHeader("Content-type","application/json"),a.send(JSON.stringify(e))}}}(l,p))}function i(e,t){f||setTimeout(function(){!t&&m.core||a()},500)}var e=function(){var n=l.createElement(k);n.src=h;var e=y[w];return!e&&""!==e||"undefined"==n[w]||(n[w]=e),n.onload=i,n.onerror=a,n.onreadystatechange=function(e,t){"loaded"!==n.readyState&&"complete"!==n.readyState||i(0,t)},n}();y.ld<0?l.getElementsByTagName("head")[0].appendChild(e):setTimeout(function(){l.getElementsByTagName(k)[0].parentNode.appendChild(e)},y.ld||0)}try{m.cookie=l.cookie}catch(p){}function t(e){for(;e.length;)!function(t){m[t]=function(){var e=arguments;g||m.queue.push(function(){m[t].apply(m,e)})}}(e.pop())}var n="track",r="TrackPage",o="TrackEvent";t([n+"Event",n+"PageView",n+"Exception",n+"Trace",n+"DependencyData",n+"Metric",n+"PageViewPerformance","start"+r,"stop"+r,"start"+o,"stop"+o,"addTelemetryInitializer","setAuthenticatedUserContext","clearAuthenticatedUserContext","flush"]),m.SeverityLevel={Verbose:0,Information:1,Warning:2,Error:3,Critical:4};var s=(d.extensionConfig||{}).ApplicationInsightsAnalytics||{};if(!0!==d[I]&&!0!==s[I]){var c="onerror";t(["_"+c]);var u=T[c];T[c]=function(e,t,n,a,i){var r=u&&u(e,t,n,a,i);return!0!==r&&m["_"+c]({message:e,url:t,lineNumber:n,columnNumber:a,error:i}),r},d.autoExceptionInstrumented=!0}return m}(y.cfg);function a(){y.onInit&&y.onInit(n)}(T[t]=n).queue&&0===n.queue.length?(n.queue.push(a),n.trackPageView({})):a()}(window,document,{
src: "https://js.monitor.azure.com/scripts/b/ai.2.min.js", // The SDK URL Source // name: "appInsights", // Global SDK Instance name defaults to "appInsights" when not supplied // ld: 0, // Defines the load delay (in ms) before attempting to load the sdk. -1 = block page load and add to head. (default) = 0ms load after timeout,
azure-monitor Data Collector Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/data-collector-api.md
While the Data Collector API should cover most of your needs to collect free-for
|||| | [Custom events](../app/api-custom-events-metrics.md?toc=%2Fazure%2Fazure-monitor%2Ftoc.json#properties): Native SDK-based ingestion in Application Insights | Application Insights, typically instrumented through an SDK within your application, offers the ability for you to send custom data through Custom Events. | <ul><li> Data that is generated within your application, but not picked up by SDK through one of the default data types (requests, dependencies, exceptions, and so on).</li><li> Data that is most often correlated to other application data in Application Insights </li></ul> | | Data Collector API in Azure Monitor Logs | The Data Collector API in Azure Monitor Logs is a completely open-ended way to ingest data. Any data formatted in a JSON object can be sent here. Once sent, it will be processed, and available in Logs to be correlated with other data in Logs or against other Application Insights data. <br/><br/> It is fairly easy to upload the data as files to an Azure Blob blob, from where these files will be processed and uploaded to Log Analytics. Please see [this](./create-pipeline-datacollector-api.md) article for a sample implementation of such a pipeline. | <ul><li> Data that is not necessarily generated within an application instrumented within Application Insights.</li><li> Examples include lookup and fact tables, reference data, pre-aggregated statistics, and so on. </li><li> Intended for data that will be cross-referenced against other Azure Monitor data (Application Insights, other Logs data types, Security Center, Container insights/VMs, and so on). </li></ul> |
-| [Azure Data Explorer](/azure/data-explorer/ingest-data-overview) | Azure Data Explorer (ADX) is the data platform that powers Application Insights Analytics and Azure Monitor Logs. Now Generally Available ("GA"), using the data platform in its raw form provides you complete flexibility (but requiring the overhead of management) over the cluster (Kubernetes RBAC, retention rate, schema, and so on). ADX provides many [ingestion options](/azure/data-explorer/ingest-data-overview#ingestion-methods) including [CSV, TSV, and JSON](/azure/kusto/management/mappings?branch=master) files. | <ul><li> Data that will not be correlated to any other data under Application Insights or Logs. </li><li> Data requiring advanced ingestion or processing capabilities not today available in Azure Monitor Logs. </li></ul> |
+| [Azure Data Explorer](/azure/data-explorer/ingest-data-overview) | Azure Data Explorer (ADX) is the data platform that powers Application Insights Analytics and Azure Monitor Logs. Now Generally Available ("GA"), using the data platform in its raw form provides you complete flexibility (but requiring the overhead of management) over the cluster (Kubernetes RBAC, retention rate, schema, and so on). ADX provides many [ingestion options](/azure/data-explorer/ingest-data-overview#ingestion-methods) including [CSV, TSV, and JSON](/azure/kusto/management/mappings) files. | <ul><li> Data that will not be correlated to any other data under Application Insights or Logs. </li><li> Data requiring advanced ingestion or processing capabilities not today available in Azure Monitor Logs. </li></ul> |
## Next steps
azure-monitor Logs Data Export https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/logs-data-export.md
Title: Log Analytics workspace data export in Azure Monitor (preview) description: Log Analytics data export allows you to continuously export data of selected tables from your Log Analytics workspace to an Azure storage account or Azure Event Hubs as it's collected. +
Log Analytics data export can write append blobs to immutable storage accounts w
Data is sent to your event hub in near-real-time as it reaches Azure Monitor. An event hub is created for each data type that you export with the name *am-* followed by the name of the table. For example, the table *SecurityEvent* would sent to an event hub named *am-SecurityEvent*. If you want the exported data to reach a specific event hub, or if you have a table with a name that exceeds the 47 character limit, you can provide your own event hub name and export all data for defined tables to it. > [!IMPORTANT]
-> The [number of supported event hubs per namespace is 10](../../event-hubs/event-hubs-quotas.md#common-limits-for-all-tiers). If you export more than 10 tables,provide your own event hub name to export all your tables to that event hub.
+> The [number of supported event hubs per namespace is 10](../../event-hubs/event-hubs-quotas.md#common-limits-for-all-tiers). If you export more than 10 tables, provide your own event hub name to export all your tables to that event hub.
Considerations: 1. 'Basic' event hub sku supports lower event size [limit](../../event-hubs/event-hubs-quotas.md#basic-vs-standard-tiers) and some logs in your workspace can exceed it and be dropped. We recommend to use 'Standard' or 'Dedicated' event hub as export destination.
If you have configured your Storage Account to allow access from selected networ
[![Storage account firewalls and virtual networks](media/logs-data-export/storage-account-vnet.png)](media/logs-data-export/storage-account-vnet.png#lightbox) - ### Create or update data export rule
-A data export rule defines data to be exported for a set of tables to a single destination. You can create a single rule for each destination.
+A data export rule defines the tables for which data is exported and the destination. You can create a single rule for each destination currently.
+
+If you need a list of tables in your workapce for export rules configuration, run this query in your workspace.
+```kusto
+find where TimeGenerated > ago(24h) | distinct Type
+```
# [Azure portal](#tab/portal)
N/A
# [Azure CLI](#tab/azure-cli)
-Use the following CLI command to view tables in your workspace. It can help copy the tables you want and include in data export rule.
-
-```azurecli
-az monitor log-analytics workspace table list --resource-group resourceGroupName --workspace-name workspaceName --query [].name --output table
-```
- Use the following command to create a data export rule to a storage account using CLI. ```azurecli
azure-netapp-files Azure Netapp Files Solution Architectures https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/azure-netapp-files-solution-architectures.md
na ms.devlang: na Previously updated : 03/03/2021 Last updated : 03/08/2021 # Solution architectures using Azure NetApp Files
This section provides references for solutions for Linux OSS applications and da
### Oracle
-* [Oracle database performance on Azure NetApp Files single volumes](performance-oracle-single-volumes.md)
* [Oracle on Azure deployment best practice guide using Azure NetApp Files](https://www.netapp.com/us/media/tr-4780.pdf) * [Oracle VM images and their deployment on Microsoft Azure: Shared storage configuration options](../virtual-machines/workloads/oracle/oracle-vm-solutions.md#shared-storage-configuration-options)
+* [Oracle database performance on Azure NetApp Files single volumes](performance-oracle-single-volumes.md)
* [Benefits of using Azure NetApp Files with Oracle Database](solutions-benefits-azure-netapp-files-oracle-database.md) ### Machine Learning
This section provides references to SAP on Azure solutions.
* [High availability of SAP HANA Scale-up with Azure NetApp Files on Red Hat Enterprise Linux](../virtual-machines/workloads/sap/sap-hana-high-availability-netapp-files-red-hat.md) * [SAP HANA scale-out with standby node on Azure VMs with Azure NetApp Files on SUSE Linux Enterprise Server](../virtual-machines/workloads/sap/sap-hana-scale-out-standby-netapp-files-suse.md) * [SAP HANA scale-out with standby node on Azure VMs with Azure NetApp Files on Red Hat Enterprise Linux](../virtual-machines/workloads/sap/sap-hana-scale-out-standby-netapp-files-rhel.md)
+* [SAP HANA scale-out with HSR and Pacemaker on RHEL - Azure Virtual Machines](../virtual-machines/workloads/sap/sap-hana-high-availability-scale-out-hsr-rhel.md)
* [Azure Application Consistent Snapshot tool (AzAcSnap)](azacsnap-introduction.md) ### SAP AnyDB
+* [Oracle Azure Virtual Machines DBMS deployment for SAP workload - Azure Virtual Machines](../virtual-machines/workloads/sap/dbms_guide_oracle.md#oracle-configuration-guidelines-for-sap-installations-in-azure-vms-on-linux)
* [Deploy SAP AnyDB (Oracle 19c) with Azure NetApp Files](https://techcommunity.microsoft.com/t5/running-sap-applications-on-the/deploy-sap-anydb-oracle-19c-with-azure-netapp-files/ba-p/2064043) ### SAP IQ-NLS
azure-percept Troubleshoot Dev Kit https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/troubleshoot-dev-kit.md
There are three small LEDs on top of the carrier board housing. A cloud icon is
|LED |State |Description | |-|--|| |LED 1 (IoT Hub) |On (solid) |Device is connected to an IoT Hub. |
-|LED 2 (Wi-Fi) |Slow blink |Device authentication in progress. |
+|LED 2 (Wi-Fi) |Slow blink |Device is ready to be configured by Wi-Fi Easy Connect and is announcing its presence to a configurator. |
|LED 2 (Wi-Fi) |Fast blink |Authentication was successful, device association in progress. | |LED 2 (Wi-Fi) |On (solid) |Authentication and association were successful; device is connected to a Wi-Fi network. | |LED 3 |NA |LED not in use. |
azure-resource-manager Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/templates/overview.md
Title: Templates overview description: Describes the benefits using Azure Resource Manager templates (ARM templates) for deployment of resources. Previously updated : 03/03/2021 Last updated : 03/08/2021 # What are ARM templates?
REQUEST BODY
Notice that the **apiVersion** you set in the template for the resource is used as the API version for the REST operation. You can repeatedly deploy the template and have confidence it will continue to work. By using the same API version, you don't have to worry about breaking changes that might be introduced in later versions.
+To deploy a template, use any of the following options:
+
+* [Azure portal](deploy-portal.md)
+* [Azure CLI](deploy-cli.md)
+* [PowerShell](deploy-powershell.md)
+* [REST API](deploy-rest.md)
+* [Button in GitHub repository](deploy-to-azure-button.md)
+* [Azure Cloud Shell](deploy-cloud-shell.md)
+ ## Template design How you define templates and resource groups is entirely up to you and how you want to manage your solution. For example, you can deploy your three tier application through a single template to a single resource group.
azure-sql Azure Defender For Sql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/azure-defender-for-sql.md
Title: Azure Defender for SQL description: Learn about functionality for managing your database vulnerabilities and detecting anomalous activities that could indicate a threat to your database in Azure SQL Database, Azure SQL Managed Instance, or Azure Synapse.- ms.devlang:
Previously updated : 02/02/2021 Last updated : 03/08/2021 # Azure Defender for SQL+ [!INCLUDE[appliesto-sqldb-sqlmi-asa](../includes/appliesto-sqldb-sqlmi-asa.md)] Azure Defender for SQL is a unified package for advanced SQL security capabilities. Azure Defender is available for Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics. It includes functionality for discovering and classifying sensitive data, surfacing and mitigating potential database vulnerabilities, and detecting anomalous activities that could indicate a threat to your database. It provides a single go-to location for enabling and managing these capabilities.
Enable Azure Defender for SQL once to enable all these included features. With o
For more information about Azure Defender for SQL pricing, see the [Azure Security Center pricing page](https://azure.microsoft.com/pricing/details/security-center/).
-## Enable Azure Defender
+## Enable Azure Defender
+There are multiple ways to enable Azure Defender plans. You can enable it at the subscription level (**recommended**) from:
+
+- [Azure Security Center](#enable-azure-defender-for-azure-sql-database-at-the-subscription-level-from-azure-security-center)
+- [Programatically with the REST API, Azure CLI, PowerShell, or Azure Policy](#enable-azure-defender-plans-programatically)
+
+Alternatively, you can enable it at the resource level as described in [Enable Azure Defender for Azure SQL Database at the resource level](#enable-azure-defender-for-azure-sql-database-at-the-resource-level)
+
+### Enable Azure Defender for Azure SQL Database at the subscription level from Azure Security Center
+To enable Azure Defender for Azure SQL Database at the subscription level from within Azure Security Center:
+
+1. From the [Azure portal](https://portal.azure.com), open **Security Center**.
+1. From Security Center's menu, select **Pricing and settings**.
+1. Select the relevant subscription.
+1. Change the plan setting to **On**.
+
+ :::image type="content" source="media/azure-defender-for-sql/enable-azure-defender-sql-subscription-level.png" alt-text="Enabling Azure Defender for Azure SQL Database at the subscription level.":::
+
+1. Select **Save**.
++
+### Enable Azure Defender plans programatically
-Azure Defender can be accessed through the [Azure portal](https://portal.azure.com). Enable Azure Defender by navigating to **Security Center** under the **Security** heading for your server or managed instance.
+The flexibility of Azure allows for a number of programmatic methods for enabling Azure Defender plans.
+
+Use any of the following tools to enable Azure Defender for your subscription:
+
+| Method | Instructions |
+|--|-|
+| REST API | [Pricings API](/rest/api/securitycenter/pricings) |
+| Azure CLI | [az security pricing](/cli/azure/security/pricing) |
+| PowerShell | [Set-AzSecurityPricing](/powershell/module/az.security/set-azsecuritypricing) |
+| Azure Policy | [Bundle Pricings](https://github.com/Azure/Azure-Security-Center/blob/master/Pricing%20%26%20Settings/ARM%20Templates/Set-ASC-Bundle-Pricing.json) |
+| | |
+
+### Enable Azure Defender for Azure SQL Database at the resource level
+
+We recommend enabling Azure Defender plans at the subscription level and this can help the creation of unprotected resources. However, if you have an organizational reason to enable Azure Defender at the server level, use the following steps:
+
+1. From the [Azure portal](https://portal.azure.com), open your server or managed instance.
+1. Under the **Security** heading, select **Security Center**.
+1. Select **Enable Azure Defender for SQL**.
+
+ :::image type="content" source="media/azure-defender-for-sql/enable-azure-defender.png" alt-text="Enable Azure Defender for SQL from within Azure SQL databases.":::
> [!NOTE] > A storage account is automatically created and configured to store your **Vulnerability Assessment** scan results. If you've already enabled Azure Defender for another server in the same resource group and region, then the existing storage account is used. > > The cost of Azure Defender is aligned with Azure Security Center standard tier pricing per node, where a node is the entire server or managed instance. You are thus paying only once for protecting all databases on the server or managed instance with Azure Defender. You can try Azure Defender out initially with a free trial. -
-## Track vulnerabilities and investigate threat alerts
-
-Click the **Vulnerability Assessment** card to view and manage vulnerability scans and reports, and to track your security stature. If security alerts have been received, click the **Advanced Threat Protection** card to view details of the alerts and to see a consolidated report on all alerts in your Azure subscription via the Azure Security Center security alerts page.
## Manage Azure Defender settings
To view and manage Azure Defender settings:
On this page, you'll see the status of Azure Defender for SQL:
- :::image type="content" source="media/azure-defender-for-sql/status-of-defender-for-sql.png" alt-text="Checking the status of Azure Defender for SQL inside Azure SQL databases":::
+ :::image type="content" source="media/azure-defender-for-sql/status-of-defender-for-sql.png" alt-text="Checking the status of Azure Defender for SQL inside Azure SQL databases.":::
1. If Azure Defender for SQL is enabled, you'll see a **Configure** link as shown in the previous graphic. To edit the settings for Azure Defender for SQL, select **Configure**.
- :::image type="content" source="media/azure-defender-for-sql/security-server-settings.png" alt-text="security server settings":::
+ :::image type="content" source="media/azure-defender-for-sql/security-server-settings.png" alt-text="Settings for Azure Defender for SQL.":::
1. Make the necessary changes and select **Save**.
azure-sql Features Comparison https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/features-comparison.md
Previously updated : 02/21/2021 Last updated : 03/08/2021 # Features comparison: Azure SQL Database and Azure SQL Managed Instance
Azure SQL Database and SQL Managed Instance share a common code base with the la
- Security features - [Application roles](/sql/relational-databases/security/authentication-access/application-roles), [Dynamic data masking](/sql/relational-databases/security/dynamic-data-masking) ([see getting started guide](dynamic-data-masking-overview.md)), [Row Level Security](/sql/relational-databases/security/row-level-security), and Threat detection - see getting started guides for [SQL Database](threat-detection-configure.md) and [SQL Managed Instance](../managed-instance/threat-detection-configure.md). - Multi-model capabilities - [Graph processing](/sql/relational-databases/graphs/sql-graph-overview), [JSON data](/sql/relational-databases/json/json-data-sql-server) ([see getting started guide](json-features.md)), [OPENXML](/sql/t-sql/functions/openxml-transact-sql), [Spatial](/sql/relational-databases/spatial/spatial-data-sql-server), [OPENJSON](/sql/t-sql/functions/openjson-transact-sql), and [XML indexes](/sql/t-sql/statements/create-xml-index-transact-sql).
-Azure manages your databases and guarantees their high-availability. Some features that might affect high-availability or can't be used in PaaS world have limited functionalities in SQL Database and SQL Managed Instance. These features are described in the tables below. If you need more details about the differences, you can find them in the separate pages for [Azure SQL Database](../managed-instance/transact-sql-tsql-differences-sql-server.md) or [Azure SQL Managed Instance](../managed-instance/transact-sql-tsql-differences-sql-server.md).
+Azure manages your databases and guarantees their high-availability. Some features that might affect high-availability or can't be used in PaaS world have limited functionalities in SQL Database and SQL Managed Instance. These features are described in the tables below.
+
+If you need more details about the differences, you can find them in the separate pages:
+- [Azure SQL Database vs. SQL Server differences](transact-sql-tsql-differences-sql-server.md)
+- [Azure SQL Managed Instance vs. SQL Server differences](../managed-instance/transact-sql-tsql-differences-sql-server.md)
## Features of SQL Database and SQL Managed Instance
For more information about Azure SQL Database and Azure SQL Managed Instance, se
- [What is Azure SQL Database?](sql-database-paas-overview.md) - [What is Azure SQL Managed Instance?](../managed-instance/sql-managed-instance-paas-overview.md)-- [What is an Azure SQL Managed Instance pool?](../managed-instance/instance-pools-overview.md)
+- [What is an Azure SQL Managed Instance pool?](../managed-instance/instance-pools-overview.md)
azure-sql Firewall Create Server Level Portal Quickstart https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/firewall-create-server-level-portal-quickstart.md
Sign in to the [Azure portal](https://portal.azure.com/).
## Create a server-level IP firewall rule
- SQL Database creates a firewall at the server level for single and pooled databases. This firewall prevents client applications from connecting to the server or any of its databases unless you create an IP firewall rule to open the firewall. For a connection from an IP address outside Azure, create a firewall rule for a specific IP address or range of addresses that you want to be able to connect. For more information about server-level and database-level IP firewall rules, see [Server-level and database-level IP firewall rules](firewall-configure.md).
+ SQL Database creates a firewall at the server level for single and pooled databases. This firewall prevents client applications from connecting to the server or any of its databases unless you create an IP firewall rule to open the firewall. For a connection from an IP address outside Azure, create a firewall rule for a specific IP address or range of addresses that you want to be able to connect from. For more information about server-level and database-level IP firewall rules, see [Server-level and database-level IP firewall rules](firewall-configure.md).
> [!NOTE] > Azure SQL Database communicates over port 1433. If you're trying to connect from within a corporate network, outbound traffic over port 1433 might not be allowed by your network's firewall. If so, you can't connect to your server unless your IT department opens port 1433.
azure-sql High Availability Sla https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/high-availability-sla.md
The zone redundant version of the high availability architecture for the general
> [!NOTE] > General Purpose databases with a size of 80 vcore may experience performance degradation with zone redundant configuration. Additionally, operations such as backup, restore, database copy, and setting up Geo-DR relationships may experience slower performance for any single databases larger than 1 TB.
+>
+> [!NOTE]
+> The preview is not covered under Reserved Instance
## Premium and Business Critical service tier locally redundant availability
Azure SQL Database and Azure SQL Managed Instance feature a built-in high availa
- Learn about [Service Fabric](../../service-fabric/service-fabric-overview.md) - Learn about [Azure Traffic Manager](../../traffic-manager/traffic-manager-overview.md) - Learn [How to initiate a manual failover on SQL Managed Instance](../managed-instance/user-initiated-failover.md)-- For more options for high availability and disaster recovery, see [Business Continuity](business-continuity-high-availability-disaster-recover-hadr-overview.md)
+- For more options for high availability and disaster recovery, see [Business Continuity](business-continuity-high-availability-disaster-recover-hadr-overview.md)
azure-sql Sql Server To Sql Database Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/migration-guides/database/sql-server-to-sql-database-guide.md
For more information about tools available to use for the Discover phase, see [S
### Assess + After data sources have been discovered, assess any on-premises SQL Server database(s) that can be migrated to Azure SQL Database to identify migration blockers or compatibility issues. You can use the Data Migration Assistant (version 4.1 and later) to assess databases to get:
To speed up migration to Azure SQL Database, you should consider the following r
| | Resource contention | Recommendation | |--|--|--| | **Source (typically on premises)** |Primary bottleneck during migration in source is DATA I/O and latency on DATA file which needs to be monitored carefully. |Based on DATA IO and DATA file latency and depending on whether itΓÇÖs a virtual machine or physical server, you will have to engage storage admin and explore options to mitigate the bottleneck. |
-|**Target (Azure SQL Database)**|Biggest limiting factor is the log generation rate and latency on log file. With Azure SQL Database, you can get a maximum of 96 MB/s log generation rate. | To speed up migration, scale up the target SQL DB to Business Critical Gen5 8 vcore to get the maximum log generation rate of 96 MB/s and also achieve low latency for log file. The [Hyperscale](../../database/service-tier-hyperscale.md) service tier provides 100 MB/s log rate regardless of chosen service level |
+|**Target (Azure SQL Database)**|Biggest limiting factor is the log generation rate and latency on log file. With Azure SQL Database, you can get a maximum of 96-MB/s log generation rate. | To speed up migration, scale up the target SQL DB to Business Critical Gen5 8 vCore to get the maximum log generation rate of 96 MB/s and also achieve low latency for log file. The [Hyperscale](../../database/service-tier-hyperscale.md) service tier provides 100-MB/s log rate regardless of chosen service level |
|**Network** |Network bandwidth needed is equal to max log ingestion rate 96 MB/s (768 Mb/s) |Depending on network connectivity from your on-premises data center to Azure, check your network bandwidth (typically [Azure ExpressRoute](../../../expressroute/expressroute-introduction.md#bandwidth-options)) to accommodate for the maximum log ingestion rate. | |**Virtual machine used for Data Migration Assistant (DMA)** |CPU is the primary bottleneck for the virtual machine running DMA |Things to consider to speed up data migration by using </br>- Azure compute intensive VMs </br>- Use at least F8s_v2 (8 vcore) VM for running DMA </br>- Ensure the VM is running in the same Azure region as target |
-|**Azure Database Migration Service (DMS)** |Compute resource contention and database objects consideration for DMS |Use Premium 4 vCore. DMS automatically takes care of database objects like foreign keys, triggers, constraints and non-clustered indexes and doesnΓÇÖt need any manual intervention. |
+|**Azure Database Migration Service (DMS)** |Compute resource contention and database objects consideration for DMS |Use Premium 4 vCore. DMS automatically takes care of database objects like foreign keys, triggers, constraints, and non-clustered indexes and doesn't need manual intervention. |
## Post-migration
azure-sql Sql Server To Sql Database Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/migration-guides/database/sql-server-to-sql-database-overview.md
The following table lists the recommended migration tools:
|Technology | Description| |||
+| [Azure Migrate](/azure/migrate/how-to-create-azure-sql-assessment) | Azure Migrate for Azure SQL allows you to discover and assess your SQL data estate at scale when on VMware, providing Azure SQL deployment recommendations, target sizing, and monthly estimates. |
|[Data Migration Assistant (DMA)](/sql/dma/dma-migrateonpremsqltosqldb)|The Data Migration Assistant is a desktop tool that provides seamless assessments of SQL Server and migrations to Azure SQL Database (both schema and data). The tool can be installed on a server on-premises or on your local machine that has connectivity to your source databases. The migration process is a logical data movement between objects in the source and target database. </br> - Migrate single databases (both schema and data)| |[Azure Database Migration Service (DMS)](../../../dms/tutorial-sql-server-to-azure-sql.md)|A first party Azure service that can migrate your SQL Server databases to Azure SQL Database using the Azure portal or automated with PowerShell. Azure DMS requires you to select a preferred Azure Virtual Network (VNet) during provisioning to ensure there is connectivity to your source SQL Server databases. </br> - Migrate single databases or at scale. | | | |
azure-sql Sql Server To Managed Instance Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/migration-guides/managed-instance/sql-server-to-managed-instance-guide.md
For more information about tools available to use for the Discover phase, see [S
### Assess + After data sources have been discovered, assess any on-premises SQL Server instance(s) that can be migrated to Azure SQL Managed Instance to identify migration blockers or compatibility issues. You can use the Data Migration Assistant (version 4.1 and later) to assess databases to get:
azure-sql Sql Server To Managed Instance Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/migration-guides/managed-instance/sql-server-to-managed-instance-overview.md
Some general guidelines to help you choose the right service tier and characteri
You can choose compute and storage resources during deployment and then change them after using the [Azure portal](../../database/scale-resources.md) without incurring downtime for your application. > [!IMPORTANT]
-> Any discrepancy in the [managed instance virtual network requirements](../../managed-instance/connectivity-architecture-overview.md#network-requirements) can prevent you from creating new instances or using existing ones. Learn more about [creating new](../../managed-instance/virtual-network-subnet-create-arm-template.md) and [configuring existing](../../managed-instance/vnet-existing-add-subnet.md?branch=release-ignite-arc-data) networks.
+> Any discrepancy in the [managed instance virtual network requirements](../../managed-instance/connectivity-architecture-overview.md#network-requirements) can prevent you from creating new instances or using existing ones. Learn more about [creating new](../../managed-instance/virtual-network-subnet-create-arm-template.md) and [configuring existing](../../managed-instance/vnet-existing-add-subnet.md) networks.
### SQL Server VM alternative
The following table lists the recommended migration tools:
|Technology | Description| |||
+| [Azure Migrate](/azure/migrate/how-to-create-azure-sql-assessment) | Azure Migrate for Azure SQL allows you to discover and assess your SQL data estate at scale when on VMware, providing Azure SQL deployment recommendations, target sizing, and monthly estimates. |
|[Azure Database Migration Service (DMS)](../../../dms/tutorial-sql-server-to-managed-instance.md) | First party Azure service that supports migration in the offline mode for applications that can afford downtime during the migration process. Unlike the continuous migration in online mode, offline mode migration runs a one-time restore of a full database backup from the source to the target. | |[Native backup and restore](../../managed-instance/restore-sample-database-quickstart.md) | SQL Managed Instance supports RESTORE of native SQL Server database backups (.bak files), making it the easiest migration option for customers who can provide full database backups to Azure storage. Full and differential backups are also supported and documented in the [migration assets section](#migration-assets) later in this article.|
-|[Log Replay Service (LRS)](../../managed-instance/log-replay-service-migrate.md) | This is a cloud service enabled for Managed Instance based on the SQL Server log shipping technology, making it a migration option for customers who can provide full, differential and log database backups to Azure storage. LRS is used to restore backup files from Azure Blob Storage to SQL Managed Instance.|
+|[Log Replay Service (LRS)](../../managed-instance/log-replay-service-migrate.md) | This is a cloud service enabled for Managed Instance based on the SQL Server log shipping technology, making it a migration option for customers who can provide full, differential, and log database backups to Azure storage. LRS is used to restore backup files from Azure Blob Storage to SQL Managed Instance.|
| | | ### Alternative tools
azure-sql Sql Server To Sql On Azure Vm Individual Databases Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/migration-guides/virtual-machines/sql-server-to-sql-on-azure-vm-individual-databases-guide.md
For additional discovery tools, see [Services and tools](../../../dms/dms-tools-
### Assess + After you've discovered all of the data sources, use the [Data Migration Assistant (DMA)](/sql/dma/dma-overview) to assess on-premises SQL Server instance(s) migrating to an instance of SQL Server on Azure VM to understand the gaps between the source and target instances.
To perform a standard migration using backup and restore, follow these steps:
1. Pause/stop any applications that are using databases intended for migration. 1. Ensure user database(s) are inactive using [single user mode](/sql/relational-databases/databases/set-a-database-to-single-user-mode). 1. Perform a full database backup to an on-premises location.
-1. Copy your on-premises backup file(s) to your VM using remote desktop, [Azure Data Explorer](/azure/data-explorer/data-explorer-overview), or the [AZCopy command line utility](../../../storage/common/storage-use-azcopy-v10.md) (> 2 TB backups recommended).
+1. Copy your on-premises backup file(s) to your VM using remote desktop, [Azure Data Explorer](/azure/data-explorer/data-explorer-overview), or the [AZCopy command line utility](../../../storage/common/storage-use-azcopy-v10.md) (> 2-TB backups recommended).
1. Restore full database backup(s) to the SQL Server on Azure VM. ### Log shipping (minimize downtime)
To perform a minimal downtime migration using backup, restore, and log shipping,
1. Set up connectivity to target SQL Server on Azure VM, based on your requirements. See [Connect to a SQL Server Virtual Machine on Azure (Resource Manager)](../../virtual-machines/windows/ways-to-connect-to-sql.md). 1. Ensure on-premise User Database(s) to be migrated are in full or bulk-logged recovery model. 1. Perform a full database backup to an on-premises location and modify any existing full database backups jobs to use [COPY_ONLY](/sql/relational-databases/backup-restore/copy-only-backups-sql-server) keyword to preserve the log chain.
-1. Copy your on-premises backup file(s) to your VM using remote desktop, [Azure Data Explorer](/azure/data-explorer/data-explorer-overview), or the [AZCopy command line utility](../../../storage/common/storage-use-azcopy-v10.md) (>1 TB backups recommended).
+1. Copy your on-premises backup file(s) to your VM using remote desktop, [Azure Data Explorer](/azure/data-explorer/data-explorer-overview), or the [AZCopy command line utility](../../../storage/common/storage-use-azcopy-v10.md) (>1-TB backups recommended).
1. Restore Full Database backup(s) on the SQL Server on Azure VM. 1. Set up [log shipping](/sql/database-engine/log-shipping/configure-log-shipping-sql-server) between on-premise database and target SQL Server on Azure VM. Be sure not to reinitialize the database(s) as this has already been completed in the previous steps. 1. **Cut over** to the target server.
azure-vmware Tutorial Deploy Vmware Hcx https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/tutorial-deploy-vmware-hcx.md
Last updated 11/25/2020
This article shows you how to deploy and configure the on-premises VMware HCX Connector for your Azure VMware Solution private cloud. With VMware HCX, you can migrate your VMware workloads to Azure VMware Solution and other connected sites through various migration types. Because Azure VMware Solution deploys and configures the HCX Cloud Manager, you must download, activate, and configure the HCX Connector in your on-premises VMware datacenter.
-VMware HCX Advanced Connector is pre-deployed in Azure VMware Solution. It supports up to three site connections (on-premises to cloud, or cloud to cloud). If you need more than three site connections, submit a [support request](https://portal.azure.com/#create/Microsoft.Support) to enable the [VMware HCX Enterprise](https://cloud.vmware.com/community/2019/08/08/introducing-hcx-enterprise/) add-on. The add-on is currently in preview.
+VMware HCX Advanced Connector is pre-deployed in Azure VMware Solution. It supports up to three site connections (on-premises to cloud, or cloud to cloud). If you need more than three site connections, submit a [support request](https://portal.azure.com/#create/Microsoft.Support) to enable the [VMware HCX Enterprise](https://cloud.vmware.com/community/2019/08/08/introducing-hcx-enterprise/) add-on.
>[!TIP]
->Although the VMware Configuration Maximum tool describes site pairs maximum to be 25 between the on-premises Connector and Cloud Manager, the licensing limits this to three for Advanced and 10 for Enterprise Edition.
+>Although the VMware Configuration Maximum tool describes site pairs maximum to be 25 between the on-premises Connector and Cloud Manager, the licensing limits this to three for HCX Advanced and 10 for HCX Enterprise Edition.
>[!NOTE] >VMware HCX Enterprise is available with Azure VMware Solution as a preview service. It's free and is subject to terms and conditions for a preview service. After the VMware HCX Enterprise service is generally available, you'll get a 30-day notice that billing will switch over. You'll also have the option to turn off or opt-out of the service. There is no simple downgrade path from VMware HCX Enterprise to VMware HCX Advanced. If you decide to downgrade, you'll have to redeploy, incurring downtime.
Make sure that your on-premises vSphere environment (source environment) meets t
### Network and ports
-* [Azure ExpressRoute Global Reach](tutorial-expressroute-global-reach-private-cloud.md) is configured between on-premises and Azure VMware Solution SDDC ExpressRoute circuits.
+* [Azure ExpressRoute Global Reach](tutorial-expressroute-global-reach-private-cloud.md) is configured between on-premises and Azure VMware Solution private cloud ExpressRoute circuits.
-* [All required ports](https://ports.vmware.com/home/VMware-HCX) are open for communication between on-premises components and Azure VMware Solution SDDC.
+* [All required ports](https://ports.vmware.com/home/VMware-HCX) are open for communication between on-premises components and Azure VMware Solution private.
### IP addresses
After the services restart, you'll see vCenter showing as green on the screen th
For an end-to-end overview of this procedure, view the [Azure VMware Solution: Activate HCX](https://www.youtube.com/embed/PnVg6SZkQsY?rel=0&amp;vq=hd720) video. > [!IMPORTANT]
- > Whether you're using VMware HCX Advanced or VMware HCX Enterprise, you may need to install the patch from VMware's [KB article 81558](https://kb.vmware.com/s/article/81558).
+ > Whether you're using HCX Advanced or HCX Enterprise, you may need to install the patch from VMware's [KB article 81558](https://kb.vmware.com/s/article/81558).
## Configure the VMware HCX Connector
For an end-to-end overview of this procedure, view the [Azure VMware Solution: C
### Create a service mesh
-Now it's time to configure a service mesh between on-premises and Azure VMware Solution SDDC.
+Now it's time to configure a service mesh between on-premises and Azure VMware Solution private cloud.
For an end-to-end overview of this procedure, view the [Azure VMware Solution: N
## Next steps
-If the appliance interconnect tunnel status is **UP** and green, you can migrate and protect Azure VMware Solution VMs by using VMware HCX. Azure VMware Solution supports workload migrations (with or without a network extension). You can still migrate workloads in your vSphere environment, along with on-premises creation of networks and deployment of VMs onto those networks.
+If the HCX interconnect tunnel status is **UP** and green, you can migrate and protect Azure VMware Solution VMs by using VMware HCX. Azure VMware Solution supports workload migrations (with or without a network extension). You can still migrate workloads in your vSphere environment, along with on-premises creation of networks and deployment of VMs onto those networks.
For more information on using HCX, go to the VMware technical documentation:
azure-vmware Windows Server Failover Cluster https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/windows-server-failover-cluster.md
+
+ Title: Windows Server Failover Cluster on Azure VMware Solution vSAN with native shared disks
+description: Set up Windows Server Failover Cluster (WSFC) on Azure VMware Solution and take advantage of solutions requiring WSFC capability.
+ Last updated : 03/08/2021++
+# Windows Server Failover Cluster on Azure VMware Solution vSAN with native shared disks
+
+In this article, we'll walk through setting up Windows Server Failover Cluster on Azure VMware Solution. The implementation in this article is for proof of concept and pilot purposes.
+
+Windows Server Failover Cluster (WSFC), previously known as Microsoft Service Cluster Service (MSCS), is a feature of the Windows Server Operating System (OS). WSFC is a business-critical feature, and for many applications is required. For example, WSFC is required for the following configurations:
+
+- SQL server configured as:
+ - Always On Failover Cluster Instance (FCI), for instance-level high availability.
+ - Always On Availability Group (AG), for database-level high availability.
+- Windows File
+ - Generic File share running on active cluster node.
+ - Scale-Out File Server (SOFS), which stores files in cluster shared volumes (CSV).
+ - Storage Spaces Direct (S2D); local disks used to create storage pools across different cluster nodes.
+
+You can host the WSFC cluster on different Azure VMware Solution instances, known as Cluster-Across-Box (CAB). You can also place the WSFC cluster on a single Azure VMware Solution node. This configuration is known as Cluster-in-a-Box (CIB). We don't recommend using a CIB solution for a production implementation. Were the single Azure VMware Solution node to fail, all WSFC cluster nodes would be powered off, and the application would experience downtime. Azure VMware Solution requires a minimum of three nodes in a private cloud cluster.
+
+It's important to deploy a supported WSFC configuration. You'll want your solution to be supported on vSphere and with Azure VMware Solution. VMware provides a detailed document about WSFC on vSphere 6.7, [Setup for Failover
+Clustering and Microsoft
+Cluster Service](https://docs.vmware.com/en/VMware-vSphere/6.7/vsphere-esxi-vcenter-server-67-setup-mscs.pdf).
+
+This article focuses on WSFC on Windows Server 2016 and Windows Server 2019. Older Windows Server versions are out of [mainstream support](https://support.microsoft.com/lifecycle/search?alpha=windows%20server) and so we don't consider them here.
+
+You'll need to first [create a WSFC](https://docs.microsoft.com/windows-server/failover-clustering/create-failover-cluster). For more information on WSFC, see [Failover Clustering in Windows Server](https://docs.microsoft.com/windows-server/failover-clustering/failover-clustering-overview). Use the information we provide in this article for the specifics of a WSFC deployment on Azure VMware Solution.
+
+## Prerequisites
+
+- Azure VMware Solution environment
+- Microsoft Windows Server OS installation media
+
+## Reference architecture
+
+Azure VMware Solution provides native support for virtualized WSFC. It supports SCSI-3 Persistent Reservations (SCSI3PR) on a virtual disk level. This support is required by WSFC to arbitrate access to a shared disk between nodes. Support of SCSI3PRs enables configuration of WSFC with a disk resource shared between VMs natively on vSAN datastores.
+
+The following diagram illustrates the architecture of WSFC virtual nodes on an Azure VMware Solution private cloud. It shows where Azure VMware Solution resides, including the WSFC virtual servers (red box), in relation to the broader Azure platform. This diagram illustrates a typical hub-spoke architecture, but a similar setup is possible with the use of Azure Virtual WAN. Both offer all the value other Azure services can bring you.
+
+[![Diagram showing the architecture of WSFC virtual nodes on an Azure VMware Solution private cloud.](media/windows-server-failover-cluster/windows-server-failover-architecture.png)](media/windows-server-failover-cluster/windows-server-failover-architecture.png#lightbox)
+
+## Supported configurations
+
+Currently, the following configurations are supported:
+
+- Microsoft Windows Server 2012 or later.
+- Up to five failover clustering nodes per cluster.
+- Up to four PVSCSI adapters per VM.
+- Up to 64 disks per PVSCSI adapter.
+
+## Virtual Machine configuration requirements
+
+### WSFC node configuration parameters
+
+- Install the latest VMware Tools on each WSFC node.
+- Mixing non-shared and shared disks on a single virtual SCSI adapter isn't supported. For example, if the system disk (drive C:) is attached to SCSI0:0, the first shared disk would be attached to SCSI1:0. A VM node of a WSFC has the same virtual SCSI controller maximum as an ordinary VM - up to four (4) virtual SCSI Controllers.
+- Virtual discs SCSI IDs should be consistent between all VMs hosting nodes of the same WSFC.
+
+| **Component** | **Requirements** |
+| | |
+| VM hardware version | 11 or above to support Live vMotion. |
+| Virtual NIC | VMXNET3 paravirtualized network interface card (NIC); enable the in-guest Windows Receive Side Scaling (RSS) on the virtual NIC. |
+| Memory | Use full VM reservation memory for nodes in the WSFC cluster. |
+| Increase the I/O timeout of each WSFC node. | Modify HKEY\_LOCAL\_MACHINE\System\CurrentControlSet\Services\Disk\TimeOutValueSet to 60 seconds or more. (If you recreate the cluster, this value might be reset to its default, so you must change it again.) |
+| Windows cluster health monitoring | The value of the SameSubnetThreshold Parameter of Windows cluster health monitoring must be modified to allow 10 missed heartbeats at minimum. This is [the default in Windows Server 2016](https://techcommunity.microsoft.com/t5/failover-clustering/tuning-failover-cluster-network-thresholds/ba-p/371834). This recommendation applies to all applications using WSFC, including shared and non-shared disks. |
+
+### WSFC node - Boot disks configuration parameters
++
+| **Component** | **Requirements** |
+| | |
+| SCSI Controller Type | LSI Logic SAS |
+| Disk mode | Virtual |
+| SCSI bus sharing | None |
+| Modify advanced settings for a virtual SCSI controller hosting the boot device. | Add the following advanced settings to each WSFC node:<br /> scsiX.returnNoConnectDuringAPD = "TRUE"<br />scsiX.returnBusyOnNoConnectStatus = "FALSE"<br />Where X is the boot device SCSI bus controller ID number. By default, X is set to 0. |
+
+### WSFC node - Shared disks configuration parameters
++
+| **Component** | **Requirements** |
+| | |
+| SCSI Controller Type | VMware Paravirtualize (PVSCSI) |
+| Disk mode | Independent - Persistent (step 2 in illustration below). By using this setting, you ensure that all disks are excluded from snapshots. Snapshots aren't supported for WSFC-based VMs. |
+| SCSI bus sharing | Physical (step 1 in illustration below) |
+| Multi-writer flag | Not used |
+| Disk format | Thick provisioned. (Eager Zeroed Thick (EZT) isn't required with vSAN.) |
++
+## Non-supported scenarios
+
+The following functionalities aren't supported for WSFC on Azure VMware Solution:
+
+- NFS data stores
+- Storage Spaces
+- vSAN using iSCSI Service
+- vSAN Stretched Cluster
+- Enhanced vMotion Compatibility (EVC)
+- vSphere Fault Tolerance (FT)
+- Snapshots
+- Live (online) storage vMotion
+- N-Port ID Virtualization (NPIV)
+
+Hot changes to virtual machine hardware might disrupt the heartbeat between the WSFC nodes.
+
+The following activities aren't supported and might cause WSFC node failover:
+
+- Hot adding memory
+- Hot adding CPU
+- Using snapshots
+- Increasing the size of a shared disk
+- Pausing and resuming the virtual machine state
+- Memory over-commitment leading to ESXi swapping or VM memory ballooning
+- Hot Extend Local VMDK file, even if it isn't associated with SCSI bus sharing controller
+
+## Configure WSFC with shared disks on Azure VMware Solution vSAN
+
+1. Ensure that an Active Directory environment is available.
+2. Create virtual machines (VMs) on the vSAN datastore.
+3. Power on all VMs, configure the hostname, IP addresses, join all VMs to an Active Directory domain, and install latest available OS updates.
+4. Install the latest VMware Tools.
+5. Enable and configure the Windows Server Failover Cluster feature on each VM.
+6. Configure a Cluster Witness for quorum (a file share witness works fine).
+7. Power off all nodes of the WSFC cluster.
+8. Add one or more Paravirtual SCSI controllers (up to four) to each VM part of the WSFC. Use the settings per the previous paragraphs.
+9. On the first cluster node, add all needed shared disks using **Add New Device** > **Hard Disk**. Disk sharing should be left as **Unspecified** (default) and Disk mode as **Independent - Persistent**. Attach it to the controller(s) created in the previous steps.
+10. Continue with the remaining WSFC nodes. Add the disks created in the previous step by selecting **Add New Device** > **Existing Hard Disk**. Be sure to maintain the same disk SCSI IDs on all WSFC nodes.
+11. Power on the first WSFC node; sign in and open the disk management console (mmc). Make sure the added shared disks can be managed by the OS and are initialized. Format the disks and assign a drive letter.
+12. Power on the other WSFC nodes.
+13. Add the disk to the WSFC cluster using the **Add Disk wizard** and add them to a Cluster Shared Volume.
+14. Test a failover using the **Move disk wizard** and make sure the WSFC cluster with shared disks works properly.
+15. Run the **Validation Cluster wizard** to confirm whether the cluster and its nodes are working properly.
+
+ It's important to keep the following specific items from the Cluster Validation test in mind:
+
+ - **Validate Storage Spaces Persistent Reservation**. If you aren't using Storage Spaces with your cluster (such as on Azure VMware Solution vSAN), this test isn't applicable. You can ignore any results of the Validate Storage Spaces Persistent Reservation test including this warning. To avoid warnings, you can exclude this test.
+
+ - **Validate Network Communication**. The Cluster Validation test will throw a warning that only one network interface per cluster node is available. You may ignore this warning. Azure VMware Solution provides the required availability and performance needed, since the nodes are connected to one of the NSX-T segments. However, keep this item as part of the Cluster Validation test, as it will validate other aspects of network communication.
+
+16. Create a DRS rule to separate the WSFC VMs cross Azure VMware Solution nodes. Use the following rules: one host-to-VM affinity and one VM-to-VM anti-affinity rule. This way cluster nodes won't run on the same Azure VMware Solution host.
+
+ >[!NOTE]
+ > For this you need to create a Support Request ticket. Our Azure support organization will be able to help you with this.
+
+## Related information
+
+- [Failover Clustering in Windows Server](https://docs.microsoft.com/windows-server/failover-clustering/failover-clustering-overview)
+- [Guidelines for Microsoft Clustering on vSphere (1037959) (vmware.com)](https://kb.vmware.com/s/article/1037959)
+- [About Setup for Failover Clustering and Microsoft Cluster Service (vmware.com)](https://docs.vmware.com/en/VMware-vSphere/6.7/com.vmware.vsphere.mscs.doc/GUID-1A2476C0-CA66-4B80-B6F9-8421B6983808.html)
+- [vSAN 6.7 U3 - WSFC with Shared Disks &amp; SCSI-3 Persistent Reservations (vmware.com)](https://blogs.vmware.com/virtualblocks/2019/08/23/vsan67-u3-wsfc-shared-disksupport/)
+- [Azure VMware Solution limits](../azure-resource-manager/management/azure-subscription-service-limits.md#azure-vmware-solution-limits)
+
+## Next steps
+
+Now that you've covered setting up a WSFC in Azure VMware Solution, you may want to learn about:
+
+- Setting up your new WSFC by adding more applications that require the WSFC capability. For instance, SQL Server and SAP ASCS.
+- Setting up a backup solution.
+ - [Setting up Azure Backup Server for Azure VMware Solution](https://docs.microsoft.com/azure/azure-vmware/set-up-backup-server-for-azure-vmware-solution)
+ - [Backup solutions for Azure VMware Solution virtual machines](https://docs.microsoft.com/azure/azure-vmware/ecosystem-back-up-vms)
backup Backup Azure Arm Userestapi Restoreazurevms https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/backup/backup-azure-arm-userestapi-restoreazurevms.md
As explained [above](#restore-operations), the following request body defines pr
```json {
- "parameters": {
- "subscriptionId": "00000000-0000-0000-0000-000000000000",
- "resourceGroupName": "testVaultRG",
- "vaultName": "testVault",
- "fabricName": "Azure",
- "containerName": "IaasVMContainer;iaasvmcontainerv2;testRG;testVM",
- "protectedItemName": "VM;iaasvmcontainerv2;testRG;testVM",
- "recoveryPointId": "348916168024334",
- "api-version": "2019-05-13",
- "parameters": {
- "properties": {
+ "properties": {
"objectType": "IaasVMRestoreRequest", "recoveryPointId": "348916168024334", "recoveryType": "AlternateLocation",
As explained [above](#restore-operations), the following request body defines pr
"originalStorageAccountOption": false, "encryptionDetails": { "encryptionEnabled": false
- }
- }
- }
- }
-}
+ }
+ }
``` The response should be handled in the same way as [explained above for restoring disks](#responses).
The response should be handled in the same way as [explained above for restoring
For more information on the Azure Backup REST APIs, see the following documents: - [Azure Recovery Services provider REST API](/rest/api/recoveryservices/)-- [Get started with Azure REST API](/rest/api/azure/)
+- [Get started with Azure REST API](/rest/api/azure/)
batch Batch Pool Vm Sizes https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/batch/batch-pool-vm-sizes.md
Title: Choose VM sizes and images for pools description: How to choose from the available VM sizes and OS versions for compute nodes in Azure Batch pools Previously updated : 11/24/2020 Last updated : 03/08/2021
Batch pools in the Virtual Machine configuration support almost all [VM sizes](.
| DC | Not supported | | Dv2, DSv2 | All sizes | | Dv3, Dsv3 | All sizes |
-| Dav4 | All sizes |
-| Dasv4 | All sizes |
+| Dav4, Dasv4 | All sizes |
| Ddv4, Ddsv4 | All sizes | | Dv4, Dsv4 | Not supported | | Ev3, Esv3 | All sizes, except for E64is_v3 |
-| Eav4 | All sizes |
-| Easv4 | All sizes |
+| Eav4, Easv4 | All sizes |
| Edv4, Edsv4 | All sizes | | Ev4, Esv4 | Not supported | | F, Fs | All sizes |
Batch pools in the Virtual Machine configuration support almost all [VM sizes](.
| NC | All sizes | | NCv2 | All sizes | | NCv3 | All sizes |
-| NCasT4_v3 | None - not yet available |
+| NCasT4_v3 | All sizes |
| ND | All sizes | | NDv2 | None - not yet available | | NV | All sizes |
cognitive-services Anomaly Detection Streaming Databricks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Anomaly-Detector/tutorials/anomaly-detection-streaming-databricks.md
- Title: "Tutorial: Anomaly detection on streaming data using Azure Databricks"-
-description: Learn how to use the Anomaly Detector API and Azure Databricks to monitor anomalies in your data.
------- Previously updated : 03/05/2020---
-# Tutorial: Anomaly detection on streaming data using Azure Databricks
-
-[Azure Databricks](https://azure.microsoft.com/services/databricks/) is a fast, easy, and collaborative Apache SparkΓÇôbased analytics service. The Anomaly Detector API, part of Azure Cognitive Services, provides a way of monitoring your time series data. Use this tutorial to run anomaly detection on a stream of data in near real-time using Azure Databricks. You'll ingest twitter data using Azure Event Hubs, and import them into Azure Databricks using the Spark Event Hubs connector. Afterwards, you'll use the API to detect anomalies on the streamed data.
-
-The following illustration shows the application flow:
-
-![Azure Databricks with Event Hubs and Cognitive Services](../media/tutorials/databricks-cognitive-services-tutorial.png "Azure Databricks with Event Hubs and Cognitive Services")
-
-This tutorial covers the following tasks:
-
-> [!div class="checklist"]
-> * Create an Azure Databricks workspace
-> * Create a Spark cluster in Azure Databricks
-> * Create a Twitter app to access streaming data
-> * Create notebooks in Azure Databricks
-> * Attach libraries for Event Hubs and Twitter API
-> * Create an Anomaly Detector resource and retrieve the access key
-> * Send tweets to Event Hubs
-> * Read tweets from Event Hubs
-> * Run anomaly detection on tweets
-
-> [!Note]
-> * This tutorial introduces an approach to implementing the recommended [solution architecture](https://azure.microsoft.com/solutions/architecture/anomaly-detector-process/) for the Anomaly Detector API.
-> * This tutorial cannot be completed with a free tier (`F0`) subscription for the Anomaly Detector API or Azure Databricks.
-
-Create an [Azure subscription](https://azure.microsoft.com/free/cognitive-services) if you don't have one.
-
-## Prerequisites
--- An [Azure Event Hubs namespace](../../../event-hubs/event-hubs-create.md) and event hub.--- The [connection string](../../../event-hubs/event-hubs-get-connection-string.md) to access the Event Hubs namespace. The connection string should have a similar format to:-
- `Endpoint=sb://<namespace>.servicebus.windows.net/;SharedAccessKeyName=<key name>;SharedAccessKey=<key value>`.
--- The shared access policy name and policy key for Event Hubs.-
-See the Azure Event Hubs [quickstart](../../../event-hubs/event-hubs-create.md) for information about creating a namespace and event hub.
-
-## Create an Azure Databricks workspace
-
-In this section, you create an Azure Databricks workspace using the [Azure portal](https://portal.azure.com/).
-
-1. In the Azure portal, select **Create a resource** > **Analytics** > **Azure Databricks**.
-
- ![Azure Databricks on portal](../media/tutorials/azure-databricks-on-portal.png "Databricks on Azure portal")
-
-3. Under **Azure Databricks Service**, provide the following values to create a Databricks workspace:
--
- |Property |Description |
- |||
- |**Workspace name** | Provide a name for your Databricks workspace |
- |**Subscription** | From the drop-down, select your Azure subscription. |
- |**Resource group** | Specify whether you want to create a new resource group or use an existing one. A resource group is a container that holds related resources for an Azure solution. For more information, see [Azure Resource Group overview](../../../azure-resource-manager/management/overview.md). |
- |**Location** | Select **East US 2** or one of any other available regions. See [Azure services available by region](https://azure.microsoft.com/regions/services/) for region availability. |
- |**Pricing Tier** | Choose between **Standard** or **Premium**. Do NOT choose **Trial**. For more information on these tiers, see [Databricks pricing page](https://azure.microsoft.com/pricing/details/databricks/). |
-
- Select **Create**.
-
-4. The workspace creation takes a few minutes.
-
-## Create a Spark cluster in Databricks
-
-1. In the Azure portal, go to the Databricks workspace that you created, and then select **Launch Workspace**.
-
-2. You're redirected to the Azure Databricks portal. From the portal, select **New Cluster**.
-
- ![Databricks on Azure](../media/tutorials/databricks-on-azure.png "Databricks on Azure")
-
-3. In the **New Cluster** page, provide the values to create a cluster.
-
- ![Create Databricks Spark cluster on Azure](../media/tutorials/create-databricks-spark-cluster.png "Create Databricks Spark cluster on Azure")
-
- Accept all other default values other than the following:
-
- * Enter a name for the cluster.
- * For this article, create a cluster with **5.2** runtime. Do NOT select **5.3** runtime.
- * Make sure the **Terminate after \_\_ minutes of inactivity** checkbox is selected. Provide a duration (in minutes) to terminate the cluster, if the cluster isn't being used.
-
- Select **Create cluster**.
-4. The cluster creation takes several minutes. Once the cluster is running, you can attach notebooks to the cluster and run Spark jobs.
-
-## Create a Twitter application
-
-To receive a stream of tweets, you must create an application in Twitter. Follow the steps to create a Twitter application and record the values that you need to complete this tutorial.
-
-1. From a web browser, go to [Twitter Application Management](https://apps.twitter.com/), and select **Create New App**.
-
- ![Create Twitter application](../media/tutorials/databricks-create-twitter-app.png "Create Twitter application")
-
-2. In the **Create an application** page, provide the details for the new app, and then select **Create your Twitter application**.
-
- ![Twitter application details](../media/tutorials/databricks-provide-twitter-app-details.png "Twitter application details")
-
-3. In the application page, select the **Keys and Access Tokens** tab and copy the values for **Consumer Key** and **Consumer Secret**. Also, select **Create my access token** to generate the access tokens. Copy the values for **Access Token** and **Access Token Secret**.
-
- ![Twitter application details 2](../media/tutorials/twitter-app-key-secret.png "Twitter application details")
-
-Save the values that you retrieved for the Twitter application. You need the values later in the tutorial.
-
-## Attach libraries to Spark cluster
-
-In this tutorial, you use the Twitter APIs to send tweets to Event Hubs. You also use the [Apache Spark Event Hubs connector](https://github.com/Azure/azure-event-hubs-spark) to read and write data into Azure Event Hubs. To use these APIs as part of your cluster, add them as libraries to Azure Databricks and then associate them with your Spark cluster. The following instructions show how to add the libraries to the **Shared** folder in your workspace.
-
-1. In the Azure Databricks workspace, select **Workspace**, and then right-click **Shared**. From the context menu, select **Create** > **Library**.
-
- ![Add library dialog box](../media/tutorials/databricks-add-library-option.png "Add library dialog box")
-
-2. In the New Library page, for **Source** select **Maven**. For **Coordinates**, enter the coordinate for the package you want to add. Here is the Maven coordinates for the libraries used in this tutorial:
-
- * Spark Event Hubs connector - `com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.10`
- * Twitter API - `org.twitter4j:twitter4j-core:4.0.7`
-
- ![Provide Maven coordinates](../media/tutorials/databricks-eventhub-specify-maven-coordinate.png "Provide Maven coordinates")
-
-3. Select **Create**.
-
-4. Select the folder where you added the library, and then select the library name.
-
- ![Select library to add](../media/tutorials/select-library.png "Select library to add")
-
-5. If there's no cluster in the library page, select **Clusters** and run the cluster you've created. Wait until the state shows 'Running' and then go back to the library page.
-On the library page, select the cluster where you want to use the library, and then select **Install**. Once the library is successfully associated with the cluster, the status immediately changes to **Installed**.
-
- ![Install library to cluster](../media/tutorials/databricks-library-attached.png "Install library to cluster")
-
-6. Repeat these steps for the Twitter package, `twitter4j-core:4.0.7`.
-
-## Get a Cognitive Services access key
-
-In this tutorial, you use the [Azure Cognitive Services Anomaly Detector APIs](../overview.md) to run anomaly detection on a stream of tweets in near real time. Before you use the APIs, you must create an Anomaly Detector resource on Azure and retrieve an access key to use the Anomaly Detector APIs.
-
-1. Sign in to the [Azure portal](https://portal.azure.com/).
-
-2. Select **+ Create a resource**.
-
-3. Under Azure Marketplace, Select **AI + Machine Learning** > **See all** > **Cognitive Services - More** > **Anomaly Detector**. Or you could use [this link](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesAnomalyDetector) to go to the **Create** dialog box directly.
-
- ![Create Anomaly Detector resource](../media/tutorials/databricks-cognitive-services-anomaly-detector.png "Create Anomaly Detector resource")
-
-4. In the **Create** dialog box, provide the following values:
-
- |Value |Description |
- |||
- |Name | A name for the Anomaly Detector resource. |
- |Subscription | The Azure subscription the resource will be associated with. |
- |Location | An Azure location. |
- |Pricing tier | A pricing tier for the service. For more information about Anomaly Detector pricing, see [pricing page](https://azure.microsoft.com/pricing/details/cognitive-services/anomaly-detector/). |
- |Resource group | Specify whether you want to create a new resource group or select an existing one. |
--
- Select **Create**.
-
-5. After the resource is created, from the **Overview** tab, copy and save the **Endpoint** URL, as shown in the screenshot. Then select **Show access keys**.
-
- ![Show access keys](../media/tutorials/cognitive-services-get-access-keys.png "Show access keys")
-
-6. Under **Keys**, select the copy icon against the key you want to use. Save the access key.
-
- ![Copy access keys](../media/tutorials/cognitive-services-copy-access-keys.png "Copy access keys")
-
-## Create notebooks in Databricks
-
-In this section, you create two notebooks in Databricks workspace with the following names
--- **SendTweetsToEventHub** - A producer notebook you use to get tweets from Twitter and stream them to Event Hubs.-- **AnalyzeTweetsFromEventHub** - A consumer notebook you use to read the tweets from Event Hubs and run anomaly detection.-
-1. In the Azure Databricks workspace, select **Workspace** from the left pane. From the **Workspace** drop-down, select **Create**, and then select **Notebook**.
-
- ![Create notebook in Databricks](../media/tutorials/databricks-create-notebook.png "Create notebook in Databricks")
-
-2. In the **Create Notebook** dialog box, enter **SendTweetsToEventHub** as name, select **Scala** as the language, and select the Spark cluster that you created earlier.
-
- ![Notebook details](../media/tutorials/databricks-notebook-details.png "Create notebook in Databricks")
-
- Select **Create**.
-
-3. Repeat the steps to create the **AnalyzeTweetsFromEventHub** notebook.
-
-## Send tweets to Event Hubs
-
-In the **SendTweetsToEventHub** notebook, paste the following code, and replace the placeholder with values for your Event Hubs namespace and Twitter application that you created earlier. This notebook extracts creation time and number of "Like"s from tweets with the keyword "Azure" and stream those as events into Event Hubs in real time.
-
-```scala
-//
-// Send Data to Eventhub
-//
-
-import scala.collection.JavaConverters._
-import com.microsoft.azure.eventhubs._
-import java.util.concurrent._
-import com.google.gson.{Gson, GsonBuilder, JsonParser}
-import java.util.Date
-import scala.util.control._
-import twitter4j._
-import twitter4j.TwitterFactory
-import twitter4j.Twitter
-import twitter4j.conf.ConfigurationBuilder
-
-// Event Hub Config
-val namespaceName = "[Placeholder: EventHub namespace]"
-val eventHubName = "[Placeholder: EventHub name]"
-val sasKeyName = "[Placeholder: EventHub access key name]"
-val sasKey = "[Placeholder: EventHub access key key]"
-val connStr = new ConnectionStringBuilder()
- .setNamespaceName(namespaceName)
- .setEventHubName(eventHubName)
- .setSasKeyName(sasKeyName)
- .setSasKey(sasKey)
-
-// Connect to the Event Hub
-val pool = Executors.newScheduledThreadPool(1)
-val eventHubClient = EventHubClient.create(connStr.toString(), pool)
-
-def sendEvent(message: String) = {
- val messageData = EventData.create(message.getBytes("UTF-8"))
- eventHubClient.get().send(messageData)
- System.out.println("Sent event: " + message + "\n")
-}
-
-case class MessageBody(var timestamp: Date, var favorite: Int)
-val gson: Gson = new GsonBuilder().setDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'").create()
-
-val twitterConsumerKey = "[Placeholder: Twitter consumer key]"
-val twitterConsumerSecret = "[Placeholder: Twitter consumer seceret]"
-val twitterOauthAccessToken = "[Placeholder: Twitter oauth access token]"
-val twitterOauthTokenSecret = "[Placeholder: Twitter oauth token secret]"
-
-val cb = new ConfigurationBuilder()
-cb.setDebugEnabled(true)
- .setOAuthConsumerKey(twitterConsumerKey)
- .setOAuthConsumerSecret(twitterConsumerSecret)
- .setOAuthAccessToken(twitterOauthAccessToken)
- .setOAuthAccessTokenSecret(twitterOauthTokenSecret)
-
-val twitterFactory = new TwitterFactory(cb.build())
-val twitter = twitterFactory.getInstance()
-
-// Getting tweets with keyword "Azure" and sending them to the Event Hub in realtime!
-
-val query = new Query(" #Azure ")
-query.setCount(100)
-query.lang("en")
-
-var finished = false
-var maxStatusId = Long.MinValue
-var preMaxStatusId = Long.MinValue
-val innerLoop = new Breaks
-while (!finished) {
- val result = twitter.search(query)
- val statuses = result.getTweets()
- var lowestStatusId = Long.MaxValue
- innerLoop.breakable {
- for (status <- statuses.asScala) {
- if (status.getId() <= preMaxStatusId) {
- preMaxStatusId = maxStatusId
- innerLoop.break
- }
- if(!status.isRetweet()) {
- sendEvent(gson.toJson(new MessageBody(status.getCreatedAt(), status.getFavoriteCount())))
- }
- lowestStatusId = Math.min(status.getId(), lowestStatusId)
- maxStatusId = Math.max(status.getId(), maxStatusId)
- }
- }
-
- if (lowestStatusId == Long.MaxValue) {
- preMaxStatusId = maxStatusId
- }
- Thread.sleep(10000)
- query.setMaxId(lowestStatusId - 1)
-}
-
-// Close connection to the Event Hub
-eventHubClient.get().close()
-pool.shutdown()
-```
-
-To run the notebook, press **SHIFT + ENTER**. You see an output as shown in the following snippet. Each event in the output is a combination of timestamp and number of "Like"s ingested into the Event Hubs.
-
-```output
- Sent event: {"timestamp":"2019-04-24T09:39:40.000Z","favorite":0}
-
- Sent event: {"timestamp":"2019-04-24T09:38:48.000Z","favorite":1}
-
- Sent event: {"timestamp":"2019-04-24T09:38:36.000Z","favorite":0}
-
- Sent event: {"timestamp":"2019-04-24T09:37:27.000Z","favorite":0}
-
- Sent event: {"timestamp":"2019-04-24T09:37:00.000Z","favorite":2}
-
- Sent event: {"timestamp":"2019-04-24T09:31:11.000Z","favorite":0}
-
- Sent event: {"timestamp":"2019-04-24T09:30:15.000Z","favorite":0}
-
- Sent event: {"timestamp":"2019-04-24T09:30:02.000Z","favorite":1}
-
- ...
- ...
-```
-
-## Read tweets from Event Hubs
-
-In the **AnalyzeTweetsFromEventHub** notebook, paste the following code, and replace the placeholder with values for your Anomaly Detector resource that you created earlier. This notebook reads the tweets that you earlier streamed into Event Hubs using the **SendTweetsToEventHub** notebook.
-
-First, write a client to call Anomaly detector.
-```scala
-
-//
-// Anomaly Detection Client
-//
-
-import java.io.{BufferedReader, DataOutputStream, InputStreamReader}
-import java.net.URL
-import java.sql.Timestamp
-
-import com.google.gson.{Gson, GsonBuilder, JsonParser}
-import javax.net.ssl.HttpsURLConnection
-
-case class Point(var timestamp: Timestamp, var value: Double)
-case class Series(var series: Array[Point], var maxAnomalyRatio: Double, var sensitivity: Int, var granularity: String)
-case class AnomalySingleResponse(var isAnomaly: Boolean, var isPositiveAnomaly: Boolean, var isNegativeAnomaly: Boolean, var period: Int, var expectedValue: Double, var upperMargin: Double, var lowerMargin: Double, var suggestedWindow: Int)
-case class AnomalyBatchResponse(var expectedValues: Array[Double], var upperMargins: Array[Double], var lowerMargins: Array[Double], var isAnomaly: Array[Boolean], var isPositiveAnomaly: Array[Boolean], var isNegativeAnomaly: Array[Boolean], var period: Int)
-
-object AnomalyDetector extends Serializable {
-
- // Cognitive Services API connection settings
- val subscriptionKey = "[Placeholder: Your Anomaly Detector resource access key]"
- val endpoint = "[Placeholder: Your Anomaly Detector resource endpoint]"
- val latestPointDetectionPath = "/anomalydetector/v1.0/timeseries/last/detect"
- val batchDetectionPath = "/anomalydetector/v1.0/timeseries/entire/detect";
- val latestPointDetectionUrl = new URL(endpoint + latestPointDetectionPath)
- val batchDetectionUrl = new URL(endpoint + batchDetectionPath)
- val gson: Gson = new GsonBuilder().setDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'").setPrettyPrinting().create()
-
- def getConnection(path: URL): HttpsURLConnection = {
- val connection = path.openConnection().asInstanceOf[HttpsURLConnection]
- connection.setRequestMethod("POST")
- connection.setRequestProperty("Content-Type", "text/json")
- connection.setRequestProperty("Ocp-Apim-Subscription-Key", subscriptionKey)
- connection.setDoOutput(true)
- return connection
- }
-
- // Handles the call to Cognitive Services API.
- def processUsingApi(request: String, path: URL): String = {
- println(request)
- val encoded_text = request.getBytes("UTF-8")
- val connection = getConnection(path)
- val wr = new DataOutputStream(connection.getOutputStream())
- wr.write(encoded_text, 0, encoded_text.length)
- wr.flush()
- wr.close()
-
- val response = new StringBuilder()
- val in = new BufferedReader(new InputStreamReader(connection.getInputStream()))
- var line = in.readLine()
- while (line != null) {
- response.append(line)
- line = in.readLine()
- }
- in.close()
- return response.toString()
- }
-
- // Calls the Latest Point Detection API.
- def detectLatestPoint(series: Series): Option[AnomalySingleResponse] = {
- try {
- println("Process Timestamp: " + series.series.apply(series.series.length-1).timestamp.toString + ", size: " + series.series.length)
- val response = processUsingApi(gson.toJson(series), latestPointDetectionUrl)
- println(response)
- // Deserializing the JSON response from the API into Scala types
- val anomaly = gson.fromJson(response, classOf[AnomalySingleResponse])
- Thread.sleep(5000)
- return Some(anomaly)
- } catch {
- case e: Exception => {
- println(e)
- e.printStackTrace()
- return None
- }
- }
- }
-
- // Calls the Batch Detection API.
- def detectBatch(series: Series): Option[AnomalyBatchResponse] = {
- try {
- val response = processUsingApi(gson.toJson(series), batchDetectionUrl)
- println(response)
- // Deserializing the JSON response from the API into Scala types
- val anomaly = gson.fromJson(response, classOf[AnomalyBatchResponse])
- Thread.sleep(5000)
- return Some(anomaly)
- } catch {
- case e: Exception => {
- println(e)
- return None
- }
- }
- }
-}
-```
-
-To run the notebook, press **SHIFT + ENTER**. You see an output as shown in the following snippet.
-
-```scala
-import java.io.{BufferedReader, DataOutputStream, InputStreamReader}
-import java.net.URL
-import java.sql.Timestamp
-import com.google.gson.{Gson, GsonBuilder, JsonParser}
-import javax.net.ssl.HttpsURLConnection
-defined class Point
-defined class Series
-defined class AnomalySingleResponse
-defined class AnomalyBatchResponse
-defined object AnomalyDetector
-```
-
-Then prepare an aggregation function for future usage.
-```scala
-//
-// User Defined Aggregation Function for Anomaly Detection
-//
-
-import org.apache.spark.sql.Row
-import org.apache.spark.sql.expressions.{MutableAggregationBuffer, UserDefinedAggregateFunction}
-import org.apache.spark.sql.types.{StructType, TimestampType, FloatType, MapType, BooleanType, DataType}
-import scala.collection.immutable.ListMap
-
-class AnomalyDetectorAggregationFunction extends UserDefinedAggregateFunction {
- override def inputSchema: StructType = new StructType().add("timestamp", TimestampType).add("value", FloatType)
-
- override def bufferSchema: StructType = new StructType().add("point", MapType(TimestampType, FloatType))
-
- override def dataType: DataType = BooleanType
-
- override def deterministic: Boolean = false
-
- override def initialize(buffer: MutableAggregationBuffer): Unit = {
- buffer(0) = Map()
- }
-
- override def update(buffer: MutableAggregationBuffer, input: Row): Unit = {
- buffer(0) = buffer.getAs[Map[java.sql.Timestamp, Float]](0) + (input.getTimestamp(0) -> input.getFloat(1))
- }
-
- override def merge(buffer1: MutableAggregationBuffer, buffer2: Row): Unit = {
- buffer1(0) = buffer1.getAs[Map[java.sql.Timestamp, Float]](0) ++ buffer2.getAs[Map[java.sql.Timestamp, Float]](0)
- }
-
- override def evaluate(buffer: Row): Any = {
- val points = buffer.getAs[Map[java.sql.Timestamp, Float]](0)
- if (points.size > 12) {
- val sorted_points = ListMap(points.toSeq.sortBy(_._1.getTime):_*)
- var detect_points: List[Point] = List()
- sorted_points.keys.foreach {
- key => detect_points = detect_points :+ new Point(key, sorted_points(key))
- }
--
- // 0.25 is maxAnomalyRatio. It represents 25%, max anomaly ratio in a time series.
- // 95 is the sensitivity of the algorithms.
- // Check Anomaly detector API reference (https://aka.ms/anomaly-detector-rest-api-ref)
-
- val series: Series = new Series(detect_points.toArray, 0.25, 95, "hourly")
- val response: Option[AnomalySingleResponse] = AnomalyDetector.detectLatestPoint(series)
- if (!response.isEmpty) {
- return response.get.isAnomaly
- }
- }
-
- return None
- }
-}
-
-```
-
-To run the notebook, press **SHIFT + ENTER**. You see an output as shown in the following snippet.
-
-```scala
-import org.apache.spark.sql.Row
-import org.apache.spark.sql.expressions.{MutableAggregationBuffer, UserDefinedAggregateFunction}
-import org.apache.spark.sql.types.{StructType, TimestampType, FloatType, MapType, BooleanType, DataType}
-import scala.collection.immutable.ListMap
-defined class AnomalyDetectorAggregationFunction
-```
-
-Then load data from event hub for anomaly detection. Replace the placeholder with values for your Azure Event Hubs that you created earlier.
-
-```scala
-//
-// Load Data from Eventhub
-//
-
-import org.apache.spark.eventhubs._
-import org.apache.spark.sql.types._
-import org.apache.spark.sql.functions._
-
-val connectionString = ConnectionStringBuilder("[Placeholder: EventHub namespace connection string]")
- .setEventHubName("[Placeholder: EventHub name]")
- .build
-
-val customEventhubParameters =
- EventHubsConf(connectionString)
- .setConsumerGroup("$Default")
- .setMaxEventsPerTrigger(100)
-
-val incomingStream = spark.readStream.format("eventhubs").options(customEventhubParameters.toMap).load()
-
-val messages =
- incomingStream
- .withColumn("enqueuedTime", $"enqueuedTime".cast(TimestampType))
- .withColumn("body", $"body".cast(StringType))
- .select("enqueuedTime", "body")
-
-val bodySchema = new StructType().add("timestamp", TimestampType).add("favorite", IntegerType)
-
-val msgStream = messages.select(from_json('body, bodySchema) as 'fields).select("fields.*")
-
-msgStream.printSchema
-
-display(msgStream)
-
-```
-
-The output now resembles the following image. Note that your date in the table might be different from the date in this tutorial as the data is real time.
-![Load Data From Event hub](../media/tutorials/load-data-from-eventhub.png "Load Data From Event Hub")
-
-You have now streamed data from Azure Event Hubs into Azure Databricks at near real time using the Event Hubs connector for Apache Spark. For more information on how to use the Event Hubs connector for Spark, see the [connector documentation](https://github.com/Azure/azure-event-hubs-spark/tree/master/docs).
---
-## Run anomaly detection on tweets
-
-In this section, you run anomaly detection on the tweets received using the Anomaly detector API. For this section, you add the code snippets to the same **AnalyzeTweetsFromEventHub** notebook.
-
-To do anomaly detection, first, you need to aggregate your metric count by hour.
-```scala
-//
-// Aggregate Metric Count by Hour
-//
-
-// If you want to change granularity, change the groupBy window.
-val groupStream = msgStream.groupBy(window($"timestamp", "1 hour"))
- .agg(avg("favorite").alias("average"))
- .withColumn("groupTime", $"window.start")
- .select("groupTime", "average")
-
-groupStream.printSchema
-
-display(groupStream)
-```
-The output now resembles the following snippets.
-```
-groupTime average
-2019-04-23T04:00:00.000+0000 24
-2019-04-26T19:00:00.000+0000 47.888888888888886
-2019-04-25T12:00:00.000+0000 32.25
-2019-04-26T09:00:00.000+0000 63.4
-...
-...
-
-```
-
-Then get the aggregated output result to Delta. Because anomaly detection requires a longer history window, we're using Delta to keep the history data for the point you want to detect.
-Replace the "[Placeholder: table name]" with a qualified Delta table name to be created (for example, "tweets"). Replace "[Placeholder: folder name for checkpoints]" with a string value that's unique each time you run this code (for example, "etl-from-eventhub-20190605").
-To learn more about Delta Lake on Azure Databricks, please refer to [Delta Lake Guide](/databricks/delta/)
--
-```scala
-//
-// Output Aggregation Result to Delta
-//
-
-groupStream.writeStream
- .format("delta")
- .outputMode("complete")
- .option("checkpointLocation", "/delta/[Placeholder: table name]/_checkpoints/[Placeholder: folder name for checkpoints]")
- .table("[Placeholder: table name]")
-
-```
-
-Replace the "[Placeholder: table name]" with the same Delta table name you've selected above.
-```scala
-//
-// Show Aggregate Result
-//
-
-val twitterCount = spark.sql("SELECT COUNT(*) FROM [Placeholder: table name]")
-twitterCount.show()
-
-val twitterData = spark.sql("SELECT * FROM [Placeholder: table name] ORDER BY groupTime")
-twitterData.show(200, false)
-
-display(twitterData)
-```
-The output as below:
-```
-groupTime average
-2019-04-08T01:00:00.000+0000 25.6
-2019-04-08T02:00:00.000+0000 6857
-2019-04-08T03:00:00.000+0000 71
-2019-04-08T04:00:00.000+0000 55.111111111111114
-2019-04-08T05:00:00.000+0000 2203.8
-...
-...
-
-```
-
-Now the aggregated time series data is continuously ingested into the Delta. Then you can schedule an hourly job to detect the anomaly of latest point.
-Replace the "[Placeholder: table name]" with the same Delta table name you've selected above.
-
-```scala
-//
-// Anomaly Detection
-//
-
-import java.time.Instant
-import java.time.format.DateTimeFormatter
-import java.time.ZoneOffset
-import java.time.temporal.ChronoUnit
-
-val detectData = spark.read.format("delta").table("[Placeholder: table name]")
-
-// You could use Databricks to schedule an hourly job and always monitor the latest data point
-// Or you could specify a const value here for testing purpose
-// For example, val endTime = Instant.parse("2019-04-16T00:00:00Z")
-val endTime = Instant.now()
-
-// This is when your input of anomaly detection starts. It is hourly time series in this tutorial, so 72 means 72 hours ago from endTime.
-val batchSize = 72
-val startTime = endTime.minus(batchSize, ChronoUnit.HOURS)
-
-val DATE_TIME_FORMATTER = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss").withZone(ZoneOffset.UTC);
-
-val series = detectData.filter($"groupTime" <= DATE_TIME_FORMATTER.format(endTime))
- .filter($"groupTime" > DATE_TIME_FORMATTER.format(startTime))
- .sort($"groupTime")
-
-series.createOrReplaceTempView("series")
-
-//series.show()
-
-// Register the function to access it
-spark.udf.register("anomalydetect", new AnomalyDetectorAggregationFunction)
-
-val adResult = spark.sql("SELECT '" + endTime.toString + "' as datetime, anomalydetect(groupTime, average) as anomaly FROM series")
-adResult.show()
-```
-Result as below:
-
-```
-+--+-+
-| timestamp|anomaly|
-+--+-+
-|2019-04-16T00:00:00Z| false|
-+--+-+
-```
-
-That's it! Using Azure Databricks, you have successfully streamed data into Azure Event Hubs, consumed the stream data using the Event Hubs connector, and then run anomaly detection on streaming data in near real time.
-Although in this tutorial, the granularity is hourly, you can always change the granularity to meet your need.
-
-## Clean up resources
-
-After you have finished running the tutorial, you can terminate the cluster. To do so, in the Azure Databricks workspace, select **Clusters** from the left pane. For the cluster you want to terminate, move the cursor over the ellipsis under **Actions** column, and select the **Terminate** icon and then select **Confirm**.
-
-![Stop a Databricks cluster](../media/tutorials/terminate-databricks-cluster.png "Stop a Databricks cluster")
-
-If you don't manually terminate the cluster it will automatically stop, provided you selected the **Terminate after \_\_ minutes of inactivity** checkbox while creating the cluster. In such a case, the cluster will automatically stop if it has been inactive for the specified time.
-
-## Next steps
-
-In this tutorial, you learned how to use Azure Databricks to stream data into Azure Event Hubs and then read the streaming data from Event Hubs in real time. Advance to the next tutorial to learn how to call the Anomaly Detector API and visualize anomalies using Power BI desktop.
-
-> [!div class="nextstepaction"]
->[Batch anomaly detection with Power BI desktop](batch-anomaly-detection-powerbi.md)
cognitive-services Howtocallvisionapi https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Computer-vision/Vision-API-How-to-Topics/HowToCallVisionAPI.md
These errors are identical to those in vision.analyze, with the additional NotSu
## Next steps
-To use the REST API, go to [Computer Vision API Reference](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-1-ga/operations/56f91f2e778daf14a499f21b).
+To use the REST API, go to [Computer Vision API Reference](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-3/operations/56f91f2e778daf14a499f21b).
cognitive-services Concept Recognizing Text https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Computer-vision/concept-recognizing-text.md
See the following example of a successful JSON response:
``` ## Natural reading order output (Latin only)
-With the [Read 3.2 preview API](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-2/operations/5d986960601faab4bf452005), specify the order in which the text lines are output with the `readingOrder` query parameter. Use `natural` for a more human-friendly reading order output as shown in the following example. This feature is only supported for Latin languages.
+With the [Read 3.2 preview API](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-3/operations/5d986960601faab4bf452005), specify the order in which the text lines are output with the `readingOrder` query parameter. Use `natural` for a more human-friendly reading order output as shown in the following example. This feature is only supported for Latin languages.
:::image border type="content" source="./Images/ocr-reading-order-example.png" alt-text="OCR Reading order example"::: ## Handwritten classification for text lines (Latin only)
-The [Read 3.2 preview API](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-2/operations/5d986960601faab4bf452005) response includes classifying whether each text line is of handwriting style or not, along with a confidence score. This feature is only supported for Latin languages. The following example shows the handwritten classification for the text in the image.
+The [Read 3.2 preview API](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-3/operations/5d986960601faab4bf452005) response includes classifying whether each text line is of handwriting style or not, along with a confidence score. This feature is only supported for Latin languages. The following example shows the handwritten classification for the text in the image.
:::image border type="content" source="./Images/ocr-handwriting-classification.png" alt-text="OCR handwriting classification example"::: ## Select page(s) or page ranges for text extraction
-With the [Read 3.2 preview API](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-2/operations/5d986960601faab4bf452005), for large multi-page documents, use the `pages` query parameter to specify page numbers or page ranges to extract text from only those pages. The following example shows a document with 10 pages, with text extracted for both cases - all pages (1-10) and selected pages (3-6).
+With the [Read 3.2 preview API](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-3/operations/5d986960601faab4bf452005), for large multi-page documents, use the `pages` query parameter to specify page numbers or page ranges to extract text from only those pages. The following example shows a document with 10 pages, with text extracted for both cases - all pages (1-10) and selected pages (3-6).
:::image border type="content" source="./Images/ocr-select-pages.png" alt-text="Selected pages output":::
The [OCR API](https://westcentralus.dev.cognitive.microsoft.com/docs/services/co
- Get started with the [Computer Vision REST API or client library quickstarts](./quickstarts-sdk/client-library.md). - Learn about the [Read 3.1 REST API](https://westcentralus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-1-ga/operations/5d986960601faab4bf452005).-- Learn about the [Read 3.2 public preview REST API](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-2/operations/5d986960601faab4bf452005) with support for a total of 73 languages.
+- Learn about the [Read 3.2 public preview REST API](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-3/operations/5d986960601faab4bf452005) with support for a total of 73 languages.
cognitive-services Spatial Analysis Camera Placement https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Computer-vision/spatial-analysis-camera-placement.md
Use the table below to determine the camera's distance from the focal point base
| Camera height | Camera-to-focal-point distance (min/max) | | - | - |
-| 8' | 10'-13' |
-| 10' | 7'-13' |
-| 12' | 10'-17' |
-| 14' | 11'-18' |
-| 16' | 12'-22' |
-| 20' | 15'-30' |
+| 8' | 4.6'-8' |
+| 10' | 5.8'-10' |
+| 12' | 7'-12' |
+| 14' | 8'-14'' |
+| 16' | 9.2'-16' |
+| 20' | 11.5'-20' |
The following illustration simulates camera views from the closest and farthest camera-to-focal-point distances.
This section describes acceptable camera angle mounting ranges. These mounting r
### Line configuration
-The following table shows recommendations for cameras configured for the **cognitiveservices.vision.spatialanalysis-personcrossingline** operation. For Face mask detection, +/-30 degrees is the optimal camera mounting angle for camera height between 8ΓÇÖ to 12ΓÇÖ.
+For the **cognitiveservices.vision.spatialanalysis-personcrossingline** operation, +/-5┬░ is the optimal camera mounting angle to maximize accuracy.
-| Camera height | Camera-to-focal-point distance | Optimal camera mounting angle (min/max) |
-| - | | |
-| 8' | 9' | +/-40┬░ |
-| 10' | 10' | +/-30┬░ |
-| 12' | 13' | +/-20┬░ |
-| 16' | 18' | +/-10┬░ |
-| 20' | 22' | +/-10┬░ |
+For Face mask detection, +/-30 degrees is the optimal camera mounting angle for camera height between 8ΓÇÖ to 12ΓÇÖ.
The following illustration simulates camera views using the leftmost (-) and rightmost (+) mounting angle recommendations for using **cognitiveservices.vision.spatialanalysis-personcrossingline** to do entrance counting in a door way.
cognitive-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Computer-vision/whats-new.md
Learn what's new in the service. These items may be release notes, videos, blog posts, and other types of information. Bookmark this page to stay up to date with the service.
+## March 2021
+
+### Computer Vision 3.2 Public Preview update
+
+The Computer Vision API v3.2 public preview has been updated. The preview release has all Computer Vision features along with updated Read and Analyze APIs.
+
+> [!div class="nextstepaction"]
+> [See Computer Vision v3.2 public preview 3](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-3/operations/5d986960601faab4bf452005)
+ ## February 2021 ### Read API v3.2 Public Preview with OCR support for 73 languages
Computer Vision's Read API v3.2 public preview, available as cloud service and D
[Learn more](concept-recognizing-text.md) about the Read API. > [!div class="nextstepaction"]
-> [Use the Read API v3.2 Public Preview](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-2/operations/5d986960601faab4bf452005)
+> [Use the Read API v3.2 Public Preview](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2-preview-3/operations/5d986960601faab4bf452005)
## January 2021
cognitive-services Luis Concept Enterprise https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/LUIS/luis-concept-enterprise.md
If your app is meant to predict a wide variety of user utterances, consider impl
Schedule a periodic [review of endpoint utterances](luis-how-to-review-endpoint-utterances.md) for active learning, such as every two weeks, then retrain and republish. ## When you need to have more than 500 intents
-Assume you're developing an office assistant that has over 500 intents. If 200 intents relate to scheduling meetings, 200 are about reminders, 200 are about getting information about colleagues, and 200 are for sending email, group intents so that each group is in a single app, then create a top-level app containing each intent. Use the [dispatch model](#dispatch-tool-and-model) to build the top-level app. Then change your bot to use the cascading call as shown in the [dispatch model's tutorial](/azure/bot-service/bot-builder-tutorial-dispatch?branch=master&tabs=cs&view=azure-bot-service-4.0).
+Assume you're developing an office assistant that has over 500 intents. If 200 intents relate to scheduling meetings, 200 are about reminders, 200 are about getting information about colleagues, and 200 are for sending email, group intents so that each group is in a single app, then create a top-level app containing each intent. Use the [dispatch model](#dispatch-tool-and-model) to build the top-level app. Then change your bot to use the cascading call as shown in the [dispatch model's tutorial](/azure/bot-service/bot-builder-tutorial-dispatch?tabs=cs&view=azure-bot-service-4.0).
## When you need to combine several LUIS and QnA maker apps
-If you have several LUIS and QnA maker apps that need to respond to a bot, use the [dispatch model](#dispatch-tool-and-model) to build the top-level app. Then change your bot to use the cascading call as shown in the [dispatch model's tutorial](/azure/bot-service/bot-builder-tutorial-dispatch?branch=master&tabs=cs&view=azure-bot-service-4.0).
+If you have several LUIS and QnA maker apps that need to respond to a bot, use the [dispatch model](#dispatch-tool-and-model) to build the top-level app. Then change your bot to use the cascading call as shown in the [dispatch model's tutorial](/azure/bot-service/bot-builder-tutorial-dispatch?tabs=cs&view=azure-bot-service-4.0).
## Dispatch tool and model Use the [Dispatch][dispatch-tool] command-line tool, found in [BotBuilder-tools](https://github.com/Microsoft/botbuilder-tools) to combine multiple LUIS and/or QnA Maker apps into a parent LUIS app. This approach allows you to have a parent domain including all subjects and different child subject domains in separate apps.
The parent domain is noted in LUIS with a version named `Dispatch` in the apps l
The chat bot receives the utterance, then sends to the parent LUIS app for prediction. The top predicted intent from the parent app determines which LUIS child app is called next. The chat bot sends the utterance to the child app for a more specific prediction.
-Understand how this hierarchy of calls is made from the Bot Builder v4 [dispatcher-application-tutorial](/azure/bot-service/bot-builder-tutorial-dispatch?branch=master&tabs=cs&view=azure-bot-service-4.0).
+Understand how this hierarchy of calls is made from the Bot Builder v4 [dispatcher-application-tutorial](/azure/bot-service/bot-builder-tutorial-dispatch?tabs=cs&view=azure-bot-service-4.0).
### Intent limits in dispatch model A dispatch application has 500 dispatch sources, equivalent to 500 intents, as the maximum.
A dispatch application has 500 dispatch sources, equivalent to 500 intents, as t
## More information * [Bot framework SDK](https://github.com/Microsoft/botframework)
-* [Dispatch model tutorial](/azure/bot-service/bot-builder-tutorial-dispatch?branch=master&tabs=cs&view=azure-bot-service-4.0)
+* [Dispatch model tutorial](/azure/bot-service/bot-builder-tutorial-dispatch?tabs=cs&view=azure-bot-service-4.0)
* [Dispatch CLI](https://github.com/Microsoft/botbuilder-tools) * Dispatch model bot sample - [.NET](https://github.com/microsoft/BotBuilder-Samples/tree/master/samples/csharp_dotnetcore/14.nlp-with-dispatch), [Node.js](https://github.com/microsoft/BotBuilder-Samples/tree/master/samples/javascript_nodejs/14.nlp-with-dispatch)
A dispatch application has 500 dispatch sources, equivalent to 500 intents, as t
* Learn how to [test a batch](luis-how-to-batch-test.md)
-[dispatcher-application-tutorial]: /azure/bot-service/bot-builder-tutorial-dispatch?branch=master
+[dispatcher-application-tutorial]: /azure/bot-service/bot-builder-tutorial-dispatch
[dispatch-tool]: https://aka.ms/dispatch-tool
cognitive-services Rest Text To Speech https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/rest-text-to-speech.md
audio-16khz-64kbitrate-mono-mp3 audio-16khz-32kbitrate-mono-mp3
raw-24khz-16bit-mono-pcm riff-24khz-16bit-mono-pcm audio-24khz-160kbitrate-mono-mp3 audio-24khz-96kbitrate-mono-mp3 audio-24khz-48kbitrate-mono-mp3 ogg-24khz-16bit-mono-opus
+raw-48khz-16bit-mono-pcm riff-48khz-16bit-mono-pcm
+audio-48khz-96kbitrate-mono-mp3 audio-48khz-192kbitrate-mono-mp3
``` > [!NOTE]
cognitive-services Speech Services Quotas And Limits https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/speech-services-quotas-and-limits.md
Previously updated : 12/07/2020 Last updated : 02/24/2021
In the tables below Parameters without "Adjustable" row are **not** adjustable f
| Quota | Free (F0)<sup>1</sup> | Standard (S0) | |--|--|--|
-| **Concurrent Request limit (Base and Custom models)** | 1 | 20 (default value) |
+| **Concurrent Request limit - Base model** | 1 | 100 (default value) |
+| Adjustable | No<sup>2</sup> | Yes<sup>2</sup> |
+| **Concurrent Request limit - Custom model** | 1 | 20 (default value) |
| Adjustable | No<sup>2</sup> | Yes<sup>2</sup> | #### Batch Transcription
cognitive-services Create Sas Tokens https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/document-translation/create-sas-tokens.md
Previously updated : 02/11/2021 Last updated : 03/05/2021
-# Create SAS tokens for Document Translation
+# Create SAS tokens for Document Translation processing
In this article, you'll learn how to create shared access signature (SAS) tokens using the Azure Storage Explorer or the Azure portal. An SAS token provides secure, delegated access to resources in your Azure storage account.
-## Create SAS tokens with Azure Storage Explorer
+## Create your SAS tokens with Azure Storage Explorer
### Prerequisites
In this article, you'll learn how to create shared access signature (SAS) tokens
## Create SAS tokens for blobs in the Azure portal > [!NOTE]
-> Creating SAS tokens for containers directly in the Azure portal is currently not supported. However, you can create an SAS token with [**Azure Storage Explorer**](#create-sas-tokens-with-azure-storage-explorer) or complete the task [programmatically](../../../storage/blobs/sas-service-create.md).
+> Creating SAS tokens for containers directly in the Azure portal is currently not supported. However, you can create an SAS token with [**Azure Storage Explorer**](#create-your-sas-tokens-with-azure-storage-explorer) or complete the task [programmatically](../../../storage/blobs/sas-service-create.md).
<!-- markdownlint-disable MD024 --> ### Prerequisites
To get started, you'll need:
* An active [**Azure account**](https://azure.microsoft.com/free/cognitive-services/). If you don't have one, you can [**create a free account**](https://azure.microsoft.com/free/). * A [**Translator**](https://ms.portal.azure.com/#create/Microsoft) service resource (**not** a Cognitive Services multi-service resource. *See* [Create a new Azure resource](../../cognitive-services-apis-create-account.md#create-a-new-azure-cognitive-services-resource).
-* An [**Azure blob storage account**](https://ms.portal.azure.com/#create/Microsoft.StorageAccount-ARM). All access to Azure Storage takes place through a storage account.
+* An [**Azure blob storage account**](https://ms.portal.azure.com/#create/Microsoft.StorageAccount-ARM). You will create containers to store and organize your blob data within your storage account.
### Create your tokens
cognitive-services Get Started With Document Translation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Translator/document-translation/get-started-with-document-translation.md
Previously updated : 02/11/2021 Last updated : 03/05/2021 # Get started with Document Translation (Preview)
Last updated 02/11/2021
## Prerequisites
+> [!NOTE]
+> Generally, when you create a Cognitive Service resource in the Azure portal, you have the option to create a multi-service subscription key or a single-service subscription key. However, Document Translation is currently supported in the Translator (single-service) resource only, and is **not** included in the Cognitive Services (multi-service) resource.
+ To get started, you'll need: * An active [**Azure account**](https://azure.microsoft.com/free/cognitive-services/). If you don't have one, you can [**create a free account**](https://azure.microsoft.com/free/).
-* A [**Translator**](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesTextTranslation) service resource (**not** a Cognitive Services resource).
+* A [**Translator**](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesTextTranslation) service resource (**not** a Cognitive Services resource).
-* An [**Azure blob storage account**](https://ms.portal.azure.com/#create/Microsoft.StorageAccount-ARM). All access to Azure Storage takes place through a storage account.
+* An [**Azure blob storage account**](https://ms.portal.azure.com/#create/Microsoft.StorageAccount-ARM). You will create containers to store and organize your blob data within your storage account.
* A completed [**Document Translation (Preview) form**](https://forms.office.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR-riVR3Xj0tOnIRdZOALbM9UOEE4UVdFQVBRQVBWWDBRQUM3WjYxUEpUTC4u) to enable your Azure subscription to use the new Document Translation feature.
-> [!NOTE]
-> Document Translation is currently only supported in the Translator (single-service) resource, **not** the Cognitive Services (multi-service) resource.
- ## Get your custom domain name and subscription key > [!IMPORTANT] >
-> * You can't use the endpoint found on your Azure portal resource _Keys and Endpoint_ page nor the global translator endpointΓÇö`api.cognitive.microsofttranslator.com`ΓÇöto make HTTP requests to Document Translation.
+> * You won't use the endpoint found on your Azure portal resource _Keys and Endpoint_ page nor the global translator endpointΓÇö`api.cognitive.microsofttranslator.com`ΓÇöto make HTTP requests to Document Translation.
> * **All API requests to the Document Translation service require a custom domain endpoint**.
-### What is the custom domain endpoint?
+### What is the custom domain endpoint?
The custom domain endpoint is a URL formatted with your resource name, hostname, and Translator subdirectories:
You'll need to [**create containers**](../../../storage/blobs/storage-quickstar
* **Target container**. This container is where your translated files will be stored (required). * **Glossary container**. This container is where you upload your glossary files (optional).
-*See* **Create SAS access tokens for Document Translation**
+### **Create SAS access tokens for Document Translation**
-The `sourceUrl` , `targetUrl` , and optional `glossaryUrl` must include a Shared Access Signature (SAS) token, appended as a query string. The token can be assigned to your container or specific blobs.
+The `sourceUrl` , `targetUrl` , and optional `glossaryUrl` must include a Shared Access Signature (SAS) token, appended as a query string. The token can be assigned to your container or specific blobs. *See* [**Create SAS tokens for Document Translation process**](create-sas-tokens.md).
* Your **source** container or blob must have designated **read** and **list** access. * Your **target** container or blob must have designated **write** and **list** access.
The following headers are included with each Document Translator API request:
> [!IMPORTANT] >
-> For the code samples, below, you may need to update the following fields, depending upon the operation:
+> For the code samples below, you'll hard-code your key and endpoint where indicated; remember to remove the key from your code when you're done, and never post it publicly. See [Azure Cognitive Services security](/azure/cognitive-services/cognitive-services-security?tabs=command-line%2Ccsharp) for ways to securely store and access your credentials.
+>
+> You may need to update the following fields, depending upon the operation:
>>> >> * `endpoint` >> * `subscriptionKey`
The following headers are included with each Document Translator API request:
>> * `glossaryURL` >> * `id` (job ID) >>
-> Where to finding the `id` value:
-> * You can find the job `id` in the The POST method's response Header `Operation-Location` URL value. The last parameter of the URL is the operation's job **`id`**.
-> * You can also use a GET Jobs request to retrieve the job `id` for a Document Translation operation.
->
-> For the code samples below, you'll hard-code your key and endpoint where indicated; remember to remove the key from your code when you're done, and never post it publicly.
+
+#### Locating the `id` value
+
+* You'll find the job `id` in the POST method response Header `Operation-Location` URL value. The last parameter of the URL is the operation's job **`id`**:
+
+|**Response header**|**Result URL**|
+|--|-|
+Operation-Location | https://<<span>NAME-OF-YOUR-RESOURCE>.cognitiveservices.azure.com/translator/text/batch/v1.0-preview.1/batches/9dce0aa9-78dc-41ba-8cae-2e2f3c2ff8ec</span>
+
+* You can also use a **GET Jobs** request to retrieve a Document Translation job `id` .
+ >
-> See [Azure Cognitive Services security](/azure/cognitive-services/cognitive-services-security?tabs=command-line%2Ccsharp) for ways to securely store and access your credentials.
## _POST Document Translation_ request
The table below lists the limits for data that you send to Document Translation.
> [!div class="nextstepaction"] > [Create a customized language system using Custom Translator](../custom-translator/overview.md) >
->
+>
communication-services Authentication https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/authentication.md
Last updated 07/24/2020
-# Authenticate to Azure Communication Services
-
-This article provides information on authenticating clients to Azure Communication Services using *access keys* and *user access tokens*. Every client interaction with Azure Communication Services needs to be authenticated.
+# Authenticate to Azure Communication Services
-The following table describes which authentication options are supported by the Azure Communication Services client libraries:
+Every client interaction with Azure Communication Services needs to be authenticated. In a typical architecture, see [client and server architecture](./client-and-server-architecture.md), *access keys* or *managed identity* is used in the trusted user access service to create users and issue tokens. And *user access token* issued by the trusted user access service is used for client applications to access other communication services, for example, chat or calling service.
-| Client library | Access key | User access tokens |
-| -- | - | |
-| Administration | Supported | Not Supported |
-| SMS | Supported | Not Supported |
-| Chat | Not Supported | Supported |
-| Calling | Not Supported | Supported |
+Azure Communication Services SMS service also accepts *access keys* or *managed identity* for authentication. This typically happens in a service application running in a trusted service environment.
Each authorization option is briefly described below: -- **Access Key** authentication for SMS and Administration operations. Access Key authentication is suitable for applications running in a trusted service environment. To authenticate with an access key, a client generates a [hash-based message authentication code (HMAC)](https://en.wikipedia.org/wiki/HMAC) and includes it within the `Authorization` header of each HTTP request. For more information, see [Authenticate with an Access Key](#authenticate-with-an-access-key).-- **User Access Token** authentication for Chat and Calling. User access tokens let your client applications authenticate directly against Azure Communication Services. These tokens are generated on a server-side token provisioning service that you create. They're then provided to client devices that use the token to initialize the Chat and Calling client libraries. For more information, see [Authenticate with a User Access Token](#authenticate-with-a-user-access-token).-
-## Authenticate with an access key
-
-Access key authentication uses a shared secret key to generate an HMAC for each HTTP request computed by using the SHA256 algorithm, and sends it in the `Authorization` header using the `HMAC-SHA256` scheme.
-
-```
-Authorization: "HMAC-SHA256 SignedHeaders=date;host;x-ms-content-sha256&Signature=<hmac-sha256-signature>"
-```
-
-The Azure Communication Services client libraries that use access key authentication should be initialized with your resource's connection string. If you're not using a client library, you can programmatically generate HMACs using your resource's access key. To learn more about connection strings, visit the [resource provisioning quickstart](../quickstarts/create-communication-resource.md).
-
-### Sign an HTTP request
-
-If you're not using a client library to make HTTP requests to the Azure Communication Services REST APIs, you'll need to programmatically create HMACs for each HTTP request. The following steps describe how to construct the Authorization header:
-
-1. Specify the Coordinated Universal Time (UTC) timestamp for the request in either the `x-ms-date` header, or in the standard HTTP `Date` header. The service validates this to guard against certain security attacks, including replay attacks.
-1. Hash the HTTP request body using the SHA256 algorithm then pass it, with the request, via the `x-ms-content-sha256` header.
-1. Construct the string to be signed by concatenating the HTTP Verb (e.g. `GET` or `PUT`), HTTP request path, and values of the `Date`, `Host` and `x-ms-content-sha256` HTTP headers in the following format:
- ```
- VERB + "\n"
- URLPathAndQuery + "\n"
- DateHeaderValue + ";" + HostHeaderValue + ";" + ContentHashHeaderValue
- ```
-1. Generate an HMAC-256 signature of the UTF-8 encoded string that you created in the previous step. Next, encode your results as Base64. Note that you also need to Base64-decode your access key. Use the following format (shown as pseudo code):
- ```
- Signature=Base64(HMAC-SHA256(UTF8(StringToSign), Base64.decode(<your_access_key>)))
- ```
-1. Specify the Authorization header as follows:
- ```
- Authorization="HMAC-SHA256 SignedHeaders=date;host;x-ms-content-sha256&Signature=<hmac-sha256-signature>"
- ```
- Where `<hmac-sha256-signature>` is the HMAC computed in the previous step.
-
-## Authenticate with a user access token
-
-User access tokens let your client applications authenticate directly against Azure Communication Services. To achieve this you should set up a trusted service that authenticates your application users and issues user access tokens with the Administration client library. Visit the [client and server architecture](./client-and-server-architecture.md) conceptual documentation to learn more about our architectural considerations.
-
-The `CommunicationTokenCredential` class contains the logic for providing user access token credentials to the client libraries and managing their lifecycle.
-
-### Initialize the client libraries
-
-To initialize Azure Communication Services client libraries that require user access token authentication, you first create an instance of the `CommunicationTokenCredential` class, and then use it to initialize an API client.
-
-The following snippets show you how to initialize the chat client library with a user access token:
-
-#### [C#](#tab/csharp)
-
-```csharp
-// user access tokens should be created by a trusted service using the Administration client library
-var token = "<valid-user-access-token>";
-
-// create a CommunicationTokenCredential instance
-var userCredential = new CommunicationTokenCredential(token);
-
-// initialize the chat client library with the credential
-var chatClient = new ChatClient(ENDPOINT_URL, userCredential);
-```
-
-#### [JavaScript](#tab/javascript)
-
-```javascript
-// user access tokens should be created by a trusted service using the Administration client library
-const token = "<valid-user-access-token>";
-
-// create a CommunicationTokenCredential instance with the AzureCommunicationTokenCredential class
-const userCredential = new AzureCommunicationTokenCredential(token);
-
-// initialize the chat client library with the credential
-let chatClient = new ChatClient(ENDPOINT_URL, userCredential);
-```
-
-#### [Swift](#tab/swift)
-
-```swift
-// user access tokens should be created by a trusted service using the Administration client library
-let token = "<valid-user-access-token>";
-
-// create a CommunicationTokenCredential instance
-let userCredential = try CommunicationTokenCredential(token: token)
-
-// initialize the chat client library with the credential
-let chatClient = try CommunicationChatClient(credential: userCredential, endpoint: ENDPOINT_URL)
-```
-
-#### [Java](#tab/java)
-
-```java
-// user access tokens should be created by a trusted service using the Administration client library
-String token = "<valid-user-access-token>";
-
-// create a CommunicationTokenCredential instance
-CommunicationTokenCredential userCredential = new CommunicationTokenCredential(token);
-
-// Initialize the chat client
-final ChatClientBuilder builder = new ChatClientBuilder();
-builder.endpoint(ENDPOINT_URL)
- .credential(userCredential)
- .httpClient(HTTP_CLIENT);
-ChatClient chatClient = builder.buildClient();
-```
---
-### Refreshing user access tokens
-
-User access tokens are short-lived credentials that need to be reissued to prevent your users from experiencing service disruptions. The `CommunicationTokenCredential` constructor accepts a refresh callback function that enables you to update user access tokens before they expire. You should use this callback to fetch a new user access token from your trusted service.
-
-#### [C#](#tab/csharp)
-
-```csharp
-var userCredential = new CommunicationTokenCredential(
- initialToken: token,
- refreshProactively: true,
- tokenRefresher: cancellationToken => fetchNewTokenForCurrentUser(cancellationToken)
-);
-```
-
-#### [JavaScript](#tab/javascript)
-
-```javascript
-const userCredential = new AzureCommunicationTokenCredential({
- tokenRefresher: async () => fetchNewTokenForCurrentUser(),
- refreshProactively: true,
- initialToken: token
-});
-```
-
-#### [Swift](#tab/swift)
-
-```swift
- let userCredential = try CommunicationTokenCredential(initialToken: token, refreshProactively: true) { |completionHandler|
- let updatedToken = fetchTokenForCurrentUser()
- completionHandler(updatedToken, nil)
- }
-```
-
-#### [Java](#tab/java)
-
-```java
-TokenRefresher tokenRefresher = new TokenRefresher() {
- @Override
- Future<String> getFetchTokenFuture() {
- return fetchNewTokenForCurrentUser();
- }
-}
-
-CommunicationTokenCredential credential = new CommunicationTokenCredential(tokenRefresher, token, true);
-```
--
-The `refreshProactively` option lets you decide how you'll manage the token lifecycle. By default, when a token is stale, the callback will block API requests and attempt to refresh it. When `refreshProactively` is set to `true` the callback is scheduled and executed asynchronously before the token expires.
+- **Access Key** authentication for SMS and Identity operations. Access Key authentication is suitable for service applications running in a trusted service environment. Access key can be found in Azure Communication Services portal. To authenticate with an access key, a service application uses the access key as credential to initialize corresponding SMS or Identity client libraries, see [Create and manage access tokens](../quickstarts/access-tokens.md). Since access key is part of the connection string of your resource, see [Create and manage Communication Services resources](../quickstarts/create-communication-resource.md), authentication with connection string is equivalent to authentication with access key.
+- **Managed Identity** authentication for SMS and Identity operations. Managed Identity, see [Managed Identity](../quickstarts/managed-identity.md), is suitable for service applications running in a trusted service environment. To authenticate with a managed identity, a service application creates a credential with the ID and a secret of the managed identity then initialize corresponding SMS or Identity client libraries, see [Create and manage access tokens](../quickstarts/access-tokens.md).
+- **User Access Token** authentication for Chat and Calling. User access tokens let your client applications authenticate against Azure Communication Chat and Calling Services. These tokens are generated in a "trusted user access service" that you create. They're then provided to client devices that use the token to initialize the Chat and Calling client libraries. For more information, see [Add Chat to your App](../quickstarts/chat/get-started.md) for example.
## Next steps > [!div class="nextstepaction"]
+> [Create and manage Communication Services resources](../quickstarts/create-communication-resource.md)
+> [Create an Azure Active Directory managed identity application from the Azure CLI](../quickstarts/managed-identity-from-cli.md)
> [Creating user access tokens](../quickstarts/access-tokens.md) For more information, see the following articles:
communication-services Call Flows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/call-flows.md
The section below gives an overview of the call flows in Azure Communication Ser
## About signaling and media protocols
-When you establish a peer-to-peer or group call, two protocols are used behind the scenes - HTTP (REST) for signaling and SRTP for media.
+When you establish a peer-to-peer or group call, two protocols are used behind the scenes - HTTP (REST) for signaling and SRTP for media.
-Signaling between the client libraries or between client libraries and Communication Services Signaling Controllers is handled with HTTP REST (TLS). For Real-Time Media Traffic (RTP), the User Datagram Protocol (UDP) is preferred. If the use of UDP is prevented by your firewall, the client library will use the Transmission Control Protocol (TCP) for media.
+Signaling between the client libraries or between client libraries and Communication Services Signaling Controllers is handled with HTTP REST (TLS). For Real-Time Media Traffic (RTP), the User Datagram Protocol (UDP) is preferred. If the use of UDP is prevented by your firewall, the client library will use the Transmission Control Protocol (TCP) for media.
-Let's review the signaling and media protocols in various scenarios.
+Let's review the signaling and media protocols in various scenarios.
## Call flow cases
In one-to-one VoIP or video calls, traffic prefers the most direct path. "Direct
### Case 2: VoIP where a direct connection between devices is not possible, but where connection between NAT devices is possible
-If two devices are located in subnets that can't reach each other (for example, Alice works from a coffee shop and Bob works from his home office) but the connection between the NAT devices is possible, the client side client libraries will establish connectivity via NAT devices.
+If two devices are located in subnets that can't reach each other (for example, Alice works from a coffee shop and Bob works from his home office) but the connection between the NAT devices is possible, the client side client libraries will establish connectivity via NAT devices.
For Alice it will be the NAT of the coffee shop and for Bob it will be the NAT of the home office. Alice's device will send the external address of her NAT and Bob's will do the same. The client libraries learn the external addresses from a STUN (Session Traversal Utilities for NAT) service that Azure Communication Services provides free of charge. The logic that handles the handshake between Alice and Bob is embedded within the Azure Communication Services provided client libraries. (You don't need any additional configuration)
For Alice it will be the NAT of the coffee shop and for Bob it will be the NAT o
If one or both client devices are behind a symmetric NAT, a separate cloud service to relay the media between the two client libraries is required. This service is called TURN (Traversal Using Relays around NAT) and is also provided by the Communication Services. The Communication Services calling client library automatically uses TURN services based on detected network conditions. Use of Microsoft's TURN service is charged separately. :::image type="content" source="./media/call-flows/about-voice-case-3.png" alt-text="Diagram showing a VOIP call which utilizes a TURN connection.":::
-
+ ### Case 4: Group calls with PSTN Both signaling and media for PSTN Calls use the Azure Communication Services telephony resource. This resource is interconnected with other carriers.
If the client library can't use UDP for media due to firewall restrictions, an a
### Case 5: Communication Services client library and Microsoft Teams in a scheduled Teams meeting
-Signaling flows through the signaling controller. Media flows through the Media Processor. The signaling controller and Media Processor are shared between Communication Services and Microsoft Teams.
+Signaling flows through the signaling controller. Media flows through the Media Processor. The signaling controller and Media Processor are shared between Communication Services and Microsoft Teams.
:::image type="content" source="./media/call-flows/teams-communication-services-meeting.png" alt-text="Diagram showing Communication Services client library and Teams Client in a scheduled Teams meeting.":::
communication-services Concepts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/chat/concepts.md
[!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)] + Azure Communication Services Chat client libraries can be used to add real-time text chat to your applications. This page summarizes key Chat concepts and capabilities. See the [Communication Services Chat client library Overview](./sdk-features.md) to learn more about specific client library languages and capabilities.
-## Chat overview
+## Chat overview
-Chat conversations happen within chat threads. A chat thread can contain many messages and many users. Every message belongs to a single thread, and a user can be a part of one or many threads.
+Chat conversations happen within chat threads. A chat thread can contain many messages and many users. Every message belongs to a single thread, and a user can be a part of one or many threads.
-Each user in the chat thread is called a member. You can have up to 250 members in a chat thread. Only thread members can send and receive messages or add/remove members in a chat thread. The maximum message size allowed is approximately 28KB. You can retrieve all messages in a chat thread using the `List/Get Messages` operation. Communication Services stores chat history until you execute a delete operation on the chat thread or message, or until no members are remaining in the chat thread at which point it is orphaned and processed for deletion.
+Each user in the chat thread is called a member. You can have up to 250 members in a chat thread. Only thread members can send and receive messages or add/remove members in a chat thread. The maximum message size allowed is approximately 28KB. You can retrieve all messages in a chat thread using the `List/Get Messages` operation. Communication Services stores chat history until you execute a delete operation on the chat thread or message, or until no members are remaining in the chat thread at which point it is orphaned and processed for deletion.
-For chat threads with more than 20 members, read receipts and typing indicator features are disabled.
+For chat threads with more than 20 members, read receipts and typing indicator features are disabled.
## Chat architecture
There are two core parts to chat architecture: 1) Trusted Service and 2) Client
- **Client app:** The client application connects to your trusted service and receives the access tokens that are used to connect directly to Communication Services. After this connection is made, your client app can send and receive messages. We recommend generating access tokens using the trusted service tier. In this scenario the server side would be responsible for creating and managing users and issuing their tokens.
-
+ ## Message types
-Communication Services Chat shares user-generated messages as well as system-generated messages called **Thread activities**. Thread activities are generated when a chat thread is updated. When you call `List Messages` or `Get Messages` on a chat thread, the result will contain the user-generated text messages as well as the system messages in chronological order. This helps you identify when a member was added or removed or when the chat thread topic was updated. Supported message types are:
+Communication Services Chat shares user-generated messages as well as system-generated messages called **Thread activities**. Thread activities are generated when a chat thread is updated. When you call `List Messages` or `Get Messages` on a chat thread, the result will contain the user-generated text messages as well as the system messages in chronological order. This helps you identify when a member was added or removed or when the chat thread topic was updated. Supported message types are:
- `Text`: A plain text message composed and sent by a user as part of a chat conversation. - `RichText/HTML`: A formatted text message. Note that Communication Services users currently can't send RichText messages. This message type is supported by messages sent from Teams users to Communication Services users in Teams Interop scenarios.
Communication Services Chat shares user-generated messages as well as system-gen
} ```
-## Real-time signaling
+## Real-time signaling
The Chat JavaScript client library includes real-time signaling. This allows clients to listen for real-time updates and incoming messages to a chat thread without having to poll the APIs. Available events include:
+ - `ChatMessageReceived` - when a new message is sent to a chat thread that the user is member of. This event is not sent for auto generated system messages which we discussed in the previous topic.
+ - `ChatMessageEdited` - when a message is edited in a chat thread that the user is member of.
+ - `ChatMessageDeleted` - when a message is deleted in a chat thread that the user is member of.
+ - `TypingIndicatorReceived` - when another member is typing a message in a chat thread that the user is member of.
+ - `ReadReceiptReceived` - when another member has read the message that user sent in a chat thread.
-## Chat events
+## Chat events
Real-time signaling allows your users to chat in real-time. Your services can use Azure Event Grid to subscribe to chat-related events. For more details, see [Event Handling conceptual](../event-handling.md).
Real-time signaling allows your users to chat in real-time. Your services can us
You can use [Azure Cognitive APIs](../../../cognitive-services/index.yml) with the Chat client library to add intelligent features to your applications. For example, you can: -- Enable users to chat with each other in different languages.
+- Enable users to chat with each other in different languages.
- Help a support agent prioritize tickets by detecting a negative sentiment of an incoming issue from a customer. - Analyze the incoming messages for key detection and entity recognition, and prompt relevant info to the user in your app based on the message content.
-One way to achieve this is by having your trusted service act as a member of a chat thread. Let's say you want to enable language translation. This service will be responsible for listening to the messages being exchanged by other members [1], calling cognitive APIs to translate the content to desired language[2,3] and sending the translated result as a message in the chat thread[4].
+One way to achieve this is by having your trusted service act as a member of a chat thread. Let's say you want to enable language translation. This service will be responsible for listening to the messages being exchanged by other members [1], calling cognitive APIs to translate the content to desired language[2,3] and sending the translated result as a message in the chat thread[4].
-This way, the message history will contain both original and translated messages. In the client application, you can add logic to show the original or translated message. See [this quickstart](../../../cognitive-services/translator/quickstart-translator.md) to understand how to use Cognitive APIs to translate text to different languages.
+This way, the message history will contain both original and translated messages. In the client application, you can add logic to show the original or translated message. See [this quickstart](../../../cognitive-services/translator/quickstart-translator.md) to understand how to use Cognitive APIs to translate text to different languages.
:::image type="content" source="../media/chat/cognitive-services.png" alt-text="Diagram showing Cognitive Services interacting with Communication Services.":::
communication-services Sdk Features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/chat/sdk-features.md
[!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)] + Azure Communication Services Chat client libraries can be used to add rich, real-time chat to your applications. ## Chat client library capabilities
The following list presents the set of features which are currently available in
| | Send and receive typing notifications when a member is actively typing a message in a chat thread <br/> *Not available when there are more than 20 members in a chat thread* | ✔️ | ✔️ | ✔️ | ✔️ | | | Get all messages in a chat thread <br/> *Unicode emojis supported* | ✔️ | ✔️ | ✔️ | ✔️ | | | Send emojis as part of message content | ✔️ | ✔️ | ✔️ | ✔️ |
-|Real-time signaling (enabled by proprietary signalling package**)| Get notified when a user receives a new message in a chat thread they're a member of | ✔️ | ❌ | ❌ | ❌ |
+|Real-time signaling (enabled by proprietary signaling package**)| Get notified when a user receives a new message in a chat thread they're a member of | ✔️ | ❌ | ❌ | ❌ |
| | Get notified when a message has been edited by another member in a chat thread they're a member of | ✔️ | ❌ | ❌ | ❌ | | | Get notified when a message has been deleted by another member in a chat thread they're a member of | ✔️ | ❌ | ❌ | ❌ | | | Get notified when another chat thread member is typing | ✔️ | ❌ | ❌ | ❌ |
The following list presents the set of features which are currently available in
| | Monitor the quality and status of API requests made by your app and configure alerts via the portal | ✔️ | ✔️ | ✔️ | ✔️ | |Additional features | Use [Cognitive Services APIs](../../../cognitive-services/index.yml) along with chat client library to enable intelligent features - *language translation & sentiment analysis of the incoming message on a client, speech to text conversion to compose a message while the member speaks, etc.* | ✔️ | ✔️ | ✔️ | ✔️ |
-**The proprietary signalling package is implemented using web sockets. It will fallback to long polling if web sockets are unsupported.
+**The proprietary signaling package is implemented using web sockets. It will fallback to long polling if web sockets are unsupported.
## JavaScript chat client library support by OS and browser
communication-services Client And Server Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/client-and-server-architecture.md
[!INCLUDE [Public Preview Notice](../includes/public-preview-include.md)] + <!-- > [!WARNING]
-> This document is under construction and needs the following items to be addressed:
+> This document is under construction and needs the following items to be addressed:
> - Need to add security best practices for token management here > - Reference docs: > - https://docs.microsoft.com/windows/security/threat-protection/security-policy-settings/create-a-token-object
Many scenarios are best served with native applications. Azure Communication Ser
Communicating over the phone system can dramatically increase the reach of your application. To support PSTN voice and SMS scenarios, Azure Communication Services helps you [acquire phone numbers](../quickstarts/telephony-sms/get-phone-number.md) directly from the Azure portal or using REST APIs and client libraries. Once phone numbers are acquired, they can be used to reach customers using both PSTN calling and SMS in both inbound and outbound scenarios. A sample architecture flow can be found below: > [!Note]
-> During public preview, the provisioning of US phone numbers is available to customers with billing addresses located within the US and Canada.
+> During public preview, the provisioning of US phone numbers is available to customers with billing addresses located within the US and Canada.
:::image type="content" source="../media/scenarios/archdiagram-pstn.png" alt-text="Diagram showing Communication Services PSTN architecture.":::
For more information, see the following articles:
- Learn about [Phone number types](../concepts/telephony-sms/plan-solution.md) - [Add chat to your app](../quickstarts/chat/get-started.md)-- [Add voice calling to your app](../quickstarts/voice-video-calling/getting-started-with-calling.md)
+- [Add voice calling to your app](../quickstarts/voice-video-calling/getting-started-with-calling.md)
communication-services Event Handling https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/event-handling.md
[!INCLUDE [Public Preview Notice](../includes/public-preview-include.md)] + Azure Communication Services integrates with [Azure Event Grid](https://azure.microsoft.com/services/event-grid/) to deliver real-time event notifications in a reliable, scalable and secure manner. The purpose of this article is to help you configure your applications to listen to Communication Services events. For example, you may want to update a database, create a work item and deliver a push notification whenever an SMS message is received by a phone number associated with your Communication Services resource. Azure Event Grid is a fully managed event routing service, which uses a publish-subscribe model. Event Grid has built-in support for Azure services like [Azure Functions](../../azure-functions/functions-overview.md) and [Azure Logic Apps](../../azure-functions/functions-overview.md). It can deliver event alerts to non-Azure services using webhooks. For a complete list of the event handlers that Event Grid supports, see [An introduction to Azure Event Grid](../../event-grid/overview.md).
Azure Event Grid is a fully managed event routing service, which uses a publish-
## Events types
-Event grid uses [event subscriptions](../../event-grid/concepts.md#event-subscriptions) to route event messages to subscribers.
+Event grid uses [event subscriptions](../../event-grid/concepts.md#event-subscriptions) to route event messages to subscribers.
Azure Communication Services emits the following event types:
Azure Communication Services emits the following event types:
| -- | - | | Microsoft.Communication.SMSReceived | Published when an SMS is received by a phone number associated with the Communication Service. | | Microsoft.Communication.SMSDeliveryReportReceived | Published when a delivery report is received for an SMS sent by the Communication Service. |
-| Microsoft.Communication.ChatMessageReceived | Published when a message is received for a user in a chat thread that she is member of. |
+| Microsoft.Communication.ChatMessageReceived | Published when a message is received for a user in a chat thread that she is member of. |
| Microsoft.Communication.ChatMessageEdited | Published when a message is edited in a chat thread that the user is member of. | | Microsoft.Communication.ChatMessageDeleted | Published when a message is deleted in a chat thread that the user is member of. | | Microsoft.Communication.ChatThreadCreatedWithUser | Published when the user is added as member at the time of creation of a chat thread. |
Azure Communication Services emits the following event types:
| Microsoft.Communication.ChatThreadPropertiesUpdatedPerUser | Published when a chat thread's properties are updated that the user is member of. | | Microsoft.Communication.ChatMemberAddedToThreadWithUser | Published when the user is added as member to a chat thread. | | Microsoft.Communication.ChatMemberRemovedFromThreadWithUser | Published when the user is removed from a chat thread. |
+| Microsoft.Communication.ChatParticipantAddedToThreadWithUser| Published for a user when a new participant is added to a chat thread, that the user is part of.|
+| Microsoft.Communication.ChatParticipantRemovedFromThreadWithUser | Published for a user when a participant is removed from a chat thread, that the user is part of. |
+| Microsoft.Communication.ChatThreadCreated | Published when a chat thread is created |
+| Microsoft.Communication.ChatThreadDeleted| Published when a chat thread is deleted |
+| Microsoft.Communication.ChatThreadParticipantAdded | Published when a new participant is added to a chat thread |
+| Microsoft.Communication.ChatThreadParticipantRemoved | Published when a new participant is added to a chat thread. |
+| Microsoft.Communication.ChatMessageReceivedInThread | Published when a message is received in a chat thread |
+| Microsoft.Communication.ChatThreadPropertiesUpdated| Published when a chat thread's properties like topic are updated.|
+| Microsoft.Communication.ChatMessageEditedInThread | Published when a message is edited in a chat thread |
+| Microsoft.Communication.ChatMessageDeletedInThread | Published when a message is deleted in a chat thread |
You can use the Azure portal or Azure CLI to subscribe to events emitted by your Communication Services resource. Get started with handling events by looking at [How to handle SMS Events in Communication Services](../quickstarts/telephony-sms/handle-sms-events.md)
This section contains an example of what that data would look like for each even
```json [{
- "id": "c13afb5f-d975-4296-a8ef-348c8fc496ee",
- "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
- "subject": "thread/{thread-id}/sender/{id-of-message-sender}/recipient/{id-of-message-recipient}",
- "data": {
- "messageBody": "Welcome to Azure Communication Services",
- "messageId": "1600389507167",
- "senderId": "8:acs:fac4607d-d2d0-40e5-84df-6f32ebd1251a_00000005-3e0d-e5aa-0e04-343a0d00037c",
- "senderDisplayName": "John",
- "composeTime": "2020-09-18T00:38:27.167Z",
- "type": "Text",
- "version": 1600389507167,
- "recipientId": "8:acs:fac4607d-d2d0-40e5-84df-6f32ebd1251a_00000005-3e1a-3090-6a0b-343a0d000409",
- "transactionId": "WGW1YmwRzkupk0UI0QA9ZA.1.1.1.1.1797783722.1.9",
- "threadId": "19:46df844a4c064bfaa2b3b30e385d1018@thread.v2"
- },
- "eventType": "Microsoft.Communication.ChatMessageReceived",
- "dataVersion": "1.0",
- "metadataVersion": "1",
- "eventTime": "2020-09-18T00:38:28.0946757Z"
-}
+ "id": "02272459-badb-4e2e-b538-4cb8a2f71da6",
+ "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/sender/{rawId}/recipient/{rawId}",
+ "data": {
+ "messageBody": "Welcome to Azure Communication Services",
+ "messageId": "1613694358927",
+ "senderId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7caf-07fd-084822001724",
+ "senderCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7caf-07fd-084822001724",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7caf-07fd-084822001724"
+ }
+ },
+ "senderDisplayName": "Jhon",
+ "composeTime": "2021-02-19T00:25:58.927Z",
+ "type": "Text",
+ "version": 1613694358927,
+ "recipientId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d05-83fe-084822000f6d",
+ "recipientCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d05-83fe-084822000f6d",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d05-83fe-084822000f6d"
+ }
+ },
+ "transactionId": "oh+LGB2dUUadMcTAdRWQxQ.1.1.1.1.1827536918.1.7",
+ "threadId": "19:6e5d6ca1d75044a49a36a7965ec4a906@thread.v2"
+ },
+ "eventType": "Microsoft.Communication.ChatMessageReceived",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-19T00:25:59.9436666Z"
+ }
] ```
This section contains an example of what that data would look like for each even
```json [{
- "id": "18247662-e94a-40cc-8d2f-f7357365309e",
- "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
- "subject": "thread/19:6d20c2f921cd402ead7d1b31b0d030cd@thread.v2/sender/8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003fe/recipient/8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003f0",
- "data": {
- "editTime": "2020-09-18T00:48:47.361Z",
- "messageBody": "Let's Chat about new communication services.",
- "messageId": "1600390097873",
- "senderId": "8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003fe",
- "senderDisplayName": "Bob(Admin)",
- "composeTime": "2020-09-18T00:48:17.873Z",
- "type": "Text",
- "version": 1600390127361,
- "recipientId": "8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003f0",
- "transactionId": "bbopOa1JZEW5NDDFLgH1ZQ.2.1.2.1.1822032097.1.5",
- "threadId": "19:6d20c2f921cd402ead7d1b31b0d030cd@thread.v2"
- },
- "eventType": "Microsoft.Communication.ChatMessageEdited",
- "dataVersion": "1.0",
- "metadataVersion": "1",
- "eventTime": "2020-09-18T00:48:48.037823Z"
-}]
+ "id": "93fc1037-b645-4eb0-a0f2-d7bb3ba6e060",
+ "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/sender/{rawId}/recipient/{rawId}",
+ "data": {
+ "editTime": "2021-02-19T00:28:20.784Z",
+ "messageBody": "Let's Chat about new communication services.",
+ "messageId": "1613694357917",
+ "senderId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7caf-07fd-084822001724",
+ "senderCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7caf-07fd-084822001724",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7caf-07fd-084822001724"
+ }
+ },
+ "senderDisplayName": "Bob(Admin)",
+ "composeTime": "2021-02-19T00:25:57.917Z",
+ "type": "Text",
+ "version": 1613694500784,
+ "recipientId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d60-83fe-084822000f6f",
+ "recipientCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d60-83fe-084822000f6f",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d60-83fe-084822000f6f"
+ }
+ },
+ "transactionId": "1mL4XZH2gEecu/alk9tOtw.2.1.2.1.1833042153.1.7",
+ "threadId": "19:6e5d6ca1d75044a49a36a7965ec4a906@thread.v2"
+ },
+ "eventType": "Microsoft.Communication.ChatMessageEdited",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-19T00:28:21.7456718Z"
+ }]
``` ### Microsoft.Communication.ChatMessageDeleted event ```json [{
- "id": "08034616-cf11-4fc2-b402-88963b93d083",
- "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
- "subject": "thread/19:6d20c2f921cd402ead7d1b31b0d030cd@thread.v2/sender/8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003fe/recipient/8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003f0",
- "data": {
- "deleteTime": "2020-09-18T00:48:47.361Z",
- "messageId": "1600390099195",
- "senderId": "8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003fe",
- "senderDisplayName": "Bob(Admin)",
- "composeTime": "2020-09-18T00:48:19.195Z",
- "type": "Text",
- "version": 1600390152154,
- "recipientId": "8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003f0",
- "transactionId": "mAxUjeTsG06NpObXkFcjVQ.1.1.2.1.1823015063.1.5",
- "threadId": "19:6d20c2f921cd402ead7d1b31b0d030cd@thread.v2"
- },
- "eventType": "Microsoft.Communication.ChatMessageDeleted",
- "dataVersion": "1.0",
- "metadataVersion": "1",
- "eventTime": "2020-09-18T00:49:12.6698791Z"
-}]
+ "id": "23cfcc13-33f2-4ae1-8d23-b5015b05302b",
+ "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/sender/{rawId}/recipient/{rawId}",
+ "data": {
+ "deleteTime": "2021-02-19T00:43:10.14Z",
+ "messageId": "1613695388152",
+ "senderId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d07-83fe-084822000f6e",
+ "senderCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d07-83fe-084822000f6e",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d07-83fe-084822000f6e"
+ }
+ },
+ "senderDisplayName": "Bob(Admin)",
+ "composeTime": "2021-02-19T00:43:08.152Z",
+ "type": "Text",
+ "version": 1613695390361,
+ "recipientId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d60-83fe-084822000f6f",
+ "recipientCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d60-83fe-084822000f6f",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d60-83fe-084822000f6f"
+ }
+ },
+ "transactionId": "fFs4InlBn0O/0WyhfQZVSQ.1.1.2.1.1867776045.1.4",
+ "threadId": "19:48899258eec941e7a281e03edc8f4964@thread.v2"
+ },
+ "eventType": "Microsoft.Communication.ChatMessageDeleted",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-19T00:43:10.9982947Z"
+ }]
```
-### Microsoft.Communication.ChatThreadCreatedWithUser event
+### Microsoft.Communication.ChatThreadCreatedWithUser event
```json [{
- "id": "06c7c381-bb0a-4fff-aedd-919df1d52137",
- "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
- "subject": "thread/19:7bdf5504a23f41a79d1bd472dd40044a@thread.v2/createdBy/8:acs:73551687-f8c8-48a7-bf06-d8263f15b02a_00000005-3e5f-1bc6-f40f-343a0d0003fe/recipient/8:acs:73551687-f8c8-48a7-bf06-d8263f15b02a_00000005-3e5f-1bc6-f40f-343a0d0003f0",
- "data": {
- "createdBy": "8:acs:73551687-f8c8-48a7-bf06-d8263f15b02a_06014f-6001fc107f",
- "properties": {
- "topic": "Chat about new commuication services",
- },
- "members": [
- {
- "displayName": "Bob",
- "memberId": "8:acs:73551687-f8c8-48a7-bf06-d8263f15b02a_00000005-3e5f-1bc6-f40f-343a0d0003f0"
+ "id": "eba02b2d-37bf-420e-8656-3a42ef74c435",
+ "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/createdBy/rawId/recipient/rawId",
+ "data": {
+ "createdBy": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-286d-e1fe-0848220013b9",
+ "createdByCommunicationIdentifier": {
+ "rawId": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-286d-e1fe-0848220013b9",
+ "communicationUser": {
+ "id": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-286d-e1fe-0848220013b9"
+ }
},
- {
- "displayName": "John",
- "memberId": "8:acs:73551687-f8c8-48a7-bf06-d8263f15b02a_00000005-3e5f-1bc6-f40f-343a0d0003f1"
- }
- ],
- "createTime": "2020-09-17T22:06:09.988Z",
- "version": 1600380369988,
- "recipientId": "8:acs:73551687-f8c8-48a7-bf06-d8263f15b02a_00000005-3e5f-1bc6-f40f-343a0d0003f0",
- "transactionId": "9ZxrGXVXCkOTygd5iwsvAQ.1.1.1.1.1440874720.1.1",
- "threadId": "19:7bdf5504a23f41a79d1bd472dd40044a@thread.v2"
- },
- "eventType": "Microsoft.Communication.ChatThreadCreatedWithUser",
- "dataVersion": "1.0",
- "metadataVersion": "1",
- "eventTime": "2020-09-17T22:06:10.3235137Z"
-}]
+ "properties": {
+ "topic": "Chat about new commuication services"
+ },
+ "members": [
+ {
+ "displayName": "Bob",
+ "memberId": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-286d-e1fe-0848220013b9"
+ },
+ {
+ "displayName": "John",
+ "memberId": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-289b-07fd-0848220015ea"
+ }
+ ],
+ "participants": [
+ {
+ "displayName": "Bob",
+ "participantCommunicationIdentifier": {
+ "rawId": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-286d-e1fe-0848220013b9",
+ "communicationUser": {
+ "id": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-286d-e1fe-0848220013b9"
+ }
+ }
+ },
+ {
+ "displayName": "John",
+ "participantCommunicationIdentifier": {
+ "rawId": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-289b-07fd-0848220015ea",
+ "communicationUser": {
+ "id": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-289b-07fd-0848220015ea"
+ }
+ }
+ }
+ ],
+ "createTime": "2021-02-18T23:47:26.91Z",
+ "version": 1613692046910,
+ "recipientId": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-286e-84f5-08482200181c",
+ "recipientCommunicationIdentifier": {
+ "rawId": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-286e-84f5-08482200181c",
+ "communicationUser": {
+ "id": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-576c-286e-84f5-08482200181c"
+ }
+ },
+ "transactionId": "zbZt+9h/N0em+XCW2QvyIA.1.1.1.1.1737228330.0.1737490483.1.6",
+ "threadId": "19:1d594fb1eeb14566903cbc5decb5bf5b@thread.v2"
+ },
+ "eventType": "Microsoft.Communication.ChatThreadCreatedWithUser",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-18T23:47:34.7437103Z"
+ }]
``` ### Microsoft.Communication.ChatThreadWithUserDeleted event ```json [{
- "id": "7f4fa31b-e95e-428b-a6e8-53e2553620ad",
- "topic":"/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
- "subject": "thread/19:6d20c2f921cd402ead7d1b31b0d030cd@thread.v2/deletedBy/8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003fe/recipient/8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003f0",
- "data": {
- "deletedBy": "8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003fe",
- "deleteTime": "2020-09-18T00:49:26.3694459Z",
- "createTime": "2020-09-18T00:46:41.559Z",
- "version": 1600390071625,
- "recipientId": "8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003f0",
- "transactionId": "MoZlSM2j7kSD2b5X8bjH7Q.1.1.2.1.1823539230.1.1",
- "threadId": "19:6d20c2f921cd402ead7d1b31b0d030cd@thread.v2"
- },
- "eventType": "Microsoft.Communication.ChatThreadWithUserDeleted",
- "dataVersion": "1.0",
- "metadataVersion": "1",
- "eventTime": "2020-09-18T00:49:26.4269056Z"
-}]
+ "id": "f5d6750c-c6d7-4da8-bb05-6f3fca6c7295",
+ "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/deletedBy/{rawId}/recipient/{rawId}",
+ "data": {
+ "deletedBy": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-5772-6473-83fe-084822000e21",
+ "deletedByCommunicationIdentifier": {
+ "rawId": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-5772-6473-83fe-084822000e21",
+ "communicationUser": {
+ "id": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-5772-6473-83fe-084822000e21"
+ }
+ },
+ "deleteTime": "2021-02-18T23:57:51.5987591Z",
+ "createTime": "2021-02-18T23:54:15.683Z",
+ "version": 1613692578672,
+ "recipientId": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-5772-647b-e1fe-084822001416",
+ "recipientCommunicationIdentifier": {
+ "rawId": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-5772-647b-e1fe-084822001416",
+ "communicationUser": {
+ "id": "8:acs:3d703c91-9657-4b3f-b19c-ef9d53f99710_00000008-5772-647b-e1fe-084822001416"
+ }
+ },
+ "transactionId": "mrliWVUndEmLwkZbeS5KoA.1.1.2.1.1761607918.1.6",
+ "threadId": "19:5870b8f021d74fd786bf5aeb095da291@thread.v2"
+ },
+ "eventType": "Microsoft.Communication.ChatThreadWithUserDeleted",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-18T23:57:52.1597234Z"
+ }]
```
-### Microsoft.Communication.ChatThreadPropertiesUpdatedPerUser event
+### Microsoft.Communication.ChatParticipantAddedToThreadWithUser event
+```json
+[{
+ "id": "049a5a7f-6cd7-43c1-b352-df9e9e6146d1",
+ "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/participantAdded/{rawId}/recipient/{rawId}",
+ "data": {
+ "time": "2021-02-25T06:37:29.9232485Z",
+ "addedByCommunicationIdentifier": {
+ "rawId": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8767-1655-373a0d00885d",
+ "communicationUser": {
+ "id": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8767-1655-373a0d00885d"
+ }
+ },
+ "participantAdded": {
+ "displayName": "John Smith",
+ "participantCommunicationIdentifier": {
+ "rawId": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8785-1655-373a0d00885f",
+ "communicationUser": {
+ "id": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8785-1655-373a0d00885f"
+ }
+ }
+ },
+ "recipientCommunicationIdentifier": {
+ "rawId": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8781-1655-373a0d00885e",
+ "communicationUser": {
+ "id": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8781-1655-373a0d00885e"
+ }
+ },
+ "createTime": "2021-02-25T06:37:17.371Z",
+ "version": 1614235049907,
+ "transactionId": "q7rr9by6m0CiGiQxKdSO1w.1.1.1.1.1473446055.1.6",
+ "threadId": "19:f1400e1c542f4086a606b52ad20cd0bd@thread.v2"
+ },
+ "eventType": "Microsoft.Communication.ChatParticipantAddedToThreadWithUser",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-25T06:37:31.4880091Z"
+ }]
+```
+### Microsoft.Communication.ChatParticipantRemovedFromThreadWithUser event
```json [{
- "id": "47a66834-57d7-4f77-9c7d-676d45524982",
- "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
- "subject": "thread/19:a33a128babf04431b7fe8cbca82f4238@thread.v2/editedBy/8:acs:fac4607d-d2d0-40e5-84df-6f32ebd1251a_00000005-3e88-2b7f-ac00-343a0d0005a8/recipient/8:acs:fac4607d-d2d0-40e5-84df-6f32ebd1251a_00000005-3e88-15fa-ac00-343a0d0005a7",
- "data": {
- "editedBy": "8:acs:fac4607d-d2d0-40e5-84df-6f32ebd1251a_00000005-3e88-2b7f-ac00-343a0d0005a8",
- "editTime": "2020-09-18T00:40:38.4914428Z",
- "properties": {
- "topic": "Communication in Azure"
+ "id": "e8a4df24-799d-4c53-94fd-1e05703a4549",
+ "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/participantRemoved/{rawId}/recipient/{rawId}",
+ "data": {
+ "time": "2021-02-25T06:40:20.3564556Z",
+ "removedByCommunicationIdentifier": {
+ "rawId": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8767-1655-373a0d00885d",
+ "communicationUser": {
+ "id": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8767-1655-373a0d00885d"
+ }
+ },
+ "participantRemoved": {
+ "displayName": "Bob",
+ "participantCommunicationIdentifier": {
+ "rawId": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8785-1655-373a0d00885f",
+ "communicationUser": {
+ "id": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8785-1655-373a0d00885f"
+ }
+ }
+ },
+ "recipientCommunicationIdentifier": {
+ "rawId": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8781-1655-373a0d00885e",
+ "communicationUser": {
+ "id": "8:acs:0a420b29-555c-4f6b-841e-de8059893bb9_00000008-77c9-8781-1655-373a0d00885e"
+ }
+ },
+ "createTime": "2021-02-25T06:37:17.371Z",
+ "version": 1614235220325,
+ "transactionId": "usv74GQ5zU+JmWv/bQ+qfg.1.1.1.1.1480065078.1.5",
+ "threadId": "19:f1400e1c542f4086a606b52ad20cd0bd@thread.v2"
},
- "createTime": "2020-09-18T00:39:02.541Z",
- "version": 1600389638481,
- "recipientId": "8:acs:fac4607d-d2d0-40e5-84df-6f32ebd1251a_00000005-3e88-15fa-ac00-343a0d0005a7",
- "transactionId": "+ah9tVwqNkCT6nUGCKIvAg.1.1.1.1.1802895561.1.1",
- "threadId": "19:a33a128babf04431b7fe8cbca82f4238@thread.v2"
- },
- "eventType": "Microsoft.Communication.ChatThreadPropertiesUpdatedPerUser",
- "dataVersion": "1.0",
- "metadataVersion": "1",
- "eventTime": "2020-09-18T00:40:38.5804349Z"
-}]
+ "eventType": "Microsoft.Communication.ChatParticipantRemovedFromThreadWithUser",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-25T06:40:24.2244945Z"
+ }]
+```
+
+### Microsoft.Communication.ChatThreadPropertiesUpdatedPerUser event
+
+```json
+[{
+ "id": "d57342ff-264e-4a5e-9c54-ef05b7d50082",
+ "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/editedBy/{rawId}/recipient/{rawId}",
+ "data": {
+ "editedBy": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d07-83fe-084822000f6e",
+ "editedByCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d07-83fe-084822000f6e",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7d07-83fe-084822000f6e"
+ }
+ },
+ "editTime": "2021-02-19T00:28:28.7390282Z",
+ "properties": {
+ "topic": "Communication in Azure"
+ },
+ "createTime": "2021-02-19T00:28:25.864Z",
+ "version": 1613694508719,
+ "recipientId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7caf-07fd-084822001724",
+ "recipientCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7caf-07fd-084822001724",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-578d-7caf-07fd-084822001724"
+ }
+ },
+ "transactionId": "WLXPrnJ/I0+LTj2cwMrNMQ.1.1.1.1.1833369763.1.4",
+ "threadId": "19:2cc3504c41244d7483208a4f58a1f188@thread.v2"
+ },
+ "eventType": "Microsoft.Communication.ChatThreadPropertiesUpdatedPerUser",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-19T00:28:29.559726Z"
+ }]
``` ### Microsoft.Communication.ChatMemberAddedToThreadWithUser event
This section contains an example of what that data would look like for each even
[{ "id": "4abd2b49-d1a9-4fcc-9cd7-170fa5d96443", "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
- "subject": "thread/19:6d20c2f921cd402ead7d1b31b0d030cd@thread.v2/memberAdded/8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003fe/recipient/8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003f0",
+ "subject": "thread/{thread-id}/memberAdded/{rawId}/recipient/{rawId}",
"data": { "time": "2020-09-18T00:47:13.1867087Z", "addedBy": "8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003f1",
This section contains an example of what that data would look like for each even
[{ "id": "b3701976-1ea2-4d66-be68-4ec4fc1b4b96", "topic": "/subscriptions/{subscription-id}/resourceGroups/{group-name}/providers/Microsoft.Communication/communicationServices/{communication-services-resource-name}",
- "subject": "thread/19:6d20c2f921cd402ead7d1b31b0d030cd@thread.v2/memberRemoved/8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003fe/recipient/8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003f0",
+ "subject": "thread/{thread-id}/memberRemoved/{rawId}/recipient/{rawId}",
"data": { "time": "2020-09-18T00:47:51.1461742Z", "removedBy": "8:acs:5354158b-17b7-489c-9380-95d8821ff76b_00000005-3e5f-1bc6-f40f-343a0d0003f1",
This section contains an example of what that data would look like for each even
}] ```
+### Microsoft.Communication.ChatThreadCreated event
+
+```json
+[ {
+ "id": "a607ac52-0974-4d3c-bfd8-6f708a26f509",
+ "topic": "/subscriptions/{subscription-id}/resourcegroups/{group-name}/providers/microsoft.communication/communicationservices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/createdBy/{rawId}",
+ "data": {
+ "createdByCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453"
+ }
+ },
+ "properties": {
+ "topic": "Talk about new Thread Events in commuication services"
+ },
+ "participants": [
+ {
+ "displayName": "Bob",
+ "participantCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453"
+ }
+ }
+ },
+ {
+ "displayName": "Scott",
+ "participantCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38e6-07fd-084822002467",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38e6-07fd-084822002467"
+ }
+ }
+ },
+ {
+ "displayName": "Shawn",
+ "participantCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38f6-83fe-084822002337",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38f6-83fe-084822002337"
+ }
+ }
+ },
+ {
+ "displayName": "Anthony",
+ "participantCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38e3-e1fe-084822002c35",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38e3-e1fe-084822002c35"
+ }
+ }
+ }
+ ],
+ "createTime": "2021-02-20T00:31:54.365+00:00",
+ "version": 1613781114365,
+ "threadId": "19:e07c8ddc5bab4c059ea9f11d29b544b6@thread.v2",
+ "transactionId": "gK6+kgANy0O1wchlVKVTJg.1.1.1.1.921436178.1"
+ },
+ "eventType": "Microsoft.Communication.ChatThreadCreated",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-20T00:31:54.5369967Z"
+ }]
+```
+### Microsoft.Communication.ChatThreadPropertiesUpdated event
+
+```json
+[{
+ "id": "cf867580-9caf-45be-b49f-ab1cbfcaa59f",
+ "topic": "/subscriptions/{subscription-id}/resourcegroups/{group-name}/providers/microsoft.communication/communicationservices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/editedBy/{rawId}",
+ "data": {
+ "editedByCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5c9e-9e35-07fd-084822002264",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5c9e-9e35-07fd-084822002264"
+ }
+ },
+ "editTime": "2021-02-20T00:04:07.7152073+00:00",
+ "properties": {
+ "topic": "Talk about new Thread Events in commuication services"
+ },
+ "createTime": "2021-02-20T00:00:40.126+00:00",
+ "version": 1613779447695,
+ "threadId": "19:9e8eefe67b3c470a8187b4c2b00240bc@thread.v2",
+ "transactionId": "GBE9MB2a40KEWzexIg0D3A.1.1.1.1.856359041.1"
+ },
+ "eventType": "Microsoft.Communication.ChatThreadPropertiesUpdated",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-20T00:04:07.8410277Z"
+ }]
+```
+### Microsoft.Communication.ChatThreadDeleted event
+
+```json
+[
+{
+ "id": "1dbd5237-4823-4fed-980c-8d27c17cf5b0",
+ "topic": "/subscriptions/{subscription-id}/resourcegroups/{group-name}/providers/microsoft.communication/communicationservices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/deletedBy/{rawId}",
+ "data": {
+ "deletedByCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5c9e-a300-07fd-084822002266",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5c9e-a300-07fd-084822002266"
+ }
+ },
+ "deleteTime": "2021-02-20T00:00:42.109802+00:00",
+ "createTime": "2021-02-20T00:00:39.947+00:00",
+ "version": 1613779241389,
+ "threadId": "19:c9e9f3060b884e448671391882066ac3@thread.v2",
+ "transactionId": "KibptDpcLEeEFnlR7cI3QA.1.1.2.1.848298005.1"
+ },
+ "eventType": "Microsoft.Communication.ChatThreadDeleted",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-20T00:00:42.5428002Z"
+ }
+ ]
+```
+### Microsoft.Communication.ChatThreadParticipantAdded event
+
+```json
+[
+{
+ "id": "3024eb5d-1d71-49d1-878c-7dc3165433d9",
+ "topic": "/subscriptions/{subscription-id}/resourcegroups/{group-name}/providers/microsoft.communication/communicationservices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/participantadded/{rawId}",
+ "data": {
+ "time": "2021-02-20T00:54:42.8622646+00:00",
+ "addedByCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453"
+ }
+ },
+ "participantAdded": {
+ "displayName": "Bob",
+ "participantCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38f3-88f7-084822002454",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38f3-88f7-084822002454"
+ }
+ }
+ },
+ "createTime": "2021-02-20T00:31:54.365+00:00",
+ "version": 1613782482822,
+ "threadId": "19:e07c8ddc5bab4c059ea9f11d29b544b6@thread.v2",
+ "transactionId": "9q6cO7i4FkaZ+5RRVzshVw.1.1.1.1.974913783.1"
+ },
+ "eventType": "Microsoft.Communication.ChatThreadParticipantAdded",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-20T00:54:43.9866454Z"
+ }
+]
+```
+### Microsoft.Communication.ChatThreadParticipantRemoved event
+
+```json
+[
+{
+ "id": "6ed810fd-8776-4b13-81c2-1a0c4f791a07",
+ "topic": "/subscriptions/{subscription-id}/resourcegroups/{group-name}/providers/microsoft.communication/communicationservices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/participantremoved/{rawId}",
+ "data": {
+ "time": "2021-02-20T00:56:18.1118825+00:00",
+ "removedByCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453"
+ }
+ },
+ "participantRemoved": {
+ "displayName": "Shawn",
+ "participantCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38e6-07fd-084822002467",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38e6-07fd-084822002467"
+ }
+ }
+ },
+ "createTime": "2021-02-20T00:31:54.365+00:00",
+ "version": 1613782578096,
+ "threadId": "19:e07c8ddc5bab4c059ea9f11d29b544b6@thread.v2",
+ "transactionId": "zGCq8IGRr0aEF6COuy7wSA.1.1.1.1.978649284.1"
+ },
+ "eventType": "Microsoft.Communication.ChatThreadParticipantRemoved",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-20T00:56:18.856721Z"
+ }
+]
+```
+### Microsoft.Communication.ChatMessageReceivedInThread event
+
+```json
+[
+{
+ "id": "4f614f97-c451-4b82-a8c9-1e30c3bfcda1",
+ "topic": "/subscriptions/{subscription-id}/resourcegroups/{group-name}/providers/microsoft.communication/communicationservices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/sender/8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cdb-4916-07fd-084822002624",
+ "data": {
+ "messageBody": "Talk about new Thread Events in commuication services",
+ "messageId": "1613783230064",
+ "type": "Text",
+ "version": "1613783230064",
+ "senderDisplayName": "Bob",
+ "senderCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cdb-4916-07fd-084822002624",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cdb-4916-07fd-084822002624"
+ }
+ },
+ "composeTime": "2021-02-20T01:07:10.064+00:00",
+ "threadId": "19:5b3809e80e4a439d92c3316e273f4a2b@thread.v2",
+ "transactionId": "foMkntkKS0O/MhMlIE5Aag.1.1.1.1.1004077250.1"
+ },
+ "eventType": "Microsoft.Communication.ChatMessageReceivedInThread",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-20T01:07:10.5704596Z"
+ }
+]
+```
+### Microsoft.Communication.ChatMessageEditedInThread event
+
+```json
+[
+ {
+ "id": "7b8dc01e-2659-41fa-bc8c-88a967714510",
+ "topic": "/subscriptions/{subscription-id}/resourcegroups/{group-name}/providers/microsoft.communication/communicationservices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/sender/{rawId}",
+ "data": {
+ "editTime": "2021-02-20T00:59:10.464+00:00",
+ "messageBody": "8effb181-1eb2-4a58-9d03-ed48a461b19b",
+ "messageId": "1613782685964",
+ "type": "Text",
+ "version": "1613782750464",
+ "senderDisplayName": "Scott",
+ "senderCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453"
+ }
+ },
+ "composeTime": "2021-02-20T00:58:05.964+00:00",
+ "threadId": "19:e07c8ddc5bab4c059ea9f11d29b544b6@thread.v2",
+ "transactionId": "H8Gpj3NkIU6bXlWw8WPvhQ.2.1.2.1.985333801.1"
+ },
+ "eventType": "Microsoft.Communication.ChatMessageEditedInThread",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-20T00:59:10.7600061Z"
+ }
+]
+```
+
+### Microsoft.Communication.ChatMessageDeletedInThread event
+
+```json
+[
+ {
+ "id": "17d9c39d-0c58-4ed8-947d-c55959f57f75",
+ "topic": "/subscriptions/{subscription-id}/resourcegroups/{group-name}/providers/microsoft.communication/communicationservices/{communication-services-resource-name}",
+ "subject": "thread/{thread-id}/sender/{rawId}",
+ "data": {
+ "deleteTime": "2021-02-20T00:59:10.464+00:00",
+ "messageId": "1613782685440",
+ "type": "Text",
+ "version": "1613782814333",
+ "senderDisplayName": "Scott",
+ "senderCommunicationIdentifier": {
+ "rawId": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453",
+ "communicationUser": {
+ "id": "8:acs:109f0644-b956-4cd9-87b1-71024f6e2f44_00000008-5cbb-38a0-88f7-084822002453"
+ }
+ },
+ "composeTime": "2021-02-20T00:58:05.44+00:00",
+ "threadId": "19:e07c8ddc5bab4c059ea9f11d29b544b6@thread.v2",
+ "transactionId": "HqU6PeK5AkCRSpW8eAbL0A.1.1.2.1.987824181.1"
+ },
+ "eventType": "Microsoft.Communication.ChatMessageDeletedInThread",
+ "dataVersion": "1.0",
+ "metadataVersion": "1",
+ "eventTime": "2021-02-20T01:00:14.8518034Z"
+ }
+]
+```
++ ## Quickstarts and how-tos | Title | Description |
communication-services Identity Model https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/identity-model.md
Instead of duplicating information in your system, you'll maintain the mapping r
## Identity
-You can create identities by using the Azure Communication Services administration library. An identity serves as an identifier in conversations. It's used to create access tokens. The same identity might participate in multiple simultaneous sessions across multiple devices. An identity might have multiple active access tokens at the same time.
+You can create identities by using the Azure Communication Services Identity library. An identity serves as an identifier in conversations. It's used to create access tokens. The same identity might participate in multiple simultaneous sessions across multiple devices. An identity might have multiple active access tokens at the same time.
-The deletion of an identity, resource, or subscription invalidates all of its access tokens. This action also deletes all data that's stored for the identity. A deleted identity can't create new access tokens or access previously stored data (for example, chat messages).
+The deletion of an identity, resource, or subscription invalidates all of its access tokens. This action also deletes all data that's stored for the identity. A deleted identity can't create new access tokens or access previously stored data (for example, chat messages).
-You aren't charged for the number of identities you have. Instead, you're charged for the use of primitives. The number of your identities doesn't have to restrict how you map your application's identities to the Azure Communication Services identities.
+You aren't charged for the number of identities you have. Instead, you're charged for the use of primitives. The number of your identities doesn't have to restrict how you map your application's identities to the Azure Communication Services identities.
With the freedom of mapping comes privacy responsibility. If a user wants to be deleted from your system, then you need to delete all identities that are associated with that user.
-Azure Communication Services doesn't provide special identities for anonymous users. It doesn't keep the mapping between the users and identities, and it can't determine whether an identity is anonymous. You can design the identity concept to fit your needs. Our recommendation is to create a new identity for each anonymous user on each application.
+Azure Communication Services doesn't provide special identities for anonymous users. It doesn't keep the mapping between the users and identities, and it can't determine whether an identity is anonymous. You can design the identity concept to fit your needs. Our recommendation is to create a new identity for each anonymous user on each application.
Anyone who has a valid access token can access current identity content. For example, users can access chat messages that they sent. The access is restricted only to scopes that are part of the access token. For more information, see the [Access tokens](#access-tokens) section in this article.
Anyone who has a valid access token can access current identity content. For exa
Azure Communication Services doesn't replicate the functionality of the Azure identity management system. It doesn't provide a way for customers to use customer-specific identities. For example, customers can't use a phone number or email address. Instead, Azure Communication Services provides unique identifiers. You can assign these unique identifiers to your application's identities. Azure Communication Services doesn't store any kind of information that might reveal the real identity of your users.
-To avoid duplicating information in your system, plan how to map users from your identity domain to Azure Communication Services identities. You can follow any kind of pattern. For example, you can use 1:1, 1:N, N:1, or M:N. Decide whether a single user is mapped to a single identity or to multiple identities.
+To avoid duplicating information in your system, plan how to map users from your identity domain to Azure Communication Services identities. You can follow any kind of pattern. For example, you can use 1:1, 1:N, N:1, or M:N. Decide whether a single user is mapped to a single identity or to multiple identities.
When a new identity is created, store its mapping to your application's user or users. Because identities require access tokens to use primitives, the identity needs to be known to your application's user or users.
If you use a relational database to store user information, then you can adjust
## Access tokens
-An access token is a JSON Web Token (JWT) that can be used to get access to Azure Communication Service primitives. An access token that's issued has integrity protection. That is, its claims can't be changed after it's issued. So a manual change of properties such as identity, expiration, or scopes will invalidate the access token. If primitives are used with invalidated tokens, then access will be denied to the primitives.
+An access token is a JSON Web Token (JWT) that can be used to get access to Azure Communication Service primitives. An access token that's issued has integrity protection. That is, its claims can't be changed after it's issued. So a manual change of properties such as identity, expiration, or scopes will invalidate the access token. If primitives are used with invalidated tokens, then access will be denied to the primitives.
The properties of an access token are: * Identity. * Expiration. * Scopes.
-An access token is always valid for 24 hours. After it expires, the access token is invalidated and can't be used to access any primitive.
+An access token is always valid for 24 hours. After it expires, the access token is invalidated and can't be used to access any primitive.
An identity needs a way to request a new access token from a server-side service. The *scope* parameter defines a nonempty set of primitives that can be used. Azure Communication Services supports the following scopes for access tokens.
An identity needs a way to request a new access token from a server-side service
|VoIP| Grants the ability to call identities and phone numbers|
-To revoke an access token before its expiration time, use the Azure Communication Services administration library. Token revocation isn't immediate. It takes up to 15 minutes to propagate. The removal of an identity, resource, or subscription revokes all access tokens.
+To revoke an access token before its expiration time, use the Azure Communication Services Identity library. Token revocation isn't immediate. It takes up to 15 minutes to propagate. The removal of an identity, resource, or subscription revokes all access tokens.
If you want to remove a user's ability to access specific functionality, revoke all access tokens. Then issue a new access token that has a more limited set of scopes.
-In Azure Communication Services, a rotation of access keys revokes all active access tokens that were created by using a former access key. All identities lose access to Azure Communication Services, and they must issue new access tokens.
+In Azure Communication Services, a rotation of access keys revokes all active access tokens that were created by using a former access key. All identities lose access to Azure Communication Services, and they must issue new access tokens.
-We recommend issuing access tokens in your server-side service and not in the client's application. The reasoning is that issuing requires an access key or a managed identity. For security reasons, sharing access keys with the client's application isn't recommended.
+We recommend issuing access tokens in your server-side service and not in the client's application. The reasoning is that issuing requires an access key or a managed identity. For security reasons, sharing access keys with the client's application isn't recommended.
The client application should use a trusted service endpoint that can authenticate your clients. The endpoint should issue access tokens on their behalf. For more information, see [Client and server architecture](./client-and-server-architecture.md).
communication-services Notifications https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/notifications.md
[!INCLUDE [Public Preview Notice](../includes/public-preview-include.md)] + The Azure Communication Services chat and calling client libraries create a real-time messaging channel that allows signaling messages to be pushed to connected clients in an efficient, reliable manner. This enables you to build rich, real-time communication functionality into your applications without the need to implement complicated HTTP polling logic. However, on mobile applications, this signaling channel only remains connected when your application is active in the foreground. If you want your users to receive incoming calls or chat messages while your application is in the background, you should use push notifications. Push notifications allow you to send information from your application to users' mobile devices. You can use push notifications to show a dialog, play a sound, or display incoming call UI. Azure Communication Services provides integrations with [Azure Event Grid](../../event-grid/overview.md) and [Azure Notification Hubs](../../notification-hubs/notification-hubs-push-notification-overview.md) that enable you to add push notifications to your apps.
Learn more about [event handling in Azure Communication Services](./event-handli
## Deliver push notifications via Azure Notification Hubs
-You can connect an Azure Notification Hub to your Communication Services resource in order to automatically send push notifications to a user's mobile device when they receive an incoming call. You should use these push notifications to wake up your application from the background and display UI that lets the user accept or decline the call.
+You can connect an Azure Notification Hub to your Communication Services resource in order to automatically send push notifications to a user's mobile device when they receive an incoming call. You should use these push notifications to wake up your application from the background and display UI that lets the user accept or decline the call.
:::image type="content" source="./media/notifications/acs-anh-int.png" alt-text="Diagram showing how communication services integrates with Azure Notification Hubs.":::
Communication Services uses Azure Notification Hub as a pass-through service to
> [!NOTE] > Currently only calling push notifications are supported.
-### Notification Hub provisioning
+### Notification Hub provisioning
To deliver push notifications to client devices using Notification Hubs, [create a Notification Hub](../../notification-hubs/create-notification-hub-portal.md) within the same subscription as your Communication Services resource. You must configure the Azure Notification Hub for the Platform Notification System you want to use. To learn how to get push notifications in your client app from Notification Hubs, see [Getting started with Notification Hubs](../../notification-hubs/notification-hubs-android-push-notification-google-fcm-get-started.md) and select your target client platform from the drop-down list near the top of the page. > [!NOTE]
-> Currently the APNs and FCM platforms are supported.
-The APNs platform needs to be configured with token authentication mode. Certificate authentication mode isn't supported as of now.
+> Currently the APNs and FCM platforms are supported.
+The APNs platform needs to be configured with token authentication mode. Certificate authentication mode isn't supported as of now.
Once your Notification hub is configured, you can associate it to your Communication Services resource by supplying a connection string for the hub using the Azure Resource Manager Client or through the Azure portal. The connection string should contain `Send` permissions. We recommend creating another access policy with `Send` only permissions specifically for your hub. Learn more about [Notification Hubs security and access policies](../../notification-hubs/notification-hubs-push-notification-security.md)
In the portal, navigate to your Azure Communication Services resource. Inside th
:::image type="content" source="./media/notifications/acs-anh-portal-int.png" alt-text="Screenshot showing the Push Notifications settings within the Azure portal."::: > [!NOTE]
-> If the Azure Notification Hub connection string is updated the Communication Services resource has to be updated as well.
+> If the Azure Notification Hub connection string is updated the Communication Services resource has to be updated as well.
Any change on how the hub is linked will be reflected in data plane (i.e., when sending a notification) within a maximum period of ``10`` minutes. This is applicable also when the hub is linked for the first time **if** there were notifications sent before.
-### Device registration
+### Device registration
Refer to the [voice calling quickstart](../quickstarts/voice-video-calling/getting-started-with-calling.md) to learn how to register your device handle with Communication Services.
In case that you regenerated the connection string of your linked Azure Notifica
## Next steps * For an introduction to Azure Event Grid, see [What is Event Grid?](../../event-grid/overview.md)
-* To learn more on the Azure Notification Hub concepts, see [Azure Notification Hubs documentation](../../notification-hubs/index.yml)
+* To learn more on the Azure Notification Hub concepts, see [Azure Notification Hubs documentation](../../notification-hubs/index.yml)
communication-services Pricing https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/pricing.md
Alice made a group call with her colleagues, Bob and Charlie. Alice and Bob used
### Pricing example: A user of the Communication Services JS client library joins a scheduled Microsoft Teams meeting
-Alice is a doctor meeting with her patient, Bob. Alice will be joining the visit from the Teams Desktop application. Bob will receive a link to join using the healthcare provider website, which connects to the meeting using the Communication Services JS client library. Bob will use his mobile phone to enter the meeting using a web browser (iPhone with Safari). Chat will be available during the virtual visit.
+Alice is a doctor meeting with her patient, Bob. Alice will be joining the visit from the Teams Desktop application. Bob will receive a link to join using the healthcare provider website, which connects to the meeting using the Communication Services JS client library. Bob will use his mobile phone to enter the meeting using a web browser (iPhone with Safari). Chat will be available during the virtual visit.
- The call lasts a total of 30 minutes. - Alice and Bob participate for the entire call. Alice turns on her video five minutes after the call starts and shares her screen for 13 minutes. Bob has his video on for the whole call.
Alice is a doctor meeting with her patient, Bob. Alice will be joining the visit
**Cost calculations** - 1 participant (Bob) x 30 minutes x $0.004 per participant per minute = $0.12 [both video and audio are charged at the same rate]-- 1 participant (Alice) x 30 minutes x $0.000 per participant per minute = $0.0*.
+- 1 participant (Alice) x 30 minutes x $0.000 per participant per minute = $0.0*.
- 1 participant (Bob) x 3 chat messages x $0.0008 = $0.0024.-- 1 participant (Alice) x 5 chat messages x $0.000 = $0.0*.
+- 1 participant (Alice) x 5 chat messages x $0.000 = $0.0*.
*Alice's participation is covered by her Teams license. Your Azure invoice will show the minutes and chat messages that Teams users had with Communication Services Users for your convenience, but those minutes and messages originating from the Teams client will not cost.
-**Total cost for the visit**:
+**Total cost for the visit**:
- User joining using the Communication Services JS client library: $0.12 + $0.0024 = $0.1224-- User joining on Teams Desktop Application: $0 (covered by Teams license)
+- User joining on Teams Desktop Application: $0 (covered by Teams license)
## Chat
With Communication Services you can enhance your application with the ability to
You're charged $0.0008 for every chat message sent.
-### Pricing example: Chat between two users
+### Pricing example: Chat between two users
Geeta starts a chat thread with Emily to share an update and sends 5 messages. The chat lasts 10 minutes. Geeta and Emily send another 15 messages each.
-**Cost calculations**
+**Cost calculations**
- Number of messages sent (5 + 15 + 15) x $0.0008 = $0.028
-### Pricing example: Group chat with multiple users
+### Pricing example: Group chat with multiple users
-Charlie starts a chat thread with his friends Casey & Jasmine to plan a vacation. They chat for a while wherein Charlie, Casey & Jasmine send 20, 30 and 18 messages respectively. They realize that their friend Rose might be interested in joining the trip as well, so they add her to the chat thread and share all the message history with her.
+Charlie starts a chat thread with his friends Casey & Jasmine to plan a vacation. They chat for a while wherein Charlie, Casey & Jasmine send 20, 30 and 18 messages respectively. They realize that their friend Rose might be interested in joining the trip as well, so they add her to the chat thread and share all the message history with her.
Rose sees the messages and starts chatting. In the meanwhile Casey gets a call and he decides to catch up on the conversation later. Charlie, Jasmine & Rose decide on the travel dates and send another 30, 25, 35 messages respectively.
-**Cost calculations**
+**Cost calculations**
- Number of messages sent (20 + 30 + 18 + 30 + 25 + 35) x $0.0008 = $0.1264 ## Telephony and SMS
-## Price
+## Price
Telephony services are priced on a per-minute basis, while SMS is priced on a per-message basis. Pricing is determined by the type and location of the number you're using as well as the destination of your calls and SMS messages.
communication-services Privacy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/privacy.md
Chat threads and messages are retained until explicitly deleted. A fully idle th
### SMS
-Sent and received SMS messages are ephemerally processed by the service and not retained.
+Sent and received SMS messages are ephemerally processed by the service and not retained.
### PSTN voice calling
communication-services Reference https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/reference.md
[!INCLUDE [Public Preview Notice](../includes/public-preview-include.md)] + The following table details the available Communication Services packages along with corresponding reference documentation: <!--note that this table also exists here and should be synced: https://github.com/Azure/Communication/blob/master/README.md -->
The following table details the available Communication Services packages along
| Area | JavaScript | .NET | Python | Java SE | iOS | Android | Other | | -- | - | - | | - | -- | -- | | | Azure Resource Manager | - | [NuGet](https://www.nuget.org/packages/Azure.ResourceManager.Communication) | [PyPi](https://pypi.org/project/azure-mgmt-communication/) | - | - | - | [Go via GitHub](https://github.com/Azure/azure-sdk-for-go/releases/tag/v46.3.0) |
-| Common | [npm](https://www.npmjs.com/package/@azure/communication-common) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Common/) | N/A | [Maven](https://search.maven.org/search?q=a:azure-communication-common) | [GitHub](https://github.com/Azure/azure-sdk-for-ios/releases/tag/1.0.0-beta.1) | [Maven](https://search.maven.org/artifact/com.azure.android/azure-communication-common) | - |
-| Administration | [npm](https://www.npmjs.com/package/@azure/communication-administration) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Administration) | [PyPi](https://pypi.org/project/azure-communication-administration/) | [Maven](https://search.maven.org/search?q=a:azure-communication-administration) | - | - | - |
+| Common | [npm](https://www.npmjs.com/package/@azure/communication-common) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Common/) | N/A | [Maven](https://search.maven.org/search?q=a:azure-communication-common) | [GitHub](https://github.com/Azure/azure-sdk-for-ios/releases) | [Maven](https://search.maven.org/artifact/com.azure.android/azure-communication-common) | - |
+| Identity | [npm](https://www.npmjs.com/package/@azure/communication-identity) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Identity) | [PyPi](https://pypi.org/project/azure-communication-identity/) | [Maven](https://search.maven.org/search?q=a:azure-communication-identity) | - | - | - |
| Chat | [npm](https://www.npmjs.com/package/@azure/communication-chat) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Chat) | [PyPi](https://pypi.org/project/azure-communication-chat/) | [Maven](https://search.maven.org/search?q=a:azure-communication-chat) | [GitHub](https://github.com/Azure/azure-sdk-for-ios/releases) | [Maven](https://search.maven.org/search?q=a:azure-communication-chat) | - | | SMS | [npm](https://www.npmjs.com/package/@azure/communication-sms) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Sms) | [PyPi](https://pypi.org/project/azure-communication-sms/) | [Maven](https://search.maven.org/artifact/com.azure/azure-communication-sms) | - | - | - |
-| Calling | [npm](https://www.npmjs.com/package/@azure/communication-calling) | - | - | - | [GitHub](https://github.com/Azure/Communication/releases/tag/v1.0.0-beta.2) ([docs](/objectivec/communication-services/calling/)) | [Maven](https://search.maven.org/artifact/com.azure.android/azure-communication-calling/) | - |
-| Reference Documentation | [docs](https://azure.github.io/azure-sdk-for-js/communication.html) | [docs](https://azure.github.io/azure-sdk-for-net/communication.html) | - | [docs](http://azure.github.io/azure-sdk-for-java/communication.html) | - | - | - |
+| Calling | [npm](https://www.npmjs.com/package/@azure/communication-calling) | - | - | - | [GitHub](https://github.com/Azure/Communication/releases) ([docs](/objectivec/communication-services/calling/)) | [Maven](https://search.maven.org/artifact/com.azure.android/azure-communication-calling/) | - |
+| Reference Documentation | [docs](https://azure.github.io/azure-sdk-for-js/communication.html) | [docs](https://azure.github.io/azure-sdk-for-net/communication.html) | - | [docs](http://azure.github.io/azure-sdk-for-java/communication.html) | - | - | - |
communication-services Sdk Options https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/sdk-options.md
[!INCLUDE [Public Preview Notice](../includes/public-preview-include.md)] + Azure Communication Services capabilities are conceptually organized into six areas. Some areas have fully open-sourced client libraries. The Calling client library uses proprietary network interfaces and is currently closed-source, and the Chat library includes a closed-source dependency. Samples and additional technical details for client libraries are published in the [Azure Communication Services GitHub repo](https://github.com/Azure/communication). ## Client libraries
Azure Communication Services capabilities are conceptually organized into six ar
| - | | |-- | | | Azure Resource Manager | REST | Open | Azure.ResourceManager.Communication | Provision and manage Communication Services resources | | Common | REST | Open | Azure.Communication.Common | Provides base types for other client libraries |
-| Identity | REST | Open | Azure.Communication.Identity | Manage users and access tokens |
+| Identity | REST | Open | Azure.Communication.Identity | Manage users, access tokens |
| Chat | REST with proprietary signaling | Open with closed source signaling package | Azure.Communication.Chat | Add real-time text based chat to your applications | | SMS | REST | Open | Azure.Communication.SMS | Send and receive SMS messages | | Calling | Proprietary transport | Closed |Azure.Communication.Calling | Leverage voice, video, screen-sharing, and other real-time data communication capabilities |
-Note that the Azure Resource Manager, Administration, and SMS client libraries are focused on service integration, and in many cases security issues arise if you integrate these functions into end-user applications. The Common and Chat client libraries are suitable for service and client applications. The Calling client library is designed for client applications. A client library focused on service scenarios is in development.
+Note that the Azure Resource Manager, Identity, and SMS client libraries are focused on service integration, and in many cases security issues arise if you integrate these functions into end-user applications. The Common and Chat client libraries are suitable for service and client applications. The Calling client library is designed for client applications. A client library focused on service scenarios is in development.
### Languages and publishing locations
-Publishing locations for individual client library packages are detailed below.
+Publishing locations for individual client library packages are detailed below.
| Area | JavaScript | .NET | Python | Java SE | iOS | Android | Other | | -- | - | - | | - | -- | -- | | | Azure Resource Manager | - | [NuGet](https://www.nuget.org/packages/Azure.ResourceManager.Communication) | [PyPi](https://pypi.org/project/azure-mgmt-communication/) | - | - | - | [Go via GitHub](https://github.com/Azure/azure-sdk-for-go/releases/tag/v46.3.0) |
-| Common | [npm](https://www.npmjs.com/package/@azure/communication-common) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Common/) | N/A | [Maven](https://search.maven.org/search?q=a:azure-communication-common) | [GitHub](https://github.com/Azure/azure-sdk-for-ios/releases/tag/1.0.0-beta.1) | [Maven](https://search.maven.org/artifact/com.azure.android/azure-communication-common) | - |
-| Administration | [npm](https://www.npmjs.com/package/@azure/communication-administration) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Administration) | [PyPi](https://pypi.org/project/azure-communication-administration/) | [Maven](https://search.maven.org/search?q=a:azure-communication-administration) | - | - | - |
-| Identity | [npm](https://www.npmjs.com/package/@azure/communication-identity) | [NuGet](https://www.nuget.org/packages/Azure.Communication.identity) | [PyPi](https://pypi.org/project/azure-communication-identity/) | [Maven](https://search.maven.org/search?q=a:azure-communication-identity) | - | - | - |
+| Common | [npm](https://www.npmjs.com/package/@azure/communication-common) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Common/) | N/A | [Maven](https://search.maven.org/search?q=a:azure-communication-common) | [GitHub](https://github.com/Azure/azure-sdk-for-ios/releases) | [Maven](https://search.maven.org/artifact/com.azure.android/azure-communication-common) | - |
+| Identity | [npm](https://www.npmjs.com/package/@azure/communication-identity) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Identity) | [PyPi](https://pypi.org/project/azure-communication-identity/) | [Maven](https://search.maven.org/search?q=a:azure-communication-identity) | - | - | - |
| Chat | [npm](https://www.npmjs.com/package/@azure/communication-chat) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Chat) | [PyPi](https://pypi.org/project/azure-communication-chat/) | [Maven](https://search.maven.org/search?q=a:azure-communication-chat) | [GitHub](https://github.com/Azure/azure-sdk-for-ios/releases) | [Maven](https://search.maven.org/search?q=a:azure-communication-chat) | - | | SMS | [npm](https://www.npmjs.com/package/@azure/communication-sms) | [NuGet](https://www.nuget.org/packages/Azure.Communication.Sms) | [PyPi](https://pypi.org/project/azure-communication-sms/) | [Maven](https://search.maven.org/artifact/com.azure/azure-communication-sms) | - | - | - |
-| Calling | [npm](https://www.npmjs.com/package/@azure/communication-calling) | - | - | - | [GitHub](https://github.com/Azure/Communication/releases/tag/v1.0.0-beta.2) | [Maven](https://search.maven.org/artifact/com.azure.android/azure-communication-calling/) | - |
+| Calling | [npm](https://www.npmjs.com/package/@azure/communication-calling) | - | - | - | [GitHub](https://github.com/Azure/Communication/releases) | [Maven](https://search.maven.org/artifact/com.azure.android/azure-communication-calling/) | - |
| Reference Documentation | [docs](https://azure.github.io/azure-sdk-for-js/communication.html) | [docs](https://azure.github.io/azure-sdk-for-net/communication.html) | - | [docs](http://azure.github.io/azure-sdk-for-java/communication.html) | [docs](/objectivec/communication-services/calling/) | [docs](/java/api/com.azure.communication.calling?view=communication-services-java-android) | - | ## REST APIs
The following timeouts apply to the Communication Services calling client librar
| Action | Timeout in seconds | | -- | - | | Reconnect/removal participant | 120 |
-| Add or remove new modality from a call (Start/stop video or screensharing) | 40 |
+| Add or remove new modality from a call (Start/stop video or screen sharing) | 40 |
| Call Transfer operation timeout | 60 | | 1:1 call establishment timeout | 85 | | Group call establishment timeout | 85 |
The following timeouts apply to the Communication Services calling client librar
| Promote 1:1 call to a group call timeout | 115 |
-## API stability expectations
+## API stability expectations
> [!IMPORTANT]
-> This section provides guidance on REST APIs and client libraries marked **stable**. APIs marked pre-release, preview, or beta may be changed or deprecated **without notice**. Currently Azure Communication Services is in a **public preview**, and APIs are marked as such.
+> This section provides guidance on REST APIs and client libraries marked **stable**. APIs marked pre-release, preview, or beta may be changed or deprecated **without notice**.
In the future we may retire versions of the Communication Services client libraries, and we may introduce breaking changes to our REST APIs and released client libraries. Azure Communication Services will *generally* follow two supportability policies for retiring service versions:
For more information, see the following client library overviews:
To get started with Azure Communication - [Create Azure Communication Resources](../quickstarts/create-communication-resource.md)-- Generate [User Access Tokens](../quickstarts/access-tokens.md)
+- Generate [User Access Tokens](../quickstarts/access-tokens.md)
communication-services Concepts https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/telephony-sms/concepts.md
# SMS concepts [!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)]++ [!INCLUDE [Regional Availability Notice](../../includes/regional-availability-include.md)] Azure Communication Services enables you to send and receive SMS text messages using the Communication Services SMS client libraries. These client libraries can be used to support customer service scenarios, appointment reminders, two-factor authentication, and other real-time communication needs. Communication Services SMS allows you to reliably send messages while exposing deliverability and response rate insights surrounding your campaigns.
communication-services Plan Solution https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/telephony-sms/plan-solution.md
Phone number availability is currently restricted to Azure subscriptions that ha
## Number types and features
-Communication Services offers two types of phone numbers: **local** and **toll-free**.
+Communication Services offers two types of phone numbers: **local** and **toll-free**.
### Local numbers
-Local (Geographic) numbers are 10-digit telephone numbers consisting of the local area codes in the United States. For example, `+1 (206) XXX-XXXX` is a local number with an area code of `206`. This area code is assigned to the city of Seattle. These phone numbers are generally used by individuals and local businesses. Azure Communication Services offers local numbers in the United States. These numbers can be used to place phone calls, but not to send SMS messages.
+Local (Geographic) numbers are 10-digit telephone numbers consisting of the local area codes in the United States. For example, `+1 (206) XXX-XXXX` is a local number with an area code of `206`. This area code is assigned to the city of Seattle. These phone numbers are generally used by individuals and local businesses. Azure Communication Services offers local numbers in the United States. These numbers can be used to place phone calls, but not to send SMS messages.
### Toll-free Numbers Toll-free numbers are 10-digit telephone numbers with distinct area codes that can be called from any phone number free of charge. For example, `+1 (800) XXX-XXXX` is a toll-free number in the North America region. These phone numbers are generally used for customer service purposes. Azure Communication Services offers toll-free numbers in the United states. These numbers can be used to place phone calls and to send SMS messages. Toll-free numbers cannot be used by people and can only be assigned to applications.
Toll-free numbers are 10-digit telephone numbers with distinct area codes that c
If your phone number will be used by an application (for example, to make calls or send messages on behalf of your service), you can select a toll-free or local (geographic) number. You can select a toll-free number if your application is sending SMS messages and/or making calls.
-If your phone number is being used by a person (for example, a user of your calling application), the local (geographic) phone number must be used.
+If your phone number is being used by a person (for example, a user of your calling application), the local (geographic) phone number must be used.
-The table below summarizes these phone number types:
+The table below summarizes these phone number types:
| Phone number type | Example | Country availability | Phone Number Capability |Common use case | | -- | | -- | |- |
The table below summarizes these phone number types:
| Toll-Free | +1 (toll-free area *code*) XXX XX XX | US | Calling (Outbound), SMS (Inbound/Outbound)| Assigning phone numbers to Interactive Voice Response (IVR) systems/Bots, SMS applications |
-### Phone number features in Azure Communication Services
+### Phone number features in Azure Communication Services
[!INCLUDE [Emergency Calling Notice](../../includes/emergency-calling-notice-include.md)]
For most phone numbers, we allow you to configure an "a la carte" set of feature
The features that are available to you depend on the country that you're operating within, your use case, and the phone number type that you've selected. These features vary by country due to regulatory requirements. Azure Communication Services offers the following phone number features: -- **One-way outbound SMS** This option allows you to send SMS messages to your users. This can be useful in notification and two-factor authentication scenarios.
+- **One-way outbound SMS** This option allows you to send SMS messages to your users. This can be useful in notification and two-factor authentication scenarios.
- **Two-way inbound and outbound SMS** This option allows you to send and receive messages from your users using phone numbers. This can be useful in customer service scenarios. - **One-way outbound telephone calling** This option allows you to make calls to your users and configure Caller ID for outbound calls placed by your service. This can be useful in customer service and voice notification scenarios.
communication-services Sdk Features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/telephony-sms/sdk-features.md
# SMS client library overview [!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)]++ [!INCLUDE [Regional Availability Notice](../../includes/regional-availability-include.md)] Azure Communication Services SMS client libraries can be used to add SMS messaging to your applications.
communication-services Sip Interface Infrastructure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/telephony-sms/sip-interface-infrastructure.md
The port range of the Media Processors is shown in the following table:
## Media traffic: Media processors geography
-The media traffic flows via components called media processors. Media processors are placed in the same datacenters as SIP proxies. Also, there are additional media processors to optimize media flow. For example, we do not have a SIP proxy component now in Australia (SIP flows via Singapore or Hong Kong) but we do have the media processor locally in Australia. The need for the media processors locally is dictated by the latency which we experience by sending traffic long-distance, for example from Australia to Singapore or Hong Kong. While latency in the example of traffic flowing from Australia to Hong Kong or Singapore is acceptable to preserve good call quality for SIP traffic, for real-time media traffic it is not.
+The media traffic flows via components called media processors. Media processors are placed in the same datacenters as SIP proxies. Also, there are additional media processors to optimize media flow. For example, we do not have a SIP proxy component now in Australia (SIP flows via Singapore or Hong Kong SAR) but we do have the media processor locally in Australia. The need for the media processors locally is dictated by the latency which we experience by sending traffic long-distance, for example from Australia to Singapore or Hong Kong SAR. While latency in the example of traffic flowing from Australia to Hong Kong SAR or Singapore is acceptable to preserve good call quality for SIP traffic, for real-time media traffic it is not.
Locations where both SIP proxy and media processor components deployed: - US (two in US West and US East datacenters) - Europe (Amsterdam and Dublin datacenters)-- Asia (Singapore and Hong Kong datacenters)
+- Asia (Singapore and Hong Kong SAR datacenters)
- Australia (AU East and Southeast datacenters) Locations where only media processors are deployed (SIP flows via the closest datacenter listed above):
communication-services Troubleshooting Info https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/troubleshooting-info.md
We encourage developers to submit questions, suggest features, and report proble
To help you troubleshoot certain types of issues, you may be asked for any of the following pieces of information:
-* **MS-CV ID**: This ID is used to troubleshoot calls and messages.
+* **MS-CV ID**: This ID is used to troubleshoot calls and messages.
* **Call ID**: This ID is used to identify Communication Services calls. * **SMS message ID**: This ID is used to identify SMS messages. * **Call logs**: These logs contain detailed information that can be used to troubleshoot calling and network issues.
To help you troubleshoot certain types of issues, you may be asked for any of th
## Access your MS-CV ID
-The MS-CV ID can be accessed by configuring diagnostics in the `clientOptions` object instance when initializing your client libraries. Diagnostics can be configured for any of the Azure client libraries including Chat, Administration, and VoIP calling.
+The MS-CV ID can be accessed by configuring diagnostics in the `clientOptions` object instance when initializing your client libraries. Diagnostics can be configured for any of the Azure client libraries including Chat, Identity, and VoIP calling.
### Client options example The following code snippets demonstrate diagnostics configuration. When the client libraries are used with diagnostics enabled, diagnostics details will be emitted to the configured event listener: # [C#](#tab/csharp)
-```
+```
// 1. Import Azure.Core.Diagnostics using Azure.Core.Diagnostics;
var clientOptions = new ChatClientOptions()
} };
-// 4. Initialize the ChatClient instance with the clientOptions
+// 4. Initialize the ChatClient instance with the clientOptions
ChatClient chatClient = new ChatClient(endpoint, communicationUserCredential, clientOptions); ChatThreadClient chatThreadClient = await chatClient.CreateChatThreadAsync("Thread Topic", new[] { new ChatThreadMember(communicationUser) }); ``` # [Python](#tab/python)
-```
+```
from azure.communication.chat import ChatClient, CommunicationUserCredential endpoint = "https://communication-services-sdk-live-tests-for-python.communication.azure.com" chat_client = ChatClient(
When filing a support request through the Azure portal related to calling issues
# [JavaScript](#tab/javascript) ```javascript
-// `call` is an instance of a call created by `callAgent.call` or `callAgent.join` methods
+// `call` is an instance of a call created by `callAgent.call` or `callAgent.join` methods
console.log(call.id) ``` # [iOS](#tab/ios) ```objc
-// The `call id` property can be retrieved by calling the `call.getCallId()` method on a call object after a call ends
+// The `call id` property can be retrieved by calling the `call.getCallId()` method on a call object after a call ends
// todo: the code snippet suggests it's a property while the comment suggests it's a method call
-print(call.callId)
+print(call.callId)
``` # [Android](#tab/android) ```java // The `call id` property can be retrieved by calling the `call.getCallId()` method on a call object after a call ends
-// `call` is an instance of a call created by `callAgent.call(…)` or `callAgent.join(…)` methods
-Log.d(call.getCallId())
+// `call` is an instance of a call created by `callAgent.call(…)` or `callAgent.join(…)` methods
+Log.d(call.getCallId())
```
console.log(result); // your message ID will be in the result
The following code can be used to configure `AzureLogger` to output logs to the console using the JavaScript client library: ```javascript
-import { AzureLogger } from '@azure/logger';
+import { AzureLogger } from '@azure/logger';
-AzureLogger.verbose = (...args) => { console.info(...args); }
-AzureLogger.info = (...args) => { console.info(...args); }
-AzureLogger.warning = (...args) => { console.info(...args); }
-AzureLogger.error = (...args) => { console.info(...args); }
+AzureLogger.verbose = (...args) => { console.info(...args); }
+AzureLogger.info = (...args) => { console.info(...args); }
+AzureLogger.warning = (...args) => { console.info(...args); }
+AzureLogger.error = (...args) => { console.info(...args); }
-callClient = new CallClient({logger: AzureLogger});
+callClient = new CallClient({logger: AzureLogger});
``` # [iOS](#tab/ios) When developing for iOS, your logs are stored in `.blog` files. Note that you can't view the logs directly because they're encrypted.
-These can be accessed by opening Xcode. Go to Windows > Devices and Simulators > Devices. Select your device. Under Installed Apps, select your application and click on "Download container".
+These can be accessed by opening Xcode. Go to Windows > Devices and Simulators > Devices. Select your device. Under Installed Apps, select your application and click on "Download container".
This will give you a `xcappdata` file. Right-click on this file and select ΓÇ£Show package contentsΓÇ¥. You'll then see the `.blog` files that you can then attach to your Azure support request.
This will give you a `xcappdata` file. Right-click on this file and select ΓÇ£Sh
When developing for Android, your logs are stored in `.blog` files. Note that you can't view the logs directly because they're encrypted.
-On Android Studio, navigate to the Device File Explorer by selecting View > Tool Windows > Device File Explorer from both the simulator and the device. The `.blog` file will be located within your application's directory, which should look something like `/data/data/[app_name_space:com.contoso.com.acsquickstartapp]/files/acs_sdk.blog`. You can attach this file to your support request.
-
+On Android Studio, navigate to the Device File Explorer by selecting View > Tool Windows > Device File Explorer from both the simulator and the device. The `.blog` file will be located within your application's directory, which should look something like `/data/data/[app_name_space:com.contoso.com.acsquickstartapp]/files/acs_sdk.blog`. You can attach this file to your support request.
+
communication-services About Call Types https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/voice-video-calling/about-call-types.md
[!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)] + You can use Azure Communication Services to make and receive one to one or group voice and video calls. Your calls can be made to other Internet-connected devices and to plain-old telephones. You can use the Communication Services JavaScript, Android, or iOS client libraries to build applications that allow your users to speak to one another in private conversations or in group discussions. Azure Communication Services supports calls to and from services or Bots. ## Call types in Azure Communication Services There are multiple types of calls you can make in Azure Communication Services. The type of calls that you make determine your signaling schema, media traffic flows, and pricing model.
-### Voice Over IP (VoIP)
+### Voice Over IP (VoIP)
When a user of your application calls another user of your application over an internet or data connection, the call is made over Voice Over IP (VoIP). In this case, both signaling and media flow over the internet.
A one-to-one call on Azure Communication Services happens when one of your users
A group call on Azure Communication Services happens when three or more participants connect to one another. Any combination of VoIP and PSTN-connected users can be present on a group call. A one-to-one call can be converted into a group call by adding more participants to the call. One of those participants can be a bot. ### Supported video standards
-We support H.264 (MPEG-4)
+We support H.264 (MPEG-4)
### Video quality
-We support up to Full HD 1080p on the native (iOS, Android) SDKs. For Web (JS) SDK we support Standard HD 720p. The quality depends on the available bandwidth.
+We support up to Full HD 1080p on the native (iOS, Android) SDKs. For Web (JS) SDK we support Standard HD 720p. The quality depends on the available bandwidth.
### Rooms concept Rooms are a set of APIs and SDKs that allow you to easily add audio, video, screen sharing, PSTN and SMS interactions to your website or native application.
-During the preview you can use the group ID to join the same conversation. You can create as many group IDs as you need and separate the users by the ΓÇ£roomsΓÇ¥. Moving forward will introduce more controls around ΓÇ£roomsΓÇ¥
+During the preview you can use the group ID to join the same conversation. You can create as many group IDs as you need and separate the users by the ΓÇ£roomsΓÇ¥. Moving forward will introduce more controls around ΓÇ£roomsΓÇ¥
## Next steps
communication-services Calling Sdk Features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/concepts/voice-video-calling/calling-sdk-features.md
[!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)] + There are two separate families of Calling client libraries, for *clients* and *services.* Currently available client libraries are intended for end-user experiences: websites and native apps. The Service client libraries are not yet available, and provide access to the raw voice and video data planes, suitable for integration with bots and other services.
The Service client libraries are not yet available, and provide access to the ra
The following list presents the set of features which are currently available in the Azure Communication Services Calling client libraries.
-| Group of features | Capability | JS | Java (Android) | Objective-C (iOS)
+| Group of features | Capability | JS | Java (Android) | Objective-C (iOS)
| -- | - | | -- | -
-| Core Capabilities | Place a one-to-one call between two users | ✔️ | ✔️ | ✔️
-| | Place a group call with more than two users (up to 350 users) | ✔️ | ✔️ | ✔️
-| | Promote a one-to-one call with two users into a group call with more than two users | ✔️ | ✔️ | ✔️
-| | Join a group call after it has started | ✔️ | ✔️ | ✔️
+| Core Capabilities | Place a one-to-one call between two users | ✔️ | ✔️ | ✔️
+| | Place a group call with more than two users (up to 350 users) | ✔️ | ✔️ | ✔️
+| | Promote a one-to-one call with two users into a group call with more than two users | ✔️ | ✔️ | ✔️
+| | Join a group call after it has started | ✔️ | ✔️ | ✔️
| | Invite another VoIP participant to join an ongoing group call | ✔️ | ✔️ | ✔️ | Mid call control | Turn your video on/off | ✔️ | ✔️ | ✔️ | | Mute/Unmute mic | ✔️ | ✔️ | ✔️
communication-services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/overview.md
> [!IMPORTANT] > Applications that you build using Azure Communication Services can talk to Microsoft Teams. To learn more, visit our [Teams Interop](./quickstarts/voice-video-calling/get-started-teams-interop.md) documentation. + Azure Communication Services allows you to easily add real-time multimedia voice, video, and telephony-over-IP communications features to your applications. The Communication Services client libraries also allow you to add chat and SMS functionality to your communications solutions. <br>
There are two other Microsoft communication products you may consider leveraging
## Next Steps
+ - [Create a Communication Services resource](./quickstarts/create-communication-resource.md)
communication-services Access Tokens https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/access-tokens.md
Title: Quickstart - Create and manage access tokens
-description: Learn how to manage identities and access tokens using the Azure Communication Services Administration client library.
+description: Learn how to manage identities and access tokens using the Azure Communication Services Identity client library.
zone_pivot_groups: acs-js-csharp-java-python
# Quickstart: Create and manage access tokens -
-Get started with Azure Communication Services by using the Communication Services Administration client library. It allows you to create identities and manage your access tokens. Identity is representing entity of your application in the Azure Communication Service (for example, user or device). Access tokens let your Chat and Calling client libraries authenticate directly against Azure Communication Services. We recommend generating access tokens on a server-side service. Access tokens are then used to initialize the Communication Services client libraries on client devices.
+Get started with Azure Communication Services by using the Communication Services Identity client library. It allows you to create identities and manage your access tokens. Identity is representing entity of your application in the Azure Communication Service (for example, user or device). Access tokens let your Chat and Calling client libraries authenticate directly against Azure Communication Services. We recommend generating access tokens on a server-side service. Access tokens are then used to initialize the Communication Services client libraries on client devices.
Any prices seen in images throughout this tutorial are for demonstration purposes only.
In this quickstart, you learned how to:
> [!div class="checklist"] > * Manage identities > * Issue access tokens
-> * Use the Communication Services Administration client library
+> * Use the Communication Services Identity client library
> [!div class="nextstepaction"]
You may also want to:
- [Learn about authentication](../concepts/authentication.md) - [Add chat to your app](./chat/get-started.md) - [Learn about client and server architecture](../concepts/client-and-server-architecture.md)
-
communication-services Get Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/chat/get-started.md
zone_pivot_groups: acs-js-csharp-java-python-swift-android
[!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)] + Get started with Azure Communication Services by using the Communication Services Chat client library to add real-time chat to your application. In this quickstart, we'll use the Chat client library to create chat threads that allow users to have conversations with one another. To learn more about Chat concepts, visit the [chat conceptual documentation](../../concepts/chat/concepts.md). ::: zone pivot="programming-language-javascript"
communication-services Create Communication Resource https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/create-communication-resource.md
zone_pivot_groups: acs-plat-azp-net # Quickstart: Create and manage Communication Services resources
-
+ [!INCLUDE [Public Preview Notice](../includes/public-preview-include.md)] Get started with Azure Communication Services by provisioning your first Communication Services resource. Communication services resources can be provisioned through the [Azure portal](https://portal.azure.com) or with the .NET management client library. The management client library and the Azure portal allow you to create, configure, update and delete your resources and interface with [Azure Resource Manager](../../azure-resource-manager/management/overview.md), Azure's deployment and management service. All functionality available in the client libraries is available in the Azure portal. +
+Get started with Azure Communication Services by provisioning your first Communication Services resource. Communication services resources can be provisioned through the [Azure portal](https://portal.azure.com) or with the .NET management client library. The management client library and the Azure portal allow you to create, configure, update and delete your resources and interface with [Azure Resource Manager](../../azure-resource-manager/management/overview.md), Azure's deployment and management service. All functionality available in the client libraries is available in the Azure portal.
+ > [!WARNING] > Note that while Communication Services is available in multiple geographies, in order to get a phone number the resource must have a data location set to 'US'. Also note that communication resources cannot be transferred to a different subscription during public preview.
Get started with Azure Communication Services by provisioning your first Communi
## Access your connection strings and service endpoints
-Connection strings allow the Communication Services client libraries to connect and authenticate to Azure. You can access your Communication Services connection strings and service endpoints from the Azure portal or programmatically with Azure Resource Manager APIs.
+Connection strings allow the Communication Services client libraries to connect and authenticate to Azure. You can access your Communication Services connection strings and service endpoints from the Azure portal or programmatically with Azure Resource Manager APIs.
After navigating to your Communication Services resource, select **Keys** from the navigation menu and copy the **Connection string** or **Endpoint** values for usage by the Communication Services client libraries. Note that you have access to primary and secondary keys. This can be useful in scenarios where you would like to provide temporary access to your Communication Services resources to a third party or staging environment.
After navigating to your Communication Services resource, select **Keys** from t
You can also access key information using Azure CLI: ```azurecli
-az communication list --resource-group "<resourceGroup>"
+az communication list --resource-group "<resourceGroup>"
az communication list-key --name "<communicationName>" --resource-group "<resourceGroup>" ```
After you add the environment variable, run `source ~/.bash_profile` from your c
If you want to clean up and remove a Communication Services subscription, you can delete the resource or resource group. Deleting the resource group also deletes any other resources associated with it.
-If you have any phone numbers assigned to your resource upon resource deletion, the phone numbers will be released from your resource automatically at the same time.
+If you have any phone numbers assigned to your resource upon resource deletion, the phone numbers will be released from your resource automatically at the same time.
## Next steps
communication-services Managed Identity From Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/managed-identity-from-cli.md
+
+ Title: Create an Azure Active Directory managed identity application from the Azure CLI
+
+description: Managed identities let you authorize Azure Communication Services access from applications running in Azure VMs, function apps, and other resources. This quickstart is focused on managing identity using the Azure CLI.
++++ Last updated : 02/25/2021++++
+# Authorize access with managed identity to your communication resource in your development environment
+
+The Azure Identity client library provides Azure Active Directory (Azure AD) token authentication support for the Azure SDK. The latest versions of the Azure Communication Services client libraries for .NET, Java, Python, and JavaScript integrate with the Azure Identity library to provide a simple and secure means to acquire an OAuth 2.0 token for authorization of Azure Communication Services requests.
+
+An advantage of the Azure Identity client library is that it enables you to use the same code to authenticate across multiple services whether your application is running in the development environment or in Azure. The Azure Identity client library authenticates a security principal. When your code is running in Azure, the security principal is a managed identity for Azure resources. In the development environment, the managed identity does not exist, so the client library authenticates either the user or a registered application for testing purposes.
+
+## Prerequisites
+
+ - Azure CLI. [Installation guide](https://docs.microsoft.com/cli/azure/install-azure-cli)
+ - An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free)
+
+## Setting Up
+
+Managed identities should be enabled on the Azure resources that you're authorizing. To learn how to enable managed identities for Azure Resources, see one of these articles:
+
+- [Azure portal](../../active-directory/managed-identities-azure-resources/qs-configure-portal-windows-vm.md)
+- [Azure PowerShell](../../active-directory/managed-identities-azure-resources/qs-configure-powershell-windows-vm.md)
+- [Azure CLI](../../active-directory/managed-identities-azure-resources/qs-configure-cli-windows-vm.md)
+- [Azure Resource Manager template](../../active-directory/managed-identities-azure-resources/qs-configure-template-windows-vm.md)
+- [Azure Resource Manager client libraries](../../active-directory/managed-identities-azure-resources/qs-configure-sdk-windows-vm.md)
+- [App services](../../app-service/overview-managed-identity.md)
+
+## Authenticate a registered application in the development environment
+
+If your development environment does not support single sign-on or login via a web browser, then you can use a registered application to authenticate from the development environment.
+
+### Creating an Azure Active Directory Registered Application
+
+To create a registered application from the Azure CLI, you need to be logged in to the Azure account where you want the operations to take place. To do this, you can use the `az login` command and enter your credentials in the browser. Once you are logged in to your Azure account from the CLI, we can call the `az ad sp create-for-rbac` command to create the registered application.
+
+The following examples uses the Azure CLI to create a new registered application
+
+```azurecli
+az ad sp create-for-rbac --name <application-name>
+```
+
+The `az ad sp create-for-rbac` command will return a list of service principal properties in JSON format. Copy these values so that you can use them to create the necessary environment variables in the next step.
+
+```json
+{
+ "appId": "generated-app-ID",
+ "displayName": "service-principal-name",
+ "name": "http://service-principal-uri",
+ "password": "generated-password",
+ "tenant": "tenant-ID"
+}
+```
+> [!IMPORTANT]
+> Azure role assignments may take a few minutes to propagate.
+
+#### Set environment variables
+
+The Azure Identity client library reads values from three environment variables at runtime to authenticate the application. The following table describes the value to set for each environment variable.
+
+|Environment variable|Value
+|-|-
+|`AZURE_CLIENT_ID`|`appId` value from the generated JSON
+|`AZURE_TENANT_ID`|`tenant` value from the generated JSON
+|`AZURE_CLIENT_SECRET`|`password` value from the generated JSON
+
+> [!IMPORTANT]
+> After you set the environment variables, close and re-open your console window. If you are using Visual Studio or another development environment, you may need to restart it in order for it to register the new environment variables.
++
+## Next steps
+
+> [!div class="nextstepaction"]
+> [Learn about authentication](../concepts/authentication.md)
+
+You may also want to:
+
+- [Learn more about Azure Identity library](/dotnet/api/overview/azure/identity-readme)
communication-services Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/managed-identity.md
Title: Use managed identities in Communication Services (.NET)
+ Title: Use managed identities in Communication Services
description: Managed identities let you authorize Azure Communication Services access from applications running in Azure VMs, function apps, and other resources. -+ Previously updated : 12/04/2020- Last updated : 2/24/2021+
+zone_pivot_groups: acs-js-csharp-java-python
-# Use managed identities (.NET)
+# Use managed identities
+Get started with Azure Communication Services by using managed identities. The Communication Services Identity and SMS client libraries support Azure Active Directory (Azure AD) authentication with [managed identities for Azure resources](../../active-directory/managed-identities-azure-resources/overview.md).
-Get started with Azure Communication Services by using managed identities in .NET. The Communication Services Administration and SMS client libraries support Azure Active Directory (Azure AD) authentication with [managed identities for Azure resources](../../active-directory/managed-identities-azure-resources/overview.md).
+This quickstart shows you how to authorize access to the Identity and SMS client libraries from an Azure environment that supports managed identities. It also describes how to test your code in a development environment.
-This quickstart shows you how to authorize access to the Administration and SMS client libraries from an Azure environment that supports managed identities. It also describes how to test your code in a development environment.
-## Prerequisites
-## Setting Up
-
-### Enable managed identities on a virtual machine or App service
-
-Managed identities should be enabled on the Azure resources that you're authorizing. To learn how to enable managed identities for Azure Resources, see one of these articles:
--- [Azure portal](../../active-directory/managed-identities-azure-resources/qs-configure-portal-windows-vm.md)-- [Azure PowerShell](../../active-directory/managed-identities-azure-resources/qs-configure-powershell-windows-vm.md)-- [Azure CLI](../../active-directory/managed-identities-azure-resources/qs-configure-cli-windows-vm.md)-- [Azure Resource Manager template](../../active-directory/managed-identities-azure-resources/qs-configure-template-windows-vm.md)-- [Azure Resource Manager client libraries](../../active-directory/managed-identities-azure-resources/qs-configure-sdk-windows-vm.md)-- [App services](../../app-service/overview-managed-identity.md)-
-#### Assign Azure roles with the Azure portal
-
-1. Navigate to the Azure portal.
-1. Navigate to the Azure Communication Service resource.
-1. Navigate to Access Control (IAM) menu -> + Add -> Add role assignment.
-1. Select the role "Contributor" (this is the only supported role).
-1. Select "User assigned managed identity" (or a "System assigned managed identity") then select the desired identity. Save your selection.
-
-![Managed identity role](media/managed-identity-assign-role.png)
-
-#### Assign Azure roles with PowerShell
-
-To assign roles and permissions using PowerShell, see [Add or remove Azure role assignments using Azure PowerShell](../../../articles/role-based-access-control/role-assignments-powershell.md)
-
-## Add managed identity to your Communication Services solution
-
-### Install the client library packages
-
-```console
-dotnet add package Azure.Identity
-dotnet add package Azure.Communication.Identity
-dotnet add package Azure.Communication.Sms
-```
-
-### Use the client library packages
-
-Add the following `using` directives to your code to use the Azure Identity and Azure Storage client libraries.
-
-```csharp
-using Azure;
-using Azure.Core;
-using Azure.Identity;
-using Azure.Communication;
-using Azure.Communication.Identity;
-using Azure.Communication.Sms;
-```
-
-The examples below are using the [DefaultAzureCredential](/dotnet/api/azure.identity.defaultazurecredential). This credential is suitable for production and development environments.
-
-### Create an identity and issue a token
-
-The following code example shows how to create a service client object with Azure Active Directory tokens, then use the client to issue a token for a new user:
-
-```csharp
- public async Task<Response<CommunicationUserToken>> CreateIdentityAndIssueTokenAsync(Uri resourceEndpoint)
- {
- TokenCredential credential = new DefaultAzureCredential();
-
- var client = new CommunicationIdentityClient(resourceEndpoint, credential);
- var identityResponse = await client.CreateUserAsync();
- var identity = identityResponse.Value;
-
- var tokenResponse = await client.IssueTokenAsync(identity, scopes: new [] { CommunicationTokenScope.VoIP });
-
- return tokenResponse;
- }
-```
-
-### Send an SMS with Azure Active Directory tokens
-
-The following code example shows how to create a service client object with Azure Active Directory tokens, then use the client to send an SMS message:
-
-```csharp
- public async Task SendSmsAsync(Uri resourceEndpoint, PhoneNumber from, PhoneNumber to, string message)
- {
- TokenCredential credential = new DefaultAzureCredential();
-
- SmsClient smsClient = new SmsClient(resourceEndpoint, credential);
- smsClient.Send(
- from: from,
- to: to,
- message: message,
- new SendSmsOptions { EnableDeliveryReport = true } // optional
- );
- }
-```
## Next steps
-> [!div class="nextstepaction"]
-> [Learn about authentication](../concepts/authentication.md)
-
-You may also want to:
- - [Learn more about Azure role-based access control](../../../articles/role-based-access-control/index.yml) - [Learn more about Azure identity library for .NET](/dotnet/api/overview/azure/identity-readme) - [Creating user access tokens](../quickstarts/access-tokens.md)
communication-services Get Phone Number https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/telephony-sms/get-phone-number.md
# Quickstart: Get a phone number using the Azure portal [!INCLUDE [Regional Availability Notice](../../includes/regional-availability-include.md)] Get started with Azure Communication Services by using the Azure portal to purchase a telephone number.
Navigate to the **Phone Numbers** blade in the resource menu.
:::image type="content" source="../media/manage-phone-azure-portal-phone-page.png" alt-text="Screenshot showing a Communication Services resource's phone page.":::
-Press the **Get** button to launch the wizard. The wizard on the **Phone numbers** blade will walk you through a series of questions that helps you choose the phone number that best fits your scenario.
+Press the **Get** button to launch the wizard. The wizard on the **Phone numbers** blade will walk you through a series of questions that helps you choose the phone number that best fits your scenario.
-You will first need to choose the **Country/region** where you would like to provision the phone number. After selecting the Country/region, you will then need to select the **Use case** which best suites your needs.
+You will first need to choose the **Country/region** where you would like to provision the phone number. After selecting the Country/region, you will then need to select the **Use case** which best suites your needs.
:::image type="content" source="../media/manage-phone-azure-portal-get-numbers.png" alt-text="Screenshot showing the Get phone numbers view."::: ### Select your phone number features
-Configuring your phone number is broken down into two steps:
+Configuring your phone number is broken down into two steps:
1. The selection of the [number type](../../concepts/telephony-sms/plan-solution.md#phone-number-types-in-azure-communication-services) 2. The selection of the [number features](../../concepts/telephony-sms/plan-solution.md#phone-number-features-in-azure-communication-services)
communication-services Handle Sms Events https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/telephony-sms/handle-sms-events.md
# Quickstart: Handle SMS events [!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)]++ [!INCLUDE [Regional Availability Notice](../../includes/regional-availability-include.md)]
-Get started with Azure Communication Services by using Azure Event Grid to handle Communication Services SMS events.
+Get started with Azure Communication Services by using Azure Event Grid to handle Communication Services SMS events.
## About Azure Event Grid [Azure Event Grid](../../../event-grid/overview.md) is a cloud-based eventing service. In this article, you'll learn how to subscribe to events for [communication service events](../../concepts/event-handling.md), and trigger an event to view the result. Typically, you send events to an endpoint that processes the event data and takes actions. In this article, we'll send the events to a web app that collects and displays the messages. ## Prerequisites-- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
- An Azure Communication Service resource. Further details can be found in the [Create an Azure Communication Resource](../create-communication-resource.md) quickstart. - An SMS enabled telephone number. [Get a phone number](./get-phone-number.md).
In the Azure portal:
2. Select the subscription you're using for Event Grid. 3. On the left menu, under **Settings**, select **Resource providers**. 4. Find **Microsoft.EventGrid**.
-5. If not registered, select **Register**.
+5. If not registered, select **Register**.
It may take a moment for the registration to finish. Select **Refresh** to update the status. When **Status** is **Registered**, you're ready to continue. ### Event Grid Viewer deployment
-For this quickstart, we will use the [Azure Event Grid Viewer Sample](/samples/azure-samples/azure-event-grid-viewer/azure-event-grid-viewer/) to view events in near-real time. This will provide the user with the experience of a real-time feed. In addition, the payload of each event should be available for inspection as well.
+For this quickstart, we will use the [Azure Event Grid Viewer Sample](/samples/azure-samples/azure-event-grid-viewer/azure-event-grid-viewer/) to view events in near-real time. This will provide the user with the experience of a real-time feed. In addition, the payload of each event should be available for inspection as well.
## Subscribe to the SMS events using web hooks
Press **Add Event Subscription** to enter the creation wizard.
On the **Create Event Subscription** page, Enter a **name** for the event subscription.
-You can subscribe to specific events to tell Event Grid which of the SMS events you want to track, and where to send the events. Select the events you'd like to subscribe to from the dropdown menu. For SMS you'll have the option to choose `SMS Received` and `SMS Delivery Report Received`.
+You can subscribe to specific events to tell Event Grid which of the SMS events you want to track, and where to send the events. Select the events you'd like to subscribe to from the dropdown menu. For SMS you'll have the option to choose `SMS Received` and `SMS Delivery Report Received`.
If you're prompted to provide a **System Topic Name**, feel free to provide a unique string. This field has no impact on your experience and is used for internal telemetry purposes.
Check out the full list of [events supported by Azure Communication Services](..
:::image type="content" source="./media/handle-sms-events/select-events-create-eventsub.png" alt-text="Screenshot showing the SMS Received and SMS Delivery Report Received event types being selected.":::
-Select **Web Hook** for **Endpoint type**.
+Select **Web Hook** for **Endpoint type**.
:::image type="content" source="./media/handle-sms-events/select-events-create-linkwebhook.png" alt-text="Screenshot showing the Endpoint Type field being set to Web Hook.":::
If you want to clean up and remove a Communication Services subscription, you ca
In this quickstart, you learned how to consume SMS events. You can receive SMS messages by creating an Event Grid subscription.
-> [!div class="nextstepaction"]
+> [!div class="nextstepaction"]
> [Send SMS](../telephony-sms/send.md) You may also want to: - [Learn about event handling concepts](../../concepts/event-handling.md)
+ - [Learn about Event Grid](../../../event-grid/overview.md)
communication-services Send https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/telephony-sms/send.md
zone_pivot_groups: acs-js-csharp-java-python
# Quickstart: Send an SMS message [!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)] + > [!IMPORTANT] > SMS messages can be sent to and received from United States phone numbers. Phone numbers located in other geographies are not yet supported by Communication Services SMS. > For more information, see **[Phone number types](../../concepts/telephony-sms/plan-solution.md)**.
In this quickstart, you learned how to send SMS messages using Azure Communicati
> [Phone number types](../../concepts/telephony-sms/plan-solution.md) > [!div class="nextstepaction"]
-> [Learn more about SMS](../../concepts/telephony-sms/concepts.md)
+> [Learn more about SMS](../../concepts/telephony-sms/concepts.md)
communication-services Calling Client Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/voice-video-calling/calling-client-samples.md
zone_pivot_groups: acs-plat-web-ios-android
[!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)] + Get started with Azure Communication Services by using the Communication Services calling client library to add voice and video calling to your app. ::: zone pivot="platform-web"
If you want to clean up and remove a Communication Services subscription, you ca
For more information, see the following articles: - Check out our [calling hero sample](../../samples/calling-hero-sample.md)-- Learn more about [how calling works](../../concepts/voice-video-calling/about-call-types.md)
+- Learn more about [how calling works](../../concepts/voice-video-calling/about-call-types.md)
communication-services Getting Started With Calling https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/voice-video-calling/getting-started-with-calling.md
Title: Quickstart - Add voice calling to your app
+ Title: Quickstart - Add voice calling to your app
description: In this quickstart, you'll learn how to add calling capabilities to your app using Azure Communication Services.
zone_pivot_groups: acs-plat-web-ios-android
[!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)]
-Get started with Azure Communication Services by using the Communication Services calling client library to add voice and video calling to your app.
+
+Get started with Azure Communication Services by using the Communication Services calling client library to add voice and video calling to your app.
[!INCLUDE [Emergency Calling Notice](../../includes/emergency-calling-notice-include.md)]
communication-services Pstn Call https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/quickstarts/voice-video-calling/pstn-call.md
zone_pivot_groups: acs-plat-web-ios-android
# Quickstart: Call To Phone+ [!INCLUDE [Public Preview Notice](../../includes/public-preview-include.md)]
-Get started with Azure Communication Services by using the Communication Services calling client library to add PSTN calling to your app.
+
+Get started with Azure Communication Services by using the Communication Services calling client library to add PSTN calling to your app.
::: zone pivot="platform-web" [!INCLUDE [Calling with JavaScript](./includes/pstn-call-js.md)]
communication-services Chat Hero Sample https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/samples/chat-hero-sample.md
[!INCLUDE [Public Preview Notice](../includes/public-preview-include.md)] + <!- > [!WARNING] > links to our Hero Sample repo need to be updated when the sample is publicly available.
In this Sample quickstart, we'll learn how the sample works before we run the sa
## Overview
-The sample has both a client-side application and a server-side application. The **client-side application** is a React/Redux web application that uses Microsoft's Fluent UI framework. This application sends requests to an ASP.NET Core **server-side application** that helps the client-side application connect to Azure.
+The sample has both a client-side application and a server-side application. The **client-side application** is a React/Redux web application that uses Microsoft's Fluent UI framework. This application sends requests to an ASP.NET Core **server-side application** that helps the client-side application connect to Azure.
Here's what the sample looks like: :::image type="content" source="./media/chat/landing-page.png" alt-text="Screenshot showing the sample application's landing page.":::
-When you press the "Start a Chat" button, the web application fetches a user access token from the server-side application. This token is then used to connect the client app to Azure Communication Services. Once the token is retrieved, you'll be prompted to specify your name and emoji that will represent you in chat.
+When you press the "Start a Chat" button, the web application fetches a user access token from the server-side application. This token is then used to connect the client app to Azure Communication Services. Once the token is retrieved, you'll be prompted to specify your name and emoji that will represent you in chat.
:::image type="content" source="./media/chat/pre-chat.png" alt-text="Screenshot showing the application's pre-chat screen.":::
Components of the main chat screen:
- **Main Chat Area**: This is the core chat experience where users can send and receives messages. To send messages, you can use the input area and press enter (or use the send button). Chat messages received are categorized by the sender with the correct name and emoji. You will see two types of notifications in the chat area: 1) typing notifications when a user is typing and 2) sent and read notifications for messages. - **Header**: This is where the user will see the title of the chat thread and the controls for toggling participant and settings side bars, and a leave button to exit the chat session.-- **Side Bar**: This is where participants and setting information are shown when toggled using the controls in the header. The participants side bar contains a list of participants in the chat and a link to invite participants to the chat session. The settings side bar allows you to configure the chat thread title.
+- **Side Bar**: This is where participants and setting information are shown when toggled using the controls in the header. The participants side bar contains a list of participants in the chat and a link to invite participants to the chat session. The settings side bar allows you to configure the chat thread title.
Below you'll find more information on prerequisites and steps to set up the sample.
If you want to clean up and remove a Communication Services subscription, you ca
## Next steps
->[!div class="nextstepaction"]
+>[!div class="nextstepaction"]
>[Download the sample from GitHub](https://github.com/Azure-Samples/communication-services-web-chat-hero) For more information, see the following articles:
For more information, see the following articles:
- [Redux](https://redux.js.org/) - Client-side state management - [FluentUI](https://aka.ms/fluent-ui) - Microsoft powered UI library - [React](https://reactjs.org/) - Library for building user interfaces-- [ASP.NET Core](/aspnet/core/introduction-to-aspnet-core?preserve-view=true&view=aspnetcore-3.1) - Framework for building web applications
+- [ASP.NET Core](/aspnet/core/introduction-to-aspnet-core?preserve-view=true&view=aspnetcore-3.1) - Framework for building web applications
communication-services Web Calling Sample https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/samples/web-calling-sample.md
# Get started with the web calling sample
-The web calling sample is a web application that serves as a step-by-step walkthrough of the various capabilities provided by the Communication Services web calling client library.
+The web calling sample is a web application that serves as a step-by-step walkthrough of the various capabilities provided by the Communication Services web calling client library.
This sample was built for developers and makes it very easy for you to get started with Communication Services. Its user interface is divided into multiple sections, each featuring a "Show code" button that allows you to copy code directly from your browser into your own Communication Services application.
This sample was built for developers and makes it very easy for you to get start
[!INCLUDE [Public Preview Notice](../includes/public-preview-include.md)] + > [!IMPORTANT] > [This sample is available on Github.](https://github.com/Azure-Samples/communication-services-web-calling-tutorial/).
Once the [web calling sample](https://github.com/Azure-Samples/communication-ser
:::image type="content" source="./media/web-calling-tutorial-page-2.png" alt-text="Web calling tutorial 2" lightbox="./media/web-calling-tutorial-page-2.png":::
-## User provisioning and SDK initialization
+## User provisioning and SDK initialization
Click on the "Provisioning user and initialize SDK" to initialize your SDK using a token provisioned by the backend token provisioning service. This backend service is in `/project/webpack.config.js`.
You're now ready to begin placing calls using your Communication Services resour
The Communication Services web calling SDK allows for **1:1**, **1:N**, and **group** calling.
-For 1:1 or 1:N outgoing calls, you can specify multiple Communication Services User Identities to call using comma-separated values. You can can also specify traditional (PSTN) phone numbers to call using comma-separated values.
+For 1:1 or 1:N outgoing calls, you can specify multiple Communication Services User Identities to call using comma-separated values. You can can also specify traditional (PSTN) phone numbers to call using comma-separated values.
When calling PSTN phone numbers, specify your alternate caller ID. Click on the "Place call" button to place an outgoing call:
This sample also provides code snippets for the following capabilities:
## Next steps
->[!div class="nextstepaction"]
+>[!div class="nextstepaction"]
>[Download the sample from GitHub](https://github.com/Azure-Samples/communication-services-web-calling-tutorial/) For more information, see the following articles:
communication-services Building App Start https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/tutorials/building-app-start.md
module.exports ={
output: { filename:'app.js', path: path.resolve(__dirname, 'dist'),
- }
+ }
} ```
Your file now should look like this:
} ```
-You added the command that can be used from npm.
+You added the command that can be used from npm.
:::image type="content" source="./media/step-one-pic-12.png" alt-text="Screenshot that shows the modification of package.json.":::
This configuration will be merged with `webpack.common.js` (where you specified
In `package.json`, add the following code: ```JavaScript
-"build:prod": "webpack --config webpack.prod.js"
+"build:prod": "webpack --config webpack.prod.js"
``` Your file should look like this:
Your file should look like this:
"scripts": { "test": "echo \"Error: no test specified\" && exit 1", "build:dev": "webpack-dev-server --config webpack.dev.js",
- "build:prod": "webpack --config webpack.prod.js"
+ "build:prod": "webpack --config webpack.prod.js"
}, "keywords": [], "author": "", "license": "ISC", "dependencies": {
- "@azure/communication-calling": "^1.0.0-beta.3",
- "@azure/communication-common": "^1.0.0-beta.3"
+ "@azure/communication-calling": "^1.0.0-beta.6",
+ "@azure/communication-common": "^1.0.0"
}, "devDependencies": { "webpack": "^4.42.0",
The command creates a `dist` folder and a production-ready `app.js` static file
### Deploy your app to Azure Storage
-
+ Copy `https://docsupdatetracker.net/index.html` and `app.css` to the `dist` folder. In the `dist` folder, create a file and name it `404.html`. Copy the following markup into that file:
You might also want to:
- [Add chat to your app](../quickstarts/chat/get-started.md) - [Create user access tokens](../quickstarts/access-tokens.md) - [Learn about client and server architecture](../concepts/client-and-server-architecture.md)-- [Learn about authentication](../concepts/authentication.md)
+- [Learn about authentication](../concepts/authentication.md)
communication-services Hmac Header Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/tutorials/hmac-header-tutorial.md
In this tutorial, you'll learn how to sign an HTTP request with an HMAC signatur
[!INCLUDE [Public Preview Notice](../includes/public-preview-include.md)] + [!INCLUDE [Sign an HTTP request C#](./includes/hmac-header-csharp.md)] ## Clean up resources
communication-services Trusted Service Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/communication-services/tutorials/trusted-service-tutorial.md
[!INCLUDE [Public Preview Notice](../includes/public-preview-include.md)] + [!INCLUDE [Trusted Service JavaScript](./includes/trusted-service-js.md)] ## Clean up resources
You may also want to:
- [Add chat to your app](../quickstarts/chat/get-started.md) - [Creating user access tokens](../quickstarts/access-tokens.md) - [Learn about client and server architecture](../concepts/client-and-server-architecture.md)-- [Learn about authentication](../concepts/authentication.md)
+- [Learn about authentication](../concepts/authentication.md)
connectors Connectors Sftp Ssh https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/connectors/connectors-sftp-ssh.md
ms.suite: integration
Previously updated : 01/07/2021 Last updated : 03/08/2021 tags: connectors
If your private key is in PuTTY format, which uses the .ppk (PuTTY Private Key)
## Considerations
-This section describes considerations to review for this connector's triggers and actions.
+This section describes considerations to review when you use this connector's triggers and actions.
+
+<a name="different-folders-trigger-processing-file-storage"></a>
+
+### Use different SFTP folders for file upload and processing
+
+On your SFTP server, make sure that you use separate folders for where you store uploaded files and where the trigger monitors those files for processing, which means that you need a way to move files between those folders. Otherwise, the trigger won't fire and behaves unpredictably, for example, skipping a random number of files that the trigger processes.
+
+If this problem happens, remove the files from the folder that the trigger monitors, and use a different folder to store the uploaded files.
<a name="create-file"></a>
container-registry Container Registry Customer Managed Keys https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/container-registry-customer-managed-keys.md
Title: Encrypt registry with a customer-managed key description: Learn about encryption-at-rest of your Azure container registry, and how to encrypt your Premium registry with a customer-managed key stored in Azure Key Vault Previously updated : 12/03/2020 Last updated : 03/03/2021 # Encrypt registry using a customer-managed key
-When you store images and other artifacts in an Azure container registry, Azure automatically encrypts the registry content at rest with [service-managed keys](../security/fundamentals/encryption-models.md). You can supplement default encryption with an additional encryption layer using a key that you create and manage in Azure Key Vault (a customer-managed key). This article walks you through the steps using the Azure CLI and the Azure portal.
+When you store images and other artifacts in an Azure container registry, Azure automatically encrypts the registry content at rest with [service-managed keys](../security/fundamentals/encryption-models.md). You can supplement default encryption with an additional encryption layer using a key that you create and manage in Azure Key Vault (a customer-managed key). This article walks you through the steps using the Azure CLI, the Azure portal, or a Resource Manager template.
Server-side encryption with customer-managed keys is supported through integration with [Azure Key Vault](../key-vault/general/overview.md):
This feature is available in the **Premium** container registry service tier. Fo
* In a registry encrypted with a customer-managed key, run logs for [ACR Tasks](container-registry-tasks-overview.md) are currently retained for only 24 hours. If you need to retain logs for a longer period, see guidance to [export and store task run logs](container-registry-tasks-logs.md#alternative-log-storage).
-> [!NOTE]
-> If access to your Azure key vault is restricted using a virtual network with a [Key Vault firewall](../key-vault/general/network-security.md), extra configuration steps are needed. After creating the registry and enabling the customer-managed key, set up access to the key using the registry's *system-assigned* managed identity, and configure the registry to bypass the Key Vault firewall. Follow the steps in this article first to enable encryption with a customer-managed key, and then see the guidance for [Advanced scenario: Key Vault firewall](#advanced-scenario-key-vault-firewall) later in this article.
+> [!IMPORTANT]
+> If you plan to store the registry encryption key in an existing Azure key vault that denies public access and allows only private endpoint or selected virtual networks, extra configuration steps are needed. See [Advanced scenario: Key Vault firewall](#advanced-scenario-key-vault-firewall) in this article.
## Automatic or manual update of key versions
identityPrincipalID=$(az identity show --resource-group <resource-group-name> --
### Create a key vault
-Create a key vault with [az keyvault create][az-keyvault-create] to store a customer-managed key for registry encryption.
+Create a key vault with [az keyvault create][az-keyvault-create] to store a customer-managed key for registry encryption.
-By default, the **soft delete** setting is automatically enabled in a new key vault. To prevent data loss caused by accidental key or key vault deletions, also enable the **purge protection** setting:
+By default, the **soft delete** setting is automatically enabled in a new key vault. To prevent data loss caused by accidental key or key vault deletions, also enable the **purge protection** setting.
```azurecli az keyvault create --name <key-vault-name> \
Depending on the key used to encrypt the registry, output is similar to:
"keyVaultProperties": { "identity": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "keyIdentifier": "https://myvault.vault.azure.net/keys/myresourcegroup/abcdefg123456789...",
- "versionedKeyIdentifier": "https://myvault.vault.azure.net/keys/myresourcegroup/abcdefg123456789...",
"keyRotationEnabled": true, "lastKeyRotationTimestamp": xxxxxxxx
+ "versionedKeyIdentifier": "https://myvault.vault.azure.net/keys/myresourcegroup/abcdefg123456789...",
}, "status": "enabled" }
Revoking the key effectively blocks access to all registry data, since the regis
## Advanced scenario: Key Vault firewall
-If your Azure key vault is deployed in a virtual network with a Key Vault firewall, perform the following additional steps after enabling customer-managed key encryption in your registry.
+You might want to store the encryption key using an existing Azure key vault configured with a [Key Vault firewall](../key-vault/general/network-security.md), which denies public access and allows only private endpoint or selected virtual networks.
+
+For this scenario, first create a new user-assigned identity, key vault, and container registry encrypted with a customer-managed key, using the [Azure CLI](#enable-customer-managed-keycli), [portal](#enable-customer-managed-keyportal), or [template](#enable-customer-managed-keytemplate). Detailed steps are in preceding sections in this article.
+ > [!NOTE]
+ > The new key vault is deployed outside the firewall. It's only used temporarily to store the customer-managed key.
-1. Configure registry encryption to use the registry's system-assigned identity
-1. Enable the registry to bypass the Key Vault firewall
-1. Rotate the customer-managed key
+After registry creation, continue with the following steps. Details are in the following sections.
-### Configure system-assigned identity
+1. Enable the registry's system-assigned identity.
+1. Grant the system-assigned identity permissions to access keys in the key vault that's restricted with the Key Vault firewall.
+1. Ensure that the Key Vault firewall allows bypass by trusted services. Currently, an Azure container registry can only bypass the firewall when using its system-managed identity.
+1. Rotate the customer-managed key by selecting one in the key vault that's restricted with the Key Vault firewall.
+1. When no longer needed, you may delete the key vault that was created outside the firewall.
-You can configure a registry's system-assigned managed identity to access the key vault for encryption keys. If you're unfamiliar with the different managed identities for Azure resources, see the [overview](../active-directory/managed-identities-azure-resources/overview.md).
-To enable the registry's system-assigned identity in the portal:
+### Step 1 - Enable registry's system-assigned identity
1. In the portal, navigate to your registry. 1. Select **Settings** > **Identity**. 1. Under **System assigned**, set **Status** to **On**. Select **Save**. 1. Copy the **Object ID** of the identity.
-To grant the identity access to your key vault:
+### Step 2 - Grant system-assigned identity access to your key vault
-1. Navigate to your key vault.
+1. In the portal, navigate to your key vault.
1. Select **Settings** > **Access policies > +Add Access Policy**. 1. Select **Key permissions**, and select **Get**, **Unwrap Key**, and **Wrap Key**. 1. Choose **Select principal** and search for the object ID of your system-assigned managed identity, or the name of your registry. 1. Select **Add**, then select **Save**.
-To update the registry's encryption settings to use the identity:
-
-1. In the portal, navigate to your registry.
-1. Under **Settings**, select **Encryption** > **Change key**.
-1. In **Identity**, select **System assigned**, and select **Save**.
-
-### Enable key vault bypass
+### Step 3 - Enable key vault bypass
To access a key vault configured with a Key Vault firewall, the registry must bypass the firewall. Ensure that the key vault is configured to allow access by any [trusted service](../key-vault/general/overview-vnet-service-endpoints.md#trusted-services). Azure Container Registry is one of the trusted services.
To access a key vault configured with a Key Vault firewall, the registry must by
1. Confirm, update, or add virtual network settings. For detailed steps, see [Configure Azure Key Vault firewalls and virtual networks](../key-vault/general/network-security.md). 1. In **Allow Microsoft Trusted Services to bypass this firewall**, select **Yes**.
-### Rotate the customer-managed key
+### Step 4 - Rotate the customer-managed key
-After completing the preceding steps, rotate the key to a new key in the key vault behind a firewall. For steps, see [Rotate key](#rotate-key) in this article.
+After completing the preceding steps, rotate to a key that's stored in the key vault behind a firewall.
+
+1. In the portal, navigate to your registry.
+1. Under **Settings**, select **Encryption** > **Change key**.
+1. In **Identity**, select **System Assigned**.
+1. Select **Select from Key Vault**, and select the name of the key vault that's behind a firewall.
+1. Select an existing key, or **Create new**. The key you select is non-versioned and enables automatic key rotation.
+1. Complete the key selection and select **Save**.
## Troubleshoot
cosmos-db Cosmos Db Advanced Threat Protection https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/cosmos-db-advanced-threat-protection.md
Use Rest API commands to create, update, or get the Advanced Threat Protection s
Use the following PowerShell cmdlets:
-* [Enable Advanced Threat Protection](/powershell/module/az.security/enable-azsecurityadvancedthreatprotection?viewFallbackFrom=azps-2.4.0)
-* [Get Advanced Threat Protection](/powershell/module/az.security/get-azsecurityadvancedthreatprotection?viewFallbackFrom=azps-2.4.0)
-* [Disable Advanced Threat Protection](/powershell/module/az.security/disable-azsecurityadvancedthreatprotection?viewFallbackFrom=azps-2.4.0)
+* [Enable Advanced Threat Protection](/powershell/module/az.security/enable-azsecurityadvancedthreatprotection)
+* [Get Advanced Threat Protection](/powershell/module/az.security/get-azsecurityadvancedthreatprotection)
+* [Disable Advanced Threat Protection](/powershell/module/az.security/disable-azsecurityadvancedthreatprotection)
### [ARM template](#tab/arm-template)
cosmos-db Distribute Data Globally https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/distribute-data-globally.md
Previously updated : 07/23/2019 Last updated : 01/06/2021 # Distribute your data globally with Azure Cosmos DB
Today's applications are required to be highly responsive and always online. To
Azure Cosmos DB is a globally distributed database system that allows you to read and write data from the local replicas of your database. Azure Cosmos DB transparently replicates the data to all the regions associated with your Cosmos account. Azure Cosmos DB is a globally distributed database service that's designed to provide low latency, elastic scalability of throughput, well-defined semantics for data consistency, and high availability. In short, if your application needs fast response time anywhere in the world, if it's required to be always online, and needs unlimited and elastic scalability of throughput and storage, you should build your application on Azure Cosmos DB.
-You can configure your databases to be globally distributed and available in any of the Azure regions. To lower the latency, place the data close to where your users are. Choosing the required regions depends on the global reach of your application and where your users are located. Cosmos DB transparently replicates the data to all the regions associated with your Cosmos account. It provides a single system image of your globally distributed Azure Cosmos database and containers that your application can read and write to locally.
+You can configure your databases to be globally distributed and available in any of the Azure regions. To lower the latency, place the data close to where your users are. Choosing the required regions depends on the global reach of your application and where your users are located. Cosmos DB transparently replicates the data to all the regions associated with your Cosmos account. It provides a single system image of your globally distributed Azure Cosmos database and containers that your application can read and write to locally.
-With Azure Cosmos DB, you can add or remove the regions associated with your account at any time. Your application doesn't need to be paused or redeployed to add or remove a region. It continues to be highly available all the time because of the multi-homing capabilities that the service natively provides.
+With Azure Cosmos DB, you can add or remove the regions associated with your account at any time. Your application doesn't need to be paused or redeployed to add or remove a region.
:::image type="content" source="./media/distribute-data-globally/deployment-topology.png" alt-text="Highly available deployment topology" border="false":::
With Azure Cosmos DB, you can add or remove the regions associated with your acc
- 99.999% read and write availability all around the world. - Guaranteed reads and writes served in less than 10 milliseconds at the 99th percentile.
-By using the Azure Cosmos DB multi-homing APIs, your application is aware of the nearest region and can send requests to that region. The nearest region is identified without any configuration changes. As you add and remove regions to and from your Azure Cosmos account, your application does not need to be redeployed or paused, it continues to be highly available at all times.
+As you add and remove regions to and from your Azure Cosmos account, your application does not need to be redeployed or paused, it continues to be highly available at all times.
**Build highly responsive apps.** Your application can perform near real-time reads and writes against all the regions you chose for your database. Azure Cosmos DB internally handles the data replication between regions with consistency level guarantees of the level you've selected.
cosmos-db Global Dist Under The Hood https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/global-dist-under-the-hood.md
The semantics of the five consistency models in Cosmos DB are described [here](c
Next learn how to configure global distribution by using the following articles: * [Add/remove regions from your database account](how-to-manage-database-account.md#addremove-regions-from-your-database-account)
-* [How to configure clients for multi-homing](how-to-manage-database-account.md#configure-multiple-write-regions)
* [How to create a custom conflict resolution policy](how-to-manage-conflicts.md#create-a-custom-conflict-resolution-policy)
cosmos-db How To Manage Database Account https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-manage-database-account.md
Previously updated : 09/18/2020 Last updated : 01/06/2021
Please see [Set failover priority with PowerShell](manage-with-powershell.md#mod
The process for performing a manual failover involves changing the account's write region (failover priority = 0) to another region configured for the account. > [!NOTE]
-> Accounts with multiple write regions cannot be manually failed over. For applications using the Azure Cosmos SDK, the SDK will detect when a region becomes unavailable, then redirect automatically to the next closest region if using multi-homing API in the SDK.
+> Accounts with multiple write regions cannot be manually failed over. For applications using the Azure Cosmos SDK, the SDK will detect when a region becomes unavailable, then redirect automatically to the next closest region.
### <a id="enable-manual-failover-via-portal"></a>Azure portal
cosmos-db How To Multi Master https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/how-to-multi-master.md
Previously updated : 09/10/2020 Last updated : 01/06/2021
# Configure multi-region writes in your applications that use Azure Cosmos DB [!INCLUDE[appliesto-sql-api](includes/appliesto-sql-api.md)]
-Once an account has been created with multiple write regions enabled, you must make two changes in your application to the ConnectionPolicy for the DocumentClient to enable the multi-region writes and multi-homing capabilities in Azure Cosmos DB. Within the ConnectionPolicy, set UseMultipleWriteLocations to true and pass the name of the region where the application is deployed to SetCurrentLocation. This will populate the PreferredLocations property based on the geo-proximity from location passed in. If a new region is later added to the account, the application does not have to be updated or redeployed, it will automatically detect the closer region and will auto-home on to it should a regional event occur.
+Once an account has been created with multiple write regions enabled, you must make two changes in your application to the ConnectionPolicy for the Cosmos client to enable the multi-region writes in Azure Cosmos DB. Within the ConnectionPolicy, set UseMultipleWriteLocations to true and pass the name of the region where the application is deployed to ApplicationRegion. This will populate the PreferredLocations property based on the geo-proximity from location passed in. If a new region is later added to the account, the application does not have to be updated or redeployed, it will automatically detect the closer region and will auto-home on to it should a regional event occur.
> [!Note] > Cosmos accounts initially configured with single write region can be configured to multiple write regions with zero down time. To learn more see, [Configure multiple-write regions](how-to-manage-database-account.md#configure-multiple-write-regions)
cosmos-db Mongodb Feature Support 40 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/mongodb-feature-support-40.md
By using the Azure Cosmos DB's API for MongoDB, you can enjoy the benefits of th
The supported operators and any limitations or exceptions are listed below. Any client driver that understands these protocols should be able to connect to Azure Cosmos DB's API for MongoDB. When using Azure Cosmos DB's API for MongoDB accounts, the 3.6+ versions of accounts have the endpoint in the format `*.mongo.cosmos.azure.com` whereas the 3.2 version of accounts has the endpoint in the format `*.documents.azure.com`.
+> [!NOTE]
+> This article only lists the supported server commands and excludes client-side wrapper functions. Client-side wrapper functions such as `deleteMany()` and `updateMany()` internally utilize the `delete()` and `update()` server commands. Functions utilizing supported server commands are compatible with Azure Cosmos DB's API for MongoDB.
+ ## Query language support Azure Cosmos DB's API for MongoDB provides comprehensive support for MongoDB query language constructs. Below you can find the detailed list of currently supported operations, operators, stages, commands, and options.
cosmos-db Sql Api Query Metrics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/sql-api-query-metrics.md
Previously updated : 05/23/2019 Last updated : 01/06/2021
The following are the most common factors that impact Azure Cosmos DB query perf
| Provisioned throughput | Measure RU per query, and ensure that you have the required provisioned throughput for your queries. | | Partitioning and partition keys | Favor queries with the partition key value in the filter clause for low latency. | | SDK and query options | Follow SDK best practices like direct connectivity, and tune client-side query execution options. |
-| Network latency | Account for network overhead in measurement, and use multi-homing APIs to read from the nearest region. |
| Indexing Policy | Ensure that you have the required indexing paths/policy for the query. | | Query execution metrics | Analyze the query execution metrics to identify potential rewrites of query and data shapes. |
cost-management-billing Assign Roles Azure Service Principals https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/assign-roles-azure-service-principals.md
+
+ Title: Assign roles to Azure Enterprise Agreement service principal names
+description: This article helps you assign roles to service principal names using PowerShell and REST APIs.
++
+tags: billing
+++ Last updated : 03/07/2021+++
+# Assign roles to Azure Enterprise Agreement service principal names
+
+You can manage your Enterprise Agreement (EA) enrollment in the [Azure Enterprise portal](https://ea.azure.com/). You can create different roles to manage your organization, view costs, and create subscriptions. This article helps you automate some of those tasks using Azure PowerShell and REST APIs with Azure service principal names (SPNs).
+
+Before you begin, ensure that you're familiar with the following articles:
+
+- [Enterprise agreement roles](understand-ea-roles.md)
+- [Sign in with Azure PowerShell](/powershell/azure/authenticate-azureps?view=azps-5.5.0&preserve-view=true)
+- [How to call REST APIs with Postman](/rest/api/azure/#how-to-call-azure-rest-apis-with-postman)
+
+## Create and authenticate your service principal
+
+To automate EA actions using an SPN, you need to create an Azure Active Directory (Azure AD) application. It can authenticate in an automated manner. Read the following articles and following the steps in them to create and authenticate your service principal.
+
+1. [Create a service principal](../../active-directory/develop/howto-create-service-principal-portal.md#register-an-application-with-azure-ad-and-create-a-service-principal)
+2. [Get tenant and app ID values for signing in](../../active-directory/develop/howto-create-service-principal-portal.md#get-tenant-and-app-id-values-for-signing-in)
+
+Here's an example screenshot showing application registration.
++
+### Find your SPN and Tenant ID
+
+You also need the Object ID of the SPN and the Tenant ID of the app. You need the information for permission assignment operations in later sections.
+
+You can find the Tenant ID of the Azure AD app on the overview page for the application. To find it in the Azure portal, navigate to Azure Active Directory and select **Enterprise applications**. Search for the app.
++
+Select the app. Here's an example showing the Application ID and Object ID.
++
+You can find the Tenant ID on the Microsoft Azure AD Overview page.
++
+Your principal tenant ID is also referred to as Principal ID, SPN, and Object ID in various locations. The value of your Azure AD tenant ID looks like a GUID with the following format: `11111111-1111-1111-1111-111111111111`.
+
+## Permissions that can be assigned to the SPN
+
+For the next steps, you give permission to the Azure AD app to do actions using an EA role. You can assign only the following roles to the SPN. The role definition ID, exactly as shown, is used later in assignment steps.
+
+| Role | Actions allowed | Role definition ID |
+| | | |
+| EnrollmentReader | Can view usage and charges across all accounts and subscriptions. Can view the Azure Prepayment (previously called monetary commitment) balance associated with the enrollment. | 24f8edb6-1668-4659-b5e2-40bb5f3a7d7e |
+| DepartmentReader | Download the usage details for the department they administer. Can view the usage and charges associated with their department. | db609904-a47f-4794-9be8-9bd86fbffd8a |
+| SubscriptionCreator | Create new subscriptions in the given scope of Account. | a0bcee42-bf30-4d1b-926a-48d21664ef71 |
+
+- An enrollment reader can be assigned to an SPN only by a user with enrollment writer role.
+- A department reader can be assigned to an SPN only by a user that has enrollment writer role or department writer role.
+- A subscription creator role can be assigned to an SPN only by a user that is the Account Owner of the enrollment account.
+
+## Assign enrollment account role permission to the SPN
+
+Read the [Role Assignments - Put](/rest/api/billing/2019-10-01-preview/roleassignments/put) REST API article.
+
+While reading the article, select **Try it** to get started using the SPN.
++
+Sign in with your account into the tenant that has access to the enrollment where you want to assign access.
+
+Provide the following parameters as part of the API request.
+
+**billingAccountName**
+
+The parameter is the Billing account ID. You can find it in the Azure portal on the Cost Management + Billing overview page.
++
+**billingRoleAssignmentName**
+
+The parameter is a unique GUID that you need to provide. You can generate a GUID using the [New-Guid](/powershell/module/microsoft.powershell.utility/new-guid?view=powershell-7.1&preserve-view=true) PowerShell command.
+
+Or, you can use the [Online GUID / UUID Generator](https://guidgenerator.com/) website to generate a unique GUID.
+
+**api-version**
+
+Use the **2019-10-01-preview** version.
+
+The request body has JSON code that you need to use.
+
+Use the sample request body at [Role Assignments - Put - Examples](/rest/api/billing/2019-10-01-preview/roleassignments/put#examples).
+
+There are three parameters that you need to use as part of the JSON.
+
+| Parameter | Where to find it |
+| | |
+| properties.principalId | See [Find your SPN and Tenant ID](#find-your-spn-and-tenant-id). |
+| properties.principalTenantId | See [Find your SPN and Tenant ID](#find-your-spn-and-tenant-id). |
+| properties.roleDefinitionId | "/providers/Microsoft.Billing/billingAccounts/{BillingAccountName}/billingRoleDefinitions/24f8edb6-1668-4659-b5e2-40bb5f3a7d7e" |
+
+The Billing Account name is the same parameter that you used in the API parameters. It's the enrollment ID that you see in the EA portal and Azure portal.
+
+Notice that `24f8edb6-1668-4659-b5e2-40bb5f3a7d7e` is a billing role definition ID for a EnrollmentReader.
+
+Select **Run** to start the command.
++
+A `200 OK` response shows that the SPN was successfully added.
+
+Now you can use the SPN (Azure AD App with the object ID) to access EA APIs in an automated manner. The SPN has the EnrollmentReader role.
+
+## Assign the department reader role to the SPN
+
+Before you begin, read the [Enrollment Department Role Assignments - Put](/rest/api/billing/2019-10-01-preview/enrollmentdepartmentroleassignments/put) REST API article.
+
+While reading the article, select **Try it**.
++
+Sign in with your account into the tenant that has access to the enrollment where you want to assign access.
+
+Provide the following parameters as part of the API request.
+
+**billingAccountName**
+
+It's the Billing account ID. You can find it in the Azure portal on the Cost Management + Billing overview page.
++
+**billingRoleAssignmentName**
+
+The parameter is a unique GUID that you need to provide. You can generate a GUID using the [New-Guid](/powershell/module/microsoft.powershell.utility/new-guid?view=powershell-7.1&preserve-view=true) PowerShell command.
+
+Or, you can use the [Online GUID / UUID Generator](https://guidgenerator.com/) website to generate a unique GUID.
+
+**departmentName**
+
+It's the Department ID. You can see department IDs in the Azure portal. Navigate to Cost Management + Billing > **Departments**.
+
+For this example, we used the ACE department. The ID for the example is `84819`.
++
+**api-version**
+
+Use the **2019-10-01-preview** version.
+
+The request body has JSON code that you need to use.
+
+Use the sample at [Enrollment Department Role Assignments - Put](/billing/2019-10-01-preview/enrollmentdepartmentroleassignments/put). There are three parameters that you need to use as part of the JSON.
+
+| Parameter | Where to find it |
+| | |
+| properties.principalId | See [Find your SPN and Tenant ID](#find-your-spn-and-tenant-id). |
+| properties.principalTenantId | See [Find your SPN and Tenant ID](#find-your-spn-and-tenant-id). |
+| properties.roleDefinitionId | "/providers/Microsoft.Billing/billingAccounts/{BillingAccountName}/billingRoleDefinitions/db609904-a47f-4794-9be8-9bd86fbffd8a" |
+
+The Billing Account name is the same parameter that you used in the API parameters. It's the enrollment ID that you see in the EA portal and Azure portal.
+
+The billing role definition ID of `db609904-a47f-4794-9be8-9bd86fbffd8a` is for a Department Reader.
+
+Select **Run** to start the command.
++
+A `200 OK` response shows that the SPN was successfully added.
+
+Now you can use the SPN (Azure AD App with the object ID) to access EA APIs in an automated manner. The SPN has the DepartmentReader role.
+
+## Assign the subscription creator role to the SPN
+
+Read the [Enrollment Account Role Assignments - Put](/rest/api/billing/2019-10-01-preview/enrollmentaccountroleassignments/put) article.
+
+While reading it, select **Try It** to assign the subscription creator role to the SPN.
++
+Sign in with your account into the tenant that has access to the enrollment where you want to assign access.
+
+Provide the following parameters as part of the API request. Read the article at [Enrollment Account Role Assignments - Put - URI Parameters](/rest/api/billing/2019-10-01-preview/enrollmentaccountroleassignments/put#uri-parameters).
+
+**billingAccountName**
+
+The parameter is the Billing account ID. You can find it in the Azure portal on the Cost Management + Billing overview page.
++
+**billingRoleAssignmentName**
+
+The parameter is a unique GUID that you need to provide. You can generate a GUID using the [New-Guid](/powershell/module/microsoft.powershell.utility/new-guid?view=powershell-7.1&preserve-view=true) PowerShell command.
+
+Or, you can use the [Online GUID / UUID Generator](https://guidgenerator.com/) website to generate a unique GUID.
+**enrollmentAccountName**
+
+The parameter is the account ID. Find the account ID for the account name in the Azure portal in Cost Management + Billing in the Enrollment and department scope.
+
+For this example, we used the GTM Test account. The ID is `196987`.
++
+**api-version**
+
+Use the **2019-10-01-preview** version.
+
+The request body has JSON code that you need to use.
+
+Use the sample at [Enrollment Department Role Assignments - Put - Examples](/rest/api/billing/2019-10-01-preview/enrollmentdepartmentroleassignments/put#putenrollmentdepartmentadministratorroleassignment).
+
+There are three parameters that you need to use as part of the JSON.
+
+| Parameter | Where to find it |
+| | |
+| properties.principalId | See [Find your SPN and Tenant ID](#find-your-spn-and-tenant-id). |
+| properties.principalTenantId | See [Find your SPN and Tenant ID](#find-your-spn-and-tenant-id). |
+| properties.roleDefinitionId | "/providers/Microsoft.Billing/billingAccounts/{BillingAccountID}/enrollmentAccounts/196987/billingRoleDefinitions/a0bcee42-bf30-4d1b-926a-48d21664ef71" |
+
+The Billing Account name is the same parameter that you used in the API parameters. It's the enrollment ID that you see in the EA portal and Azure portal.
+
+The billing role definition ID of `a0bcee42-bf30-4d1b-926a-48d21664ef71` is for the subscription creator role.
+
+Select **Run** to start the command.
++
+A `200 OK` response shows that the SPN has been successfully added.
+
+Now you can use the SPN (Azure AD App with the object ID) to access EA APIs in an automated manner. The SPN has the SubscriptionCreator role.
+
+## Next steps
+
+- Learn more about [Azure EA portal administration](ea-portal-administration.md).
data-factory Connector Dynamics Crm Office 365 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-dynamics-crm-office-365.md
Title: Copy data in Dynamics (Common Data Service)
-description: Learn how to copy data from Microsoft Dynamics CRM or Microsoft Dynamics 365 (Common Data Service) to supported sink data stores or from supported source data stores to Dynamics CRM or Dynamics 365 by using a copy activity in a data factory pipeline.
+description: Learn how to copy data from Microsoft Dynamics CRM or Microsoft Dynamics 365 (Common Data Service/Microsoft Dataverse) to supported sink data stores or from supported source data stores to Dynamics CRM or Dynamics 365 by using a copy activity in a data factory pipeline.
Previously updated : 02/02/2021 Last updated : 03/08/2021
-# Copy data from and to Dynamics 365 (Common Data Service) or Dynamics CRM by using Azure Data Factory
+# Copy data from and to Dynamics 365 (Common Data Service/Microsoft Dataverse) or Dynamics CRM by using Azure Data Factory
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
This connector is supported for the following activities:
- [Copy activity](copy-activity-overview.md) with [supported source and sink matrix](copy-activity-overview.md) - [Lookup activity](control-flow-lookup-activity.md)
-You can copy data from Dynamics 365 (Common Data Service) or Dynamics CRM to any supported sink data store. You also can copy data from any supported source data store to Dynamics 365 (Common Data Service) or Dynamics CRM. For a list of data stores that a copy activity supports as sources and sinks, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
+You can copy data from Dynamics 365 (Common Data Service/Microsoft Dataverse) or Dynamics CRM to any supported sink data store. You also can copy data from any supported source data store to Dynamics 365 (Common Data Service) or Dynamics CRM. For a list of data stores that a copy activity supports as sources and sinks, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
This Dynamics connector supports Dynamics versions 7 through 9 for both online and on-premises. More specifically:
The optimal combination of **writeBatchSize** and **parallelCopies** depends on
] ```
+## Retrieving data from views
+
+To retrieve data from Dynamics views, you need to get the saved query of the view, and use the query to get the data.
+
+There are two entities which store different types of view: "saved query" stores system view and "user query" stores user view. To get the information of the views, refer to the following FetchXML query and replace the "TARGETENTITY" with `savedquery` or `userquery`. Each entity type has more available attributes that you can add to the query based on your need. Learn more about [savedquery entity](https://docs.microsoft.com/dynamics365/customer-engagement/web-api/savedquery) and [userquery entity](https://docs.microsoft.com/dynamics365/customer-engagement/web-api/userquery).
+
+```xml
+<fetch top="5000" >
+ <entity name="<TARGETENTITY>">
+ <attribute name="name" />
+ <attribute name="fetchxml" />
+ <attribute name="returnedtypecode" />
+ <attribute name="querytype" />
+ </entity>
+</fetch>
+```
+
+You can also add filters to filter the views. For example, add the following filter to get a view named "My Active Accounts" in account entity.
+
+```xml
+<filter type="and" >
+ <condition attribute="returnedtypecode" operator="eq" value="1" />
+ <condition attribute="name" operator="eq" value="My Active Accounts" />
+</filter>
+```
+ ## Data type mapping for Dynamics When you copy data from Dynamics, the following table shows mappings from Dynamics data types to Data Factory interim data types. To learn how a copy activity maps to a source schema and a data type maps to a sink, see [Schema and data type mappings](copy-activity-schema-and-type-mapping.md).
data-factory Data Flow Expression Functions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-expression-functions.md
___
Conversion functions are used to convert data and test for data types
-<code>isBoolean</code>
+### <code>isBoolean</code>
<code><b>isBoolean(<value1> : string) => boolean</b></code><br/><br/> Checks if the string value is a boolean value according to the rules of ``toBoolean()`` * ``isBoolean('true') -> true`` * ``isBoolean('no') -> true`` * ``isBoolean('microsoft') -> false``-
-<code>isByte</code>
+___
+### <code>isByte</code>
<code><b>isByte(<value1> : string) => boolean</b></code><br/><br/> Checks if the string value is a byte value given an optional format according to the rules of ``toByte()`` * ``isByte('123') -> true`` * ``isByte('chocolate') -> false``-
-<code>isDate</code>
+___
+### <code>isDate</code>
<code><b>isDate (<value1> : string, [<format>: string]) => boolean</b></code><br/><br/> Checks if the input date string is a date using an optional input date format. Refer Java's SimpleDateFormat for available formats. If the input date format is omitted, default format is ``yyyy-[M]M-[d]d``. Accepted formats are ``[ yyyy, yyyy-[M]M, yyyy-[M]M-[d]d, yyyy-[M]M-[d]dT* ]`` * ``isDate('2012-8-18') -> true`` * ``isDate('12/18--234234' -> 'MM/dd/yyyy') -> false``-
-<code>isShort</code>
+___
+### <code>isShort</code>
<code><b>isShort (<value1> : string, [<format>: string]) => boolean</b></code><br/><br/> Checks of the string value is a short value given an optional format according to the rules of ``toShort()`` * ``isShort('123') -> true`` * ``isShort('$123' -> '$###') -> true`` * ``isShort('microsoft') -> false``-
-<code>isInteger</code>
+___
+### <code>isInteger</code>
<code><b>isInteger (<value1> : string, [<format>: string]) => boolean</b></code><br/><br/> Checks of the string value is a integer value given an optional format according to the rules of ``toInteger()`` * ``isInteger('123') -> true`` * ``isInteger('$123' -> '$###') -> true`` * ``isInteger('microsoft') -> false``-
-<code>isLong</code>
+___
+### <code>isLong</code>
<code><b>isLong (<value1> : string, [<format>: string]) => boolean</b></code><br/><br/> Checks of the string value is a long value given an optional format according to the rules of ``toLong()`` * ``isLong('123') -> true`` * ``isLong('$123' -> '$###') -> true`` * ``isLong('gunchus') -> false``-
-<code>isFloat</code>
+___
+### <code>isFloat</code>
<code><b>isFloat (<value1> : string, [<format>: string]) => boolean</b></code><br/><br/> Checks of the string value is a float value given an optional format according to the rules of ``toFloat()`` * ``isFloat('123') -> true`` * ``isFloat('$123.45' -> '$###.00') -> true`` * ``isFloat('icecream') -> false``-
-<code>isDouble</code>
+___
+### <code>isDouble</code>
<code><b>isDouble (<value1> : string, [<format>: string]) => boolean</b></code><br/><br/> Checks of the string value is a double value given an optional format according to the rules of ``toDouble()`` * ``isDouble('123') -> true`` * ``isDouble('$123.45' -> '$###.00') -> true`` * ``isDouble('icecream') -> false``-
-<code>isDecimal</code>
+___
+### <code>isDecimal</code>
<code><b>isDecimal (<value1> : string) => boolean</b></code><br/><br/> Checks of the string value is a decimal value given an optional format according to the rules of ``toDecimal()`` * ``isDecimal('123.45') -> true`` * ``isDecimal('12/12/2000') -> false``-
-<code>isTimestamp</code>
+___
+### <code>isTimestamp</code>
<code><b>isTimestamp (<value1> : string, [<format>: string]) => boolean</b></code><br/><br/> Checks if the input date string is a timestamp using an optional input timestamp format. Refer to Java's SimpleDateFormat for available formats. If the timestamp is omitted the default pattern ``yyyy-[M]M-[d]d hh:mm:ss[.f...]`` is used. You can pass an optional timezone in the form of 'GMT', 'PST', 'UTC', 'America/Cayman'. Timestamp supports up to millisecond accuracy with value of 999 Refer to Java's SimpleDateFormat for available formats. * ``isTimestamp('2016-12-31 00:12:00') -> true`` * ``isTimestamp('2016-12-31T00:12:00' -> 'yyyy-MM-dd\\'T\\'HH:mm:ss' -> 'PST') -> true`` * ``isTimestamp('2012-8222.18') -> false``-
+___
### <code>toBase64</code> <code><b>toBase64(<i>&lt;value1&gt;</i> : string) => string</b></code><br/><br/> Encodes the given string in base64.
Selects a column value by its relative position(1 based) in the stream. If the p
* ``toBoolean(byName(4))`` * ``toString(byName($colName))`` * ``toString(byPosition(1234))``
+___
+### <code>hex</code>
+<code><b>hex(<value1>: binary) => string</b></code><br/><br/>
+Returns a hex string representation of a binary value
+* ``hex(toBinary([toByte(0x1f), toByte(0xad), toByte(0xbe)])) -> '1fadbe'``
+___
+### <code>unhex</code>
+<code><b>unhex(<value1>: string) => binary</b></code><br/><br/>
+Unhexes a binary value from its string representation. This can be used in conjunction with sha2, md5 to convert from string to binary representation
+* ``unhex('1fadbe') -> toBinary([toByte(0x1f), toByte(0xad), toByte(0xbe)])``
+* ``unhex(md5(5, 'gunchus', 8.2, 'bojjus', true, toDate('2010-4-4'))) -> toBinary([toByte(0x4c),toByte(0xe8),toByte(0xa8),toByte(0x80),toByte(0xbd),toByte(0x62),toByte(0x1a),toByte(0x1f),toByte(0xfa),toByte(0xd0),toByte(0xbc),toByte(0xa9),toByte(0x05),toByte(0xe1),toByte(0xbc),toByte(0x5a)])``
## Window functions The following functions are only available in window transformations.
data-factory Data Flow Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-troubleshoot-guide.md
Title: Troubleshoot mapping data flows
-description: Learn how to troubleshoot data flow issues in Azure Data Factory.
+description: Learn how to troubleshoot data flow problems in Azure Data Factory.
This article explores common troubleshooting methods for mapping data flows in A
### Error code: DF-Executor-SourceInvalidPayload - **Message**: Data preview, debug, and pipeline data flow execution failed because container does not exist-- **Causes**: When dataset contains a container that does not exist in the storage-- **Recommendation**: Make sure that the container referenced in your dataset exists or accessible.
+- **Cause**: A dataset contains a container that doesn't exist in storage.
+- **Recommendation**: Make sure that the container referenced in your dataset exists and can be accessed.
### Error code: DF-Executor-SystemImplicitCartesian - **Message**: Implicit cartesian product for INNER join is not supported, use CROSS JOIN instead. Columns used in join should create a unique key for rows.-- **Causes**: Implicit cartesian product for INNER join between logical plans is not supported. If the columns are used in the join, then the unique key with at least one column from both sides of the relationship is required.-- **Recommendation**: For non-equality based joins you have to opt for CUSTOM CROSS JOIN.
+- **Cause**: Implicit cartesian products for INNER joins between logical plans aren't supported. If you're using columns in the join, create a unique key with at least one column from both sides of the relationship.
+- **Recommendation**: For non-equality based joins, use CUSTOM CROSS join.
### Error code: DF-Executor-SystemInvalidJson - **Message**: JSON parsing error, unsupported encoding or multiline-- **Causes**: Possible issues with the JSON file: unsupported encoding, corrupt bytes, or using JSON source as single document on many nested lines-- **Recommendation**: Verify the JSON file's encoding is supported. On the Source transformation that is using a JSON dataset, expand 'JSON Settings' and turn on 'Single Document'.
+- **Cause**: Possible problems with the JSON file: unsupported encoding, corrupt bytes, or using JSON source as a single document on many nested lines.
+- **Recommendation**: Verify that the JSON file's encoding is supported. On the source transformation that's using a JSON dataset, expand **JSON Settings** and turn on **Single Document**.
### Error code: DF-Executor-BroadcastTimeout - **Message**: Broadcast join timeout error, make sure broadcast stream produces data within 60 secs in debug runs and 300 secs in job runs-- **Causes**: Broadcast has a default timeout of 60 secs in debug runs and 300 seconds in job runs. Stream chosen for broadcast seems too large to produce data within this limit.-- **Recommendation**: Check the Optimize tab on your data flow transformations for Join, Exists, and Lookup. The default option for Broadcast is "Auto". If "Auto" is set, or if you are manually setting the left or right side to broadcast under "Fixed", then you can either set a larger Azure Integration Runtime configuration, or switch off broadcast. The recommended approach for best performance in data flows is to allow Spark to broadcast using "Auto" and use a Memory Optimized Azure IR. If you are executing the data flow in a debug test execution from a debug pipeline run, you may run into this condition more frequently. This is because ADF throttles the broadcast timeout to 60 secs in order to maintain a faster debug experience. If you would like to extend that to the 300-seconds timeout from a triggered run, you can use the Debug > Use Activity Runtime option to utilize the Azure IR defined in your Execute Data Flow pipeline activity.
+- **Cause**: Broadcast has a default timeout of 60 seconds on debug runs and 300 seconds on job runs. The stream chosen for broadcast is too large to produce data within this limit.
+- **Recommendation**: Check the **Optimize** tab on your data flow transformations for join, exists, and lookup. The default option for broadcast is **Auto**. If **Auto** is set, or if you're manually setting the left or right side to broadcast under **Fixed**, you can either set a larger Azure integration runtime (IR) configuration or turn off broadcast. For the best performance in data flows, we recommend that you allow Spark to broadcast by using **Auto** and use a memory-optimized Azure IR.
+
+ If you're running the data flow in a debug test execution from a debug pipeline run, you might run into this condition more frequently. That's because Azure Data Factory throttles the broadcast timeout to 60 seconds to maintain a faster debugging experience. You can extend the timeout to the 300-second timeout of a triggered run. To do so, you can use the **Debug** > **Use Activity Runtime** option to use the Azure IR defined in your Execute Data Flow pipeline activity.
- **Message**: Broadcast join timeout error, you can choose 'Off' of broadcast option in join/exists/lookup transformation to avoid this issue. If you intend to broadcast join option to improve performance then make sure broadcast stream can produce data within 60 secs in debug runs and 300 secs in job runs.-- **Causes**: Broadcast has a default timeout of 60 secs in debug runs and 300 secs in job runs. On broadcast join, the stream chosen for broadcast seems too large to produce data within this limit. If a broadcast join is not used, the default broadcast done by dataflow can reach the same limit-- **Recommendation**: Turn off the broadcast option or avoid broadcasting large data streams where the processing can take more than 60 secs. Choose a smaller stream to broadcast instead. Large SQL/DW tables and source files are typically bad candidates. In the absence of a broadcast join, use a larger cluster if the error occurs.
+- **Cause**: Broadcast has a default timeout of 60 seconds in debug runs and 300 seconds in job runs. On the broadcast join, the stream chosen for broadcast is too large to produce data within this limit. If a broadcast join isn't used, the default broadcast by dataflow can reach the same limit.
+- **Recommendation**: Turn off the broadcast option or avoid broadcasting large data streams for which the processing can take more than 60 seconds. Choose a smaller stream to broadcast. Large Azure SQL Data Warehouse tables and source files aren't typically good choices. In the absence of a broadcast join, use a larger cluster if this error occurs.
### Error code: DF-Executor-Conversion - **Message**: Converting to a date or time failed due to an invalid character-- **Causes**: Data is not in the expected format-- **Recommendation**: Use the correct data type
+- **Cause**: Data isn't in the expected format.
+- **Recommendation**: Use the correct data type.
### Error code: DF-Executor-InvalidColumn - **Message**: Column name needs to be specified in the query, set an alias if using a SQL function-- **Causes**: No column name was specified-- **Recommendation**: Set an alias if using a SQL function such as min()/max(), etc.
+- **Cause**: No column name is specified.
+- **Recommendation**: Set an alias if you're using a SQL function like min() or max().
### Error code: DF-Executor-DriverError - **Message**: INT96 is legacy timestamp type which is not supported by ADF Dataflow. Please consider upgrading the column type to the latest types.-- **Causes**: Driver error-- **Recommendation**: INT96 is legacy timestamp type, which is not supported by ADF Dataflow. Consider upgrading the column type to the latest types.
+- **Cause**: Driver error.
+- **Recommendation**: INT96 is a legacy timestamp type that's not supported by Azure Data Factory data flow. Consider upgrading the column type to the latest type.
### Error code: DF-Executor-BlockCountExceedsLimitError - **Message**: The uncommitted block count cannot exceed the maximum limit of 100,000 blocks. Check blob configuration.-- **Causes**: There can be a maximum of 100,000 uncommitted blocks in a blob.-- **Recommendation**: Contact Microsoft product team regarding this issue for more details
+- **Cause**: The maximum number of uncommitted blocks in a blob is 100,000.
+- **Recommendation**: Contact the Microsoft product team for more details about this problem.
### Error code: DF-Executor-PartitionDirectoryError - **Message**: The specified source path has either multiple partitioned directories (for e.g. <Source Path>/<Partition Root Directory 1>/a=10/b=20, <Source Path>/<Partition Root Directory 2>/c=10/d=30) or partitioned directory with other file or non-partitioned directory (for example <Source Path>/<Partition Root Directory 1>/a=10/b=20, <Source Path>/Directory 2/file1), remove partition root directory from source path and read it through separate source transformation.-- **Causes**: Source path has either multiple partitioned directories or partitioned directory with other file or non-partitioned directory.-- **Recommendation**: Remove partitioned root directory from source path and read it through separate source transformation.-
-### Error code: DF-Executor-OutOfMemoryError
-- **Message**: Cluster ran into out of memory issue during execution, please retry using an integration runtime with bigger core count and/or memory optimized compute type-- **Causes**: Cluster is running out of memory-- **Recommendation**: Debug clusters are meant for development purposes. Leverage data sampling, appropriate compute type, and size to run the payload. Refer to the [mapping data flow performance guide](concepts-data-flow-performance.md) for tuning to achieve best performance.
+- **Cause**: The source path has either multiple partitioned directories or a partitioned directory that has another file or non-partitioned directory.
+- **Recommendation**: Remove the partitioned root directory from the source path and read it through separate source transformation.
### Error code: DF-Executor-InvalidType - **Message**: Please make sure that the type of parameter matches with type of value passed in. Passing float parameters from pipelines isn't currently supported.-- **Causes**: Incompatible data types between declared type and actual parameter value-- **Recommendation**: Please check that your parameter values passed into a data flow match the declared type.
+- **Cause**: The data type for the declared type isn't compatible with the actual parameter value.
+- **Recommendation**: Check that the parameter values passed into the data flow match the declared type.
### Error code: DF-Executor-ColumnUnavailable - **Message**: Column name used in expression is unavailable or invalid-- **Causes**: Invalid or unavailable column name used in expressions-- **Recommendation**: Please check column name(s) used in expressions
+- **Cause**: An invalid or unavailable column name used in an expression.
+- **Recommendation**: Check column names in expressions.
### Error code: DF-Executor-ParseError - **Message**: Expression cannot be parsed-- **Causes**: Expression has parsing errors due to formatting-- **Recommendation**: Please check formatting in expression.
+- **Cause**: An expression generated parsing errors because of incorrect formatting.
+- **Recommendation**: Check the formatting in the expression.
### Error code: DF-Executor-SystemImplicitCartesian - **Message**: Implicit cartesian product for INNER join is not supported, use CROSS JOIN instead. Columns used in join should create a unique key for rows.-- **Causes**: Implicit cartesian product for INNER join between logical plans is not supported. If the columns used in the join creates the unique key-- **Recommendation**: For non-equality based joins you have to opt for CROSS JOIN.
+- **Cause**: Implicit cartesian products for INNER joins between logical plans aren't supported. If you're using columns in the join, create a unique key.
+- **Recommendation**: For non-equality based joins, use CROSS JOIN.
### Error code: DF-Executor-SystemInvalidJson - **Message**: JSON parsing error, unsupported encoding or multiline-- **Causes**: Possible issues with the JSON file: unsupported encoding, corrupt bytes, or using JSON source as single document on many nested lines-- **Recommendation**: Verify the JSON file's encoding is supported. On the Source transformation that is using a JSON dataset, expand 'JSON Settings' and turn on 'Single Document'.
+- **Cause**: Possible problems with the JSON file: unsupported encoding, corrupt bytes, or using JSON source as a single document on many nested lines.
+- **Recommendation**: Verify that the JSON file's encoding is supported. On the source transformation that's using a JSON dataset, expand **JSON Settings** and turn on **Single Document**.
### Error code: DF-Executor-Conversion - **Message**: Converting to a date or time failed due to an invalid character-- **Causes**: Data is not in the expected format-- **Recommendation**: Please use the correct data type.
+- **Cause**: Data isn't in the expected format.
+- **Recommendation**: Use the correct data type.
### Error code: DF-Executor-BlockCountExceedsLimitError - **Message**: The uncommitted block count cannot exceed the maximum limit of 100,000 blocks. Check blob configuration.-- **Causes**: There can be a maximum of 100,000 uncommitted blocks in a blob.-- **Recommendation**: Please contact Microsoft product team regarding this issue for more details
+- **Cause**: The maximum number of uncommitted blocks in a blob is 100,000.
+- **Recommendation**: Contact the Microsoft product team for more details about this problem.
### Error code: DF-Executor-PartitionDirectoryError - **Message**: The specified source path has either multiple partitioned directories (for e.g. *<Source Path>/<Partition Root Directory 1>/a=10/b=20, <Source Path>/<Partition Root Directory 2>/c=10/d=30*) or partitioned directory with other file or non-partitioned directory (for e.g. *<Source Path>/<Partition Root Directory 1>/a=10/b=20, <Source Path>/Directory 2/file1*), remove partition root directory from source path and read it through separate source transformation.-- **Causes**: Source path has either multiple partitioned directories or partitioned directory with other file or non-partitioned directory.-- **Recommendation**: Remove partitioned root directory from source path and read it through separate source transformation.
+- **Cause**: The source path has either multiple partitioned directories or a partitioned directory that has another file or non-partitioned directory.
+- **Recommendation**: Remove the partitioned root directory from the source path and read it through separate source transformation.
### Error code: GetCommand OutputAsync failed - **Message**: During Data Flow debug and data preview: GetCommand OutputAsync failed with ...-- **Causes**: This is a back-end service error. You can retry the operation and also restart your debug session.-- **Recommendation**: If retry and restart do not resolve the issue, contact customer support.
+- **Cause**: This error is a back-end service error.
+- **Recommendation**: Retry the operation and restart your debugging session. If retrying and restarting doesn't resolve the problem, contact customer support.
### Error code: DF-Executor-OutOfMemoryError - **Message**: Cluster ran into out of memory issue during execution, please retry using an integration runtime with bigger core count and/or memory optimized compute type-- **Causes**: Cluster is running out of memory.-- **Recommendation**: Debug clusters are meant for development purposes. Leverage data sampling appropriate compute type and size to run the payload. Refer to [Dataflow Performance Guide](./concepts-data-flow-performance.md) for tuning the dataflows for best performance.
+- **Cause**: The cluster is running out of memory.
+- **Recommendation**: Debug clusters are meant for development. Use data sampling and an appropriate compute type and size to run the payload. For performance tips, see [Mapping data flow performance guide](concepts-data-flow-performance.md).
### Error code: DF-Executor-illegalArgument-- **Message**: Please make sure that the access key in your Linked Service is correct.-- **Causes**: Account Name or Access Key is incorrect.-- **Recommendation**: Please supply right account name or access key. - **Message**: Please make sure that the access key in your Linked Service is correct-- **Causes**: Account Name or Access Key incorrect
+- **Cause**: The account name or access key is incorrect.
- **Recommendation**: Ensure the account name or access key specified in your linked service is correct. ### Error code: DF-Executor-InvalidType - **Message**: Please make sure that the type of parameter matches with type of value passed in. Passing float parameters from pipelines isn't currently supported.-- **Causes**: Incompatible data types between declared type and actual parameter value-- **Recommendation**: Please supply right data types.
+- **Cause**: The data type for the declared type isn't compatible with the actual parameter value.
+- **Recommendation**: Supply the correct data types.
### Error code: DF-Executor-ColumnUnavailable - **Message**: Column name used in expression is unavailable or invalid.-- **Causes**: Invalid or unavailable column name is used in expressions.-- **Recommendation**: Please check column name(s) used in expressions.
+- **Cause**: An invalid or unavailable column name is used in an expression.
+- **Recommendation**: Check the column names used in expressions.
### Error code: DF-Executor-ParseError - **Message**: Expression cannot be parsed.-- **Causes**: Expression has parsing errors due to formatting.-- **Recommendation**: Please check formatting in expression.
+- **Cause**: An expression generated parsing errors because of incorrect formatting.
+- **Recommendation**: Check the formatting in the expression.
### Error code: DF-Executor-OutOfDiskSpaceError - **Message**: Internal server error-- **Causes**: Cluster is running out of disk space.-- **Recommendation**: Please retry the pipeline. If problem persists, contact customer support.
+- **Cause**: The cluster is running out of disk space.
+- **Recommendation**: Retry the pipeline. If doing so doesn't resolve the problem, contact customer support.
### Error code: DF-Executor-StoreIsNotDefined - **Message**: The store configuration is not defined. This error is potentially caused by invalid parameter assignment in the pipeline.-- **Causes**: Undetermined-- **Recommendation**: Please check parameter value assignment in the pipeline. Parameter expression may contain invalid characters.
+- **Cause**: Undetermined.
+- **Recommendation**: Check parameter value assignment in the pipeline. A parameter expression might contain invalid characters.
### Error code: DF-Excel-InvalidConfiguration - **Message**: Excel sheet name or index is required.-- **Causes**: Undetermined-- **Recommendation**: Please check parameter value and specify sheet name or index to read Excel data.
+- **Cause**: Undetermined.
+- **Recommendation**: Check the parameter value. Specify the worksheet name or index for reading Excel data.
- **Message**: Excel sheet name and index cannot exist at the same time.-- **Causes**: Undetermined-- **Recommendation**: Please check parameter value and specify sheet name or index to read Excel data.
+- **Cause**: Undetermined.
+- **Recommendation**: Check the parameter value. Specify the worksheet name or index for reading Excel data.
- **Message**: Invalid range is provided.-- **Causes**: Undetermined-- **Recommendation**: Please check parameter value and specify valid range by reference: [Excel properties](./format-excel.md#dataset-properties).
+- **Cause**: Undetermined.
+- **Recommendation**: Check the parameter value. Specify a valid range by reference. For more information, see [Excel properties](./format-excel.md#dataset-properties).
- **Message**: Invalid excel file is provided while only .xlsx and .xls are supported-- **Causes**: Undetermined-- **Recommendation**: Make sure Excel file extension is either .xlsx or .xls.
+- **Cause**: Undetermined.
+- **Recommendation**: Make sure the Excel file extension is either .xlsx or .xls.
### Error code: DF-Excel-InvalidData - **Message**: Excel worksheet does not exist.-- **Causes**: Undetermined-- **Recommendation**: Please check parameter value and specify valid sheet name or index to read Excel data.
+- **Cause**: Undetermined.
+- **Recommendation**: Check the parameter value. Specify a valid worksheet name or index for reading Excel data.
- **Message**: Reading excel files with different schema is not supported now.-- **Causes**: Undetermined-- **Recommendation**: Use correct Excel file.
+- **Cause**: Undetermined.
+- **Recommendation**: Use a supported Excel file.
- **Message**: Data type is not supported.-- **Causes**: Undetermined-- **Recommendation**: Please use Excel file right data types.
+- **Cause**: Undetermined.
+- **Recommendation**: Use supported Excel file data types.
### Error code: 4502-- **Message**: There are substantial concurrent MappingDataflow executions which are causing failures due to throttling under Integration Runtime.-- **Causes**: A lot of Dataflow Activity runs are going on concurrently on the Integration Runtime. Please learn more about the [Azure Data Factory limits](../azure-resource-manager/management/azure-subscription-service-limits.md#data-factory-limits).-- **Recommendation**: In case you are looking to run more Data flow activities in parallel, please distribute those on multiple integration runtimes.
+- **Message**: There are substantial concurrent MappingDataflow executions that are causing failures due to throttling under Integration Runtime.
+- **Cause**: A large number of Data Flow activity runs are occurring concurrently on the integration runtime. For more information, see [Azure Data Factory limits](../azure-resource-manager/management/azure-subscription-service-limits.md#data-factory-limits).
+- **Recommendation**: If you want to run more Data Flow activities in parallel, distribute them across multiple integration runtimes.
### Error code: InvalidTemplate - **Message**: The pipeline expression cannot be evaluated.-- **Causes**: Pipeline expression passed in the dataflow activity is not being processed correctly because of syntax error.-- **Recommendation**: Please check your activity in activity monitoring to verify the expression.
+- **Cause**: The pipeline expression passed in the Data Flow activity isn't being processed correctly because of a syntax error.
+- **Recommendation**: Check your activity in activity monitoring to verify the expression.
### Error code: 2011 - **Message**: The activity was running on Azure Integration Runtime and failed to decrypt the credential of data store or compute connected via a Self-hosted Integration Runtime. Please check the configuration of linked services associated with this activity, and make sure to use the proper integration runtime type.-- **Causes**: Data flow does not support the linked services with self-hosted integration runtime.-- **Recommendation**: Please configure Data flow to run on integration runtime with 'Managed Virtual Network'.
+- **Cause**: Data flow doesn't support linked services on self-hosted integration runtimes.
+- **Recommendation**: Configure data flow to run on a Managed Virtual Network integration runtime.
## Miscellaneous troubleshooting tips-- **Issue**: Hit unexpected exception and execution failed
+- **Issue**: Unexpected exception occurred and execution failed.
- **Message**: During Data Flow activity execution: Hit unexpected exception and execution failed.
- - **Causes**: This is a back-end service error. You can retry the operation and also restart your debug session.
- - **Recommendation**: If retry and restart do not resolve the issue, contact customer support.
+ - **Cause**: This error is a back-end service error. Retry the operation and restart your debugging session.
+ - **Recommendation**: If retrying and restarting doesn't resolve the problem, contact customer support.
-- **Issue**: Debug data preview No Output Data on Join
+- **Issue**: No output data on join during debug data preview.
- **Message**: There are a high number of null values or missing values which may be caused by having too few rows sampled. Try updating the debug row limit and refreshing the data.
- - **Causes**: Join condition did not match any rows or resulted in high number of NULLs during data preview.
- - **Recommendation**: Go to Debug Settings and increase the number of rows in the source row limit. Make sure that you have select an Azure IR with a large enough data flow cluster to handle more data.
+ - **Cause**: The join condition either didn't match any rows or resulted in a large number of null values during the data preview.
+ - **Recommendation**: In **Debug Settings**, increase the number of rows in the source row limit. Be sure to select an Azure IR that has a data flow cluster that's large enough to handle more data.
-- **Issue**: Validation Error at Source with multiline CSV files
- - **Message**: You might see one of the following error messages:
+- **Issue**: Validation error at source with multiline CSV files.
+ - **Message**: You might see one of these error messages:
- The last column is null or missing. - Schema validation at source fails. - Schema import fails to show correctly in the UX and the last column has a new line character in the name.
- - **Causes**: In the Mapping data flow, currently, the multiline CSV source does not work with the \r\n as row delimiter. Sometimes extra lines at carriage returns break source values.
- - **Recommendation**: Either generate the file at the source with \n as row delimiter rather than \r\n. Or, use Copy Activity to convert CSV file with \r\n to \n as a row delimiter.
+ - **Cause**: In the Mapping data flow, multiline CSV source files don't currently work when \r\n is used as the row delimiter. Sometimes extra lines at carriage returns can cause errors.
+ - **Recommendation**: Generate the file at the source by using \n as the row delimiter rather than \r\n. Or use the Copy activity to convert the CSV file to use \n as a row delimiter.
## General troubleshooting guidance
-1. Check the status of your dataset connections. In each Source and Sink transformation, visit the Linked Service for each dataset that you are using and test connections.
-2. Check the status of your file and table connections from the data flow designer. Switch on Debug and click on Data Preview on your Source transformations to ensure that you are able to access your data.
-3. If everything looks good from data preview, go into the Pipeline designer and put your data flow in a pipeline activity. Debug the pipeline for an end-to-end test.
+1. Check the status of your dataset connections. In each source and sink transformation, go to the linked service for each dataset that you're using and test the connections.
+2. Check the status of your file and table connections in the data flow designer. In debug mode, select **Data Preview** on your source transformations to ensure that you can access your data.
+3. If everything looks correct in data preview, go into the Pipeline designer and put your data flow in a Pipeline activity. Debug the pipeline for an end-to-end test.
## Next steps
-For more help with troubleshooting, try the following resources:
+For more help with troubleshooting, see these resources:
* [Data Factory blog](https://azure.microsoft.com/blog/tag/azure-data-factory/) * [Data Factory feature requests](https://feedback.azure.com/forums/270578-data-factory) * [Azure videos](https://azure.microsoft.com/resources/videos/index/?sort=newest&services=data-factory)
-* [Stack overflow forum for Data Factory](https://stackoverflow.com/questions/tagged/azure-data-factory)
-* [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
+* [Stack Overflow forum for Data Factory](https://stackoverflow.com/questions/tagged/azure-data-factory)
+* [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
data-factory How To Use Sql Managed Instance With Ir https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/how-to-use-sql-managed-instance-with-ir.md
You can now move your SQL Server Integration Services (SSIS) projects, packages,
- Inside the same virtual network as the managed instance, with **different subnet**. - Inside a different virtual network than the the managed instance, via virtual network peering (which is limited to the same region due to Global VNet peering constraints) or a connection from virtual network to virtual network.
- For more info on SQL Managed Instance connectivity, see [Connect your application to Azure SQL Managed Instance](https://review.docs.microsoft.com/azure/sql-database/sql-database-managed-instance-connect-app).
+ For more info on SQL Managed Instance connectivity, see [Connect your application to Azure SQL Managed Instance](/azure/sql-database/sql-database-managed-instance-connect-app).
1. [Configure virtual network](#configure-virtual-network).
data-lake-analytics Data Lake Analytics Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-lake-analytics/data-lake-analytics-overview.md
Data Lake Analytics is a cost-effective solution for running big data workloads.
Data Lake Analytics works with Azure Data Lake Storage for the highest performance, throughput, and parallelization and works with Azure Storage blobs, Azure SQL Database, Azure Synapse Analytics.
+## In-region data residency
+
+Data Lake Analytics does not move or store customer data out of the region in which it is deployed.
+ ## Next steps
databox-online Azure Stack Edge Gpu Connect Powershell Interface https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-gpu-connect-powershell-interface.md
Previously updated : 02/22/2021 Last updated : 03/08/2021 # Manage an Azure Stack Edge Pro GPU device via Windows PowerShell
This article focuses on how you can connect to the PowerShell interface of the d
## Connect to the PowerShell interface ## Create a support package [!INCLUDE [Create a support package](../../includes/data-box-edge-gateway-create-support-package.md)]
-<!--## Upload certificate
--
-You can also upload IoT Edge certificates to enable a secure connection between your IoT Edge device and the downstream devices that may connect to it. There are three IoT Edge certificates (*.pem* format) that you need to install:
--- Root CA certificate or the owner CA-- Device CA certificate-- Device key certificate-
-The following example shows the usage of this cmdlet to install IoT Edge certificates:
-
-```
-Set-HcsCertificate -Scope IotEdge -RootCACertificateFilePath "\\hcfs\root-ca-cert.pem" -DeviceCertificateFilePath "\\hcfs\device-ca-cert.pem\" -DeviceKeyFilePath "\\hcfs\device-key-cert.pem" -Credential "username"
-```
-When you run this cmdlet, you will be prompted to provide the password for the network share.
-
-For more information on certificates, go to [Azure IoT Edge certificates](../iot-edge/iot-edge-certs.md) or [Install certificates on a gateway](../iot-edge/how-to-create-transparent-gateway.md).-->
## View device information
If the compute role is configured on your device, you can also get the GPU drive
A Multi-Process Service (MPS) on Nvidia GPUs provides a mechanism where GPUs can be shared by multiple jobs, where each job is allocated some percentage of the GPU's resources. MPS is a preview feature on your Azure Stack Edge Pro GPU device. To enable MPS on your device, follow these steps:
-1. Before you begin, make sure: that
-
- 1. You've configured and [Activated your Azure Stack Edge Pro device](azure-stack-edge-gpu-deploy-activate.md) with an Azure Stack Edge Pro/Data Box Gateway resource in Azure.
- 1. You've [Configured compute on this device in the Azure portal](azure-stack-edge-deploy-configure-compute.md#configure-compute).
-
-1. [Connect to the PowerShell interface](#connect-to-the-powershell-interface).
-1. Use the following command to enable MPS on your device.
- ```powershell
- Start-HcsGpuMPS
- ```
## Reset your device
Id PodSubnet ServiceSubnet
[10.100.10.10]: PS> ``` - ## Debug Kubernetes issues related to IoT Edge
-<!--When the Kubernetes cluster is created, there are two system namespaces created: `iotedge` and `azure-arc`. -->
-
-<!--### Create config file for system namespace
-
-To troubleshoot, first create the `config` file corresponding to the `iotedge` namespace with `aseuser`.
-
-Run the `Get-HcsKubernetesUserConfig -AseUser` command and save the output as `config` file (no file extension). Save the file in the `.kube` folder of your user profile on the local machine.
-
-Following is the sample output of the `Get-HcsKubernetesUserConfig` command.
+Before you begin, you must have:
-```PowerShell
-[10.100.10.10]: PS>Get-HcsKubernetesUserConfig -AseUser
-apiVersion: v1
-clusters:
-- cluster:
- certificate-authority-data: LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUN5RENDQWJDZ0F3SUJBZ0lCQURBTkJna3Foa2lHOXcwQkFRc0ZBREFWTVJNd0VRWURWUVFERXdwcmRXSmwKY201bGRHVnpNQjRYRFRJd01EVXhNekl4TkRRME5sb1hEVE13TURVeE1USXhORFEwTmxvd0ZURVRNQkVHQTFVRQpBeE1LYTNWaVpYSnVaWFJsY3pDQ0FTSXdEUVlKS29aSWh2Y05BUUVCQlFBRGdnRVBBRENDQVFvQ2dnRUJBS0M1CjlJbzRSU2hudG90QUdxdjNTYmRjOVd4UmJDYlRzWXU5S0RQeU9xanVoZE1UUE9PcmROOGNoa0x4NEFyZkZaU1AKZithUmhpdWZqSE56bWhucnkvZlprRGdqQzQzRmV5UHZzcTZXeVVDV0FEK2JBdi9wSkJDbkg2MldoWGNLZ1BVMApqU1k0ZkpXenNFbzBaREhoeUszSGN3MkxkbmdmaEpEanBQRFJBNkRWb2pIaktPb29OT1J1dURvUHpiOTg2dGhUCkZaQXJMZjRvZXRzTEk1ZzFYRTNzZzM1YVhyU0g3N2JPYVVsTGpYTzFYSnpFZlZWZ3BMWE5xR1ZqTXhBMVU2b1MKMXVJL0d1K1ArY
-===========CUT=========================================CUT===================
- server: https://compute.myasegpu1.wdshcsso.com:6443
- name: kubernetes
-contexts:
-- context:
- cluster: kubernetes
- user: aseuser
- name: aseuser@kubernetes
-current-context: aseuser@kubernetes
-kind: Config
-preferences: {}
-users:
-- name: aseuser
- user:
- client-certificate-data: LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMwRENDQWJpZ0F3SUJBZ0lJY1hOTXRPU2VwbG93RFFZSktvWklodmNOQVFFTEJRQXdGVEVUTUJFR0ExVUUKQXhNS2EzVmlaWEp1WlhSbGN6QWVGdzB5TURBMU1UTXlNVFEwTkRaYUZ3MHlNVEExTVRNeU1UVXhNVEphTUJJeApFREFPQmdOVkJBTVRCMkZ6WlhWelpYSXdnZ0VpTUEwR0NTcUdTSWIzRFFFQkFRVUFBNElCRHdBd2dnRUtBb0lCCkFRRHVjQ1pKdm9qNFIrc0U3a1EyYmVjNEJkTXdpUEhmU2R2WnNDVVY0aTRRZGY1Yzd0dkE3OVRSZkRLQTY1d08Kd0h0QWdlK3lLK0hIQ1Qyd09RbWtNek1RNjZwVFEzUlE0eVdtRDZHR1cWZWMExBR1hFUUxWWHRuTUdGCi0tLS0tRU5EIFJTQSBQUklWQVRFIEtFWS0tLS0tCg==
-
-[10.100.10.10]: PS>
-```
>-
-On an Azure Stack Edge Pro device that has the compute role configured, you can troubleshoot or monitor the device using two different set of commands.
+- Compute network configured. See [Tutorial: Configure network for Azure Stack Edge Pro with GPU](azure-stack-edge-gpu-deploy-configure-network-compute-web-proxy.md).
+- Compute role configured on your device.
+
+On an Azure Stack Edge Pro device that has the compute role configured, you can troubleshoot or monitor the device using two different sets of commands.
- Using `iotedge` commands. These commands are available for basic operations for your device. - Using `kubectl` commands. These commands are available for an extensive set of operations for your device.
For a comprehensive list of the `kubectl` commands, go to [`kubectl` cheatsheet]
#### To get IP of service or module exposed outside of Kubernetes cluster
-To get the IP of a load balancing service or modules exposed outside of the Kubernetes, run the following command:
+To get the IP of a load-balancing service or modules exposed outside of the Kubernetes, run the following command:
`kubectl get svc -n iotedge`
To get the logs for a module, run the following command from the PowerShell inte
`kubectl logs <pod_name> -n <namespace> --all-containers`
-Because `all-containers` flag will dumps all the logs for all the containers, a good way to see the recent errors is to use the option `--tail 10`.
+Because `all-containers` flag dumps all the logs for all the containers, a good way to see the recent errors is to use the option `--tail 10`.
Following is a sample output.
While changing the memory and processor usage, follow these guidelines.
- Default memory is 25% of device specification. - Default processor count is 30% of device specification.-- When changing the values for memory and processor counts, we recommend that you vary the values between 15% to 65% of the device memory and the processor count. -- We recommend an upper limit of 65% is so that there are enough resources for system components.
+- When changing the values for memory and processor counts, we recommend that you vary the values between 15% to 60% of the device memory and the processor count.
+- We recommend an upper limit of 60% is so that there are enough resources for system components.
## Connect to BMC
databox-online Azure Stack Edge J Series Manage Bandwidth Schedules https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/azure-stack-edge-j-series-manage-bandwidth-schedules.md
Do the following steps in the Azure portal to add a schedule.
1. Provide the **Start day**, **End day**, **Start time**, and **End time** of the schedule. 2. Check the **All day** option if this schedule should run all day.
- 3. **Bandwidth rate** is the bandwidth in Megabits per second (Mbps) used by your device in operations involving the cloud (both uploads and downloads). Supply a number between 20 and 2,147,483,647 for this field.
+ 3. **Bandwidth rate** is the bandwidth in Megabits per second (Mbps) used by your device in operations involving the cloud (both uploads and downloads). Supply a number between 64 and 2,147,483,647 for this field.
4. Select **Unlimited bandwidth** if you do not want to throttle the date upload and download. 5. Select **Add**.
databox-online Security Baseline https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/databox-online/security-baseline.md
Note that additional permissions might be required to get visibility into worklo
**Guidance**: You can bring in your own applications to run on any locally created virtual machines. Use PowerShell scripts to create local compute virtual machines on your Stack Edge device. We strongly recommend that you bring in only trusted applications to run on the local virtual machines. -- [How to control PowerShell script execution in Windows environment](/powershell/module/microsoft.powershell.security/set-executionpolicy?preserve-view=true&amp;viewFallbackFrom=powershell-6&view=powershell-7.1)
+- [How to control PowerShell script execution in Windows environment](/powershell/module/microsoft.powershell.security/set-executionpolicy)
**Azure Security Center monitoring**: Not applicable
defender-for-iot Architecture Agent Based https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/architecture-agent-based.md
editor: '' ms.devlang: na-+ na Last updated 1/25/2021
defender-for-iot Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/architecture.md
editor: '' ms.devlang: na-+ na Last updated 1/25/2021
defender-for-iot Concept Security Agent Authentication Methods https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/concept-security-agent-authentication-methods.md
This article explains the different authentication methods you can use with the
For each device onboarded to Defender for IoT in the IoT Hub, a security module is required. To authenticate the device, Defender for IoT can use one of two methods. Choose the method that works best for your existing IoT solution.
-> [!div class="checklist"]
-> * SecurityModule option
-> * Device option
+- SecurityModule option
+- Device option
## Authentication methods
defender-for-iot Getting Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/getting-started.md
Title: Getting started
-description: Get started with understanding the basic workflow for Defender for IoT deployment.
+ Title: "Quickstart: Getting started"
+description: In this quickstart you will learn how to get started with understanding the basic workflow for Defender for IoT deployment.
documentationcenter: na
editor: '' ms.devlang: na-+ na Last updated 2/18/2021
-# Get started with Defender for IoT
+# Quickstart: Get started with Defender for IoT
This article provides an overview of the steps you'll take to set up Azure Defender for IoT. The process requires that you:
This article provides an overview of the steps you'll take to set up Azure Defen
- Install the sensor and on-premises management console software. - Perform initial activation of the sensor and management console.
+## Prerequisites
+
+None
+ ## Permission requirements Some of the setup steps require specific user permissions.
The following table describes user access permissions to Azure Defender for IoT
| Update pricing | | Γ£ô | Γ£ô | Γ£ô | | Recover password | Γ£ô | Γ£ô | Γ£ô | Γ£ô |
-## 1. Identify the solution infrastructure
+## Identify the solution infrastructure
**Clarify your network setup needs**
Azure Defender for IoT supports both physical and virtual deployments. For the p
We recommend that you calculate the approximate number of devices that will be monitored. Later, when you register your Azure subscription to the portal, you'll be asked to enter this number. Numbers can be added in intervals of 1,000 seconds. The numbers of monitored devices are called *committed devices*.
-## 2. Register with Azure Defender for IoT
+## Register with Azure Defender for IoT
Registration includes:
To register:
For information on how to offboard a subscription, see [Offboard a subscription](how-to-manage-sensors-on-the-cloud.md#offboard-a-subscription).
-## 3. Install and set up the on-premises management console
+## Install and set up the on-premises management console
After you acquire your on-premises management console appliance:
To install and set up:
1. Install the on-premises management console software. For more information, see [Defender for IoT installation](how-to-install-software.md). 1. Activate and set up the management console. For more information, see [Activate and set up your on-premises management console](how-to-activate-and-set-up-your-on-premises-management-console.md).
-## 4. Onboard a sensor
+## Onboard a sensor
Onboard a sensor by registering it with Azure Defender for IoT and downloading a sensor activation file:
Onboard a sensor by registering it with Azure Defender for IoT and downloading a
For more information, see [Onboard and manage sensors in the Defender for IoT portal](how-to-manage-sensors-on-the-cloud.md).
-## 5. Install and set up the sensor
+## Install and set up the sensor
Download the ISO package from the Azure Defender for IoT portal, install the software, and set up the sensor.
Download the ISO package from the Azure Defender for IoT portal, install the sof
1. Install the sensor software. For more information, see [Defender for IoT installation](how-to-install-software.md). 1. Activate and set up your sensor. For more information, see [Sign in and activate a sensor](how-to-activate-and-set-up-your-sensor.md).
-## 6. Connect sensors to an on-premises management console
+## Connect sensors to an on-premises management console
Connect sensors to the management console to ensure that:
We recommend that you group multiple sensors monitoring the same networks in one
For more information, see [Connect sensors to the on-premises management console](how-to-activate-and-set-up-your-on-premises-management-console.md#connect-sensors-to-the-on-premises-management-console).
-## 7. Populate Azure Sentinel with alert information (optional)
+## Populate Azure Sentinel with alert information (optional)
Send alert information to Azure Sentinel by configuring Azure Sentinel. See [Connect your data from Defender for IoT to Azure Sentinel](how-to-configure-with-sentinel.md).
-## See also
--- [Welcome to Azure Defender for IoT](overview.md)
+## Next steps
-- [Azure Defender for IoT architecture](architecture.md)
+> [!div class="nextstepaction"]
+> [Welcome to Azure Defender for IoT](overview.md)
+> [Azure Defender for IoT architecture](architecture.md)
defender-for-iot How To Activate And Set Up Your Sensor https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/how-to-activate-and-set-up-your-sensor.md
You access console tools from the side menu.
|-||| | Support | :::image type="icon" source="media/concept-sensor-console-overview/support-icon-azure.png" border="false"::: | Contact [Microsoft Support](https://support.microsoft.com/) for help. |
-### See also
+## See also
-[Onboard a sensor](getting-started.md#4-onboard-a-sensor)
+[Onboard a sensor](getting-started.md#onboard-a-sensor)
[Manage sensor activation files](how-to-manage-individual-sensors.md#manage-sensor-activation-files)
defender-for-iot How To Azure Rtos Security Module https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/how-to-azure-rtos-security-module.md
description: Learn about how to configure and customize your Security Module for
documentationcenter: na-+ editor: ''
ms.devlang: na
na Previously updated : 09/09/2020- Last updated : 03/07/2021+
-# Configure and customize Security Module for Azure RTOS (preview)
+# Configure and customize Defender-IoT-micro-agent for Azure RTOS GA
-Use this following file to configure your device behavior.
+This article describes how to configure the Defender-IoT-micro-agent for your Azure RTOS device, to meet your network, bandwidth, and memory requirements.
-## azure_iot_security_module/inc/asc_port.h
+You must select a target distribution file that has a `*.dist` extension, from the `netxduo/addons/azure_iot/azure_iot_security_module/configs` directory.
- The default behavior of each configuration is provided in the following tables:
+When using a CMake compilation environment, you must set a command line parameter to `IOT_SECURITY_MODULE_DIST_TARGET` for the chosen value. For example, `-DIOT_SECURITY_MODULE_DIST_TARGET=RTOS_BASE`.
-### General
+In an IAR, or other non CMake compilation environment, you must add the `netxduo/addons/azure_iot/azure_iot_security_module/inc/configs/<target distribution>/` path to any known included paths. For example, `netxduo/addons/azure_iot/azure_iot_security_module/inc/configs/RTOS_BASE`.
+
+Use the following file to configure your device behavior.
+
+**netxduo/addons/azure_iot/azure_iot_security_module/inc/configs/\<target distribution>/asc_config.h**
+
+In a CMake compilation environment, you must change the default configuration by editing the `netxduo/addons/azure_iot/azure_iot_security_module/configs/<target distribution>.dist` file. Use the following CMake format `set(ASC_XXX ON)`, or the following file `netxduo/addons/azure_iot/azure_iot_security_module/inc/configs/<target distribution>/asc_config.h` for all other environments. For example, `#define ASC_XXX`.
+
+The default behavior of each configuration is provided in the following tables:
+
+## General
| Name | Type | Default | Details | | - | - | - | - |
-| ASC_SECURITY_MODULE_ID | String | | Unique identifier of the device |
-| ASC_SECURITY_MODULE_PENDING_TIME | Number | 300 | Security Module pending time in seconds. If the time exceeds state change to suspend. |
+| ASC_SECURITY_MODULE_ID | String | defender-iot-micro-agent | The unique identifier of the device. |
+| SECURITY_MODULE_VERSION_(MAJOR)(MINOR)(PATCH) | Number | 3.2.1 | The version. |
+| ASC_SECURITY_MODULE_SEND_MESSAGE_RETRY_TIME | Number | 3 | The amount of time the Defender-IoT-micro-agent will take to send the security message after a fail. (in seconds) |
+| ASC_SECURITY_MODULE_PENDING_TIME | Number | 300 | The Defender-IoT-micro-agent pending time (in seconds). The state will change to suspend, if the time is exceeded.. |
-#### Collection
+## Collection
| Name | Type | Default | Details | | - | - | - | - |
-| ASC_HIGH_PRIORITY_INTERVAL | Number | 10 | Collectors high priority group interval in seconds. |
-| ASC_MEDIUM_PRIORITY_INTERVAL | Number | 30 | Collectors medium priority group interval in seconds. |
-| ASC_LOW_PRIORITY_INTERVAL | Number | 145,440 | Collectors low priority group interval in seconds. |
+| ASC_FIRST_COLLECTION_INTERVAL | Number | 30 | The Collector's startup collection interval offset. During startup, the value will be added to the collection of the system in order to avoid sending messages from multiple devices simultaneously. |
+| ASC_HIGH_PRIORITY_INTERVAL | Number | 10 | The collector's high priority group interval (in seconds). |
+| ASC_MEDIUM_PRIORITY_INTERVAL | Number | 30 | The collector's medium priority group interval (in seconds). |
+| ASC_LOW_PRIORITY_INTERVAL | Number | 145,440 | The collector's low priority group interval (in seconds). |
#### Collector network activity
To customize your collector network activity configuration, use the following:
| Name | Type | Default | Details | | - | - | - | - |
-| ASC_COLLECTOR_NETWORK_ACTIVITY_TCP_DISABLED | Boolean | false | Filter `TCP` network activity |
-| ASC_COLLECTOR_NETWORK_ACTIVITY_UDP_DISABLED | Boolean | false | Filter `UDP` network activity events |
-| ASC_COLLECTOR_NETWORK_ACTIVITY_ICMP_DISABLED | Boolean | false | Filter `ICMP` network activity events |
-| ASC_COLLECTOR_NETWORK_ACTIVITY_CAPTURE_UNICAST_ONLY | Boolean | true | Capture unicast incoming packets only, when set to false capture also Broadcast and Multicast |
-| ASC_COLLECTOR_NETWORK_ACTIVITY_MAX_IPV4_OBJECTS_IN_CACHE | Number | 64 | Maximum number of IPv4 network events to store in memory |
-| ASC_COLLECTOR_NETWORK_ACTIVITY_MAX_IPV6_OBJECTS_IN_CACHE | Number | 64 | Maximum number of IPv6 network events to store in memory |
--
-## Compile flags
-Compile flags allows you to override the predefined configurations.
+| ASC_COLLECTOR_NETWORK_ACTIVITY_TCP_DISABLED | Boolean | false | Filters the `TCP` network activity. |
+| ASC_COLLECTOR_NETWORK_ACTIVITY_UDP_DISABLED | Boolean | false | Filters the `UDP` network activity events. |
+| ASC_COLLECTOR_NETWORK_ACTIVITY_ICMP_DISABLED | Boolean | false | Filters the `ICMP` network activity events. |
+| ASC_COLLECTOR_NETWORK_ACTIVITY_CAPTURE_UNICAST_ONLY | Boolean | true | Captures the unicast incoming packets only. When set to false, it will also capture both Broadcast, and Multicast. |
+| ASC_COLLECTOR_NETWORK_ACTIVITY_SEND_EMPTY_EVENTS | Boolean | false | Sends an empty events of collector. |
+| ASC_COLLECTOR_NETWORK_ACTIVITY_MAX_IPV4_OBJECTS_IN_CACHE | Number | 64 | The maximum number of IPv4 network events to store in memory. |
+| ASC_COLLECTOR_NETWORK_ACTIVITY_MAX_IPV6_OBJECTS_IN_CACHE | Number | 64 | The maximum number of IPv6 network events to store in memory. |
### Collectors | Name | Type | Default | Details | | - | - | - | - |
-| collector_heartbeat_enabled | Boolean | ON | Enable the heartbeat collector |
-| collector_network_activity_enabled | Boolean | ON | Enable the network activity collector |
-| collector_system_information_enabled | Boolean | ON | Enable the system information collector |
+| ASC_COLLECTOR_HEARTBEAT_ENABLED | Boolean | ON | Enables the heartbeat collector. |
+| ASC_COLLECTOR_NETWORK_ACTIVITY_ENABLED | Boolean | ON | Enables the network activity collector. |
+| ASC_COLLECTOR_SYSTEM_INFORMATION_ENABLED | Boolean | ON | Enables the system information collector. |
+Other configurations flags are advanced, and have unsupported features. Contact support to change this, or for more information.
+
## Supported security alerts and recommendations
-The Security Module for Azure RTOS supports specific security alerts and recommendations. Make sure to [review and customize the relevant alert and recommendation values](concept-rtos-security-alerts-recommendations.md) for your service.
+The Defender-IoT-micro-agent for Azure RTOS supports specific security alerts and recommendations. Make sure to [review and customize the relevant alert and recommendation values](concept-rtos-security-alerts-recommendations.md) for your service.
## Log Analytics (optional)
-While optional and not required, enabling and configuring Log Analytics can be helpful when you wish to further investigate device events and activities. Read about how to setup and use [Log Analytics with the Defender for IoT service](how-to-security-data-access.md#log-analytics) to learn more.
+You can enable and configure Log Analytics to investigate device events and activities. Read about how to setup, and use [Log Analytics with the Defender for IoT service](how-to-security-data-access.md#log-analytics) to learn more.
## Next steps - Review and customize Security Module for Azure RTOS [security alerts and recommendations](concept-rtos-security-alerts-recommendations.md) - Refer to the [Security Module for Azure RTOS API](azure-rtos-security-module-api.md) as needed.-
defender-for-iot How To Deploy Agent https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/how-to-deploy-agent.md
To learn more, see [Security agent reference architecture](security-agent-archit
Agents are developed as open-source projects, and are available in two flavors: <br> [C](https://aka.ms/iot-security-github-c), and [C#](https://aka.ms/iot-security-github-cs). In this article, you learn how to:-
-> [!div class="checklist"]
-> * Compare security agent flavors
-> * Discover supported agent platforms
-> * Choose the right agent flavor for your solution
+- Compare security agent flavors
+- Discover supported agent platforms
+- Choose the right agent flavor for your solution
## Understand security agent options
defender-for-iot How To Deploy Linux C https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/how-to-deploy-linux-c.md
This guide explains how to install and deploy the Defender for IoT C-based security agent on Linux.
-In this guide, you learn how to:
-
-> [!div class="checklist"]
-> * Install
-> * Verify deployment
-> * Uninstall the agent
-> * Troubleshoot
+- Install
+- Verify deployment
+- Uninstall the agent
+- Troubleshoot
## Prerequisites
defender-for-iot How To Deploy Linux Cs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/how-to-deploy-linux-cs.md
This guide explains how to install and deploy the Defender for IoT C#-based secu
In this guide, you learn how to:
-> [!div class="checklist"]
-> * Install
-> * Verify deployment
-> * Uninstall the agent
-> * Troubleshoot
+- Install
+- Verify deployment
+- Uninstall the agent
+- Troubleshoot
## Prerequisites
defender-for-iot How To Deploy Windows Cs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/how-to-deploy-windows-cs.md
This guide explains how to install the Defender for IoT C#-based security agent
In this guide, you learn how to:
-> [!div class="checklist"]
-> * Install
-> * Verify deployment
-> * Uninstall the agent
-> * Troubleshoot
+- Install
+- Verify deployment
+- Uninstall the agent
+- Troubleshoot
## Prerequisites
defender-for-iot Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/overview.md
ms.devlang: na-+ na Last updated 12/09/2020
defender-for-iot Quickstart Azure Rtos Security Module https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/quickstart-azure-rtos-security-module.md
Title: "Quickstart: Configure and enable the Security Module for Azure RTOS"
-description: Learn how to onboard and enable the Security Module for Azure RTOS service in your Azure IoT Hub.
+description: In this quickstart you will learn how to onboard and enable the Security Module for Azure RTOS service in your Azure IoT Hub.
documentationcenter: na
The Security Module for Azure RTOS uses Azure IoT Middleware connections based o
Advance to the next article to finish configuring and customizing your solution. > [!div class="nextstepaction"]
-> [Configure Security Module for Azure RTOS](how-to-azure-rtos-security-module.md)
+> [Configure Security Module for Azure RTOS](how-to-azure-rtos-security-module.md)
defender-for-iot Quickstart Configure Your Solution https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/quickstart-configure-your-solution.md
Title: Add Azure resources to your IoT solution
+ Title: "Quickstart: Add Azure resources to your IoT solution"
description: In this quickstart, learn how to configure your end-to-end IoT solution using Azure Defender for IoT.
This article provides an explanation of how to perform initial configuration of your IoT security solution using Defender for IoT.
+## Prerequisites
+
+None
+ ## What is Defender for IoT? Defender for IoT provides comprehensive end-to-end security for Azure-based IoT solutions.
defender-for-iot Quickstart Create Security Twin https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/quickstart-create-security-twin.md
Title: Create a security module twin
+ Title: "Quickstart: Create a security module twin"
description: In this quickstart, learn how to create a Defender for IoT module twin for use with Azure Defender for IoT.
Last updated 1/21/2021
-# Create an azureiotsecurity module twin
+# Quickstart: Create an azureiotsecurity module twin
This quickstart explains how to create individual _azureiotsecurity_ module twins for new devices, or batch create module twins for all devices in an IoT Hub.
+## Prerequisites
+
+None
+ ## Understanding azureiotsecurity module twins For IoT solutions built in Azure, device twins play a key role in both device management and process automation.
defender-for-iot Quickstart Onboard Iot Hub https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/quickstart-onboard-iot-hub.md
Title: Onboard Defender for IoT to an agent-based solution
-description: Learn how to onboard and enable the Defender for IoT security service in your Azure IoT Hub.
+ Title: "Quickstart: Onboard Defender for IoT to an agent-based solution"
+description: In this quickstart you will learn how to onboard and enable the Defender for IoT security service in your Azure IoT Hub.
documentationcenter: na
Last updated 1/20/2021
-# Onboard Defender for IoT to an agent-based solution
+# Quickstart: Onboard Defender for IoT to an agent-based solution
This article explains how to enable the Defender for IoT service on your existing IoT Hub. If you don't currently have an IoT Hub, see [Create an IoT Hub using the Azure portal](../iot-hub/iot-hub-create-through-portal.md) to get started.
You can manage your IoT security through the IoT Hub in Defender for IoT. The ma
> [!NOTE] > Defender for IoT currently only supports standard tier IoT Hubs.
+## Prerequisites
+
+None
+ ## Onboard Defender for IoT to an IoT Hub For all new IoT hubs, Defender for IoT is set to **On** by default. You can verify that Defender for IoT is toggled to **On** during the IoT Hub creation process.
defender-for-iot Security Agent Architecture https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/security-agent-architecture.md
Title: Security agents overview
-description: Understand security agent architecture for the agents used in the Azure Defender for IoT service.
+ Title: "Quickstart: Security agents overview"
+description: In this quickstart you will learn how to understand security agent architecture for the agents used in the Azure Defender for IoT service.
documentationcenter: na
editor: ''
ms.devlang: na-+ na Last updated 01/24/2021
-# Security agent reference architecture
+# Quickstart: Security agent reference architecture
Azure Defender for IoT provides r