Updates from: 08/06/2021 03:09:32
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory-b2c Configure Authentication Sample Spa App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/configure-authentication-sample-spa-app.md
You're now ready to test the single-page application's scoped access to the API.
![Screenshot of the SPA sample app displayed in the browser window.](./media/configure-authentication-sample-spa-app/sample-app-sign-in.png)
-1. Sign in by using the email address and password you used in the [previous tutorial](tutorial-single-page-app.md). After you've logged in successfully, you should see the "User \<your username> logged in" message.
+1. Complete the sign-up or sign-in process. After you've logged in successfully, you should see the "User \<your username> logged in" message.
1. Select the **Call API** button. The SPA sends the access token in a request to the protected web API, which returns the display name of the logged-in user: ![Screenshot of the SPA in a browser window, showing the username JSON result that's returned by the API.](./media/configure-authentication-sample-spa-app/sample-app-result.png)
active-directory-b2c Partner Gallery https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/partner-gallery.md
# Azure Active Directory B2C ISV partners
-Our ISV partner network extends our solution capabilities to help you build seamless end-user experiences. With Azure AD B2C, you can integrate with ISV partners to enable multifactor authentication (MFA) methods, do role-based access control, enable identity verification and proofing, improve security with bot detection and fraud protection, and meet Payment Services Directive 2 (PSD2) Secure Customer Authentication (SCA) requirements. Use our detailed sample walkthroughs to learn how to integrate apps with the ISV partners.
+Our ISV partner network extends our solution capabilities to help you build seamless end-user experiences. With Azure AD B2C, you can integrate with ISV partners to enable multifactor authentication (MFA) methods, do role-based access control, enable identity verification and proofing, improve security with bot detection and fraud protection, and meet Payment Services Directive 2 (PSD2) Secure Customer Authentication (SCA) requirements. Use our detailed sample walkthroughs to learn how to integrate apps with the ISV partners.
+
+To be considered into this sample documentation, submit your application request in the [Microsoft Application Network portal](https://microsoft.sharepoint.com/teams/apponboarding/Apps/SitePages/Default.aspx). For any additional questions, send an email to [SaaSApplicationIntegrations@service.microsoft.com](mailto:SaaSApplicationIntegrations@service.microsoft.com).
>[!NOTE] >The [Azure Active Directory B2C community site on GitHub](https://azure-ad-b2c.github.io/azureadb2ccommunity.io/) also provides sample custom policies from the community.
Microsoft partners with the following ISVs for MFA and Passwordless authenticati
| ![Screenshot of a itsme logo](./medi) is an Electronic Identification, Authentication and Trust Services (eiDAS) compliant digital ID solution to allow users to sign in securely without card readers, passwords, two-factor authentication, and multiple PIN codes. | |![Screenshot of a Keyless logo.](./medi) is a passwordless authentication provider that provides authentication in the form of a facial biometric scan and eliminates fraud, phishing, and credential reuse. | ![Screenshot of a nevis logo](./medi) enables passwordless authentication and provides a mobile-first, fully branded end-user experience with Nevis Access app for strong customer authentication and to comply with PSD2 transaction requirements. |
+| ![Screenshot of a nok nok logo](./medi) provides passwordless authentication and enables FIDO certified multifactor authentication such as FIDO UAF, FIDO U2F, WebAuthn, and FIDO2 for mobile and web applications. Using Nok Nok customers can improve their security posture while balancing user experience.
| ![Screenshot of a trusona logo](./medi) integration helps you sign in securely and enables passwordless authentication, MFA, and digital license scanning. | | ![Screenshot of a twilio logo.](./medi) provides multiple solutions to enable MFA through SMS one-time password (OTP), time-based one-time password (TOTP), and push notifications, and to comply with SCA requirements for PSD2. | | ![Screenshot of a typingDNA logo](./medi) enables strong customer authentication by analyzing a userΓÇÖs typing pattern. It helps companies enable a silent MFA and comply with SCA requirements for PSD2. |
active-directory-b2c Partner Nok Nok https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/partner-nok-nok.md
+
+ Title: Tutorial to configure Azure Active Directory B2C with Nok Nok
+
+description: Tutorial to configure Nok Nok with Azure Active Directory B2C to enable passwordless FIDO2 authentication
+++++++ Last updated : 08/04/2021+++
+# Tutorial: Configure Nok Nok with Azure Active Directory B2C to enable passwordless FIDO2 authentication
+
+In this sample tutorial, learn how to integrate the Nok Nok S3 authentication suite into your Azure Active Directory (AD) B2C tenant. [Nok Nok](https://noknok.com/) enables FIDO certified multifactor authentication such as FIDO UAF, FIDO U2F, WebAuthn, and FIDO2 for mobile and web applications. Using Nok Nok customers can improve their security posture while balancing user experience.
+
+## Prerequisites
+
+To get started, you'll need:
+
+- An Azure subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+
+- [An Azure AD B2C tenant](tutorial-create-tenant.md) that is linked to your Azure subscription.
+
+- Get a free Nok Nok [trial tenant](https://noknok.com/products/strong-authentication-service/).
+
+## Scenario description
+
+To enable passwordless FIDO authentication to your users, enable Nok Nok as an Identity provider to your Azure AD B2C tenant. The Nok Nok integration includes the following components:
+
+- **Azure AD B2C** ΓÇô The authorization server, responsible for verifying the userΓÇÖs credentials.
+
+- **Web and mobile applications** ΓÇô Your mobile or web applications that you choose to protect with Nok Nok and Azure AD B2C.
+
+- **The Nok Nok app SDK or Nok Nok Passport app** ΓÇô Applications used to authenticate Azure AD B2C enabled applications. These applications are available on [Apple app store](https://apps.apple.com/us/app/nok-nok-passport/id1050437340) and [Google play store](https://play.google.com/store/apps/details?id=com.noknok.android.passport2&hl=en&gl=US).
+
+The following architecture diagram shows the implementation. Nok Nok is acting as an Identity provider for Azure AD B2C using Open ID Connect (OIDC) to enable passwordless authentication.
+
+![image shows the architecture diagram of nok nok and azure ad b2c](./media/partner-nok-nok/nok-nok-architecture-diagram.png)
+
+| Step | Description |
+|:|:--|
+| 1. | User arrives at a login page. Users select sign-in/sign-up and enter the username |
+| 2. | Azure AD B2C redirects the user to the Nok Nok OIDC authentication provider. |
+| 3a. | For mobile based authentications, Nok Nok either displays a QR code or sends a push notification request to the end userΓÇÖs mobile device. |
+| 3b. | For Desktop/PC based login, Nok Nok redirects the end user to the web application login page to initiate a passwordless authentication prompt. |
+|4a. | The user scanΓÇÖs the displayed QR code in their smartphone using Nok Nok app SDK or Nok Nok Passport app.|
+| 4b. | User provides username as an input on the login page of the web application and selects next. |
+| 5a. | User is prompted for authentication on smartphone. <BR> User does passwordless authentication by using the userΓÇÖs preferred method, such as biometrics, device PIN, or any roaming authenticator.|
+| 5b. | User is prompted for authentication on web application. <BR> User does passwordless authentication by using the userΓÇÖs preferred method, such as biometrics, device PIN, or any roaming authenticator. |
+| 6. | Nok Nok server validates FIDO assertion and upon validation, sends OIDC authentication response to Azure AD B2C.|
+| 7. | Based on the response user is granted or denied access. |
+
+## Onboard with Nok Nok
+
+Fill out the [Nok Nok cloud form](https://noknok.com/contact/) to create your own Nok Nok tenant. Once you submit the form, you'll receive an email explaining how to access your tenant. The email will also include access to Nok Nok guides. Follow the instructions provided in the Nok Nok integration guide to complete the OIDC configuration of your Nok Nok cloud tenant.
+
+## Integrate with Azure AD B2C
+
+### Add a new Identity provider
+
+To add a new Identity provider, follow these steps:
+
+1. Sign in to the **[Azure portal](https://portal.azure.com/#home)** as the global administrator of your Azure AD B2C tenant.
+
+2. Make sure you're using the directory that contains your Azure AD B2C tenant by selecting the **Directory + subscription** filter on the top menu and choosing the directory that contains your tenant.
+
+3. Choose **All services** in the top-left corner of the Azure portal, search for and select **Azure AD B2C**.
+
+4. Navigate to **Dashboard** > **Azure Active Directory B2C** > **Identity providers**
+
+5. Select **Identity providers**.
+
+6. Select **Add**.
+
+### Configure an Identity provider
+
+To configure an Identity provider, follow these steps:
+
+1. Select **Identity provider type** > **OpenID Connect (Preview)**
+2. Fill out the form to set up the Identity provider:
+
+ |Property | Value |
+ |:--| :--|
+ | Name | Nok Nok Authentication Provider |
+ | Metadata URL | Insert the URI of the hosted Nok Nok Authentication app, followed by the specific path such as 'https://demo.noknok.com/mytenant/oidc/.well-known/openid-configuration' |
+ | Client Secret | Use the client Secret provided by the Nok Nok platform.|
+ | Client ID | Use the client ID provided by the Nok Nok platform.|
+ | Scope | OpenID profile email |
+ | Response type | code |
+ | Response mode | form_post|
+
+3. Select **OK**.
+
+4. Select **Map this identity providerΓÇÖs claims**.
+
+5. Fill out the form to map the Identity provider:
+
+ |Property | Value |
+ |:--| :--|
+ | UserID | From subscription |
+ | Display name | From subscription |
+ | Response mode | From subscription |
+
+6. Select **Save** to complete the setup for your new OIDC Identity provider.
+
+### Create a user flow policy
+
+You should now see Nok Nok as a new OIDC Identity provider listed within your B2C identity providers.
+
+1. In your Azure AD B2C tenant, under **Policies**, select **User flows**.
+
+2. Select **New** user flow.
+
+3. Select **Sign up and sign in**, select a **version**, and then select **Create**.
+
+4. Enter a **Name** for your policy.
+
+5. In the Identity providers section, select your newly created Nok Nok Identity provider.
+
+6. Set up the parameters of your User flow. Insert a name and select the Identity provider youΓÇÖve created. You can also add email address. In this case, Azure wonΓÇÖt redirect the login procedure directly to Nok Nok instead it will show a screen where the user can choose the option they would like to use.
+
+7. Leave the **Multi-factor Authentication** field as is.
+
+8. Select **Enforce conditional access policies**
+
+9. Under **User attributes and token claims**, select **Email Address** in the Collect attribute option. You can add all the attributes that Azure AD can collect about the user alongside the claims that Azure AD B2C can return to the client application.
+
+10. Select **Create**.
+
+11. After a successful creation, select your new **User flow**.
+
+12. On the left panel, select **Application Claims**. Under options, tick the **email** checkbox and select **Save**.
+
+## Test the user flow
+
+1. Open the Azure AD B2C tenant and under Policies select Identity Experience Framework.
+
+2. Select your previously created SignUpSignIn.
+
+3. Select Run user flow and select the settings:
+
+ a. Application: select the registered app (sample is JWT)
+
+ b. Reply URL: select the redirect URL
+
+ c. Select Run user flow.
+
+4. Go through sign-up flow and create an account
+
+5. Nok Nok will be called during the flow, after user attribute is created. If the flow is incomplete, check that user isn't saved in the directory.
+
+## Next steps
+
+For additional information, review the following articles:
+
+- [Custom policies in Azure AD B2C](./custom-policy-overview.md)
+
+- [Get started with custom policies in Azure AD B2C](tutorial-create-user-flows.md?pivots=b2c-custom-policy)
active-directory-b2c Tutorial Create User Flows https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/tutorial-create-user-flows.md
Previously updated : 06/07/2021 Last updated : 08/05/2021 zone_pivot_groups: b2c-policy-type
In this article, you learned how to:
> * Create a profile editing user flow > * Create a password reset user flow
-Next, learn how to use Azure AD B2C to sign in and sign up users in an application. Follow the ASP.NET web application linked below, or navigate to another application in the table of contents under **Authenticate users**.
-
-> [!div class="nextstepaction"]
-> [Tutorial: Enable authentication in a web application using Azure AD B2C >](tutorial-web-app-dotnet.md)
+Next, learn how to use Azure AD B2C to sign in and sign up users in an application. Follow the sample apps linked below:
+
+- [Configure a sample ASP.NET Core web app](configure-authentication-sample-web-app.md)
+- [Configure a sample ASP.NET Core web app that calls a web API](configure-authentication-sample-web-app-with-api.md)
+- [Configure authentication in a sample Python web application](configure-authentication-sample-python-web-app.md)
+- [Configure a sample Single-page application (SPA)](configure-authentication-sample-spa-app.md)
+- [Configure a sample Angular single-page app](configure-authentication-sample-angular-spa-app.md)
+- [Configure a sample Android mobile app](configure-authentication-sample-android-app.md)
+- [Configure a sample iOS mobile app](configure-authentication-sample-ios-app.md)
+- [Configure authentication in a sample WPF desktop application](configure-authentication-sample-wpf-desktop-app.md)
+- [Enable authentication in your web API](enable-authentication-web-api.md)
+- [Configure a SAML application](saml-service-provider.md)
You can also learn more in the [Azure AD B2C Architecture Deep Dive Series](https://www.youtube.com/playlist?list=PLOPotgzC07IKXXCTZcrpuLWbVe3y51kfm).
active-directory-b2c Tutorial Desktop App Webapi https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/tutorial-desktop-app-webapi.md
- Title: "Tutorial: Grant access to a Node.js web API from a desktop application"
-description: Tutorial on how to use Active Directory B2C to protect a Node.js web API and call it from a .NET desktop app.
----- Previously updated : 10/12/2019------
-# Tutorial: Grant access to a Node.js web API from a desktop app using Azure Active Directory B2C
-
-This tutorial shows you how to call a Node.js web API protected by Azure Active Directory B2C (Azure AD B2C) from a Windows Presentation Foundation (WPF) desktop app, also protected by Azure AD B2C.
-
-In this tutorial, you learn how to:
-
-> [!div class="checklist"]
-> * Add a web API application
-> * Configure scopes for a web API
-> * Grant permissions to the web API
-> * Update the sample to use the application
-
-## Prerequisites
-
-Complete the steps and prerequisites in [Tutorial: Authenticate users in a native desktop client](tutorial-desktop-app.md).
-
-## Add a web API application
--
-## Configure scopes
-
-Scopes provide a way to govern access to protected resources. Scopes are used by the web API to implement scope-based access control. For example, some users could have both read and write access, whereas other users might have read-only permissions. In this tutorial, you define read and write permissions for the web API.
--
-Record the value under **SCOPES** for the `demo.read` scope to use in a later step when you configure the desktop application. The full scope value is similar to `https://contosob2c.onmicrosoft.com/api/demo.read`.
-
-## Grant permissions
-
-To call a protected web API from a native client application, you need to grant the registered native client application permissions to the web API you registered in Azure AD B2C.
-
-In the prerequisite tutorial, you registered a native client application named *nativeapp1*. The following steps configure that native application registration with the API scopes you exposed for *webapi1* in the previous section. This allows the desktop application to obtain an access token from Azure AD B2C that the web API can use to verify and provide scoped access to its resources. You configure and run both the desktop application and web API code samples later in the tutorial.
-
-To register an application in your Azure AD B2C tenant, you can use our new unified **App registrations** experience or our legacy **Applications (Legacy)** experience. [Learn more about the new experience](./app-registrations-training-guide.md).
-
-#### [App registrations](#tab/app-reg-ga/)
-
-1. Select **App registrations**, and then select the native client application that should have access to the API. For example, *nativeapp1*.
-1. Under **Manage**, select **API permissions**.
-1. Under **Configured permissions**, select **Add a permission**.
-1. Select the **My APIs** tab.
-1. Select the API to which the native client application should be granted access. For example, *webapi1*.
-1. Under **Permission**, expand **demo**, and then select the scopes that you defined earlier. For example, *demo.read* and *demo.write*.
-1. Select **Add permissions**. As directed, wait a few minutes before proceeding to the next step.
-1. Select **Grant admin consent for (your tenant name)**.
-1. Select your currently signed-in administrator account, or sign in with an account in your Azure AD B2C tenant that's been assigned at least the *Cloud application administrator* role.
-1. Select **Accept**.
-1. Select **Refresh**, and then verify that "Granted for ..." appears under **Status** for both scopes. It might take a few minutes for the permissions to propagate.
-
-#### [Applications (Legacy)](#tab/applications-legacy/)
-
-1. Select **Applications (Legacy)**, and then select *nativeapp1*.
-1. Select **API access**, and then select **Add**.
-1. In the **Select API** dropdown, select *webapi1*.
-1. In the **Select Scopes** dropdown, select the scopes that you defined earlier. For example, *demo.read* and *demo.write*.
-1. Select **OK**.
-
-* * *
-
-A user authenticates with Azure AD B2C to use the WPF desktop application. The desktop application obtains an authorization grant from Azure AD B2C to access the protected web API.
-
-## Configure the samples
-
-Now that the web API is registered and you have scopes and permissions configured, you configure the desktop application and web API samples to use your Azure AD B2C tenant.
-
-### Update the desktop application
-
-In a prerequisite for this article, you modified a [WPF desktop application](https://github.com/Azure-Samples/active-directory-b2c-dotnet-desktop) to enable sign-in with a user flow in your Azure AD B2C tenant. In this section, you update that same application to reference the web API you registered earlier, *webapi1*.
-
-1. Open the **active-directory-b2c-wpf** solution (`active-directory-b2c-wpf.sln`) in Visual Studio.
-1. In the **active-directory-b2c-wpf** project, open the *App.xaml.cs* file and find the following variable definitions.
- 1. Replace the value of the `ApiScopes` variable with the value you recorded earlier when you defined the **demo.read** scope.
- 1. Replace the value of the `ApiEndpoint` variable with the **Redirect URI** you recorded earlier when you registered the web API (for example, *webapi1*) in your tenant.
-
- Here's an example:
-
- ```csharp
- public static string[] ApiScopes = { "https://contosob2c.onmicrosoft.com/api/demo.read" };
- public static string ApiEndpoint = "http://localhost:5000";
- ```
-
-### Get and update the Node.js API sample
-
-Next, get the Node.js web API code sample from GitHub and configure it to use the web API you registered in your Azure AD B2C tenant.
-
-[Download a zip file](https://github.com/Azure-Samples/active-directory-b2c-javascript-nodejs-webapi/archive/master.zip) or clone the sample web app from GitHub.
-
-```console
-git clone https://github.com/Azure-Samples/active-directory-b2c-javascript-nodejs-webapi.git
-```
-
-The Node.js web API sample uses the Passport.js library to enable Azure AD B2C to protect calls to the API.
-
-1. Open the `index.js` file.
-1. Update these variable definitions with the following values. Change `<web-API-application-ID>` to the **Application (client) ID** of the web API you registered earlier (*webapi1*). Change `<your-b2c-tenant>` to the name of your Azure AD B2C tenant.
-
- ```nodejs
- var clientID = "<web-API-application-ID>";
- var b2cDomainHost = "<your-b2c-tenant>.b2clogin.com";
- var tenantIdGuid = "<your-b2c-tenant>.onmicrosoft.com";
- var policyName = "B2C_1_signupsignin1";
- ```
-1. Since you're running the API locally, update the path in the route for the GET method to `/` instead of the demo app's location of `/hello`:
-
- ```nodejs
- app.get("/",
- ```
-
-## Run the samples
-
-### Run the Node.js web API
-
-1. Launch a Node.js command prompt.
-2. Change to the directory containing the Node.js sample. For example `cd c:\active-directory-b2c-javascript-nodejs-webapi`
-3. Run the following commands:
- ```console
- npm install && npm update
- ```
- ```console
- node index.js
- ```
-
-### Run the desktop application
-
-1. Open the **active-directory-b2c-wpf** solution in Visual Studio.
-2. Press **F5** to run the desktop app.
-3. Sign in using the email address and password used in [Authenticate users with Azure Active Directory B2C in a desktop app tutorial](tutorial-desktop-app.md).
-4. Select the **Call API** button.
-
-The desktop application makes a request to the locally running web API, and upon verification of a valid access token, shows the signed-in user's display name.
-
-![Display name shown in the top pane of the WPF desktop application](./media/tutorial-desktop-app-webapi/desktop-app-01-post-api-call.png)
-
-Your desktop application, protected by Azure AD B2C, is calling the locally running web API that is also protected by Azure AD B2C.
-
-## Next steps
-
-In this tutorial, you learned how to:
-
-> [!div class="checklist"]
-> * Add a web API application
-> * Configure scopes for a web API
-> * Grant permissions to the web API
-> * Update the sample to use the application
-
-> [!div class="nextstepaction"]
-> [Add identity providers to your applications in Azure Active Directory B2C](add-identity-provider.md)
active-directory-b2c Tutorial Desktop App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/tutorial-desktop-app.md
- Title: "Tutorial: Authenticate users in a native client application"-
-description: Tutorial on how to use Azure Active Directory B2C to provide user login for a .NET desktop application.
----- Previously updated : 10/12/2019------
-# Tutorial: Authenticate users in a native desktop client using Azure Active Directory B2C
-
-This tutorial shows you how to use Azure Active Directory B2C (Azure AD B2C) to sign in and sign up users in an Windows Presentation Foundation (WPF) desktop application. Azure AD B2C enables your applications to authenticate to social accounts, enterprise accounts, and Azure Active Directory accounts using open standard protocols.
-
-In this tutorial, you learn how to:
-
-> [!div class="checklist"]
-> * Add the native client application
-> * Configure the sample to use the application
-> * Sign up using the user flow
--
-## Prerequisites
--- [Create user flows](tutorial-create-user-flows.md) to enable user experiences in your application.-- Install [Visual Studio 2019](https://www.visualstudio.com/downloads/) with **.NET desktop development** and **ASP.NET and web development** workloads.-
-## Add the native client application
--
-Record the **Application (client) ID** for use in a later step.
-
-## Configure the sample
-
-In this tutorial, you configure a sample that you can download from GitHub. The sample WPF desktop application demonstrates sign-up, sign-in, and can call a protected web API in Azure AD B2C. [Download a zip file](https://github.com/Azure-Samples/active-directory-b2c-dotnet-desktop/archive/msalv3.zip), [browse the repo](https://github.com/Azure-Samples/active-directory-b2c-dotnet-desktop), or clone the sample from GitHub.
-
-```
-git clone https://github.com/Azure-Samples/active-directory-b2c-dotnet-desktop.git
-```
-
-To update the application to work with your Azure AD B2C tenant and invoke its user flows instead of those in the default demo tenant:
-
-1. Open the **active-directory-b2c-wpf** solution (`active-directory-b2c-wpf.sln`) in Visual Studio.
-2. In the **active-directory-b2c-wpf** project, open the *App.xaml.cs* file and find the following variable definitions. Replace `{your-tenant-name}` with your Azure AD B2C tenant name and `{application-ID}` with the application ID that you recorded earlier.
-
- ```csharp
- private static readonly string Tenant = "{your-tenant-name}.onmicrosoft.com";
- private static readonly string AzureAdB2CHostname = "{your-tenant-name}.b2clogin.com";
- private static readonly string ClientId = "{application-ID}";
- ```
-
-3. Update the policy name variables with the names of the user flows that you created as part of the prerequisites. For example:
-
- ```csharp
- public static string PolicySignUpSignIn = "B2C_1_signupsignin1";
- public static string PolicyEditProfile = "B2C_1_profileediting1";
- public static string PolicyResetPassword = "B2C_1_passwordreset1";
- ```
-
-## Run the sample
-
-Press **F5** to build and run the sample.
-
-### Sign up using an email address
-
-1. Select **Sign In** to sign up as a user. This uses the **B2C_1_signupsignin1** user flow.
-2. Azure AD B2C presents a sign in page with a **Sign up now** link. Since you don't yet have an account, select the **Sign up now** link.
-3. The sign-up workflow presents a page to collect and verify the user's identity using an email address. The sign-up workflow also collects the user's password and the requested attributes defined in the user flow.
-
- Use a valid email address and validate using the verification code. Set a password. Enter values for the requested attributes.
-
- ![Sign-up page shown as part of sign-in/sign-up workflow](./media/tutorial-desktop-app/azure-ad-b2c-sign-up-workflow.png)
-
-4. Select **Create** to create a local account in the Azure AD B2C tenant.
-
-The user can now use their email address to sign in and use the desktop application. After a successful sign-up or sign-in, the token details are displayed in the lower pane of the WPF app.
-
-![Token details shown in bottom pane of WPF desktop application](./media/tutorial-desktop-app/desktop-app-01-post-signin.png)
-
-If you select the **Call API** button, an **error message** is displayed. You encounter the error because, in its current state, the application is attempting to access an API protected by the demo tenant, `fabrikamb2c.onmicrosoft.com`. Since your access token is valid only for your Azure AD B2C tenant, the API call is therefore unauthorized.
-
-Continue to the next tutorial to register a protected web API in your own tenant and enable the **Call API** functionality.
-
-## Next steps
-
-In this tutorial, you learned how to:
-
-> [!div class="checklist"]
-> * Add the native client application
-> * Configure the sample to use the application
-> * Sign up using the user flow
-
-Next, to enable the **Call API** button functionality, grant the WPF desktop application access to a web API registered in your own Azure AD B2C tenant:
-
-> [!div class="nextstepaction"]
-> [Tutorial: Grant access to a Node.js web API from a desktop app >](tutorial-desktop-app-webapi.md)
active-directory-b2c Tutorial Single Page App Webapi https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/tutorial-single-page-app-webapi.md
- Title: "Tutorial: Protect a Node.js web API using Azure AD B2C and grant access to a single-page application (SPA)"-
-description: In this tutorial, learn how to use Active Directory B2C to protect a Node.js web API and call it from a single-page application.
----- Previously updated : 04/04/2020------
-# Tutorial: Protect and grant access to a Node.js web API from a single-page application with Azure AD B2C
-
-This tutorial shows you how to call an Azure Active Directory B2C (Azure AD B2C)-protected Node.js web API from a single-page application.
-
-In this tutorial, the second in a two-part series:
-
-> [!div class="checklist"]
-> * Create a web API application registration in your Azure AD B2C tenant
-> * Configure scopes for the web API
-> * Grant permissions to the web API
-> * Modify a web API code sample to work with your tenant
-
-In the [first tutorial](tutorial-single-page-app.md) in this series, you downloaded the code sample and modified it to sign in users with a user flow in your Azure AD B2C tenant.
--
-## Prerequisites
-
-* Complete the steps and prerequisites in [Tutorial: Enable authentication in a single-page application with Azure AD B2C](tutorial-single-page-app.md)
-* [Visual Studio Code](https://code.visualstudio.com/) or another code editor
-* [Node.js](https://nodejs.org/en/download/)
-
-## Add a web API application
--
-## Configure scopes
-
-Scopes provide a way to govern access to protected resources. Scopes are used by the web API to implement scope-based access control. For example, some users could have both read and write access, whereas other users might have read-only permissions. In this tutorial, you define both read and write permissions for the web API.
--
-Record the value under **Scopes** for the `demo.read` scope to use in a later step when you configure the single-page application. The full scope value is similar to `https://contosob2c.onmicrosoft.com/api/demo.read`.
-
-## Grant permissions
-
-To call a protected web API from another application, you need to grant that application permissions to the web API.
-
-In the prerequisite tutorial, you created a single-page application named *spaapp1*. In this tutorial, you configure that application to call the web API you created in a previous section, *spaapp1*.
--
-Your single-page web application has now been granted permissions to the protected web API for the scopes specified. A user authenticates with Azure AD B2C to use the single-page application. The single-page app obtains an access token from Azure AD B2C to access the protected web API.
-
-## Configure the sample
-
-Now that the web API is registered and you've defined scopes, configure the web API code to work with your Azure AD B2C tenant. In this tutorial, you configure a sample Node.js web API you download from GitHub.
-
-[Download a \*.zip archive](https://github.com/Azure-Samples/active-directory-b2c-javascript-nodejs-webapi/archive/master.zip) or clone the sample web API project from GitHub. You can also browse directly to the [Azure-Samples/active-directory-b2c-javascript-nodejs-webapi](https://github.com/Azure-Samples/active-directory-b2c-javascript-nodejs-webapi) project on GitHub.
-
-```console
-git clone https://github.com/Azure-Samples/active-directory-b2c-javascript-nodejs-webapi.git
-```
-
-### Configure the web API
-
-1. Open the *config.json* file in your code editor.
-1. Modify the variable values to reflect those of the application registration you created earlier. Also update the `policyName` with the user flow you created as part of the prerequisites. For example, *B2C_1_signupsignin1*.
-
- ```json
- "credentials": {
- "tenantName": "<your-tenant-name>",
- "clientID": "<your-webapi-application-ID>"
- },
- "policies": {
- "policyName": "B2C_1_signupsignin1"
- },
- "resource": {
- "scope": ["demo.read"]
- },
- ```
-
-#### Enable CORS
-
-To allow your single-page application to call the Node.js web API, you need to enable [CORS](https://expressjs.com/en/resources/middleware/cors.html) in the web API. In a production application you should be careful about which domain is making the request, but for this tutorial, allow requests from any domain.
-
-To enable CORS, use the following middleware. In the Node.js web API code sample in this tutorial, it's already been added to the *index.js* file.
-
-```javascript
-app.use((req, res, next) => {
- res.header("Access-Control-Allow-Origin", "*");
- res.header("Access-Control-Allow-Headers", "Authorization, Origin, X-Requested-With, Content-Type, Accept");
- next();
-});
-```
-
-### Configure the single-page application
-
-The single-page application (SPA) from the [previous tutorial](tutorial-single-page-app.md) in the series uses Azure AD B2C for user sign-up and sign-in, and by default, calls the Node.js web API protected by the *fabrikamb2c* demo tenant.
-
-In this section, you update the single-page web application to call the Node.js web API protected by *your* Azure AD B2C tenant (and which you run on your local machine).
-
-To change the settings in the SPA:
-
-1. In the [ms-identity-b2c-javascript-spa][github-js-spa] project you downloaded or cloned in the previous tutorial, open the *apiConfig.js* file inside the *App* folder.
-1. Configure the sample with the URI for the *demo.read* scope you created earlier and the URL of the web API.
- 1. In the `apiConfig` definition, replace the `b2cScopes` value with the full URI for the *demo.read* scope (the **Scope** value you recorded earlier).
- 1. Change the domain in the `webApi` value to the redirect URI you added when you registered the web API application in an earlier step.
-
- Because the API is accessible at the `/hello` endpoint, leave */hello* in the URI.
-
- The `apiConfig` definition should look similar to the following code block, but with your B2C tenant's name in the place of `<your-tenant-name>`:
-
- ```javascript
- // The current application coordinates were pre-registered in a B2C tenant.
- const apiConfig = {
- b2cScopes: ["https://<your-tenant-name>.onmicrosoft.com/api/demo.read"],
- webApi: "http://localhost:5000/hello" // '/hello' should remain in the URI
- };
- ```
-
-## Run the SPA and web API
-
-You're now ready to test the single-page application's scoped access to the API. Run both the Node.js web API and the sample JavaScript single-page application on your local machine. Then, sign in to the single-page application and select the **Call API** button to initiate a request to the protected API.
-
-Although both applications are running locally when you follow this tutorial, you've configured them to use Azure AD B2C for secure sign-up/sign-in and to grant access to the protected web API.
-
-### Run the Node.js web API
-
-1. Open a console window and change to the directory containing the Node.js web API sample. For example:
-
- ```console
- cd active-directory-b2c-javascript-nodejs-webapi
- ```
-
-1. Run the following commands:
-
- ```console
- npm install && npm update
- node index.js
- ```
-
- The console window displays the port number where the application is hosted.
-
- ```console
- Listening on port 5000...
- ```
-
-### Run the single-page app
-
-1. Open another console window and change to the directory containing the JavaScript SPA sample. For example:
-
- ```console
- cd ms-identity-b2c-javascript-spa
- ```
-
-1. Run the following commands:
-
- ```console
- npm install && npm update
- npm start
- ```
-
- The console window displays the port number of where the application is hosted.
-
- ```console
- Listening on port 6420...
- ```
-
-1. Navigate to `http://localhost:6420` in your browser to view the application.
-
- ![Single-page application sample app shown in browser](./media/tutorial-single-page-app-webapi/tutorial-01-sample-app-browser.png)
-
-1. Sign in using the email address and password you used in the [previous tutorial](tutorial-single-page-app.md). Upon successful login, you should see the `User 'Your Username' logged-in` message.
-1. Select the **Call API** button. The SPA obtains an authorization grant from Azure AD B2C, then accesses the protected web API to display the name of the logged-in user:
-
- ![Single-page application in browser showing username JSON result returned by API](./media/tutorial-single-page-app-webapi/tutorial-02-call-api.png)
-
-## Next steps
-
-In this tutorial, you:
-
-> [!div class="checklist"]
-> * Created a web API application registration in your Azure AD B2C tenant
-> * Configured scopes for the web API
-> * Granted permissions to the web API
-> * Modified a web API code sample to work with your tenant
-
-Now that you've seen an SPA request a resource from a protected web API, gain a deeper understanding of how these application types interact with each other and with Azure AD B2C.
-
-> [!div class="nextstepaction"]
-> [Application types that can be used in Active Directory B2C >](application-types.md)
-
-<!-- Links - EXTERNAL -->
-[github-js-spa]: https://github.com/Azure-Samples/ms-identity-b2c-javascript-spa
active-directory-b2c Tutorial Single Page App https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/tutorial-single-page-app.md
- Title: "Tutorial: Enable authentication in a single-page app"-
-description: In this tutorial, learn how to use Azure Active Directory B2C to provide user login for a JavaScript-based single-page application (SPA).
----- Previously updated : 04/04/2020------
-# Tutorial: Enable authentication in a single-page application with Azure AD B2C
-
-This tutorial shows you how to use Azure Active Directory B2C (Azure AD B2C) to sign up and sign in users in a single-page application (SPA) using the [OAuth 2.0 authorization code flow](./authorization-code-flow.md) via [MSAL.js](https://github.com/AzureAD/microsoft-authentication-library-for-js/tree/dev/lib/msal-browser))
-
-In this tutorial, the first in a two-part series:
-
-> [!div class="checklist"]
-> * Add a reply URL to an application registered in your Azure AD B2C tenant
-> * Download a code sample from GitHub
-> * Modify the sample application's code to work with your tenant
-> * Sign up using your sign-up/sign-in user flow
-
-The [next tutorial](tutorial-single-page-app-webapi.md) in the series enables the web API portion of the code sample.
--
-## Prerequisites
-
-You need the following Azure AD B2C resources in place before continuing with the steps in this tutorial:
-
-* [Azure AD B2C tenant](tutorial-create-tenant.md)
-* [Application registered](tutorial-register-spa.md) in your tenant
-* [User flows created](tutorial-create-user-flows.md) in your tenant
-
-Additionally, you need the following in your local development environment:
-
-* [Visual Studio Code](https://code.visualstudio.com/) or another code editor
-* [Node.js](https://nodejs.org/en/download/)
-
-## Update the application
-
-In the [second tutorial](./tutorial-register-spa.md) that you completed as part of the prerequisites, you registered a single-page application in Azure AD B2C. To enable communication with the code sample in this tutorial, add a reply URL (also called a redirect URI) to the application registration.
-
-To update an application in your Azure AD B2C tenant, you can use our new unified **App registrations** experience or our legacy **Applications (Legacy)** experience. [Learn more about the new experience](./app-registrations-training-guide.md).
-
-#### [App registrations](#tab/app-reg-auth/)
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Select the **Directory + subscription** filter in the top menu, and then select the directory that contains your Azure AD B2C tenant.
-1. In the left menu, select **Azure AD B2C**. Or, select **All services** and search for and select **Azure AD B2C**.
-1. Select **App registrations**, select the **Owned applications** tab, and then select the *spaapp1* application.
-1. Under **Single-page Application**, select the **Add URI** link, then enter `http://localhost:6420`.
-1. Select **Save**.
-1. Select **Overview**.
-1. Record the **Application (client) ID** for use in a later step when you update the code in the single-page web application.
-
-#### [Applications (Legacy)](#tab/applications-legacy/)
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Make sure you're using the directory that contains your Azure AD B2C tenant by selecting the **Directory + subscription** filter in the top menu and choosing the directory that contains your tenant.
-1. Select **All services** in the top-left corner of the Azure portal, and then search for and select **Azure AD B2C**.
-1. Select **Applications (Legacy)**, and then select the *spaapp1* application.
-1. Under **Reply URL**, add `http://localhost:6420`.
-1. Select **Save**.
-1. On the properties page, record the **Application ID**. You use the app ID in a later step when you update the code in the single-page web application.
-
-* * *
-
-## Get the sample code
-
-In this tutorial, you configure a code sample that you download from GitHub to work with your B2C tenant. The sample demonstrates how a single-page application can use Azure AD B2C for user sign-up and sign-in, and to call a protected web API (you enable the web API in the next tutorial in the series).
-
- [Download a zip file](https://github.com/Azure-Samples/ms-identity-b2c-javascript-spa/archive/main.zip) or clone the sample from GitHub:
-
- ```
- git clone https://github.com/Azure-Samples/ms-identity-b2c-javascript-spa.git
- ```
-
-## Update the sample
-
-Now that you've obtained the sample, update the code with your Azure AD B2C tenant name and the application ID you recorded in an earlier step.
-
-1. Open the *authConfig.js* file inside the *App* folder.
-1. In the `msalConfig` object, find the assignment for `clientId` and replace it with the **Application (client) ID** you recorded in an earlier step.
-1. Open the `policies.js` file.
-1. Find the entries under `names` and replace their assignment with the name of the user-flows you created in an earlier step, for example `B2C_1_signupsignin1`.
-1. Find the entries under `authorities` and replace them as appropriate with the names of the user-flows you created in an earlier step, for example `https://<your-tenant-name>.b2clogin.com/<your-tenant-name>.onmicrosoft.com/<your-sign-in-sign-up-policy>`.
-1. Find the assignment for `authorityDomain` and replace it with `<your-tenant-name>.b2clogin.com`.
-1. Open the `apiConfig.js` file.
-1. Find the assignment for `b2cScopes` and replace the URL with the scope URL you created for the Web API, for example `b2cScopes: ["https://<your-tenant-name>.onmicrosoft.com/helloapi/demo.read"]`.
-1. Find the assignment for `webApi` and replace the current URL with the URL where you deployed your Web API in Step 4, for example `webApi: http://localhost:5000/hello`.
--
-Your resulting code should look similar to following:
-
-*authConfig.js*:
-
-```javascript
-const msalConfig = {
- auth: {
- clientId: "e760cab2-b9a1-4c0d-86fb-ff7084abd902",
- authority: b2cPolicies.authorities.signUpSignIn.authority,
- knownAuthorities: [b2cPolicies.authorityDomain],
- },
- cache: {
- cacheLocation: "localStorage",
- storeAuthStateInCookie: true
- }
-};
-
-const loginRequest = {
- scopes: ["openid", "profile"],
-};
-
-const tokenRequest = {
- scopes: apiConfig.b2cScopes // i.e. ["https://fabrikamb2c.onmicrosoft.com/helloapi/demo.read"]
-};
-```
-
-*policies.js*:
-
-```javascript
-const b2cPolicies = {
- names: {
- signUpSignIn: "b2c_1_susi",
- forgotPassword: "b2c_1_reset",
- editProfile: "b2c_1_edit_profile"
- },
- authorities: {
- signUpSignIn: {
- authority: "https://fabrikamb2c.b2clogin.com/fabrikamb2c.onmicrosoft.com/b2c_1_susi",
- },
- forgotPassword: {
- authority: "https://fabrikamb2c.b2clogin.com/fabrikamb2c.onmicrosoft.com/b2c_1_reset",
- },
- editProfile: {
- authority: "https://fabrikamb2c.b2clogin.com/fabrikamb2c.onmicrosoft.com/b2c_1_edit_profile"
- }
- },
- authorityDomain: "fabrikamb2c.b2clogin.com"
-}
-```
-
-*apiConfig.js*:
-
-```javascript
-const apiConfig = {
- b2cScopes: ["https://fabrikamb2c.onmicrosoft.com/helloapi/demo.read"],
- webApi: "https://fabrikamb2chello.azurewebsites.net/hello"
-};
-```
--
-## Run the sample
-
-1. Open a console window and navigate to the directory containing the sample.
-
- ```console
- cd ms-identity-b2c-javascript-spa
- ```
-
-1. Run the following commands:
-
- ```console
- npm install && npm update
- npm start
- ```
-
- The console window displays the port number of the locally running Node.js server:
-
- ```console
- Listening on port 6420...
- ```
-1. Browse to `http://localhost:6420` to view the web application running on your local machine.
-
- :::image type="content" source="media/tutorial-single-page-app/web-app-spa-01-not-logged-in.png" alt-text="Web browser showing single-page application running locally":::
-
-### Sign up using an email address
-
-This sample application supports sign up, sign in, and password reset. In this tutorial, you sign up using an email address.
-
-1. Select **Sign In** to initiate the *B2C_1_signupsignin1* user flow you specified in an earlier step.
-1. Azure AD B2C presents a sign-in page that includes a sign up link. Since you don't yet have an account, select the **Sign up now** link.
-1. The sign up workflow presents a page to collect and verify the user's identity using an email address. The sign up workflow also collects the user's password and the requested attributes defined in the user flow.
-
- Use a valid email address and validate using the verification code. Set a password. Enter values for the requested attributes.
-
- :::image type="content" source="media/tutorial-single-page-app/user-flow-sign-up-workflow-01.png" alt-text="Sign up page displayed by Azure AD B2C user flow":::
-
-1. Select **Create** to create a local account in the Azure AD B2C directory.
-
-When you select **Create**, the application shows the name of the signed in user.
--
-If you'd like to test sign-in, select the **Sign Out** button, then select **Sign In** and sign in with the email address and password you entered when you signed up.
-
-### What about calling the API?
-
-If you select the **Call API** button after signing in, you're presented with the sign-up/sign-in user flow page instead of the results of the API call. This is expected because you haven't yet configured the API portion of the application to communicate with a web API application registered in *your* Azure AD B2C tenant.
-
-At this point, the application is still trying to communicate with the API registered in the demo tenant (fabrikamb2c.onmicrosoft.com), and because you're not authenticated with that tenant, the sign-up/sign-in page is displayed.
-
-Move on to the next tutorial in the series in to enable the protected API (see the [Next steps](#next-steps) section).
-
-## Next steps
-
-In this tutorial, you configured a single-page application to work with a user flow in your Azure AD B2C tenant to provide sign up and sign in capability. You completed these steps:
-
-> [!div class="checklist"]
-> * Added a reply URL to an application registered in your Azure AD B2C tenant
-> * Downloaded a code sample from GitHub
-> * Modified the sample application's code to work with your tenant
-> * Signed up using your sign-up/sign-in user flow
-
-Now move on to the next tutorial in the series to grant access to a protected web API from the SPA:
-
-> [!div class="nextstepaction"]
-> [Tutorial: Protect and grant access to web API from a single-page application >](tutorial-single-page-app-webapi.md)
active-directory-b2c Tutorial Web Api Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/tutorial-web-api-dotnet.md
- Title: "Tutorial: Grant access to an ASP.NET web API"-
-description: Tutorial on how to use Active Directory B2C to protect an ASP.NET web API and call it from an ASP.NET web application.
----- Previously updated : 10/14/2019------
-# Tutorial: Grant access to an ASP.NET web API using Azure Active Directory B2C
-
-This tutorial shows you how to call a protected web API resource in Azure Active Directory B2C (Azure AD B2C) from an ASP.NET web application.
-
-In this tutorial, you learn how to:
-
-> [!div class="checklist"]
-> * Add a web API application
-> * Configure scopes for a web API
-> * Grant permissions to the web API
-> * Configure the sample to use the application
--
-## Prerequisites
-
-Complete the steps and prerequisites in [Tutorial: Enable authenticate in a web application using Azure Active Directory B2C](tutorial-web-app-dotnet.md).
-
-## Add a web API application
-
-Web API resources need to be registered in your tenant before they can accept and respond to protected resource requests by client applications that present an access token.
-
-To register an application in your Azure AD B2C tenant, you can use our new unified **App registrations** experience or our legacy **Applications (Legacy)** experience. [Learn more about the new experience](./app-registrations-training-guide.md).
-
-#### [App registrations](#tab/app-reg-ga/)
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Select the **Directory + subscription** filter in the top menu, and then select the directory that contains your Azure AD B2C tenant.
-1. In the left menu, select **Azure AD B2C**. Or, select **All services** and search for and select **Azure AD B2C**.
-1. Select **App registrations**, and then select **New registration**.
-1. Enter a **Name** for the application. For example, *webapi1*.
-1. Under **Redirect URI**, select **Web**, and then enter an endpoint where Azure AD B2C should return any tokens that your application requests. In this tutorial, the sample runs locally and listens at `https://localhost:44332`.
-1. Select **Register**.
-1. Record the **Application (client) ID** for use in a later step.
-
-#### [Applications (Legacy)](#tab/applications-legacy/)
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-2. Make sure you're using the directory that contains your Azure AD B2C tenant by selecting the **Directory + subscription** filter in the top menu and choosing the directory that contains your tenant.
-3. Choose **All services** in the top-left corner of the Azure portal, and then search for and select **Azure AD B2C**.
-4. Select **Applications (Legacy)**, and then select **Add**.
-5. Enter a name for the application. For example, *webapi1*.
-6. For **Include web app/ web API**, select **Yes**.
-7. For **Reply URL**, enter an endpoint where Azure AD B2C should return any tokens that your application requests. In this tutorial, the sample runs locally and listens at `https://localhost:44332`.
-8. For **App ID URI**, enter the identifier used for your web API. The full identifier URI including the domain is generated for you. For example, `https://contosotenant.onmicrosoft.com/api`.
-9. Click **Create**.
-10. On the properties page, record the application ID that you'll use when you configure the web application.
-
-* * *
-
-## Configure scopes
-
-Scopes provide a way to govern access to protected resources. Scopes are used by the web API to implement scope-based access control. For example, users of the web API could have both read and write access, or users of the web API might have only read access. In this tutorial, you use scopes to define read and write permissions for the web API.
--
-## Grant permissions
-
-To call a protected web API from an application, you need to grant your application permissions to the API. In the prerequisite tutorial, you created a web application in Azure AD B2C named *webapp1*. You use this application to call the web API.
--
-Your application is registered to call the protected web API. A user authenticates with Azure AD B2C to use the application. The application obtains an authorization grant from Azure AD B2C to access the protected web API.
-
-## Configure the sample
-
-Now that the web API is registered and you have scopes defined, you configure the web API to use your Azure AD B2C tenant. In this tutorial, you configure a sample web API. The sample web API is included in the project you downloaded in the prerequisite tutorial.
-
-There are two projects in the sample solution:
-
-* **TaskWebApp** - Create and edit a task list. The sample uses the **sign-up or sign-in** user flow to sign up or sign in users.
-* **TaskService** - Supports the create, read, update, and delete task list functionality. The API is protected by Azure AD B2C and called by TaskWebApp.
-
-### Configure the web application
-
-1. Open the **B2C-WebAPI-DotNet** solution in Visual Studio.
-1. In the **TaskWebApp** project, open **Web.config**.
-1. To run the API locally, use the localhost setting for **api:TaskServiceUrl**. Change the Web.config as follows:
-
- ```csharp
- <add key="api:TaskServiceUrl" value="https://localhost:44332/"/>
- ```
-
-1. Configure the URI of the API. This is the URI the web application uses to make the API request. Also, configure the requested permissions.
-
- ```csharp
- <add key="api:ApiIdentifier" value="https://<Your tenant name>.onmicrosoft.com/api/" />
- <add key="api:ReadScope" value="demo.read" />
- <add key="api:WriteScope" value="demo.write" />
- ```
-
-### Configure the web API
-
-1. In the **TaskService** project, open **Web.config**.
-1. Configure the API to use your tenant.
-
- ```csharp
- <add key="ida:AadInstance" value="https://<Your tenant name>.b2clogin.com/{0}/{1}/v2.0/.well-known/openid-configuration" />
- <add key="ida:Tenant" value="<Your tenant name>.onmicrosoft.com" />
- ```
-
-1. Set the client ID to the Application ID of your registered web API application, *webapi1*.
-
- ```csharp
- <add key="ida:ClientId" value="<application-ID>"/>
- ```
-
-1. Update the user flow setting with the name of your sign-up and sign-in user flow, *B2C_1_signupsignin1*.
-
- ```csharp
- <add key="ida:SignUpSignInPolicyId" value="B2C_1_signupsignin1" />
- ```
-
-1. Configure the scopes setting to match those you created in the portal.
-
- ```csharp
- <add key="api:ReadScope" value="demo.read" />
- <add key="api:WriteScope" value="demo.write" />
- ```
-
-## Run the sample
-
-You need to run both the **TaskWebApp** and **TaskService** projects.
-
-1. In Solution Explorer, right-click on the solution and select **Set StartUp Projects...**.
-1. Select **Multiple startup projects**.
-1. Change the **Action** for both projects to **Start**.
-1. Click **OK** to save the configuration.
-1. Press **F5** to run both applications. Each application opens in its own browser window.
- * `https://localhost:44316/` is the web application.
- * `https://localhost:44332/` is the web API.
-
-1. In the web application, select **sign-up / sign-in** to sign in to the web application. Use the account that you previously created.
-1. After you sign in, select **To-do list** and create a to-do list item.
-
-When you create a to-do list item, the web application makes a request to the web API to generate the to-do list item. Your protected web application is calling the web API protected by Azure AD B2C.
-
-## Next steps
-
-In this tutorial, you learned how to:
-
-> [!div class="checklist"]
-> * Add a web API application
-> * Configure scopes for a web API
-> * Grant permissions to the web API
-> * Configure the sample to use the application
-
-> [!div class="nextstepaction"]
-> [Add identity providers to your applications in Azure Active Directory B2C](add-identity-provider.md)
active-directory-b2c Tutorial Web App Dotnet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/tutorial-web-app-dotnet.md
- Title: "Tutorial: Enable authentication in a web application"-
-description: Tutorial on how to use Azure Active Directory B2C to provide user login for an ASP.NET web application.
----- Previously updated : 10/02/2020------
-# Tutorial: Enable authentication in a web application using Azure Active Directory B2C
-
-This tutorial shows you how to use Azure Active Directory B2C (Azure AD B2C) to sign in and sign up users in an ASP.NET web application. Azure AD B2C enables your applications to authenticate to social accounts, enterprise accounts, and Azure Active Directory accounts using open standard protocols.
-
-In this tutorial, you learn how to:
-
-> [!div class="checklist"]
-> * Update the application in Azure AD B2C
-> * Configure the sample to use the application
-> * Sign up using the user flow
--
-> [!NOTE]
-> This tutorial uses an ASP.NET sample web application. For other sample applications (including ASP.NET Core, Node.js, Python, and more), see [Azure Active Directory B2C code samples](integrate-with-app-code-samples.md).
-
-## Prerequisites
-
-* [Create user flows](tutorial-create-user-flows.md) to enable user experiences in your application.
-* Install [Visual Studio 2019](https://www.visualstudio.com/downloads/) with the **ASP.NET and web development** workload.
-
-## Update the application registration
-
-In the tutorial that you completed as part of the prerequisites, you registered a web application in Azure AD B2C. To enable communication with the sample in this tutorial, you need to add a redirect URI and create a client secret (key) for the registered application.
-
-### Add a redirect URI (reply URL)
-
-To update an application in your Azure AD B2C tenant, you can use our new unified **App registrations** experience or our legacy **Applications (Legacy)** experience. [Learn more about the new experience](./app-registrations-training-guide.md).
-
-#### [App registrations](#tab/app-reg-ga/)
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Select the **Directory + subscription** filter in the top menu, and then select the directory that contains your Azure AD B2C tenant.
-1. In the left menu, select **Azure AD B2C**. Or, select **All services** and search for and select **Azure AD B2C**.
-1. Select **App registrations**, select the **Owned applications** tab, and then select the *webapp1* application.
-1. Under **Web**, select the **Add URI** link, enter `https://localhost:44316`, and then select **Save**.
-1. Select **Overview**.
-1. Record the **Application (client) ID** for use in a later step when you configure the web application.
-
-#### [Applications (Legacy)](#tab/applications-legacy/)
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Make sure you're using the directory that contains your Azure AD B2C tenant by selecting the **Directory + subscription** filter in the top menu and choosing the directory that contains your tenant.
-1. Choose **All services** in the top-left corner of the Azure portal, and then search for and select **Azure AD B2C**.
-1. Select **Applications (Legacy)**, and then select the *webapp1* application.
-1. Under **Reply URL**, add `https://localhost:44316`.
-1. Select **Save**.
-1. On the properties page, record the application ID for use in a later step when you configure the web application.
-
-* * *
-
-### Create a client secret
-
-Next, create a client secret for the registered web application. The web application code sample uses this to prove its identity when requesting tokens.
--
-## Configure the sample
-
-In this tutorial, you configure a sample that you can download from GitHub. The sample uses ASP.NET to provide a simple to-do list. The sample uses [Microsoft OWIN middleware components](/aspnet/aspnet/overview/owin-and-katana/). [Download a zip file](https://github.com/Azure-Samples/active-directory-b2c-dotnet-webapp-and-webapi/archive/master.zip) or clone the sample from GitHub. Make sure that you extract the sample file in a folder where the total character length of the path is less than 260.
-
-```
-git clone https://github.com/Azure-Samples/active-directory-b2c-dotnet-webapp-and-webapi.git
-```
-
-The following two projects are in the sample solution:
-
-* **TaskWebApp** - Create and edit a task list. The sample uses the **sign-up or sign-in** user flow to sign up and sign in users.
-* **TaskService** - Supports the create, read, update, and delete task list functionality. The API is protected by Azure AD B2C and called by TaskWebApp.
-
-You change the sample to use the application that's registered in your tenant, which includes the application ID and the key that you previously recorded. You also configure the user flows that you created. The sample defines the configuration values as settings in the *Web.config* file.
-
-Update the settings in the Web.config file to work with your user flow:
-
-1. Open the **B2C-WebAPI-DotNet** solution in Visual Studio.
-1. In the **TaskWebApp** project, open the **Web.config** file.
- 1. Update the value of `ida:Tenant` and `ida:AadInstance` with the name of the Azure AD B2C tenant that you created. For example, replace `fabrikamb2c` with `contoso`.
- 1. Replace the value of `ida:TenantId` with the directory ID, which you can find in the properties for your Azure B2C tenant (in the Azure portal under **Azure Active Directory** > **Properties** > **Directory ID**).
- 1. Replace the value of `ida:ClientId` with the application ID that you recorded.
- 1. Replace the value of `ida:ClientSecret` with the key that you recorded. If the client secret contains any predefined XML entities, for example less than (`<`), greater than (`>`), ampersand (`&`), or double quote (`"`), you must escape those characters by XML-encoding the client secret before adding it to your Web.config.
- 1. Replace the value of `ida:SignUpSignInPolicyId` with `b2c_1_signupsignin1`.
- 1. Replace the value of `ida:EditProfilePolicyId` with `b2c_1_profileediting1`.
- 1. Replace the value of `ida:ResetPasswordPolicyId` with `b2c_1_passwordreset1`.
-
-## Run the sample
-
-1. In Solution Explorer, right-click the **TaskWebApp** project, and then click **Set as StartUp Project**.
-1. Press **F5**. The default browser launches to the local web site address `https://localhost:44316/`.
-
-### Sign up using an email address
-
-1. Select **Sign up / Sign in** to sign up as a user of the application. The **b2c_1_signupsignin1** user flow is used.
-1. Azure AD B2C presents a sign-in page with a sign-up link. Since you don't have an account yet, select **Sign up now**. The sign-up workflow presents a page to collect and verify the user's identity using an email address. The sign-up workflow also collects the user's password and the requested attributes defined in the user flow.
-1. Use a valid email address and validate using the verification code. Set a password. Enter values for the requested attributes.
-
- ![Sign-up page shown as part of sign-in/sign-up workflow](./media/tutorial-web-app-dotnet/sign-up-workflow.PNG)
-
-1. Select **Create** to create a local account in the Azure AD B2C tenant.
-
-The application user can now use their email address to sign in and use the web application.
-
-However, the **To-Do List** feature won't function until you complete the next tutorial in the series, [Tutorial: Use Azure AD B2C to protect an ASP.NET web API](tutorial-web-api-dotnet.md).
-
-## Next steps
-
-In this tutorial, you learned how to:
-
-> [!div class="checklist"]
-> * Update the application in Azure AD B2C
-> * Configure the sample to use the application
-> * Sign up using the user flow
-
-Now move on to the next tutorial to enable the **To-Do List** feature of the web application. In it, you register a web API application in your own Azure AD B2C tenant, and then modify the code sample to use your tenant for API authentication.
-
-> [!div class="nextstepaction"]
-> [Tutorial: Use Azure Active Directory B2C to protect an ASP.NET web API >](tutorial-web-api-dotnet.md)
active-directory-b2c Tutorial Web App Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/tutorial-web-app-python.md
- Title: "Tutorial: Enable authentication in a Python web application"-
-description: In this tutorial, learn how to use Azure Active Directory B2C to provide user login for a Python Flask web application.
----- Previously updated : 08/04/2021------
-# Tutorial: Enable authentication in a Python web application with Azure AD B2C
-
-This tutorial shows you how to use Azure Active Directory B2C (Azure AD B2C) to sign up and sign in users in a Python Flask web application.
-
-In this tutorial:
-
-> [!div class="checklist"]
-> * Add a reply URL to an application registered in your Azure AD B2C tenant
-> * Download a code sample from GitHub
-> * Modify the sample application's code to work with your tenant
-> * Sign up using your sign-up/sign-in user flow
--
-## Prerequisites
-
-You need the following Azure AD B2C resources in place before continuing with the steps in this tutorial:
-
-* [Azure AD B2C tenant](tutorial-create-tenant.md)
-* [Application registered](tutorial-register-applications.md) in your tenant, and its *Application (client) ID* and *client secret*
-* [User flows created](tutorial-create-user-flows.md) in your tenant
-
-Additionally, you need the following in your local development environment:
-
-* [Visual Studio Code](https://code.visualstudio.com/) or another code editor
-* [Python](https://nodejs.org/en/download/) 2.7+ or 3+
-
-## Add a redirect URI
-
-In the second tutorial that you completed as part of the prerequisites, you registered a web application in Azure AD B2C. To enable communication with the code sample in this tutorial, add a reply URL (also called a redirect URI) to the application registration.
-
-To update an application in your Azure AD B2C tenant, you can use our new unified **App registrations** experience or our legacy **Applications (Legacy)** experience. [Learn more about the new experience](./app-registrations-training-guide.md).
-
-#### [App registrations](#tab/app-reg-ga/)
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Select the **Directory + subscription** filter in the top menu, and then select the directory that contains your Azure AD B2C tenant.
-1. In the left menu, select **Azure AD B2C**. Or, select **All services** and search for and select **Azure AD B2C**.
-1. Select **App registrations**, select the **Owned applications** tab, and then select the *webapp1* application.
-1. Under **Manage**, select **Authentication**.
-1. Under **Web**, select the **Add URI** link, and then enter `http://localhost:5000/getAToken` in the text box.
-1. Select **Save**.
-
-#### [Applications (Legacy)](#tab/applications-legacy/)
-
-1. Sign in to the [Azure portal](https://portal.azure.com).
-1. Make sure you're using the directory that contains your Azure AD B2C tenant by selecting the **Directory + subscription** filter in the top menu and choosing the directory that contains your tenant.
-1. Select **All services** in the top-left corner of the Azure portal, and then search for and select **Azure AD B2C**.
-1. Select **Applications (Legacy)**, and then select the *webapp1* application.
-1. Under **Reply URL**, add `http://localhost:5000/getAToken`.
-1. Select **Save**.
-* * *
-
-## Get the sample code
-
-In this tutorial, you configure a code sample that you download from GitHub to work with your B2C tenant. The sample demonstrates how a Python Flask web application can use Azure AD B2C for user sign-up and sign-in.
-
-[Download a .ZIP archive](https://github.com/Azure-Samples/ms-identity-python-webapp/archive/master.zip) or clone the [code sample repository](https://github.com/Azure-Samples/ms-identity-python-webapp) from GitHub.
-
-```console
-git clone https://github.com/Azure-Samples/ms-identity-python-webapp.git
-```
-
-## Update the sample
-
-Once you've obtained the sample, configure the application to use your Azure AD B2C tenant, application registration, and user flows.
-
-In the project's root directory:
-
-1. Rename the *app_config.py* file to *app_config.py.OLD*
-1. Rename the *app_config_b2c.py* to *app_config.py*
-
-Update the newly renamed *app_config.py* with values for your Azure AD B2C tenant and application registration you created as part of the prerequisites.
-
-1. Open the *app_config.py* file in your editor.
-1. Update the `b2c_tenant` value with the name of your Azure AD B2C tenant, for example *contosob2c*.
-1. Update each of the `*_user_flow` values to match the names of the user flows you created as part of the prerequisites.
-1. Update the `CLIENT_ID` value with the **Application (client) ID** of the web application you registered as part of the prerequisites.
-1. Update the `CLIENT_SECRET` value with the value of the **client secret** you created in the prerequisites. For increased security, considering storing it instead in an **environment variable** as recommended in the comments.
-
-The top section of *app_config.py* should now look similar to the following code snippet:
-
-```python
-import os
-
-b2c_tenant = "contosob2c"
-signupsignin_user_flow = "B2C_1_signupsignin1"
-editprofile_user_flow = "B2C_1_profileediting1"
-resetpassword_user_flow = "B2C_1_passwordreset1"
-authority_template = "https://{tenant}.b2clogin.com/{tenant}.onmicrosoft.com/{user_flow}"
-
-CLIENT_ID = "11111111-1111-1111-1111-111111111111" # Application (client) ID of app registration
-
-CLIENT_SECRET = "22222222-2222-2222-2222-222222222222" # Placeholder - for use ONLY during testing.
-# In a production app, we recommend you use a more secure method of storing your secret,
-# like Azure Key Vault. Or, use an environment variable as described in Flask's documentation:
-# https://flask.palletsprojects.com/en/1.1.x/config/#configuring-from-environment-variables
-# CLIENT_SECRET = os.getenv("CLIENT_SECRET")
-# if not CLIENT_SECRET:
-# raise ValueError("Need to define CLIENT_SECRET environment variable")
-```
-
-> [!WARNING]
-> As noted in the code snippet comments, we recommend you **do not store secrets in plaintext** in your application code. The hardcoded variable is used in the code sample for *convenience only*. Consider using an environment variable or a secret store like Azure Key Vault.
-
-## Run the sample
-
-1. In your console or terminal, switch to the directory containing the sample. For example:
-
- ```console
- cd ms-identity-python-webapp
- ```
-1. Run the following commands to install the required packages from PyPi and run the web app on your local machine:
-
- ```console
- pip install -r requirements.txt
- flask run --host localhost --port 5000
- ```
-
- The console window displays the port number of the locally running application:
-
- ```console
- * Serving Flask app "app" (lazy loading)
- * Environment: production
- WARNING: This is a development server. Do not use it in a production deployment.
- Use a production WSGI server instead.
- * Debug mode: off
- * Running on http://localhost:5000/ (Press CTRL+C to quit)
- ```
-
-1. Browse to `http://localhost:5000` to view the web application running on your local machine.
-
- :::image type="content" source="media/tutorial-web-app-python/python-flask-web-app-01.png" alt-text="Web browser showing Python Flask web application running locally":::
-
-### Sign up using an email address
-
-This sample application supports sign up, sign in, and password reset. In this tutorial, you sign up using an email address.
-
-1. Select **Sign In** to initiate the *B2C_1_signupsignin1* user flow you specified in an earlier step.
-1. Azure AD B2C presents a sign-in page that includes a sign up link. Since you don't yet have an account, select the **Sign up now** link.
-1. The sign up workflow presents a page to collect and verify the user's identity using an email address. The sign up workflow also collects the user's password and the requested attributes defined in the user flow.
-
- Use a valid email address and validate using the verification code. Set a password. Enter values for the requested attributes.
-
- :::image type="content" source="media/tutorial-web-app-python/python-flask-web-app-02.png" alt-text="Sign up page displayed by Azure AD B2C user flow":::
-
-1. Select **Create** to create a local account in the Azure AD B2C directory.
-
-When you select **Create**, the application shows the name of the signed in user.
--
-If you'd like to test sign-in, select the **Logout** link, then select **Sign In** and sign in with the email address and password you entered when you signed up.
-
-## Next steps
-
-In this tutorial, you configured a Python Flask web application to work with a user flow in your Azure AD B2C tenant to provide sign up and sign in capability. You completed these steps:
-
-> [!div class="checklist"]
-> * Added a reply URL to an application registered in your Azure AD B2C tenant
-> * Downloaded a code sample from GitHub
-> * Modified the sample application's code to work with your tenant
-> * Signed up using your sign-up/sign-in user flow
-
-Next, learn how to customize the UI of the user flow pages displayed to your users by Azure AD B2C:
-
-> [!div class="nextstepaction"]
-> [Customize the interface of user experiences in Azure AD B2C >](customize-ui.md)
active-directory-b2c Whats New Docs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory-b2c/whats-new-docs.md
Welcome to what's new in Azure Active Directory B2C documentation. This article
- [Tutorial: Create user flows in Azure Active Directory B2C](tutorial-create-user-flows.md) - [Azure AD B2C custom policy overview](custom-policy-overview.md) - [User flows and custom policies overview](user-flow-overview.md)-- [Tutorial: Enable authentication in a single-page application with Azure AD B2C](tutorial-single-page-app.md) - [Set up phone sign-up and sign-in for user flows](phone-authentication-user-flows.md) - [Enable multi-factor authentication in Azure Active Directory B2C](multi-factor-authentication.md) - [User flow versions in Azure Active Directory B2C](user-flow-versions.md)
active-directory Msal B2c Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/develop/msal-b2c-overview.md
The [Microsoft Authentication Library for JavaScript (MSAL.js)](https://github.com/AzureAD/microsoft-authentication-library-for-js) enables JavaScript developers to authenticate users with social and local identities using [Azure Active Directory B2C](../../active-directory-b2c/overview.md) (Azure AD B2C).
-By using Azure AD B2C as an identity management service, you can customize and control how your customers sign up, sign in, and manage their profiles when they use your applications. Azure AD B2C also enables you to brand and customize the UI that your application displays during the authentication process.
+By using Azure AD B2C as an identity management service, you can customize and control how your customers sign up, sign in, and manage their profiles when they use your applications.
+
+Azure AD B2C also enables you to brand and customize the UI that your application displays during the authentication process.
## Supported app types and scenarios
MSAL.js enables [single-page applications](../../active-directory-b2c/applicatio
- Users **can** authenticate with their social and local identities. - Users **can** be authorized to access Azure AD B2C protected resources (but not Azure AD protected resources).-- Users **cannot** obtain tokens for Microsoft APIs (e.g. MS Graph API) using [delegated permissions](./v2-permissions-and-consent.md#permission-types).-- Users with administrator privileges **can** obtain tokens for Microsoft APIs (e.g. MS Graph API) using [delegated permissions](./v2-permissions-and-consent.md#permission-types).
+- Users **cannot** obtain tokens for Microsoft APIs (for example, MS Graph API) using [delegated permissions](./v2-permissions-and-consent.md#permission-types).
+- Users with administrator privileges **can** obtain tokens for Microsoft APIs (for example, MS Graph API) using [delegated permissions](./v2-permissions-and-consent.md#permission-types).
For more information, see: [Working with Azure AD B2C](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/working-with-b2c.md)
For more information, see: [Working with Azure AD B2C](https://github.com/AzureA
Follow the tutorial on how to: -- [Sign in users with Azure AD B2C in a single-page application](../../active-directory-b2c/tutorial-single-page-app.md)-- [Call an Azure AD B2C protected web API](../../active-directory-b2c/tutorial-single-page-app-webapi.md)
+- [Sign in users with Azure AD B2C in a single-page application](../../active-directory-b2c/configure-authentication-sample-spa-app.md)
+- [Call an Azure AD B2C protected web API](../../active-directory-b2c/enable-authentication-web-api.md)
active-directory Azureadjoin Plan https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/devices/azureadjoin-plan.md
Choose **Selected** and selects the users you want to add to the local administr
### Require multi-factor authentication (MFA) to join devices
-Select **ΓÇ£Yes** if you require users to perform MFA while joining devices to Azure AD. For the users joining devices to Azure AD using MFA, the device itself becomes a 2nd factor.
+Select **ΓÇ£Yes** if you require users to perform MFA while joining devices to Azure AD.
![Require multi-factor Auth to join devices](./media/azureadjoin-plan/03.png)
active-directory User Properties https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/external-identities/user-properties.md
Title: Properties of a B2B guest user - Azure Active Directory | Microsoft Docs
-description: Azure Active Directory B2B guest user properties and states before and after invitation redemption
+description: Azure Active Directory B2B invited guest user properties and states before and after invitation redemption
Previously updated : 07/26/2021 Last updated : 08/04/2021
# Properties of an Azure Active Directory B2B collaboration user
-This article describes the properties and states of the B2B guest user object in Azure Active Directory (Azure AD) before and after invitation redemption. An Azure AD business-to-business (B2B) collaboration user is a user with UserType = Guest. This guest user typically is from a partner organization and has limited privileges in the inviting directory, by default.
+This article describes the properties and states of an invited Azure Active Directory B2B (Azure AD B2B) collaboration user object both before and after invitation redemption. An Azure AD B2B collaboration user is an external user, typically from a partner organization, that you invite to sign into your Azure AD organization using their own credentials. This B2B collaboration user (also generally referred to as a *guest user*) can then access the apps and resources you want to share with them. A user object is created for the B2B collaboration user in the same directory as your employees. B2B collaboration user objects have limited privileges in your directory by default, and they can be managed like employees, added to groups, and so on.
Depending on the inviting organization's needs, an Azure AD B2B collaboration user can be in one of the following account states:
active-directory Whats New Archive https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/fundamentals/whats-new-archive.md
For more information, see [Upgrade to monthly active users billing model](../../
In October 2019, we've added these 35 new apps with Federation support to the app gallery:
-[In Case of Crisis ΓÇô Mobile](../saas-apps/in-case-of-crisis-mobile-tutorial.md), [Juno Journey](../saas-apps/juno-journey-tutorial.md), [ExponentHR](../saas-apps/exponenthr-tutorial.md), [Tact](https://www.tact.ai/products/tact-assistant), [OpusCapita Cash Management](https://appsource.microsoft.com/product/web-apps/opuscapitagroupoy-1036255.opuscapita-cm), [Salestim](https://www.salestim.com/), [Learnster](../saas-apps/learnster-tutorial.md), [Dynatrace](../saas-apps/dynatrace-tutorial.md), [HunchBuzz](https://login.hunchbuzz.com/integrations/azure/process), [Freshworks](../saas-apps/freshworks-tutorial.md), [eCornell](../saas-apps/ecornell-tutorial.md), [ShipHazmat](../saas-apps/shiphazmat-tutorial.md), [Netskope Cloud Security](../saas-apps/netskope-cloud-security-tutorial.md), [Contentful](../saas-apps/contentful-tutorial.md), [Bindtuning](https://bindtuning.com/login), [HireVue Coordinate ΓÇô Europe](https://www.hirevue.com/), [HireVue Coordinate - USOnly](https://www.hirevue.com/), [HireVue Coordinate - US](https://www.hirevue.com/), [WittyParrot Knowledge Box](https://wittyapi.wittyparrot.com/wittyparrot/api/provision/trail/signup), [Cloudmore](../saas-apps/cloudmore-tutorial.md), [Visit.org](../saas-apps/visitorg-tutorial.md), [Cambium Xirrus EasyPass Portal](https://login.xirrus.com/azure-signup), [Paylocity](../saas-apps/paylocity-tutorial.md), [Mail Luck!](../saas-apps/mail-luck-tutorial.md), [Teamie](https://theteamie.com/), [Velocity for Teams](https://velocity.peakup.org/teams/login), [SIGNL4](https://account.signl4.com/manage), [EAB Navigate IMPL](../saas-apps/eab-navigate-impl-tutorial.md), [ScreenMeet](https://console.screenmeet.com/), [Omega Point](https://pi.ompnt.com/), [Speaking Email for Intune (iPhone)](https://speaking.email/FAQ/98/email-access-via-microsoft-intune), [Speaking Email for Office 365 Direct (iPhone/Android)](https://speaking.email/FAQ/126/email-access-via-microsoft-office-365-direct), [ExactCare SSO](../saas-apps/exactcare-sso-tutorial.md), [iHealthHome Care Navigation System](https://ihealthnav.com/account/signin), [Qubie](https://qubie.azurewebsites.net/static/adminTab/)
+[In Case of Crisis ΓÇô Mobile](../saas-apps/in-case-of-crisis-mobile-tutorial.md), [Juno Journey](../saas-apps/juno-journey-tutorial.md), [ExponentHR](../saas-apps/exponenthr-tutorial.md), [Tact](https://www.tact.ai/products/tact-assistant), [OpusCapita Cash Management](https://appsource.microsoft.com/product/web-apps/opuscapitagroupoy-1036255.opuscapita-cm), [Salestim](https://www.salestim.com/), [Learnster](../saas-apps/learnster-tutorial.md), [Dynatrace](../saas-apps/dynatrace-tutorial.md), [HunchBuzz](https://login.hunchbuzz.com/integrations/azure/process), [Freshworks](../saas-apps/freshworks-tutorial.md), [eCornell](../saas-apps/ecornell-tutorial.md), [ShipHazmat](../saas-apps/shiphazmat-tutorial.md), [Netskope Cloud Security](../saas-apps/netskope-cloud-security-tutorial.md), [Contentful](../saas-apps/contentful-tutorial.md), [Bindtuning](https://bindtuning.com/login), [HireVue Coordinate ΓÇô Europe](https://www.hirevue.com/), [HireVue Coordinate - USOnly](https://www.hirevue.com/), [HireVue Coordinate - US](https://www.hirevue.com/), [WittyParrot Knowledge Box](https://wittyapi.wittyparrot.com/wittyparrot/api/provision/trail/signup), [Cloudmore](../saas-apps/cloudmore-tutorial.md), [Visit.org](../saas-apps/visitorg-tutorial.md), [Cambium Xirrus EasyPass Portal](https://login.xirrus.com/azure-signup), [Paylocity](../saas-apps/paylocity-tutorial.md), [Mail Luck!](../saas-apps/mail-luck-tutorial.md), [Teamie](https://theteamie.com/), [Velocity for Teams](https://velocity.peakup.org/teams/login), [SIGNL4](https://account.signl4.com/manage), [EAB Navigate IMPL](../saas-apps/eab-navigate-impl-tutorial.md), [ScreenMeet](https://console.screenmeet.com/), [Omega Point](https://pi.ompnt.com/), [Speaking Email for Intune (iPhone)](https://speaking.email/FAQ/98/email-access-via-microsoft-intune), [Speaking Email for Office 365 Direct (iPhone/Android)](https://speaking.email/FAQ/126/email-access-via-microsoft-office-365-direct), [ExactCare SSO](../saas-apps/exactcare-sso-tutorial.md), [iHealthHome Care Navigation System](https://ihealthnav.com/account/signin), [Qubie](https://www.qubie.app/)
For more information about the apps, see [SaaS application integration with Azure Active Directory](../saas-apps/tutorial-list.md). For more information about listing your application in the Azure AD app gallery, see [List your application in the Azure Active Directory application gallery](../develop/v2-howto-app-gallery-listing.md).
active-directory Howto Export Risk Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/identity-protection/howto-export-risk-data.md
# How To: Export risk data
-Azure AD stores reports and security signals for a defined period of time. When it comes to risk information, 90 days may not be long enough.
+Azure AD stores reports and security signals for a defined period of time. When it comes to risk information, that may not be long enough.
| Report / Signal | Azure AD Free | Azure AD Premium P1 | Azure AD Premium P2 | | | | | |
active-directory Assetbank Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/assetbank-tutorial.md
Previously updated : 01/19/2019 Last updated : 08/04/2021 # Tutorial: Azure Active Directory integration with Asset Bank
-In this tutorial, you learn how to integrate Asset Bank with Azure Active Directory (Azure AD).
-Integrating Asset Bank with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Asset Bank with Azure Active Directory (Azure AD). When you integrate Asset Bank with Azure AD, you can:
-* You can control in Azure AD who has access to Asset Bank.
-* You can enable your users to be automatically signed-in to Asset Bank (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Asset Bank.
+* Enable your users to be automatically signed-in to Asset Bank with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Asset Bank, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Asset Bank single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Asset Bank single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Asset Bank supports **SP** initiated SSO
-* Asset Bank supports **Just In Time** user provisioning
+* Asset Bank supports **SP** initiated SSO.
+* Asset Bank supports **Just In Time** user provisioning.
-## Adding Asset Bank from the gallery
+## Add Asset Bank from the gallery
To configure the integration of Asset Bank into Azure AD, you need to add Asset Bank from the gallery to your list of managed SaaS apps.
-**To add Asset Bank from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Asset Bank**, select **Asset Bank** from result panel then click **Add** button to add the application.
-
- ![Asset Bank in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with Asset Bank based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Asset Bank needs to be established.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Asset Bank** in the search box.
+1. Select **Asset Bank** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-To configure and test Azure AD single sign-on with Asset Bank, you need to complete the following building blocks:
+## Configure and test Azure AD SSO for Asset Bank
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Asset Bank Single Sign-On](#configure-asset-bank-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Asset Bank test user](#create-asset-bank-test-user)** - to have a counterpart of Britta Simon in Asset Bank that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+Configure and test Azure AD SSO with Asset Bank using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Asset Bank.
-### Configure Azure AD single sign-on
+To configure and test Azure AD SSO with Asset Bank, perform the following steps:
-In this section, you enable Azure AD single sign-on in the Azure portal.
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Asset Bank SSO](#configure-asset-bank-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Asset Bank test user](#create-asset-bank-test-user)** - to have a counterpart of B.Simon in Asset Bank that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-To configure Azure AD single sign-on with Asset Bank, perform the following steps:
+## Configure Azure AD SSO
-1. In the [Azure portal](https://portal.azure.com/), on the **Asset Bank** application integration page, select **Single sign-on**.
+Follow these steps to enable Azure AD SSO in the Azure portal.
- ![Configure single sign-on link](common/select-sso.png)
+1. In the Azure portal, on the **Asset Bank** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
-
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![Asset Bank Domain and URLs single sign-on information](common/sp-identifier.png)
+ a. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
+ `https://<companyname>.assetbank-server.com/shibboleth`
- a. In the **Sign on URL** text box, type a URL using the following pattern:
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
`https://<companyname>.assetbank-server.com`
- b. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
- `https://<companyname>.assetbank-server.com/shibboleth`
- > [!NOTE]
- > These values are not real. Update these values with the actual Sign on URL and Identifier. Contact [Asset Bank Client support team](mailto:support@assetbank.co.uk) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier and Sign on URL. Contact [Asset Bank Client support team](mailto:support@assetbank.co.uk) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Federation Metadata XML** from the given options as per your requirement and save it on your computer.
To configure Azure AD single sign-on with Asset Bank, perform the following step
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure Ad Identifier
-
- c. Logout URL
-
-### Configure Asset Bank Single Sign-On
-
-To configure single sign-on on **Asset Bank** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Asset Bank support team](mailto:support@assetbank.co.uk). They set this setting to have the SAML SSO connection set properly on both sides.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
+In this section, you'll create a test user in the Azure portal called B.Simon.
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Asset Bank.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Asset Bank**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **Asset Bank**.
-
- ![The Asset Bank link in the Applications list](common/all-applications.png)
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Asset Bank.
-3. In the menu on the left, select **Users and groups**.
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Asset Bank**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![The "Users and groups" link](common/users-groups-blade.png)
+## Configure Asset Bank SSO
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
+To configure single sign-on on **Asset Bank** side, you need to send the downloaded **Federation Metadata XML** and appropriate copied URLs from Azure portal to [Asset Bank support team](mailto:support@assetbank.co.uk). They set this setting to have the SAML SSO connection set properly on both sides.
### Create Asset Bank test user
In this section, a user called Britta Simon is created in Asset Bank. Asset Bank
> [!NOTE] > If you need to create a user manually, you need to contact the [Asset Bank support team](mailto:support@assetbank.co.uk).
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the Asset Bank tile in the Access Panel, you should be automatically signed in to the Asset Bank for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Asset Bank Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to Asset Bank Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Asset Bank tile in the My Apps, this will redirect to Asset Bank Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Asset Bank you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Central Desktop Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/central-desktop-tutorial.md
Previously updated : 02/12/2019 Last updated : 08/04/2021 # Tutorial: Azure Active Directory integration with Central Desktop
-In this tutorial, you learn how to integrate Central Desktop with Azure Active Directory (Azure AD).
-Integrating Central Desktop with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate Central Desktop with Azure Active Directory (Azure AD). When you integrate Central Desktop with Azure AD, you can:
-* You can control in Azure AD who has access to Central Desktop.
-* You can enable your users to be automatically signed-in to Central Desktop (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to Central Desktop.
+* Enable your users to be automatically signed-in to Central Desktop with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Central Desktop, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* Central Desktop single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Central Desktop single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* Central Desktop supports **SP** initiated SSO
+* Central Desktop supports **SP** initiated SSO.
-## Adding Central Desktop from the gallery
+## Add Central Desktop from the gallery
To configure the integration of Central Desktop into Azure AD, you need to add Central Desktop from the gallery to your list of managed SaaS apps.
-**To add Central Desktop from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **Central Desktop**, select **Central Desktop** from result panel then click **Add** button to add the application.
-
- ![Central Desktop in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with Central Desktop based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in Central Desktop needs to be established.
-
-To configure and test Azure AD single sign-on with Central Desktop, you need to complete the following building blocks:
-
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure Central Desktop Single Sign-On](#configure-central-desktop-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create Central Desktop test user](#create-central-desktop-test-user)** - to have a counterpart of Britta Simon in Central Desktop that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Central Desktop** in the search box.
+1. Select **Central Desktop** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-### Configure Azure AD single sign-on
+## Configure and test Azure AD SSO for Central Desktop
-In this section, you enable Azure AD single sign-on in the Azure portal.
+Configure and test Azure AD SSO with Central Desktop using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Central Desktop.
-To configure Azure AD single sign-on with Central Desktop, perform the following steps:
+To configure and test Azure AD SSO with Central Desktop, perform the following steps:
-1. In the [Azure portal](https://portal.azure.com/), on the **Central Desktop** application integration page, select **Single sign-on**.
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Central Desktop SSO](#configure-central-desktop-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Central Desktop test user](#create-central-desktop-test-user)** - to have a counterpart of B.Simon in Central Desktop that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
- ![Configure single sign-on link](common/select-sso.png)
+## Configure Azure AD SSO
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+Follow these steps to enable Azure AD SSO in the Azure portal.
- ![Single sign-on select mode](common/select-saml-option.png)
+1. In the Azure portal, on the **Central Desktop** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![Central Desktop Domain and URLs single sign-on information](common/sp-identifier-reply.png)
-
- a. In the **Sign-on URL** text box, type a URL using the following pattern:
- `https://<companyname>.centraldesktop.com`
-
- b. In the **Identifier** box, type a URL using the following pattern:
+ a. In the **Identifier** box, type a URL using one of the following patterns:
- ```http
- https://<companyname>.centraldesktop.com/saml2-metadata.php
- https://<companyname>.imeetcentral.com/saml2-metadata.php
- ```
+ | **Identifier** |
+ |-|
+ | `https://<companyname>.centraldesktop.com/saml2-metadata.php` |
+ | `https://<companyname>.imeetcentral.com/saml2-metadata.php` |
- c. In the **Reply URL** text box, type a URL using the following pattern:
+ b. In the **Reply URL** text box, type a URL using the following pattern:
`https://<companyname>.centraldesktop.com/saml2-assertion.php`
+ c. In the **Sign-on URL** text box, type a URL using the following pattern:
+ `https://<companyname>.centraldesktop.com`
+ > [!NOTE]
- > These values are not real. Update these values with the actual Sign-On URL, Identifier and Reply URL. Contact [Central Desktop Client support team](https://imeetcentral.com/contact-us) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier,Reply URL and Sign on URL. Contact [Central Desktop Client support team](https://imeetcentral.com/contact-us) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Certificate (Raw)** from the given options as per your requirement and save it on your computer.
To configure Azure AD single sign-on with Central Desktop, perform the following
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
+### Create an Azure AD test user
- b. Azure Ad Identifier
+In this section, you'll create a test user in the Azure portal called B.Simon.
- c. Logout URL
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
-### Configure Central Desktop Single Sign-On
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Central Desktop.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Central Desktop**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Central Desktop SSO
1. Sign in to your **Central Desktop** tenant. 2. Go to **Settings**. Select **Advanced**, and then select **Single Sign On**.
- ![Setup - Advanced](./media/central-desktop-tutorial/ic769563.png "Setup - Advanced")
+ ![Setup - Advanced.](./media/central-desktop-tutorial/settings.png "Setup - Advanced")
-3. On the **Single Sign On Settings** page, take the following steps:
+3. On the **Single Sign On Settings** page, perform the following steps:
- ![Single sign-on settings](./media/central-desktop-tutorial/ic769564.png "Single Sign On Settings")
+ ![Single sign-on settings.](./media/central-desktop-tutorial/configuration.png "Single Sign On Settings")
a. Select **Enable SAML v2 Single Sign On**.
To configure Azure AD single sign-on with Central Desktop, perform the following
d. In the **SSO Logout URL** box, paste the **Logout URL** value that you copied from the Azure portal.
-4. In the **Message Signature Verification Method** section, take the following steps:
+4. In the **Message Signature Verification Method** section, perform the following steps:
- ![Message signature verification method](./media/central-desktop-tutorial/ic769565.png "Message Signature Verification Method")
+ ![Message signature verification method](./media/central-desktop-tutorial/certificate.png "Message Signature Verification Method")
a. Select **Certificate**.
To configure Azure AD single sign-on with Central Desktop, perform the following
e. Select **Update**.
-### Create an Azure AD test user
-
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon\@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
-
- d. Click **Create**.
-
-### Assign the Azure AD test user
-
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to Central Desktop.
-
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **Central Desktop**.
-
- ![Enterprise applications blade](common/enterprise-applications.png)
-
-2. In the applications list, select **Central Desktop**.
-
- ![The Central Desktop link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
- ### Create Central Desktop test user For Azure AD users to be able to sign in, they must be provisioned in the Central Desktop application. This section describes how to create Azure AD user accounts in Central Desktop.
For Azure AD users to be able to sign in, they must be provisioned in the Centra
2. Select **People** and then select **Add Internal Members**.
- ![People](./media/central-desktop-tutorial/ic781051.png "People")
+ ![People.](./media/central-desktop-tutorial/members.png "People")
3. In the **Email Address of New Members** box, type an Azure AD account that you want to provision, and then select **Next**.
- ![Email addresses of new members](./media/central-desktop-tutorial/ic781052.png "Email addresses of new members")
+ ![Email addresses of new members.](./media/central-desktop-tutorial/add-members.png "Email addresses of new members")
4. Select **Add Internal member(s)**.
- ![Add internal member](./media/central-desktop-tutorial/ic781053.png "Add internal member")
+ ![Add internal member.](./media/central-desktop-tutorial/account.png "Add internal member")
> [!NOTE] > The users that you add receive an email that includes a confirmation link for activating their accounts.
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the Central Desktop tile in the Access Panel, you should be automatically signed in to the Central Desktop for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Central Desktop Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to Central Desktop Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Central Desktop tile in the My Apps, this will redirect to Central Desktop Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Central Desktop you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Eccentex Appbase For Azure Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/eccentex-appbase-for-azure-tutorial.md
+
+ Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with Eccentex AppBase for Azure | Microsoft Docs'
+description: Learn how to configure single sign-on between Azure Active Directory and Eccentex AppBase for Azure.
++++++++ Last updated : 08/02/2021++++
+# Tutorial: Azure Active Directory single sign-on (SSO) integration with Eccentex AppBase for Azure
+
+In this tutorial, you'll learn how to integrate Eccentex AppBase for Azure with Azure Active Directory (Azure AD). When you integrate Eccentex AppBase for Azure with Azure AD, you can:
+
+* Control in Azure AD who has access to Eccentex AppBase for Azure.
+* Enable your users to be automatically signed-in to Eccentex AppBase for Azure with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Eccentex AppBase for Azure single sign-on (SSO) enabled subscription.
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Eccentex AppBase for Azure supports **SP** initiated SSO.
+
+* Eccentex AppBase for Azure supports **Just In Time** user provisioning.
+
+## Add Eccentex AppBase for Azure from the gallery
+
+To configure the integration of Eccentex AppBase for Azure into Azure AD, you need to add Eccentex AppBase for Azure from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Eccentex AppBase for Azure** in the search box.
+1. Select **Eccentex AppBase for Azure** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Eccentex AppBase for Azure
+
+Configure and test Azure AD SSO with Eccentex AppBase for Azure using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Eccentex AppBase for Azure.
+
+To configure and test Azure AD SSO with Eccentex AppBase for Azure, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Eccentex AppBase for Azure SSO](#configure-eccentex-appbase-for-azure-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Eccentex AppBase for Azure test user](#create-eccentex-appbase-for-azure-test-user)** - to have a counterpart of B.Simon in Eccentex AppBase for Azure that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Eccentex AppBase for Azure** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, perform the following steps:
+
+ a. In the **Identifier (Entity ID)** text box, type a URL using one of the following patterns:
+
+ | **Identifier** |
+ |--|
+ | `https://<CustomerName>.appbase.com/Ecx.Web` |
+ | `https://<CustomerName>.eccentex.com:<PortNumber>/Ecx.Web` |
+
+ b. In the **Sign on URL** text box, type a URL using one of the following patterns:
+
+ | **Sign on URL** |
+ ||
+ | `https://<CustomerName>.appbase.com/Ecx.Web/Account/sso?tenantCode=<TenantCode>&authCode=<AuthConfigurationCode>`|
+ | `https://<CustomerName>.eccentex.com:<PortNumber>/Ecx.Web/Account/sso?tenantCode=<TenantCode>&authCode=<AuthConfigurationCode>` |
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Identifier and Sign on URL. Contact [Eccentex AppBase for Azure Client support team](mailto:eccentex.support@eccentex.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Raw)** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/certificateraw.png)
+
+1. On the **Set up Eccentex AppBase for Azure** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Eccentex AppBase for Azure.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Eccentex AppBase for Azure**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Eccentex AppBase for Azure SSO
+
+1. Log in to your Eccentex AppBase for Azure company site as an administrator.
+
+1. Go to **Gear** icon and click **Manage Users**.
+
+ ![Screenshot shows settings of SAML account.](./media/eccentex-appbase-for-azure-tutorial/settings.png "Account")
+
+1. Navigate to **User Management** > **Auth Configurations** and click **Add SAML** button.
+
+ ![Screenshot shows SAML settings.](./media/eccentex-appbase-for-azure-tutorial/users.png "SAML settings")
+
+1. In the **New SAML Configuration** page, perform the following steps.
+
+ ![Screenshot shows the Azure SAML configuration.](./media/eccentex-appbase-for-azure-tutorial/configuration.png "SAML Configuration")
+
+ 1. In the **Name** textbox, type a short configuration name.
+
+ 1. In the **Issuer Url** textbox, enter the Azure **Application ID** which you have copied from the Azure portal.
+
+ 1. Copy **Application Url** value, paste this value into the **Identifier(Entity ID)** text box in the **Basic SAML Configuration** section in the Azure portal.
+
+ 1. In the **AppBase New Users Onboarding**, select **Invitation Only** from the dropdown.
+
+ 1. In the **AppBase Authentication Failure Behavior**, select **Display Error Page** from the dropdown.
+
+ 1. Select **Signature Digest Method** and **Signature Method** according to your certificate encryption.
+
+ 1. In the **Use Certificate**, select **Manual Uploading** from the dropdown.
+
+ 1. In the **Authentication Context Class Name**, select **Password** from the dropdown.
+
+ 1. In the **Service Provider to Identity Provider Binding**, select **HTTP-Redirect** from the dropdown.
+
+ > [!NOTE]
+ > Make sure the **Sign Outbound Requests** is not checked.
+
+ 1. Copy **Assertion Consumer Service Url** value, paste this value into the **Reply URL** text box in the **Basic SAML Configuration** section in the Azure portal.
+
+ 1. In the **Auth Request Destination Url** textbox, paste the **Login URL** value which you have copied from the Azure portal.
+
+ 1. In the **Service Provider Resource URL** textbox, paste the **Login URL** value which you have copied from the Azure portal.
+
+ 1. In the **Artifact Identification Url** textbox, paste the **Login URL** value which you have copied from the Azure portal.
+
+ 1. In the **Auth Request Protocol Binding**, select **HTTP-POST** from the dropdown.
+
+ 1. In the **Auth Request Name ID Policy**, select **Persistent** from the dropdown.
+
+ 1. In the **Artifact Responder URL** textbox, paste the **Login URL** value which you have copied from the Azure portal.
+
+ 1. Enable **Enforce Response Signature Verification** checkbox.
+
+ 1. Open the downloaded **Certificate(Raw)** from the Azure portal into Notepad and paste the content into the **SAML Mutual Certificate Upload** textbox.
+
+ 1. In the **Logout Response Protocol Binding**, select **HTTP-POST** from the dropdown.
+
+ 1. In the **AppBase Custom Logout URL** textbox, paste the **Logout URL** value which you have copied from the Azure portal.
+
+ 1. Click **Save**.
+
+### Create Eccentex AppBase for Azure test user
+
+In this section, a user called Britta Simon is created in Eccentex AppBase for Azure. Eccentex AppBase for Azure supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Eccentex AppBase for Azure, a new one is created after authentication.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on **Test this application** in Azure portal. This will redirect to Eccentex AppBase for Azure Sign-on URL where you can initiate the login flow.
+
+* Go to Eccentex AppBase for Azure Sign-on URL directly and initiate the login flow from there.
+
+* You can use Microsoft My Apps. When you click the Eccentex AppBase for Azure tile in the My Apps, this will redirect to Eccentex AppBase for Azure Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure Eccentex AppBase for Azure you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Github Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/github-tutorial.md
Follow these steps to enable Azure AD SSO in the Azure portal.
1. On the **Basic SAML Configuration** section, enter the values for the following fields:
- a. In the **Sign on URL** text box, type a URL using the following pattern:
- `https://github.com/orgs/<Organization ID>/sso`
-
- b. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
+ a. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
`https://github.com/orgs/<Organization ID>`
- c. In the **Reply URL** text box, type a URL using the following pattern:
+ b. In the **Reply URL** text box, type a URL using the following pattern:
`https://github.com/orgs/<Organization ID>/saml/consume`
+ c. In the **Sign on URL** text box, type a URL using the following pattern:
+ `https://github.com/orgs/<Organization ID>/sso`
+ > [!NOTE] > Please note that these are not the real values. You have to update these values with the actual Sign on URL, Identifier and Reply URL. Here we suggest you to use the unique value of string in the Identifier. Go to GitHub Admin section to retrieve these values.
In this section, you test your Azure AD single sign-on configuration with follow
## Next steps
-Once you configure GitHub you can enforce Session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad)
+Once you configure GitHub you can enforce Session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad)
active-directory Impacriskmanager Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/impacriskmanager-tutorial.md
Previously updated : 03/14/2019 Last updated : 08/04/2021 # Tutorial: Azure Active Directory integration with IMPAC Risk Manager
-In this tutorial, you learn how to integrate IMPAC Risk Manager with Azure Active Directory (Azure AD).
-Integrating IMPAC Risk Manager with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate IMPAC Risk Manager with Azure Active Directory (Azure AD). When you integrate IMPAC Risk Manager with Azure AD, you can:
-* You can control in Azure AD who has access to IMPAC Risk Manager.
-* You can enable your users to be automatically signed-in to IMPAC Risk Manager (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to IMPAC Risk Manager.
+* Enable your users to be automatically signed-in to IMPAC Risk Manager with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with IMPAC Risk Manager, you need the following items:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* IMPAC Risk Manager single sign-on enabled subscription
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* IMPAC Risk Manager single sign-on (SSO) enabled subscription.
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* IMPAC Risk Manager supports **SP and IDP** initiated SSO
-
-## Adding IMPAC Risk Manager from the gallery
-
-To configure the integration of IMPAC Risk Manager into Azure AD, you need to add IMPAC Risk Manager from the gallery to your list of managed SaaS apps.
-
-**To add IMPAC Risk Manager from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
+* IMPAC Risk Manager supports **SP and IDP** initiated SSO.
-4. In the search box, type **IMPAC Risk Manager**, select **IMPAC Risk Manager** from result panel then click **Add** button to add the application.
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
- ![IMPAC Risk Manager in the results list](common/search-new-app.png)
+## Add IMPAC Risk Manager from the gallery
-## Configure and test Azure AD single sign-on
-
-In this section, you configure and test Azure AD single sign-on with IMPAC Risk Manager based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in IMPAC Risk Manager needs to be established.
-
-To configure and test Azure AD single sign-on with IMPAC Risk Manager, you need to complete the following building blocks:
-
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure IMPAC Risk Manager Single Sign-On](#configure-impac-risk-manager-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create IMPAC Risk Manager test user](#create-impac-risk-manager-test-user)** - to have a counterpart of Britta Simon in IMPAC Risk Manager that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure the integration of IMPAC Risk Manager into Azure AD, you need to add IMPAC Risk Manager from the gallery to your list of managed SaaS apps.
-### Configure Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **IMPAC Risk Manager** in the search box.
+1. Select **IMPAC Risk Manager** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure and test Azure AD SSO for IMPAC Risk Manager
-To configure Azure AD single sign-on with IMPAC Risk Manager, perform the following steps:
+Configure and test Azure AD SSO with IMPAC Risk Manager using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in IMPAC Risk Manager.
-1. In the [Azure portal](https://portal.azure.com/), on the **IMPAC Risk Manager** application integration page, select **Single sign-on**.
+To configure and test Azure AD SSO with IMPAC Risk Manager, perform the following steps:
- ![Configure single sign-on link](common/select-sso.png)
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure IMPAC Risk Manager SSO](#configure-impac-risk-manager-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create IMPAC Risk Manager test user](#create-impac-risk-manager-test-user)** - to have a counterpart of B.Simon in IMPAC Risk Manager that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
+## Configure Azure AD SSO
- ![Single sign-on select mode](common/select-saml-option.png)
+Follow these steps to enable Azure AD SSO in the Azure portal.
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
+1. In the Azure portal, on the **IMPAC Risk Manager** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following steps:
- ![Screenshot shows the Basic SAML Configuration, where you can enter Identifier, Reply U R L, and select Save.](common/idp-intiated.png)
-
- a. In the **Identifier** text box, type a value provided by IMPAC
+ a. In the **Identifier** text box, type a value provided by IMPAC.
- b. In the **Reply URL** text box, type a URL using the following pattern:
+ b. In the **Reply URL** text box, type a URL using one of the following patterns:
| Environment | URL Pattern | | | |
To configure Azure AD single sign-on with IMPAC Risk Manager, perform the follow
5. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- ![Screenshot shows Set additional U R Ls where you can enter a Sign on U R L.](common/metadata-upload-additional-signon.png)
-
- In the **Sign-on URL** text box, type a URL using the following pattern:
+ In the **Sign-on URL** text box, type a URL using one of the following patterns:
| Environment | URL Pattern | | | |
To configure Azure AD single sign-on with IMPAC Risk Manager, perform the follow
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure AD Identifier
-
- c. Logout URL
-
-### Configure IMPAC Risk Manager Single Sign-On
-
-To configure single sign-on on **IMPAC Risk Manager** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [IMPAC Risk Manager support team](mailto:rmsupport@Impac.co.nz). They set this setting to have the SAML SSO connection set properly on both sides.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type **brittasimon@yourcompanydomain.extension**
- For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
+In this section, you'll create a test user in the Azure portal called B.Simon.
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to IMPAC Risk Manager.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to IMPAC Risk Manager.
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **IMPAC Risk Manager**.
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **IMPAC Risk Manager**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![Enterprise applications blade](common/enterprise-applications.png)
+## Configure IMPAC Risk Manager SSO
-2. In the applications list, select **IMPAC Risk Manager**.
-
- ![The IMPAC Risk Manager link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
+To configure single sign-on on **IMPAC Risk Manager** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [IMPAC Risk Manager support team](mailto:rmsupport@Impac.co.nz). They set this setting to have the SAML SSO connection set properly on both sides.
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
+### Create IMPAC Risk Manager test user
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
+In this section, you create a user called Britta Simon in IMPAC Risk Manager. Work with [IMPAC Risk Manager support team](mailto:rmsupport@Impac.co.nz) to add the users in the IMPAC Risk Manager platform. Users must be created and activated before you use single sign-on.
-7. In the **Add Assignment** dialog click the **Assign** button.
+## Test SSO
-### Create IMPAC Risk Manager test user
+In this section, you test your Azure AD single sign-on configuration with following options.
-In this section, you create a user called Britta Simon in IMPAC Risk Manager. Work with [IMPAC Risk Manager support team](mailto:rmsupport@Impac.co.nz) to add the users in the IMPAC Risk Manager platform. Users must be created and activated before you use single sign-on.
+#### SP initiated:
-### Test single sign-on
+* Click on **Test this application** in Azure portal. This will redirect to IMPAC Risk Manager Sign on URL where you can initiate the login flow.
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+* Go to IMPAC Risk Manager Sign-on URL directly and initiate the login flow from there.
-When you click the IMPAC Risk Manager tile in the Access Panel, you should be automatically signed in to the IMPAC Risk Manager for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the IMPAC Risk Manager for which you set up the SSO.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the IMPAC Risk Manager tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the IMPAC Risk Manager for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure IMPAC Risk Manager you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Saphana Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/saphana-tutorial.md
Previously updated : 12/27/2020 Last updated : 08/02/2021 # Tutorial: Azure Active Directory integration with SAP HANA
To test the steps in this tutorial, follow these recommendations:
In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* SAP HANA supports **IDP** initiated SSO
-* SAP HANA supports **just-in-time** user provisioning
+* SAP HANA supports **IDP** initiated SSO.
+* SAP HANA supports **just-in-time** user provisioning.
> [!NOTE] > Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
If you need to create a user manually, take the following steps:
3. Select **Add** to add the SAML IDP. Select the appropriate SAML IDP, and then select **OK**.
-4. Add the **External Identity** (in this case, BrittaSimon) or choose **Any**. Then select **OK**.
+4. Add the **External Identity** (in this case, BrittaSimon). Then select **OK**.
> [!Note]
- > If the **Any** check box is not selected, then the user name in HANA needs to exactly match the name of the user in the UPN before the domain suffix. (For example, BrittaSimon@contoso.com becomes BrittaSimon in HANA.)
+ > You have to populate the **External Identity** field for the user and that has to match the **NameID** field in the SAML token from Azure AD. **Any** checkbox should not be checked as this option requires the IDP to send SPProvderID property in the NameID Field which is right now not supported by Azure AD. Plese refer [this](https://help.sap.com/viewer/b3ee5778bc2e4a089d3299b82ec762a7/2.0.05/en-US/db6db355bb571014b56eb25057daec5f.html) document for more details.
5. For testing purposes, assign all **XS** roles to the user.
active-directory Zylo Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/active-directory/saas-apps/zylo-tutorial.md
na Previously updated : 08/12/2020 Last updated : 08/04/2021
In this tutorial, you'll learn how to integrate Zylo with Azure Active Directory
* Enable your users to be automatically signed-in to Zylo with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items:
To get started, you need the following items:
In this tutorial, you configure and test Azure AD SSO in a test environment.
-* Zylo supports **SP and IDP** initiated SSO
-* Zylo supports **Just In Time** user provisioning
-* Once you configure Zylo you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-any-app).
+* Zylo supports **SP and IDP** initiated SSO.
+* Zylo supports **Just In Time** user provisioning.
> [!NOTE] > Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
-## Adding Zylo from the gallery
+## Add Zylo from the gallery
To configure the integration of Zylo into Azure AD, you need to add Zylo from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **Zylo** in the search box. 1. Select **Zylo** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. - ## Configure and test Azure AD SSO for Zylo Configure and test Azure AD SSO with Zylo using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Zylo.
-To configure and test Azure AD SSO with Zylo, complete the following building blocks:
+To configure and test Azure AD SSO with Zylo, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature. 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
To configure and test Azure AD SSO with Zylo, complete the following building bl
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Zylo** application integration page, find the **Manage** section and select **single sign-on**.
+1. In the Azure portal, on the **Zylo** application integration page, find the **Manage** section and select **single sign-on**.
1. On the **Select a single sign-on method** page, select **SAML**.
-1. On the **Set up single sign-on with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
-1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, enter the values for the following fields:
+1. On the **Basic SAML Configuration** section, if you wish to configure the application in **IDP** initiated mode, perform the following step:
In the **Reply URL** text box, type a URL using the following pattern: `https://api.zylo.com/saml/sso/azuread/<CUSTOMER_NAME>`
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **Zylo**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. Click on **Menu** of Zylo at the top-right corner and select **Admin**
- ![Configuration for Zylo](./media/zylo-tutorial/click-admin.png)
+ ![Configuration for Zylo.](./media/zylo-tutorial/click-admin.png)
1. In the **Admin** page, go to the **Saml Info** tab and perform the following steps:
- ![Configuration](./media/zylo-tutorial/saml-configuration-zylo.png)
+ ![Zylo SAML Configuration.](./media/zylo-tutorial/configuration.png)
a. Change the **Zylo SAML Configuration** to **On**.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
f. Click on **Save**. - ### Create Zylo test user In this section, a user called B.Simon is created in Zylo. Zylo supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Zylo, a new one is created after authentication. ## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
-When you click the Zylo tile in the Access Panel, you should be automatically signed in to the Zylo for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](../user-help/my-apps-portal-end-user-access.md).
+* Click on **Test this application** in Azure portal. This will redirect to Zylo Sign on URL where you can initiate the login flow.
-## Additional resources
+* Go to Zylo Sign-on URL directly and initiate the login flow from there.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+#### IDP initiated:
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Zylo for which you set up the SSO.
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the Zylo tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Zylo for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [Try Zylo with Azure AD](https://aad.portal.azure.com/)
+## Next steps
-- [What is session control in Microsoft Cloud App Security?](/cloud-app-security/proxy-intro-aad)
+Once you configure Zylo you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
advisor Resource Graph Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/advisor/resource-graph-samples.md
Title: Azure Resource Graph sample queries for Azure Advisor description: Sample Azure Resource Graph queries for Azure Advisor showing use of resource types and tables to access Azure Advisor related resources and properties. Previously updated : 07/21/2021 Last updated : 08/04/2021
aks Concepts Identity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/concepts-identity.md
When creating a cluster with specific attributes, you will need the following ad
| `Microsoft.Network/virtualNetworks/subnets/read` | Required if using an internal load balancer in another resource group. Required to verify if a subnet already exists for the internal load balancer in the resource group. | | `Microsoft.Network/privatednszones/*` | Required if using a private DNS zone in another resource group such as a custom privateDNSZone. |
+## AKS Node Access
+
+By default Node Access is not required for AKS. The following access is needed for the node if a specific component is leveraged.
+
+| Access | Reason |
+|||
+| `kubelet` | Required for customer to grant MSI access to ACR. |
+| `http app routing` | Required for write permission to "random name".aksapp.io. |
+| `container insights` | Required for customer to grant permission to the Log Analytics workspace. |
+ ## Kubernetes RBAC Kubernetes RBAC provides granular filtering of user actions. With this control mechanism:
aks Node Auto Repair https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/aks/node-auto-repair.md
If AKS identifies an unhealthy node that remains unhealthy for 10 minutes, AKS t
1. Reboot the node. 1. If the reboot is unsuccessful, reimage the node.
-1. If the reimage is unsuccessful, create and reimage a new node.
Alternative remediations are investigated by AKS engineers if auto-repair is unsuccessful.
app-service Manage Custom Dns Buy Domain https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/app-service/manage-custom-dns-buy-domain.md
After you purchase the App Service Domain, you have five days to cancel your pur
1. In the **App Service Domains** section, select the domain you want to configure.
-1. In the domain's left navigation, select **Hostname bindings**. The hostname bindings from all Azure services are listed here.
+1. In the domain's left navigation, select **Locks**.
- ![Screenshot that shows the Hostname bindings page.](./media/custom-dns-web-site-buydomains-web-app/dncmntask-cname-buydomains-hostname-bindings.png)
+ A delete lock has been created for your domain. As long as a delete lock exists, you can't delete the App Service domain.
-1. Delete each hostname binding by selecting **...** > **Delete**. After all the bindings are deleted, select **Save**.
-
- <!-- ![Screenshot that shows where to delete the hostname bindings.](./media/custom-dns-web-site-buydomains-web-app/dncmntask-cname-buydomains-delete-hostname-bindings.png) -->
+1. Click **Delete** to remove the lock.
1. In the domain's left navigation, select **Overview**.
applied-ai-services Data Feeds From Different Sources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/applied-ai-services/metrics-advisor/data-feeds-from-different-sources.md
# How-to: Connect different data sources
-Use this article to find the settings and requirements for connecting different types of data sources to Metrics Advisor. Make sure to read how to [Onboard your data](how-tos/onboard-your-data.md) to learn about the key concepts for using your data with Metrics Advisor.
+Use this article to find the settings and requirements for connecting different types of data sources to Azure Metrics Advisor. To learn about using your data with Metrics Advisor, see [Onboard your data](how-tos/onboard-your-data.md).
## Supported authentication types | Authentication types | Description | | |-|
-|**Basic** | You need to provide basic parameters for accessing data sources. For example, a connection string or a password. Data feed admins can view these credentials. |
-| **Azure Managed Identity** | [Managed identities](../../active-directory/managed-identities-azure-resources/overview.md) for Azure resources is a feature of Azure Active Directory. It provides Azure services with an automatically managed identity in Azure AD. You can use the identity to authenticate to any service that supports Azure AD authentication.|
-| **Azure SQL Connection String**| Store your AzureSQL connection string as a **credential entity** in Metrics Advisor, and use it directly each time when onboarding metrics data. Only admins of the credential entity can view these credentials, but enables authorized viewers to create data feeds without needing to know details for the credentials. |
-| **Data Lake Gen2 Shared Key**| Store your data lake account key as a **credential entity** in Metrics Advisor and use it directly each time when onboarding metrics data. Only admins of the Credential entity can view these credentials, but enables authorized viewers to create data feed without needing to know the credential details.|
-| **Service principal**| Store your [Service Principal](../../active-directory/develop/app-objects-and-service-principals.md) as a **credential entity** in Metrics Advisor and use it directly each time when onboarding metrics data. Only admins of Credential entity can view the credentials, but enables authorized viewers to create data feed without needing to know the credential details.|
-| **Service principal from key vault**|Store your [Service Principal in a Key Vault](/azure-stack/user/azure-stack-key-vault-store-credentials) as a **credential entity** in Metrics Advisor and use it directly each time when onboarding metrics data. Only admins of a **credential entity** can view the credentials, but also leave viewers able to create data feed without needing to know detailed credentials. |
+|**Basic** | You need to provide basic parameters for accessing data sources. For example, you can use a connection string or a password. Data feed admins can view these credentials. |
+| **Azure managed identity** | [Managed identities](../../active-directory/managed-identities-azure-resources/overview.md) for Azure resources is a feature of Azure Active Directory (Azure AD). It provides Azure services with an automatically managed identity in Azure AD. You can use the identity to authenticate to any service that supports Azure AD authentication.|
+| **Azure SQL connection string**| Store your Azure SQL connection string as a credential entity in Metrics Advisor, and use it directly each time you import metrics data. Only admins of the credential entity can view these credentials, but authorized viewers can create data feeds without needing to know details for the credentials. |
+| **Azure Data Lake Storage Gen2 shared key**| Store your data lake account key as a credential entity in Metrics Advisor, and use it directly each time you import metrics data. Only admins of the credential entity can view these credentials, but authorized viewers can create data feeds without needing to know details for the credentials.|
+| **Service principal**| Store your [service principal](../../active-directory/develop/app-objects-and-service-principals.md) as a credential entity in Metrics Advisor, and use it directly each time you import metrics data. Only admins of the credential entity can view the credentials, but authorized viewers can create data feeds without needing to know details for the credentials.|
+| **Service principal from key vault**|Store your [service principal in a key vault](/azure-stack/user/azure-stack-key-vault-store-credentials) as a credential entity in Metrics Advisor, and use it directly each time you import metrics data. Only admins of a credential entity can view the credentials, but viewers can create data feeds without needing to know details for the credentials. |
-## Data sources supported and corresponding authentication types
+## Data sources and corresponding authentication types
-| Data sources | Authentication Types |
+| Data sources | Authentication types |
|-| |
-|[**Azure Application Insights**](#appinsights) | Basic |
-|[**Azure Blob Storage (JSON)**](#blob) | Basic<br>ManagedIdentity |
-|[**Azure Cosmos DB (SQL)**](#cosmosdb) | Basic |
-|[**Azure Data Explorer (Kusto)**](#kusto) | Basic<br>Managed Identity<br>Service principal<br>Service principal from key vault |
-|[**Azure Data Lake Storage Gen2**](#adl) | Basic<br>Data Lake Gen2 Shared Key<br>Service principal<br>Service principal from key vault |
-|[**Azure Event Hubs**](#eventhubs) | Basic |
-|[**Azure Log Analytics**](#log) | Basic<br>Service principal<br>Service principal from key vault |
-|[**Azure SQL Database / SQL Server**](#sql) | Basic<br>Managed Identity<br>Service principal<br>Service principal from key vault<br>Azure SQL Connection String |
-|[**Azure Table Storage**](#table) | Basic |
-|[**InfluxDB (InfluxQL)**](#influxdb) | Basic |
-|[**MongoDB**](#mongodb) | Basic |
-|[**MySQL**](#mysql) | Basic |
-|[**PostgreSQL**](#pgsql) | Basic|
-|[**Local files(CSV)**](#csv) | Basic|
+|[Application Insights](#appinsights) | Basic |
+|[Azure Blob Storage (JSON)](#blob) | Basic<br>Managed identity |
+|[Azure Cosmos DB (SQL)](#cosmosdb) | Basic |
+|[Azure Data Explorer (Kusto)](#kusto) | Basic<br>Managed identity<br>Service principal<br>Service principal from key vault |
+|[Azure Data Lake Storage Gen2](#adl) | Basic<br>Data Lake Storage Gen2 shared key<br>Service principal<br>Service principal from key vault |
+|[Azure Event Hubs](#eventhubs) | Basic |
+|[Azure Monitor Logs](#log) | Basic<br>Service principal<br>Service principal from key vault |
+|[Azure SQL Database / SQL Server](#sql) | Basic<br>Managed identity<br>Service principal<br>Service principal from key vault<br>Azure SQL connection string |
+|[Azure Table Storage](#table) | Basic |
+|[InfluxDB (InfluxQL)](#influxdb) | Basic |
+|[MongoDB](#mongodb) | Basic |
+|[MySQL](#mysql) | Basic |
+|[PostgreSQL](#pgsql) | Basic|
+|[Local files (CSV)](#csv) | Basic|
The following sections specify the parameters required for all authentication types within different data source scenarios.
-## <span id="appinsights">Azure Application Insights</span>
+## <span id="appinsights">Application Insights</span>
-* **Application ID**: This is used to identify this application when using the Application Insights API. To get the Application ID, take the following steps:
+* **Application ID**: This is used to identify this application when you're using the Application Insights API. To get the application ID, follow these steps:
- 1. From your Application Insights resource, click API Access.
+ 1. From your Application Insights resource, select **API Access**.
- ![Get application ID from your Application Insights resource](media/portal-app-insights-app-id.png)
+ ![Screenshot that shows how to get the application ID from your Application Insights resource.](media/portal-app-insights-app-id.png)
- 2. Copy the Application ID generated into **Application ID** field in Metrics Advisor.
+ 2. Copy the application ID generated into the **Application ID** field in Metrics Advisor.
-* **API Key**: API keys are used by applications outside the browser to access this resource. To get the API key, take the following steps:
+* **API key**: API keys are used by applications outside the browser to access this resource. To get the API key, follow these steps:
- 1. From the Application Insights resource, click **API Access**.
+ 1. From the Application Insights resource, select **API Access**.
- 2. Click **Create API Key**.
+ 2. Select **Create API key**.
- 3. Enter a short description, check the **Read telemetry** option, and click the **Generate key** button.
+ 3. Enter a short description, select the **Read telemetry** option, and select **Generate key**.
- ![Get API key in Azure portal](media/portal-app-insights-app-id-api-key.png)
+ ![Screenshot that shows how to get the API key in the Azure portal.](media/portal-app-insights-app-id-api-key.png)
- > [!WARNING]
- > Copy this **API key** and save it because this key will never be shown to you again. If you lose this key, you have to create a new one.
+ > [!IMPORTANT]
+ > Copy and save this API key. It will never be shown to you again. If you lose this key, you have to create a new one.
4. Copy the API key to the **API key** field in Metrics Advisor.
-* **Query**: Azure Application Insights logs are built on Azure Data Explorer, and Azure Monitor log queries use a version of the same Kusto query language. The [Kusto query language documentation](/azure/data-explorer/kusto/query) has all of the details for the language and should be your primary resource for writing a query against Application Insights.
+* **Query**: Application Insights logs are built on Azure Data Explorer, and Azure Monitor log queries use a version of the same Kusto query language. The [Kusto query language documentation](/azure/data-explorer/kusto/query) should be your primary resource for writing a query against Application Insights.
Sample query:
The following sections specify the parameters required for all authentication ty
## <span id="blob">Azure Blob Storage (JSON)</span>
-* **Connection String**: There are two authentication types for Azure Blob Storage(JSON), one is **Basic**, the other is **Managed Identity**.
+* **Connection string**: There are two authentication types for Azure Blob Storage (JSON):
- * **Basic**: See [Configure Azure Storage connection strings](../../storage/common/storage-configure-connection-string.md#configure-a-connection-string-for-an-azure-storage-account) for information on retrieving this string. Also, you can visit the Azure portal for your Azure Blob Storage resource, and find connection string directly in the **Settings > Access keys** section.
+ * **Basic**: See [Configure Azure Storage connection strings](../../storage/common/storage-configure-connection-string.md#configure-a-connection-string-for-an-azure-storage-account) for information on retrieving this string. Also, you can visit the Azure portal for your Azure Blob Storage resource, and find the connection string directly in **Settings** > **Access keys**.
- * **Managed Identity**: Managed identities for Azure resources can authorize access to blob and queue data using Azure AD credentials from applications running in Azure virtual machines (VMs), function apps, virtual machine scale sets, and other services.
+ * **Managed identity**: Managed identities for Azure resources can authorize access to blob and queue data. The feature uses Azure AD credentials from applications running in Azure virtual machines (VMs), function apps, virtual machine scale sets, and other services.
- You can create a managed identity in Azure portal for your Azure Blob Storage resource, and choose **role assignments** in **Access Control(IAM)** section, then click **add** to create. A suggested role type is: Storage Blob Data Reader. For more details, refer to [Use managed identity to access Azure Storage](../../active-directory/managed-identities-azure-resources/tutorial-vm-windows-access-storage.md#grant-access-1).
+ You can create a managed identity in the Azure portal for your Azure Blob Storage resource. In **Access Control (IAM)**, select **Role assignments**, and then select **Add**. A suggested role type is: **Storage Blob Data Reader**. For more details, refer to [Use managed identity to access Azure Storage](../../active-directory/managed-identities-azure-resources/tutorial-vm-windows-access-storage.md#grant-access-1).
- ![MI blob](media/managed-identity-blob.png)
+ ![Screenshot that shows a managed identity blob.](media/managed-identity-blob.png)
-* **Container**: Metrics Advisor expects time series data stored as Blob files (one Blob per timestamp) under a single container. This is the container name field.
+* **Container**: Metrics Advisor expects time series data to be stored as blob files (one blob per timestamp), under a single container. This is the container name field.
-* **Blob Template**: Metrics Advisor uses path to find the json file in your Blob storage. This is an example of a Blob file template, which is used to find the json file in your Blob storage: `%Y/%m/FileName_%Y-%m-%d-%h-%M.json`. "%Y/%m" is the path, if you have "%d" in your path, you can add after "%m". If your JSON file is named by date, you could also use `%Y-%m-%d-%h-%M.json`.
+* **Blob template**: Metrics Advisor uses a path to find the JSON file in Blob Storage. This is an example of a blob file template, which is used to find the JSON file in Blob Storage: `%Y/%m/FileName_%Y-%m-%d-%h-%M.json`. `%Y/%m` is the path, and if you have `%d` in your path, you can add it after `%m`. If your JSON file is named by date, you can also use `%Y-%m-%d-%h-%M.json`.
The following parameters are supported:
- * `%Y` is the year formatted as `yyyy`
- * `%m` is the month formatted as `MM`
- * `%d` is the day formatted as `dd`
- * `%h` is the hour formatted as `HH`
- * `%M` is the minute formatted as `mm`
+ * `%Y` is the year, formatted as `yyyy`.
+ * `%m` is the month, formatted as `MM`.
+ * `%d` is the day, formatted as `dd`.
+ * `%h` is the hour, formatted as `HH`.
+ * `%M` is the minute, formatted as `mm`.
- For example, in the following dataset, the blob template should be "%Y/%m/%d/00/JsonFormatV2.json".
+ For example, in the following dataset, the blob template should be `%Y/%m/%d/00/JsonFormatV2.json`.
- ![blob template](media/blob-template.png)
+ ![Screenshot that shows the blob template.](media/blob-template.png)
-* **JSON format version**: Defines the data schema in the JSON files. Currently Metrics Advisor supports two versions, you can choose one to fill in the field:
+* **JSON format version**: Defines the data schema in the JSON files. Metrics Advisor supports the following versions. You can choose one to fill in the field:
- * **v1** (Default value)
+ * **v1** (default value)
Only the metrics *Name* and *Value* are accepted. For example:
The following sections specify the parameters required for all authentication ty
## <span id="cosmosdb">Azure Cosmos DB (SQL)</span>
-* **Connection String**: The connection string to access your Azure Cosmos DB. This can be found in the Cosmos DB resource in Azure portal, in **Keys**. Also, you can find more information in [Secure access to data in Azure Cosmos DB](../../cosmos-db/secure-access-to-data.md).
-* **Database**: The database to query against. This can be found in the **Browse** page under **Containers** section in the Azure portal.
-* **Collection ID**: The collection ID to query against. This can be found in the **Browse** page under **Containers** section in the Azure portal.
-* **SQL Query**: A SQL query to get and formulate data into multi-dimensional time series data. You can use the `@IntervalStart` and `@IntervalEnd` variables in your query. They should be formatted: `yyyy-MM-ddTHH:mm:ssZ`.
+* **Connection string**: The connection string to access your Azure Cosmos DB. This can be found in the Azure Cosmos DB resource in the Azure portal, in **Keys**. For more information, see [Secure access to data in Azure Cosmos DB](../../cosmos-db/secure-access-to-data.md).
+* **Database**: The database to query against. In the Azure portal, under **Containers**, go to **Browse** to find the database.
+* **Collection ID**: The collection ID to query against. In the Azure portal, under **Containers**, go to **Browse** to find the collection ID.
+* **SQL query**: A SQL query to get and formulate data into multi-dimensional time series data. You can use the `@IntervalStart` and `@IntervalEnd` variables in your query. They should be formatted as follows: `yyyy-MM-ddTHH:mm:ssZ`.
Sample query:
The following sections specify the parameters required for all authentication ty
SELECT [TimestampColumn], [DimensionColumn], [MetricColumn] FROM [TableName] WHERE [TimestampColumn] >= @IntervalStart and [TimestampColumn] < @IntervalEnd ```
- For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md) for more specific examples.
+ For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md).
## <span id="kusto">Azure Data Explorer (Kusto)</span>
-* **Connection String**: There are four authentication types for Azure Data Explorer (Kusto), they are **Basic**, **Service Principal**, **Service Principal From KeyVault**, and **Managed Identity**. The data source in connection string should be in URI format(starts with 'https'), you can find the URI in Azure portal.
+* **Connection string**: There are four authentication types for Azure Data Explorer (Kusto): basic, service principal, service principal from key vault, and managed identity. The data source in the connection string should be in the URI format (starts with "https"). You can find the URI in the Azure portal.
- * **Basic**: Metrics Advisor supports accessing Azure Data Explorer(Kusto) by using Azure AD application authentication. You need to create and register an Azure AD application and then authorize it to access an Azure Data Explorer database, see detail in [Create an AAD app registration in Azure Data Explorer](/azure/data-explorer/provision-azure-ad-app) documentation.
- Here's an example of connection string:
+ * **Basic**: Metrics Advisor supports accessing Azure Data Explorer (Kusto) by using Azure AD application authentication. You need to create and register an Azure AD application, and then authorize it to access an Azure Data Explorer database. For more information, see [Create an Azure AD app registration in Azure Data Explorer](/azure/data-explorer/provision-azure-ad-app). Here's an example of connection string:
``` Data Source=<URI Server>;Initial Catalog=<Database>;AAD Federated Security=True;Application Client ID=<Application Client ID>;Application Key=<Application Key>;Authority ID=<Tenant ID> ```
- * **Service Principal**: A service principal is a concrete instance created from the application object and inherits certain properties from that application object. The service principal object defines what the app can actually do in the specific tenant, who can access the app, and what resources the app can access. There are 3 steps to use service principal in Metrics Advisor.
+ * **Service principal**: A service principal is a concrete instance created from the application object. The service principal inherits certain properties from that application object. The service principal object defines what the app can actually do in the specific tenant, who can access the app, and what resources the app can access. To use a service principal in Metrics Advisor:
- **1. Create Azure AD application registration.** See first part in [Create an AAD app registration in Azure Data Explorer](/azure/data-explorer/provision-azure-ad-app).
+ 1. Create the Azure AD application registration. For more information, see [Create an Azure AD app registration in Azure Data Explorer](/azure/data-explorer/provision-azure-ad-app).
- **2. Manage Azure Data Explorer database permissions.** See [Manage Azure Data Explorer database permissions](/azure/data-explorer/manage-database-permissions) to know about Service Principal and manage permissions.
+ 1. Manage Azure Data Explorer database permissions. For more information, see [Manage Azure Data Explorer database permissions](/azure/data-explorer/manage-database-permissions).
- **3. Create a credential entity in Metrics Advisor.** See how to [create a credential entity](how-tos/credential-entity.md) in Metrics Advisor, so that you can choose that entity when adding data feed for Service Principal authentication type.
+ 1. Create a credential entity in Metrics Advisor. See how to [create a credential entity](how-tos/credential-entity.md) in Metrics Advisor, so that you can choose that entity when you're adding a data feed for the service principal authentication type.
Here's an example of connection string:
The following sections specify the parameters required for all authentication ty
Data Source=<URI Server>;Initial Catalog=<Database> ```
- * **Service Principal From Key Vault**: Key Vault helps to safeguard cryptographic keys and secret values that cloud apps and services use. By using Key Vault, you can encrypt keys and secret values. You should create a service principal first, and then store the service principal inside Key Vault. You can go through [Create a credential entity for Service Principal from Key Vault](how-tos/credential-entity.md#sp-from-kv) to follow detailed procedure to set service principal from key vault.
- Here's an example of connection string:
+ * **Service principal from key vault**: Azure Key Vault helps to safeguard cryptographic keys and secret values that cloud apps and services use. By using Key Vault, you can encrypt keys and secret values. You should create a service principal first, and then store the service principal inside Key Vault. For more information, see [Create a credential entity for service principal from Key Vault](how-tos/credential-entity.md#sp-from-kv) to follow detailed procedure to set service principal from key vault. Here's an example of connection string:
``` Data Source=<URI Server>;Initial Catalog=<Database> ```
- * **Managed Identity**: Managed identity for Azure resources can authorize access to blob and queue data using Azure AD credentials from applications running in Azure virtual machines (VMs), function apps, virtual machine scale sets, and other services. By using managed identity for Azure resources together with Azure AD authentication, you can avoid storing credentials with your applications that run in the cloud. Learn how to [authorize with a managed identity](../../storage/common/storage-auth-aad-msi.md#enable-managed-identities-on-a-vm).
+ * **Managed identity**: Managed identity for Azure resources can authorize access to blob and queue data. Managed identity uses Azure AD credentials from applications running in Azure virtual machines, function apps, virtual machine scale sets, and other services. By using managed identity for Azure resources and Azure AD authentication, you can avoid storing credentials with your applications that run in the cloud. Learn how to [authorize with a managed identity](../../storage/common/storage-auth-aad-msi.md#enable-managed-identities-on-a-vm).
- You can create a managed identity in Azure portal for your Azure Data Explorer (Kusto), choose **Permissions** section, and click **add** to create. The suggested role type is: admin / viewer.
+ You can create a managed identity in the Azure portal for your Azure Data Explorer (Kusto). Select **Permissions** > **Add**. The suggested role type is: **admin / viewer**.
- ![MI kusto](media/managed-identity-kusto.png)
+ ![Screenshot that shows managed identity for Kusto.](media/managed-identity-kusto.png)
Here's an example of connection string: ``` Data Source=<URI Server>;Initial Catalog=<Database> ```
- <!-- For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md) for more specific examples. -->
-
-* **Query**: See [Kusto Query Language](/azure/data-explorer/kusto/query) to get and formulate data into multi-dimensional time series data. You can use the `@IntervalStart` and `@IntervalEnd` variables in your query. They should be formatted: `yyyy-MM-ddTHH:mm:ssZ`.
+
+* **Query**: To get and formulate data into multi-dimensional time series data, see [Kusto Query Language](/azure/data-explorer/kusto/query). You can use the `@IntervalStart` and `@IntervalEnd` variables in your query. They should be formatted as follows: `yyyy-MM-ddTHH:mm:ssZ`.
Sample query:
The following sections specify the parameters required for all authentication ty
[TableName] | where [TimestampColumn] >= datetime(@IntervalStart) and [TimestampColumn] < datetime(@IntervalEnd); ```
- For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md) for more specific examples.
+ For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md).
## <span id="adl">Azure Data Lake Storage Gen2</span>
-* **Account Name**: There are four authentication types for Azure Data Lake Storage Gen2, they are **Basic**, **Azure Data Lake Storage Gen2 Shared Key**, **Service Principal**, and **Service Principal From KeyVault**.
+* **Account Name**: The authentication types for Azure Data Lake Storage Gen2 are basic, Azure Data Lake Storage Gen2 shared key, service principal, and service principal from Key Vault.
- * **Basic**: The **Account Name** of your Azure Data Lake Storage Gen2. This can be found in your Azure Storage Account (Azure Data Lake Storage Gen2) resource in **Access keys**.
+ * **Basic**: The **Account Name** of your Azure Data Lake Storage Gen2. You can find this in your Azure storage account (Azure Data Lake Storage Gen2) resource, in **Access keys**.
- * **Azure Data Lake Storage Gen2 Shared Key**: First, you should specify the account key to access your Azure Data Lake Storage Gen2 (the same as Account Key in *Basic* authentication type). This could be found in Azure Storage Account (Azure Data Lake Storage Gen2) resource in **Access keys** setting. Then you should [create a credential entity](how-tos/credential-entity.md) for *Azure Data Lake Storage Gen2 Shared Key* type and fill in the account key.
+ * **Azure Data Lake Storage Gen2 shared key**: First, you specify the account key to access your Azure Data Lake Storage Gen2 (this is the same as the account key in the basic authentication type. You can find this in your Azure storage account (Azure Data Lake Storage Gen2) resource, in **Access keys**. Then, you [create a credential entity](how-tos/credential-entity.md) for Azure Data Lake Storage Gen2 shared key type, and fill in the account key.
- The account name is the same as *Basic* authentication type.
+ The account name is the same as the basic authentication type.
- * **Service Principal**: A service principal is a concrete instance created from the application object and inherits certain properties from that application object. A service principal is created in each tenant where the application is used and references the globally unique app object. The service principal object defines what the app can actually do in the specific tenant, who can access the app, and what resources the app can access.
+ * **Service principal**: A *service principal* is a concrete instance created from the application object, and it inherits certain properties from that application object. A service principal is created in each tenant where the application is used, and it references the globally unique app object. The service principal object defines what the app can actually do in the specific tenant, who can access the app, and what resources the app can access.
- The account name is the same as **Basic** authentication type.
+ The account name is the same as the basic authentication type.
- **Step 1:** Create and register an Azure AD application and then authorize it to access database, see detail in [Create an AAD app registration](/azure/data-explorer/provision-azure-ad-app) documentation.
+ **Step 1:** Create and register an Azure AD application, and then authorize it to access the database. For more information, see [Create an Azure AD app registration](/azure/data-explorer/provision-azure-ad-app).
**Step 2:** Assign roles. 1. In the Azure portal, go to the **Storage accounts** service.
- 2. Select the ADLS Gen2 account to use with this application registration.
+ 2. Select the Azure Data Lake Storage Gen2 account to use with this application registration.
- 3. Click **Access Control (IAM)**.
+ 3. Select **Access Control (IAM)**.
- 4. Click **+ Add** and select **Add role assignment** from the dropdown menu.
+ 4. Select **+ Add**, and select **Add role assignment** from the menu.
- 5. Set the **Select** field to the Azure AD application name and set role to **Storage Blob Data Contributor**. Click **Save**.
+ 5. Set the **Select** field to the Azure AD application name, and set the role to **Storage Blob Data Contributor**. Then select **Save**.
- ![lake-service-principals](media/datafeeds/adls-gen-2-app-reg-assign-roles.png)
+ ![Screenshot that shows the steps to assign roles.](media/datafeeds/adls-gen-2-app-reg-assign-roles.png)
- **Step 3:** [Create a credential entity](how-tos/credential-entity.md) in Metrics Advisor, so that you can choose that entity when adding data feed for Service Principal authentication type.
+ **Step 3:** [Create a credential entity](how-tos/credential-entity.md) in Metrics Advisor, so that you can choose that entity when you're adding a data feed for the service principal authentication type.
- * **Service Principal From Key Vault** authentication type: Key Vault helps to safeguard cryptographic keys and secret values that cloud apps and services use. By using Key Vault, you can encrypt keys and secret values. You should create a service principal first, and then store the service principal inside Key Vault. You can go through [Create a credential entity for Service Principal from Key Vault](how-tos/credential-entity.md#sp-from-kv) to follow detailed procedure to set service principal from key vault. The account name is the same as *Basic* authentication type.
+ * **Service principal from Key Vault**: Key Vault helps to safeguard cryptographic keys and secret values that cloud apps and services use. By using Key Vault, you can encrypt keys and secret values. Create a service principal first, and then store the service principal inside a key vault. For more details, see [Create a credential entity for service principal from Key Vault](how-tos/credential-entity.md#sp-from-kv). The account name is the same as the basic authentication type.
-* **Account Key** (only *Basic* needs): Specify the account key to access your Azure Data Lake Storage Gen2. This could be found in Azure Storage Account (Azure Data Lake Storage Gen2) resource in **Access keys** setting.
+* **Account Key** (only necessary for the basic authentication type): Specify the account key to access your Azure Data Lake Storage Gen2. You can find this in your Azure storage account (Azure Data Lake Storage Gen2) resource, in **Access keys**.
-* **File System Name (Container)**: Metrics Advisor will expect your time series data stored as Blob files (one Blob per timestamp) under a single container. This is the container name field. This can be found in your Azure storage account (Azure Data Lake Storage Gen2) instance, and click **'Containers'** in **'Data Lake Storage'** section, then you'll see the container name.
+* **File System Name (Container)**: For Metrics Advisor, you store your time series data as blob files (one blob per timestamp), under a single container. This is the container name field. You can find this in your Azure storage account (Azure Data Lake Storage Gen2) instance. In **Data Lake Storage**, select **Containers**, and then you see the container name.
-* **Directory Template**: This is the directory template of the Blob file. The following parameters are supported:
+* **Directory Template**: This is the directory template of the blob file. The following parameters are supported:
- * `%Y` is the year formatted as `yyyy`
- * `%m` is the month formatted as `MM`
- * `%d` is the day formatted as `dd`
- * `%h` is the hour formatted as `HH`
- * `%M` is the minute formatted as `mm`
+ * `%Y` is the year, formatted as `yyyy`.
+ * `%m` is the month, formatted as `MM`.
+ * `%d` is the day, formatted as `dd`.
+ * `%h` is the hour, formatted as `HH`.
+ * `%M` is the minute, formatted as `mm`.
Query sample for a daily metric: `%Y/%m/%d`. Query sample for an hourly metric: `%Y/%m/%d/%h`. * **File Template**:
- Metrics Advisor uses path to find the json file in your Blob storage. This is an example of a Blob file template, which is used to find the json file in your Blob storage: `%Y/%m/FileName_%Y-%m-%d-%h-%M.json`. `%Y/%m` is the path, if you have `%d` in your path, you can add after `%m`.
+ Metrics Advisor uses a path to find the JSON file in Blob Storage. The following is an example of a blob file template, which is used to find the JSON file in Blob Storage: `%Y/%m/FileName_%Y-%m-%d-%h-%M.json`. `%Y/%m` is the path, and if you have `%d` in your path, you can add it after `%m`.
The following parameters are supported:
- * `%Y` is the year formatted as `yyyy`
- * `%m` is the month formatted as `MM`
- * `%d` is the day formatted as `dd`
- * `%h` is the hour formatted as `HH`
- * `%M` is the minute formatted as `mm`
+ * `%Y` is the year, formatted as `yyyy`.
+ * `%m` is the month, formatted as `MM`.
+ * `%d` is the day, formatted as `dd`.
+ * `%h` is the hour, formatted as `HH`.
+ * `%M` is the minute, formatted as `mm`.
- Currently Metrics Advisor supports the data schema in the JSON files as follows. For example:
+ Metrics Advisor supports the data schema in the JSON files, as in the following example:
``` JSON [
The following sections specify the parameters required for all authentication ty
## <span id="eventhubs">Azure Event Hubs</span>
-* **Limitations**: There are some limitations with Metrics Advisor Event Hub integration.
+* **Limitations**: Be aware of the following limitations with integration.
- * Metrics Advisor Event Hubs integration doesn't currently support more than 3 active data feeds in one Metrics Advisor instance in public preview.
- * Metrics Advisor will always start consuming messages from the latest offset, including when re-activating a paused data feed.
+ * Metrics Advisor integration with Event Hubs doesn't currently support more than three active data feeds in one Metrics Advisor instance in public preview.
+ * Metrics Advisor will always start consuming messages from the latest offset, including when reactivating a paused data feed.
* Messages during the data feed pause period will be lost.
- * The data feed ΓÇÿingestion start timeΓÇÖ is set to the current UTC timestamp automatically when created and is for reference purposes only.
-created, and for reference only.
+ * The data feed ingestion start time is set to the current Coordinated Universal Time timestamp automatically, when the data feed is created. This time is only for reference purposes.
- * Only one data feed can be used per consumer group . To reuse a consumer group from another deleted data feed, you need to wait at least 10 minutes after deletion.
-data feed, it needs to wait at least 10 minutes after deletion.
- * The connection string and consumer group cannot be modified after the data feed is created.
- * About messages in Event Hubs: Only JSON is supported, and the JSON values cannot be a nested JSON object. The top-level element can be a JSON object or a JSON array.
+ * Only one data feed can be used per consumer group. To reuse a consumer group from another deleted data feed, you need to wait at least ten minutes after deletion.
+ * The connection string and consumer group can't be modified after the data feed is created.
+ * For Event Hubs messages, only JSON is supported, and the JSON values can't be a nested JSON object. The top-level element can be a JSON object or a JSON array.
- Valid messages as follows:
+ Valid messages are as follows:
``` JSON Single JSON object
data feed, it needs to wait at least 10 minutes after deletion.
```
-* **Connection String**: Navigate to the **Event Hubs Instance** first. Then add a new policy or choose an existing Shared access policy. Copy the connection string in the pop-up panel.
- ![eventhubs](media/datafeeds/entities-eventhubs.jpg)
+* **Connection String**: Go to the instance of Event Hubs. Then add a new policy or choose an existing shared access policy. Copy the connection string in the pop-up panel.
+ ![Screenshot of Event Hubs.](media/datafeeds/entities-eventhubs.jpg)
- ![shared access policies](media/datafeeds/shared-access-policies.jpg)
+ ![Screenshot of shared access policies.](media/datafeeds/shared-access-policies.jpg)
Here's an example of a connection string: ```
data feed, it needs to wait at least 10 minutes after deletion.
``` * **Consumer Group**: A [consumer group](../../event-hubs/event-hubs-features.md#consumer-groups) is a view (state, position, or offset) of an entire event hub.
-This can be found on the "Consumer Groups" menu of an Azure Event Hubs instance. A consumer group can only serve one data feed, otherwise, onboard and ingestion will fail. It is recommended that you create a new consumer group for each data feed.
-* **Timestamp**(optional): Metrics Advisor uses the Event Hubs timestamp as the event timestamp if the user data source does not contain a timestamp field. The timestamp field is optional. If no timestamp column is chosen, we will use the enqueued time as the timestamp.
+You find this on the **Consumer Groups** menu of an instance of Azure Event Hubs. A consumer group can only serve one data feed. Create a new consumer group for each data feed.
+* **Timestamp** (optional): Metrics Advisor uses the Event Hubs timestamp as the event timestamp, if the user data source doesn't contain a timestamp field. The timestamp field is optional. If no timestamp column is chosen, the service uses the enqueued time as the timestamp.
The timestamp field must match one of these two formats:
- * "YYYY-MM-DDTHH:MM:SSZ" format;
- * Number of seconds or milliseconds from the epoch of 1970-01-01T00:00:00Z.
- No matter which timestamp field it will left align to granularity. For example, if timestamp is "2019-01-01T00:03:00Z", granularity is 5 minutes, then Metrics Advisor aligns the timestamp to "2019-01-01T00:00:00Z". If the event timestamp is "2019-01-01T00:10:00Z", Metrics Advisor uses the timestamp directly without any alignment.
+ * `YYYY-MM-DDTHH:MM:SSZ`
+ * The number of seconds or milliseconds from the epoch of `1970-01-01T00:00:00Z`.
+
+ The timestamp will left-align to granularity. For example, if the timestamp is `2019-01-01T00:03:00Z`, granularity is 5 minutes, and then Metrics Advisor aligns the timestamp to `2019-01-01T00:00:00Z`. If the event timestamp is `2019-01-01T00:10:00Z`, Metrics Advisor uses the timestamp directly, without any alignment.
-## <span id="log">Azure Log Analytics</span>
+## <span id="log">Azure Monitor Logs</span>
-There are three authentication types for Azure Log Analytics, they are **Basic**, **Service Principal** and **Service Principal From KeyVault**.
-* **Basic**: You need to fill in **Tenant ID**, **Client ID**, **Client Secret**, **Workspace ID**.
- To get **Tenant ID**, **Client ID**, **Client Secret**, see [Register app or web API](../../active-directory/develop/quickstart-register-app.md).
+Azure Monitor Logs has the following authentication types: basic, service principal, and service principal from Key Vault.
+* **Basic**: You need to fill in **Tenant ID**, **Client ID**, **Client Secret**, and **Workspace ID**.
+ To get **Tenant ID**, **Client ID**, and **Client Secret**, see [Register app or web API](../../active-directory/develop/quickstart-register-app.md). You can find **Workspace ID** in the Azure portal.
- * **Tenant ID**: Specify the tenant ID to access your Log Analytics.
- * **Client ID**: Specify the client ID to access your Log Analytics.
- * **Client Secret**: Specify the client secret to access your Log Analytics.
- * **Workspace ID**: Specify the workspace ID of Log Analytics. For **Workspace ID**, you can find it in Azure portal.
-
- ![workspace id](media/workspace-id.png)
+ ![Screenshot that shows where to find the Workspace ID in the Azure portal.](media/workspace-id.png)
-* **Service Principal**: A service principal is a concrete instance created from the application object and inherits certain properties from that application object. A service principal is created in each tenant where the application is used and references the globally unique app object. The service principal object defines what the app can actually do in the specific tenant, who can access the app, and what resources the app can access.
+* **Service principal**: A service principal is a concrete instance created from the application object, and it inherits certain properties from that application object. A service principal is created in each tenant where the application is used, and it references the globally unique app object. The service principal object defines what the app can actually do in the specific tenant, who can access the app, and what resources the app can access.
- **Step 1:** Create and register an Azure AD application and then authorize it to access a database, see first part in [Create an AAD app registration](/azure/data-explorer/provision-azure-ad-app).
+ **Step 1:** Create and register an Azure AD application, and then authorize it to access a database. For more information, see [Create an Azure AD app registration](/azure/data-explorer/provision-azure-ad-app).
**Step 2:** Assign roles. 1. In the Azure portal, go to the **Storage accounts** service.
- 2. Click **Access Control (IAM)**.
- 3. Click **+ Add** and select **Add role assignment** from the dropdown menu.
- 4. Set the **Select** field to the Azure AD application name and set role to **Storage Blob Data Contributor**. Click **Save**.
+ 2. Select **Access Control (IAM)**.
+ 3. Select **+ Add**, and then select **Add role assignment** from the menu.
+ 4. Set the **Select** field to the Azure AD application name, and set the role to **Storage Blob Data Contributor**. Then select **Save**.
- ![lake-service-principals](media/datafeeds/adls-gen-2-app-reg-assign-roles.png)
+ ![Screenshot that shows how to assign roles.](media/datafeeds/adls-gen-2-app-reg-assign-roles.png)
- **Step 3:** [Create a credential entity](how-tos/credential-entity.md) in Metrics Advisor, so that you can choose that entity when adding data feed for Service Principal authentication type.
+ **Step 3:** [Create a credential entity](how-tos/credential-entity.md) in Metrics Advisor, so that you can choose that entity when you're adding a data feed for the service principal authentication type.
-* **Service Principal From Key Vault** authentication type: Key Vault helps to safeguard cryptographic keys and secret values that cloud apps and services use. By using Key Vault, you can encrypt keys and secret values. You should create a service principal first, and then store the service principal inside Key Vault. You can go through [Create a credential entity for Service Principal from Key Vault](how-tos/credential-entity.md#sp-from-kv) to follow detailed procedure to set service principal from key vault.
+* **Service principal from Key Vault**: Key Vault helps to safeguard cryptographic keys and secret values that cloud apps and services use. By using Key Vault, you can encrypt keys and secret values. Create a service principal first, and then store the service principal inside a key vault. For more details, see [Create a credential entity for service principal from Key Vault](how-tos/credential-entity.md#sp-from-kv).
-* **Query**: Specify the query of Log Analytics. For more information, see [Log queries in Azure Monitor](../../azure-monitor/logs/log-query-overview.md)
+* **Query**: Specify the query. For more information, see [Log queries in Azure Monitor](../../azure-monitor/logs/log-query-overview.md).
Sample query:
There are three authentication types for Azure Log Analytics, they are **Basic**
| summarize [count_per_dimension]=count() by [Dimension] ```
- For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md) for more specific examples.
+ For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md).
## <span id="sql">Azure SQL Database | SQL Server</span>
-* **Connection String**: There are five authentication types for Azure SQL Database | SQL Server, they are **Basic**, **Managed Identity**, **Azure SQL Connection String**, **Service Principal** and **Service Principal From KeyVault**.
+* **Connection String**: The authentication types for Azure SQL Database and SQL Server are basic, managed identity, Azure SQL connection string, service principal, and service principal from key vault.
- * **Basic**: Metrics Advisor accepts an [ADO.NET Style Connection String](/dotnet/framework/data/adonet/connection-string-syntax) for sql server data source.
+ * **Basic**: Metrics Advisor accepts an [ADO.NET style connection string](/dotnet/framework/data/adonet/connection-string-syntax) for a SQL Server data source.
Here's an example of connection string: ``` Data Source=<Server>;Initial Catalog=<db-name>;User ID=<user-name>;Password=<password> ```
- * <span id='jump'>**Managed Identity**</span>: Managed identity for Azure resources can authorize access to blob and queue data using Azure AD credentials from applications running in Azure virtual machines (VMs), function apps, virtual machine scale sets, and other services. By using managed identity for Azure resources together with Azure AD authentication, you can avoid storing credentials with your applications that run in the cloud. To [enable your managed entity](../../active-directory/managed-identities-azure-resources/tutorial-windows-vm-access-sql.md), you can refer to following steps:
- 1. **Enabling a system-assigned managed identity is a one-click experience.** In Azure portal for your Metrics Advisor workspace, set the status as `on` in **Settings > Identity > System assigned**.
+ * <span id='jump'>**Managed identity**</span>: Managed identity for Azure resources can authorize access to blob and queue data. It does so by using Azure AD credentials from applications running in Azure virtual machines, function apps, virtual machine scale sets, and other services. By using managed identity for Azure resources and Azure AD authentication, you can avoid storing credentials with your applications that run in the cloud. To [enable your managed entity](../../active-directory/managed-identities-azure-resources/tutorial-windows-vm-access-sql.md), follow these steps:
+ 1. Enabling a system-assigned managed identity is a one-click experience. In the Azure portal, for your Metrics Advisor workspace, go to **Settings** > **Identity** > **System assigned**. Then set the status as **on**.
- ![set status as on](media/datafeeds/set-identity-status.png)
+ ![Screenshot that shows how to set the status as on.](media/datafeeds/set-identity-status.png)
- 1. **Enable Azure AD authentication.** In the Azure portal for your data source, click **Set admin** in **Settings > Active Directory admin**, select an **Azure AD user account** to be made an administrator of the server, and click **Select**.
+ 1. Enable Azure AD authentication. In the Azure portal, for your data source, go to **Settings** > **Active Directory admin**. Select **Set admin**, and select an **Azure AD user account** to be made an administrator of the server. Then, choose **Select**.
- ![set admin](media/datafeeds/set-admin.png)
+ ![Screenshot that shows how to set the admin.](media/datafeeds/set-admin.png)
- 1. **Enable managed identity(MI) in Metrics Advisor.** There are 2 ways to choose: edit query in a **database management tool** or **Azure portal**.
+ 1. Enable managed identity in Metrics Advisor. You can edit a query in the database management tool or in the Azure portal.
- **Management tool**: In your database management tool, select **Active Directory - Universal with MFA support** in the authentication field. In the User name field, enter the name of the Azure AD account that you set as the server administrator in step 2, for example, test@contoso.com
+ **Management tool**: In your database management tool, select **Active Directory - Universal with MFA support** in the authentication field. In the **User name** field, enter the name of the Azure AD account that you set as the server administrator in step 2. For example, this might be `test@contoso.com`.
- ![set connection detail](media/datafeeds/connection-details.png)
+ ![Screenshot that shows how to set connection details.](media/datafeeds/connection-details.png)
- **Azure portal**: Select Query editor in your SQL database, sign in admin account.
- ![edit query in Azure Portal](media/datafeeds/query-editor.png)
+ **Azure portal**: In your SQL database, select **Query editor**, and sign in the admin account.
+ ![Screenshot that shows how to edit your query in the Azure portal.](media/datafeeds/query-editor.png)
- Then in the query window, you should execute the following lines (same for management tool method):
+ Then in the query window, run the following (note that this is the same for the management tool method):
``` CREATE USER [MI Name] FROM EXTERNAL PROVIDER
There are three authentication types for Azure Log Analytics, they are **Basic**
``` > [!NOTE]
- > The `MI Name` is the **Managed Identity Name** in Metrics Advisor (for service principal, it should be replaced with **Service Principal name**). Also, you can learn more detail in this document: [Authorize with a managed identity](../../storage/common/storage-auth-aad-msi.md#enable-managed-identities-on-a-vm).
+ > The `MI Name` is the managed identity name in Metrics Advisor (for service principal, it should be replaced with the service principal name). For more information, see [Authorize with a managed identity](../../storage/common/storage-auth-aad-msi.md#enable-managed-identities-on-a-vm).
- Here's an example of connection string:
+ Here's an example of a connection string:
``` Data Source=<Server>;Initial Catalog=<Database> ```
- * **Azure SQL Connection String**:
+ * **Azure SQL connection string**:
- Here's an example of connection string:
+ Here's an example of a connection string:
``` Data Source=<Server>;Initial Catalog=<Database>;User ID=<user-name>;Password=<password> ```
- * **Service Principal**: A service principal is a concrete instance created from the application object and inherits certain properties from that application object. A service principal is created in each tenant where the application is used and references the globally unique app object. The service principal object defines what the app can actually do in the specific tenant, who can access the app, and what resources the app can access.
+ * **Service principal**: A service principal is a concrete instance created from the application object, and it inherits certain properties from that application object. A service principal is created in each tenant where the application is used, and it references the globally unique app object. The service principal object defines what the app can actually do in the specific tenant, who can access the app, and what resources the app can access.
- **Step 1:** Create and register an Azure AD application and then authorize it to access a database, see detail in [Create an AAD app registration](/azure/data-explorer/provision-azure-ad-app) documentation.
+ **Step 1:** Create and register an Azure AD application, and then authorize it to access a database. For more information, see [Create an Azure AD app registration](/azure/data-explorer/provision-azure-ad-app).
- **Step 2:** Follow the same steps with [managed identity in SQL Server](#jump), which is mentioned above.
+ **Step 2:** Follow the steps documented previously, in [managed identity in SQL Server](#jump).
- **Step 3:** [Create a credential entity](how-tos/credential-entity.md) in Metrics Advisor, so that you can choose that entity when adding data feed for Service Principal authentication type.
+ **Step 3:** [Create a credential entity](how-tos/credential-entity.md) in Metrics Advisor, so that you can choose that entity when you're adding a data feed for the service principal authentication type.
- Here's an example of connection string:
+ Here's an example of a connection string:
``` Data Source=<Server>;Initial Catalog=<Database> ```
- * **Service Principal From Key Vault**: Key Vault helps to safeguard cryptographic keys and secret values that cloud apps and services use. By using Key Vault, you can encrypt keys and secret values. You should create a service principal first, and then store the service principal inside Key Vault. You can go through [Create a credential entity for Service Principal from Key Vault](how-tos/credential-entity.md#sp-from-kv) to follow detailed procedure to set service principal from key vault. Also, your connection string could be found in Azure SQL Server resource in **Settings > Connection strings** section.
+ * **Service principal from Key Vault**: Key Vault helps to safeguard cryptographic keys and secret values that cloud apps and services use. By using Key Vault, you can encrypt keys and secret values. Create a service principal first, and then store the service principal inside a key vault. For more details, see [Create a credential entity for service principal from Key Vault](how-tos/credential-entity.md#sp-from-kv). You can also find your connection string in your Azure SQL Server resource, in **Settings** > **Connection strings**.
Here's an example of connection string:
There are three authentication types for Azure Log Analytics, they are **Basic**
Data Source=<Server>;Initial Catalog=<Database> ```
-* **Query**: A SQL query to get and formulate data into multi-dimensional time series data. You can use `@IntervalStart` and `@IntervalEnd` in your query to help with getting expected metrics value in an interval. They should be formatted: `yyyy-MM-ddTHH:mm:ssZ`.
+* **Query**: Use a SQL query to get and formulate data into multi-dimensional time series data. You can use `@IntervalStart` and `@IntervalEnd` in your query to help with getting an expected metrics value in an interval. They should be formatted as follows: `yyyy-MM-ddTHH:mm:ssZ`.
Sample query:
There are three authentication types for Azure Log Analytics, they are **Basic**
## <span id="table">Azure Table Storage</span>
-* **Connection String**: Create an SAS (shared access signature) URL and fill in here. The most straightforward way to generate a SAS URL is using the Azure portal. By using the Azure portal, you can navigate graphically. To create an SAS URL via the Azure portal, first, navigate to the storage account youΓÇÖd like to access under the **Settings section** then click **Shared access signature**. Check at least "Table" and "Object" checkboxes, then click the Generate SAS and connection string button. Table service SAS URL is what you need to copy and fill in the text box in the Metrics Advisor workspace.
+* **Connection String**: Create a shared access signature (SAS) URL, and fill it in here. The most straightforward way to generate a SAS URL is by using the Azure portal. First, under **Settings**, go to the storage account you want to access. Then select **Shared access signature**. Select the **Table** and **Object** checkboxes, and then select **Generate SAS and connection string**. In the Metrics Advisor workspace, copy and paste the **Table service SAS URL** into the text box.
- ![azure table generate sas](media/azure-table-generate-sas.png)
+ ![Screenshot that shows how to generate the shared access signature in Azure Table Storage.](media/azure-table-generate-sas.png)
-* **Table Name**: Specify a table to query against. This can be found in your Azure Storage Account instance. Click **Tables** in the **Table Service** section.
+* **Table Name**: Specify a table to query against. You can find this in your Azure storage account instance. In the **Table Service** section, select **Tables**.
-* **Query**: You can use `@IntervalStart` and `@IntervalEnd` in your query to help with getting expected metrics value in an interval. They should be formatted: `yyyy-MM-ddTHH:mm:ssZ`.
+* **Query**: You can use `@IntervalStart` and `@IntervalEnd` in your query to help with getting an expected metrics value in an interval. They should be formatted as follows: `yyyy-MM-ddTHH:mm:ssZ`.
Sample query:
There are three authentication types for Azure Log Analytics, they are **Basic**
PartitionKey ge '@IntervalStart' and PartitionKey lt '@IntervalEnd' ```
- For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md) for more specific examples.
-
-<!--
-## <span id="es">Elasticsearch</span>
-
-* **Host**: Specify the master host of Elasticsearch Cluster.
-* **Port**: Specify the master port of Elasticsearch Cluster.
-* **Authorization Header**: Specify the authorization header value of Elasticsearch Cluster.
-* **Query**: Specify the query to get data. Placeholder `@IntervalStart` is supported. For example, when data of `2020-06-21T00:00:00Z` is ingested, `@IntervalStart = 2020-06-21T00:00:00`.
-
+ For more information, see the [tutorial on writing a valid query](tutorials/write-a-valid-query.md).
-* **Request URL**: An HTTP url that can return a JSON. The placeholders %Y,%m,%d,%h,%M are supported: %Y=year in format yyyy, %m=month in format MM, %d=day in format dd, %h=hour in format HH, %M=minute in format mm. For example: `http://microsoft.com/ProjectA/%Y/%m/X_%Y-%m-%d-%h-%M`.
-* **Request HTTP method**: Use GET or POST.
-* **Request header**: Could add basic authentication.
-* **Request payload**: Only JSON payload is supported. Placeholder @IntervalStart is supported in the payload. The response should be in the following JSON format: `[{"timestamp": "2018-01-01T00:00:00Z", "market":"en-us", "count":11, "revenue":1.23}, {"timestamp": "2018-01-01T00:00:00Z", "market":"zh-cn", "count":22, "revenue":4.56}]`. For example, when data of `2020-06-21T00:00:00Z` is ingested, `@IntervalStart = 2020-06-21T00:00:00.0000000+00:00)`.
> ## <span id="influxdb">InfluxDB (InfluxQL)</span>
-* **Connection String**: The connection string to access your InfluxDB.
+* **Connection String**: The connection string to access InfluxDB.
* **Database**: The database to query against. * **Query**: A query to get and formulate data into multi-dimensional time series data for ingestion.
There are three authentication types for Azure Log Analytics, they are **Basic**
SELECT [TimestampColumn], [DimensionColumn], [MetricColumn] FROM [TableName] WHERE [TimestampColumn] >= @IntervalStart and [TimestampColumn] < @IntervalEnd ```
-For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md) for more specific examples.
+For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md).
* **User name**: This is optional for authentication. * **Password**: This is optional for authentication. ## <span id="mongodb">MongoDB</span>
-* **Connection String**: The connection string to access your MongoDB.
+* **Connection String**: The connection string to access MongoDB.
* **Database**: The database to query against.
-* **Query**: A command to get and formulate data into multi-dimensional time series data for ingestion. We recommend the command is verified on [db.runCommand()](https://docs.mongodb.com/manual/reference/method/db.runCommand/https://docsupdatetracker.net/index.html).
+* **Query**: A command to get and formulate data into multi-dimensional time series data for ingestion. Verify the command on [db.runCommand()](https://docs.mongodb.com/manual/reference/method/db.runCommand/https://docsupdatetracker.net/index.html).
Sample query:
For more information, refer to the [tutorial on writing a valid query](tutorials
## <span id="mysql">MySQL</span>
-* **Connection String**: The connection string to access your MySQL DB.
+* **Connection String**: The connection string to access MySQL DB.
* **Query**: A query to get and formulate data into multi-dimensional time series data for ingestion. Sample query:
For more information, refer to the [tutorial on writing a valid query](tutorials
SELECT [TimestampColumn], [DimensionColumn], [MetricColumn] FROM [TableName] WHERE [TimestampColumn] >= @IntervalStart and [TimestampColumn]< @IntervalEnd ```
- For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md) for more specific examples.
+ For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md).
## <span id="pgsql">PostgreSQL</span>
-* **Connection String**: The connection string to access your PostgreSQL DB.
+* **Connection String**: The connection string to access PostgreSQL DB.
* **Query**: A query to get and formulate data into multi-dimensional time series data for ingestion. Sample query:
For more information, refer to the [tutorial on writing a valid query](tutorials
``` SQL SELECT [TimestampColumn], [DimensionColumn], [MetricColumn] FROM [TableName] WHERE [TimestampColumn] >= @IntervalStart and [TimestampColumn] < @IntervalEnd ```
- For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md) for more specific examples.
+ For more information, refer to the [tutorial on writing a valid query](tutorials/write-a-valid-query.md).
-## <span id="csv">Local files(CSV)</span>
+## <span id="csv">Local files (CSV)</span>
> [!NOTE]
-> This feature is only used for quick system evaluation focusing on anomaly detection. It only accepts static data from a local CSV and performs anomaly detection on single time series data. However, for the full experience analyzing on multi-dimensional metrics including real-time data ingestion, anomaly notification, root cause analysis, cross-metric incident analysis, use other supported data sources.
+> This feature is only used for quick system evaluation focusing on anomaly detection. It only accepts static data from a local CSV, and performs anomaly detection on single time series data. For analyzing multi-dimensional metrics, including real-time data ingestion, anomaly notification, root cause analysis, and cross-metric incident analysis, use other supported data sources.
**Requirements on data in CSV:**-- Have at least one column, which represents measurements to be analyzed. For better and quicker user experience, we recommend you try a CSV file containing two columns: (1) Timestamp column (2) Metric Column. (Timestamp format: 2021-03-30T00:00:00Z, the 'seconds' part is best to be ':00Z'), and the time granularity between every record should be the same.-- Timestamp column is optional, if there's no timestamp, Metrics Advisor will use timestamp starting from today 00:00:00(UTC) and map each measure in the row at a one-hour interval. If there is timestamp column in CSV and you want to keep it, make sure the data time period follow this rule [historical data processing window].-- There is no re-ordering or gap-filling happening during data ingestion, make sure your data in CSV is ordered by timestamp **ascending (ASC)**.
+- Have at least one column, which represents measurements to be analyzed. For a better and quicker user experience, try a CSV file that contains two columns: a timestamp column and a metric column. The timestamp format should be as follows: `2021-03-30T00:00:00Z`, and the `seconds` part is best to be `:00Z`. The time granularity between every record should be the same.
+- The timestamp column is optional. If there's no timestamp, Metrics Advisor will use timestamp starting from today (`00:00:00` Coordinated Universal Time). The service maps each measure in the row at a one-hour interval.
+- There is no re-ordering or gap-filling happening during data ingestion. Make sure that your data in the CSV file is ordered by the timestamp ordering **ascending (ASC)**.
## Next steps
-* While waiting for your metric data to be ingested into the system, read about [how to manage data feed configurations](how-tos/manage-data-feeds.md).
-* When your metric data is ingested, you can [Configure metrics and fine tune detection configuration](how-tos/configure-metrics.md).
+* While you're waiting for your metric data to be ingested into the system, read about [how to manage data feed configurations](how-tos/manage-data-feeds.md).
+* When your metric data is ingested, you can [configure metrics and fine tune detection configuration](how-tos/configure-metrics.md).
applied-ai-services Web Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/applied-ai-services/metrics-advisor/quickstarts/web-portal.md
- mode-portal
-# Quickstart: Monitor your first metric using the web portal
+# Quickstart: Monitor your first metric by using the web portal
-When you provision a Metrics Advisor instance, you can use the APIs and web-based workspace to work with the service. The web-based workspace can be used as a straightforward way to quickly get started with the service. It also provides a visual way to configure settings, customize your model, and perform root cause analysis.
-
-* Onboard your metric data
-* View your metrics and visualizations
-* Fine-tune detection configurations
-* Explore diagnostic insights
-* Create and subscribe to anomaly alerts
+When you provision an instance of Azure Metrics Advisor, you can use the APIs and web-based workspace to work with the service. The web-based workspace can be used as a straightforward way to quickly get started with the service. It also provides a visual way to configure settings, customize your model, and perform root cause analysis.
## Prerequisites
-* Azure subscription - [Create one for free](https://azure.microsoft.com/free/cognitive-services)
-* Once you have your Azure subscription, <a href="https://go.microsoft.com/fwlink/?linkid=2142156" title="Create a Metrics Advisor resource" target="_blank">create a Metrics Advisor resource </a> in the Azure portal to deploy your Metrics Advisor instance.
+* An Azure subscription. [Create one for free](https://azure.microsoft.com/free/cognitive-services).
+* When you have your Azure subscription, <a href="https://go.microsoft.com/fwlink/?linkid=2142156" title="Create a Metrics Advisor resource" target="_blank">create a Metrics Advisor resource </a> in the Azure portal to deploy your instance of Metrics Advisor.
> [!TIP]
-> * It may 10 to 30 minutes for your Metrics Advisor resource to deploy. Select **Go to resource** once it successfully deploys.
-> * If you'd like to use the REST API to interact with the service, you will need the key and endpoint from the resource you create. You can find them in the **Keys and endpoints** tab in the created resource.
+> * It can take 10 to 30 minutes for your Metrics Advisor resource to deploy. Select **Go to resource** after it successfully deploys.
+> * If you want to use the REST API to interact with the service, you need the key and endpoint from the resource you create. You can find them on the **Keys and endpoints** tab in the created resource.
-This document uses a SQL Database as an example for creating your first monitor.
+This document uses a SQL database as an example for creating your first monitor.
## Sign in to your workspace
-After your resource is created, sign in to [Metrics Advisor portal](https://go.microsoft.com/fwlink/?linkid=2143774) with your Active Directory account. From the landing page, select your **Directory**, **Subscription** and **Workspace** that just created, then select **Get started**. For onboarding time series data, select **Add data feed** from the left menu.
+After your resource is created, sign in to the [Metrics Advisor portal](https://go.microsoft.com/fwlink/?linkid=2143774) with your Active Directory account. From the landing page, select your **Directory**, **Subscription**, and **Workspace** that you just created, and then select **Get started**. To use time series data, select **Add data feed** from the left menu.
-Currently you can create one Metrics Advisor resource at each available region. You can switch workspaces in Metrics Advisor portal at any time.
+Currently you can create one Metrics Advisor resource at each available region. You can switch workspaces in the Metrics Advisor portal at any time.
-## Onboard time series data
+## Time series data
-Metrics Advisor provides connectors for different data sources, such as SQL Database, Azure Data Explorer, and Azure Table Storage. The steps for connecting data are similar for different connectors, although some configuration parameters may vary. See [connect data different data feed sources](../data-feeds-from-different-sources.md) for different data connection settings.
+Metrics Advisor provides connectors for different data sources, such as Azure SQL Database, Azure Data Explorer, and Azure Table Storage. The steps for connecting data are similar for different connectors, although some configuration parameters might vary. For more information, see [Connect different data sources](../data-feeds-from-different-sources.md).
-This quickstart uses a SQL Database as an example. You can also ingest your own data follow the same steps.
+This quickstart uses a SQL database as an example. You can also ingest your own data by following the same steps.
### Data schema requirements and configuration
This quickstart uses a SQL Database as an example. You can also ingest your own
[Add the data feeds](../how-tos/onboard-your-data.md) by connecting to your time series data source. Start by selecting the following parameters: * **Source Type**: The type of data source where your time series data is stored.
-* **Granularity**: The interval between consecutive data points in your time series data, for example Yearly, Monthly, Daily. The lowest interval customization supports is 60 seconds.
+* **Granularity**: The interval between consecutive data points in your time series data (for example, yearly, monthly, or daily). The shortest interval supported is 60 seconds.
* **Ingest data since (UTC)**: The start time for the first timestamp to be ingested.
-<!-- Next, specify the **Connection string** with the credentials for your data source, and a custom **Query**, see [how to write a valid query](../tutorials/write-a-valid-query.md) for more information. -->
- ### Load data
-After the connection string and query string are inputted, select **Load data**. Within this operation, Metrics Advisor will check connection and permission to load data, check necessary parameters (@IntervalStart and @IntervalEnd) which need to be used in query, and check the column name from data source.
+After you input the connection and query strings, select **Load data**. Metrics Advisor checks the connection and permission to load data, checks the necessary parameters used in the query, and checks the column name from the data source.
If there's an error at this step:
-1. First check if the connection string is valid.
-2. Then confirm that there's sufficient permissions and that the ingestion worker IP address is granted access.
-3. Next check if the required parameters (@IntervalStart and @IntervalEnd) are used in your query.
+1. Check if the connection string is valid.
+1. Confirm that there's sufficient permissions and that the ingestion worker IP address is granted access.
+1. Check if the required parameters (`@IntervalStart` and `@IntervalEnd`) are used in your query.
### Schema configuration
-Once the data is loaded by running the query and shown like below, select the appropriate fields.
+After the data is loaded by running the query, select the appropriate fields.
|Selection |Description |Notes | ||||
-|**Timestamp** | The timestamp of a data point. If omitted, Metrics Advisor will use the timestamp when the data point is ingested instead. For each data feed, you could specify at most one column as timestamp. | Optional. Should be specified with at most one column. |
-|**Measure** | The numeric values in the data feed. For each data feed, you could specify multiple measures but at least one column should be selected as measure. | Should be specified with at least one column. |
-|**Dimension** | Categorical values. A combination of different values identifies a particular single-dimension time series, for example: country, language, tenant. You could select none or arbitrary number of columns as dimensions. Note: if you're selecting a non-string column as dimension, be cautious with dimension explosion. | Optional. |
-|**Ignore** | Ignore the selected column. | Optional. For data sources support using a query to get data, there is no 'Ignore' option. |
+|**Timestamp** | The timestamp of a data point. If the timestamp is omitted, Metrics Advisor uses the timestamp when the data point is ingested instead. For each data feed, you can specify at most one column as timestamp. | Optional. Should be specified with at most one column. |
+|**Measure** | The numeric values in the data feed. For each data feed, you can specify multiple measures, but at least one column should be selected as measure. | Should be specified with at least one column. |
+|**Dimension** | Categorical values. A combination of different values identifies a particular single-dimension time series. Examples include country, language, and tenant. You can select none, or an arbitrary number of columns as dimensions. If you're selecting a non-string column as dimension, be cautious with dimension explosion. | Optional. |
+|**Ignore** | Ignore the selected column. | Optional. For data sources that support using a query to get data, there's no ignore option. |
-After configuring the schema, select **Verify schema**. Within this operation, Metrics Advisor will perform following checks:
-- Whether timestamp of queried data falls into one single interval. -- Whether there's duplicate values returned for the same dimension combination within one metric interval.
+After configuring the schema, select **Verify schema**. Metrics Advisor performs the following checks:
+- Whether the timestamp of the queried data falls into one single interval.
+- Whether there are duplicate values returned for the same dimension combination within one metric interval.
-### Automatic roll up settings
+### Automatic roll-up settings
> [!IMPORTANT]
-> If you'd like to enable **root cause analysis** and other diagnostic capabilities, 'automatic roll up setting' needs to be configured.
-> Once enabled, the automatic roll up settings cannot be changed.
+> If you want to enable root cause analysis and other diagnostic capabilities, configure the automatic roll-up settings. After you enable the analysis, you can't change the automatic roll-up settings.
-Metrics Advisor can automatically perform aggregation(SUM/MAX/MIN...) on each dimension during ingestion, then builds a hierarchy, which will be used in root case analysis and other diagnostic features. See [Automatic roll up settings](../how-tos/onboard-your-data.md#automatic-roll-up-settings) for more details.
+Metrics Advisor can automatically perform aggregation on each dimension during ingestion. Then the service builds a hierarchy that you can use in root cause analysis and other diagnostic features. For more information, see [Automatic roll-up settings](../how-tos/onboard-your-data.md#automatic-roll-up-settings).
-Give a custom name for the data feed, which will be displayed in your workspace. Select **Submit**.
+Give a custom name for the data feed, which will be shown in your workspace. Select **Submit**.
## Tune detection configuration
-After the data feed is added, Metrics Advisor will attempt to ingest metric data from the specified start date. It will take some time for data to be fully ingested, and you can view the ingestion status by selecting **Ingestion progress** at the top of the data feed page. If data is ingested, Metrics Advisor will apply detection, and continue to monitor the source for new data.
+After the data feed is added, Metrics Advisor attempts to ingest metric data from the specified start date. It will take some time for data to be fully ingested, and you can view the ingestion status by selecting **Ingestion progress** at the top of the data feed page. If data is ingested, Metrics Advisor will apply detection, and continue to monitor the source for new data.
-When detection is applied, select one of the metrics listed in data feed to find the **Metric detail page** to:
-- View visualizations of all time series' slices under this metric-- Update detection configuration to meet expected results-- Set up notification for detected anomalies
+When detection is applied, select one of the metrics listed in the data feed to find the **Metric detail page**. Here, you can:
+- View visualizations of all time series' slices under this metric.
+- Update detection configuration to meet expected results.
+- Set up notification for detected anomalies.
-## View the diagnostic insights
+## View diagnostic insights
-After tuning the detection configuration, anomalies that are found should reflect actual anomalies in your data. Metrics Advisor performs analysis on multi-dimensional metrics to locate root cause into specific dimension and also cross-metrics analysis by using "Metrics graph".
+After tuning the detection configuration, you should find that detected anomalies reflect actual anomalies in your data. Metrics Advisor performs analysis on multidimensional metrics to locate the root cause to a specific dimension. The service also performs cross-metrics analysis by using the metrics graph feature.
-To view the diagnostic insights, select the red dots on time series visualizations, which represent detected anomalies. A window will appear with a link to incident analysis page.
+To view the diagnostic insights, select the red dots on time series visualizations. These red dots represent detected anomalies. A window will appear with a link to the incident analysis page.
-After selecting the link, you will be pivoted to the incident analysis page, which analyzes on a group of related anomalies with a bunch of diagnostics insights. There're 3 major steps to diagnose an incident:
+On the incident analysis page, you see a group of related anomalies and diagnostic insights. The following sections cover the major steps to diagnose an incident.
-### Check summary of current incident
+### Check the summary of the current incident
-At the top, there will be a summary including basic information, actions & tracings and an analyzed root cause. Basic information includes the "top impacted series" with a diagram, "impact start & end time", "incident severity" and "total anomalies included".
+You can find the summary at the top of the incident analysis page. This summary includes basic information, actions and tracings, and an analyzed root cause. Basic information includes the top impacted series with a diagram, the impact start and end time, the severity, and the total anomalies included.
-Analyzed root cause is an automatic analyzed result. Metrics Advisor analyzes on all anomalies that captured on time series within one metric with different dimension values at the same timestamp. Then performs correlation, clustering to group related anomalies together and generates a root cause advice.
+The analyzed root cause is an automatically analyzed result. Metrics Advisor analyzes all anomalies that are captured on a time series, within one metric with different dimension values at the same timestamp. Then the service performs correlation, clustering group-related anomalies together, and generates advice about a root cause.
-Based on these, you can already get a straightforward view of current abnormal status and the impact of the incident and the most potential root cause. So that immediate action could be taken to resolve incident as soon as possible.
+Based on these, you can already get a straightforward view of the current abnormal status, the impact of the incident, and the most likely root cause. You can then take immediate action to resolve the incident.
### View cross-dimension diagnostic insights
-After getting basic info and automatic analysis insight, you can get more detailed info on abnormal status on other dimensions within the same metric in a holistic way using **"Diagnostic tree"**.
+You can also get more detailed info on abnormal status on other dimensions within the same metric in a holistic way, by using the diagnostic tree feature.
+
+For metrics with multiple dimensions, Metrics Advisor categorizes the time series into a hierarchy (called a diagnostic tree). For example, a revenue metric is monitored by two dimensions: region and category. You need to have an aggregated dimension value, such as `SUM`. Then, the time series of `region = SUM` and `category = SUM` is categorized as the root node within the tree. Whenever there's an anomaly captured at the `SUM` dimension, you can analyze it to locate which specific dimension value has contributed the most to the parent node anomaly. Select each node to expand it for detailed information.
-For metrics with multiple dimensions, Metrics Advisor categorizes the time series into a hierarchy, which is named as "Diagnostic tree". For example, a "revenue" metric is monitored by two dimensions: "region" and "category". Despite concrete dimension values, there needs to have an **aggregated** dimension value, like **"SUM"**. Then time series of "region" = **"SUM"** and "category" = **"SUM"** will be categorized as the root node within the tree. Whenever there's an anomaly captured at **"SUM"** dimension, then it could be drilled down and analyzed to locate which specific dimension value has contributed the most to the parent node anomaly. Click on each node to expand detailed information.
+### View cross-metrics diagnostic insights
-### View cross-metrics diagnostic insights using "Metrics graph"
+Sometimes, it's hard to analyze an issue by checking the abnormal status of a single metric, and you need to correlate multiple metrics together. To do this, configure a metrics graph, which indicates the relationships between metrics.
-Sometimes, it's hard to analyze an issue by checking the abnormal status of a single metric, and you need to correlate multiple metrics together. Customers are able to configure a "Metrics graph" which indicates the relations between metrics.
-By leveraging above cross-dimension diagnostic result, the root cause is limited into specific dimension value. Then use "Metrics graph" and filter by the analyzed root cause dimension to check anomaly status on other metrics.
-After clicking the link, you will be pivoted to the incident analysis page which analyzes on corresponding anomaly, with a bunch of diagnostics insights. There are three sections in the incident detail page which correspond to three major steps to diagnosing an incident.
+By using the cross-dimension diagnostic result described in the previous section, you can identify that the root cause is limited to a specific dimension value. Then use a metrics graph to filter by the analyzed root cause dimension, to check the anomaly status on other metrics.
-But you can also pivot across more diagnostics insights leveraging additional features to drill down anomalies by dimension, view similar anomalies and do comparison across metrics. Please find more at [How to: diagnose an incident](../how-tos/diagnose-an-incident.md).
+You can also pivot across more diagnostic insights by using additional features. These features help you drill down on dimensions of anomalies, view similar anomalies, and compare across metrics. For more information, see [Diagnose an incident](../how-tos/diagnose-an-incident.md).
## Get notified when new anomalies are found
-If you'd like to get alerted when an anomaly is detected in your data, you can create a subscription for one or more of your metrics. Metrics Advisor uses hooks to send alerts. Three types of hooks are supported: email hook, web hook and Azure DevOps. We'll use web hook as an example.
+If you want to get alerted when an anomaly is detected in your data, you can create a subscription for one or more of your metrics. Metrics Advisor uses hooks to send alerts. Three types of hooks are supported: email hook, web hook, and Azure DevOps. We'll use web hook as an example.
### Create a web hook
-A web hook is the entry point to get anomaly noticed by a programmatic way from the Metrics Advisor service, which calls a user-provided API when an alert is triggered.For details on how to create a hook, refer to the **Create a hook** section in [How-to: Configure alerts and get notifications using a hook](../how-tos/alerts.md#create-a-hook).
+In Metrics Advisor, you can use a web hook to surface an anomaly programmatically. The service calls a user-provided API when an alert is triggered. For more information, see [Create a hook](../how-tos/alerts.md#create-a-hook).
### Configure alert settings
-After creating a hook, an alert setting determines how and which alert notifications should be sent. You can set multiple alert settings for each metric. two important settings are **Alert for** which specifies the anomalies to be included, and **Filter anomaly options**, which define which anomalies to include in the alert. See the **Add or Edit alert settings** section in [How-to: Configure alerts and get notifications using a hook](../how-tos/alerts.md#add-or-edit-alert-settings) for more details.
+After creating a hook, an alert setting determines how and which alert notifications should be sent. You can set multiple alert settings for each metric. Two important settings are **Alert for**, which specifies the anomalies to be included, and **Filter anomaly options**, which defines which anomalies to include in the alert. For more information, see [Add or edit alert settings](../how-tos/alerts.md#add-or-edit-alert-settings).
## Next steps -- [Onboard your data feeds](../how-tos/onboard-your-data.md)
- - [Manage data feeds](../how-tos/manage-data-feeds.md)
- - [Configurations for different data sources](../data-feeds-from-different-sources.md)
-- [Use the REST API or Client libraries](./rest-api-and-client-library.md)-- [Configure metrics and fine tune detection configuration](../how-tos/configure-metrics.md)
+- [Add your metric data to Metrics Advisor](../how-tos/onboard-your-data.md)
+ - [Manage your data feeds](../how-tos/manage-data-feeds.md)
+ - [Connect different data sources](../data-feeds-from-different-sources.md)
+- [Use the client libraries or REST APIs to customize your solution](./rest-api-and-client-library.md)
+- [Configure metrics and fine-tune detection configuration](../how-tos/configure-metrics.md)
automanage Automanage Virtual Machines https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automanage/automanage-virtual-machines.md
There are several prerequisites to consider before trying to enable Azure Automa
- VMs must be in a supported region (see below) - User must have correct permissions (see below) - Automanage does not support Sandbox subscriptions at this time
+- Automanage does not support Windows 10 at this time
### Supported regions Automanage only supports VMs located in the following regions:
automation Automation Linux Hrw Install https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/automation-linux-hrw-install.md
Title: Deploy a Linux Hybrid Runbook Worker in Azure Automation
description: This article tells how to install an Azure Automation Hybrid Runbook Worker to run runbooks on Linux-based machines in your local datacenter or cloud environment. Previously updated : 07/14/2021 Last updated : 08/05/2021
The minimum requirements for a Linux system and user Hybrid Runbook Worker are:
|Glibc |GNU C Library| 2.5-12 | |Openssl| OpenSSL Libraries | 1.0 (TLS 1.1 and TLS 1.2 are supported)| |Curl | cURL web client | 7.15.5|
-|Python-ctypes | Python 2.x or Python 3.x are required |
+|Python-ctypes | Foreign function library for Python| Python 2.x or Python 3.x are required |
|PAM | Pluggable Authentication Modules|+ | **Optional package** | **Description** | **Minimum version**|
+| | | -|
| PowerShell Core | To run PowerShell runbooks, PowerShell Core needs to be installed. See [Installing PowerShell Core on Linux](/powershell/scripting/install/installing-powershell-core-on-linux) to learn how to install it. | 6.0.0 | ### Adding a machine to a Hybrid Runbook Worker group
automation Quickstart Create Automation Account Template https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/quickstart-create-automation-account-template.md
Title: 'Quickstart: Create an Automation account - Azure template'
+ Title: 'Create an Automation account - Azure template'
-description: This quickstart shows how to create an Automation account by using the Azure Resource Manager template.
+description: This article shows how to create an Automation account by using the Azure Resource Manager template.
Last updated 07/20/2021-+ - mvc
# Customer intent: I want to create an Automation account by using an Azure Resource Manager template so that I can automate processes with runbooks.
-# Quickstart: Create an Automation account by using ARM template
+# Create an Automation account by using ARM template
-Azure Automation delivers a cloud-based automation and configuration service that supports consistent management across your Azure and non-Azure environments. This quickstart shows you how to deploy an Azure Resource Manager template (ARM template) that creates an Automation account. Using an ARM template takes fewer steps compared to other deployment methods.
+Azure Automation delivers a cloud-based automation and configuration service that supports consistent management across your Azure and non-Azure environments. This article shows you how to deploy an Azure Resource Manager template (ARM template) that creates an Automation account. Using an ARM template takes fewer steps compared to other deployment methods.
[!INCLUDE [About Azure Resource Manager](../../includes/resource-manager-quickstart-introduction.md)]
This sample template performs the following:
After you complete these steps, you need to [configure diagnostic settings](automation-manage-send-joblogs-log-analytics.md) for your Automation account to send runbook job status and job streams to the linked Log Analytics workspace.
-The template used in this quickstart is from [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates/101-automation/).
+The template used in this article is from [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates/101-automation/).
:::code language="json" source="~/quickstart-templates/quickstarts/microsoft.automation/101-automation/azuredeploy.json":::
When you no longer need them, unlink the Automation account from the Log Analyti
## Next steps
-In this quickstart, you created an Automation account, a Log Analytics workspace, and linked them together.
+In this article, you created an Automation account, a Log Analytics workspace, and linked them together.
To learn more, continue to the tutorials for Azure Automation.
automation Update Agent Issues https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/troubleshoot/update-agent-issues.md
Proxy and firewall configurations must allow the Hybrid Runbook Worker agent to
This check determines if the Log Analytics agent for Windows (`healthservice`) is running on the machine. To learn more about troubleshooting the service, see [The Log Analytics agent for Windows isn't running](hybrid-runbook-worker.md#mma-not-running).
-To reinstall the Log Analytics agent for Windows, see [Install the agent for Windows](../../azure-monitor/vm/quick-collect-windows-computer.md#install-the-agent-for-windows).
+To reinstall the Log Analytics agent for Windows, see [Install the agent for Windows](../../azure-monitor/agents/agent-windows.md).
### Monitoring agent service events
automation Update Management https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/automation/troubleshoot/update-management.md
This issue can be caused by local configuration issues or by improperly configur
| summarize by Computer, Solutions ```
- If you don't see your machine in the query results, it hasn't recently checked in. There's probably a local configuration issue and you should [reinstall the agent](../../azure-monitor/vm/quick-collect-windows-computer.md#install-the-agent-for-windows).
+ If you don't see your machine in the query results, it hasn't recently checked in. There's probably a local configuration issue and you should [reinstall the agent](../../azure-monitor/agents/agent-windows.md).
If your machine is listed in the query results, verify under the **Solutions** property that **updates** is listed. This verifies it is registered with Update Management. If it is not, check for scope configuration problems. The [scope configuration](../update-management/scope-configuration.md) determines which machines are configured for Update Management. To configure the scope configuration for the target the machine, see [Enable machines in the workspace](../update-management/enable-from-automation-account.md#enable-machines-in-the-workspace).
azure-app-configuration Enable Dynamic Configuration Dotnet Core Push Refresh https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-app-configuration/enable-dynamic-configuration-dotnet-core-push-refresh.md
Once the resources are created, add the following environment variables. These w
1. Click on `Create` to create the event subscription.
-1. Click on `Event Subscriptions` in the `Events` pane to validated that the subscription was created successfully.
+1. Click on `Event Subscriptions` in the `Events` pane to validate that the subscription was created successfully.
![App Configuration event subscriptions](./media/event-subscription-view.png)
A random delay is added before the cached value is marked as dirty to reduce pot
In this tutorial, you enabled your .NET Core app to dynamically refresh configuration settings from App Configuration. To learn how to use an Azure managed identity to streamline the access to App Configuration, continue to the next tutorial. > [!div class="nextstepaction"]
-> [Managed identity integration](./howto-integrate-azure-managed-service-identity.md)
+> [Managed identity integration](./howto-integrate-azure-managed-service-identity.md)
azure-arc Resource Graph Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/kubernetes/resource-graph-samples.md
Title: Azure Resource Graph sample queries for Azure Arc-enabled Kubernetes description: Sample Azure Resource Graph queries for Azure Arc-enabled Kubernetes showing use of resource types and tables to access Azure Arc-enabled Kubernetes related resources and properties. Previously updated : 07/21/2021 Last updated : 08/04/2021
azure-arc Resource Graph Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/resource-graph-samples.md
Title: Azure Resource Graph sample queries for Azure Arc description: Sample Azure Resource Graph queries for Azure Arc showing use of resource types and tables to access Azure Arc related resources and properties. Previously updated : 07/21/2021 Last updated : 08/04/2021
azure-arc Data Residency https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/servers/data-residency.md
Title: Data residency description: Data residency and information about Azure Arc-enabled servers. Previously updated : 07/16/2021 Last updated : 08/05/2021
Metadata information about the connected machine is also collected. Specifically
* Public key for managed identity * Policy compliance status and details (if using Azure Policy Guest Configuration policies)
-Arc-enabled servers allow you to specify the region where your data will be stored. Microsoft may replicate to other regions for data resiliency, but Microsoft does not replicate or move data outside the geography. This data is stored in the region where the Azure Arc machine resource is configured. For example, if the machine is registered with Arc in the East US region, this data is stored in the US region.
+Arc-enabled servers allow you to specify the region where your data is stored. Microsoft may replicate to other regions for data resiliency, but Microsoft does not replicate or move data outside the geography. This data is stored in the region where the Azure Arc machine resource is configured. For example, if the machine is registered with Arc in the East US region, this data is stored in the US region.
+
+> [!NOTE]
+> For South East Asia, your data is not replicated outside of this region.
For more information about our regional resiliency and compliance support, see [Azure geography](https://azure.microsoft.com/global-infrastructure/geographies/).
azure-arc Manage Vm Extensions Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/servers/manage-vm-extensions-cli.md
Title: Enable VM extension using Azure CLI description: This article describes how to deploy virtual machine extensions to Azure Arc-enabled servers running in hybrid cloud environments using the Azure CLI. Previously updated : 07/16/2021 Last updated : 08/05/2021
The following example enables the Custom Script Extension on an Arc-enabled serv
az connectedmachine extension create --machine-name "myMachineName" --name "CustomScriptExtension" --location "eastus" --type "CustomScriptExtension" --publisher "Microsoft.Compute" --settings "{\"commandToExecute\":\"powershell.exe -c \\\"Get-Process | Where-Object { $_.CPU -gt 10000 }\\\"\"}" --type-handler-version "1.10" --resource-group "myResourceGroup" ```
-The following example enables the Key Vault VM extension (preview) on an Arc-enabled server:
+The following example enables the Key Vault VM extension on an Arc-enabled server:
```azurecli az connectedmachine extension create --resource-group "resourceGroupName" --machine-name "myMachineName" --location "regionName" --publisher "Microsoft.Azure.KeyVault" --type "KeyVaultForLinux or KeyVaultForWindows" --name "KeyVaultForLinux or KeyVaultForWindows" --settings '{"secretsManagementSettings": { "pollingIntervalInS": "60", "observedCertificates": ["observedCert1"] }, "authenticationSettings": { "msiEndpoint": "http://localhost:40342/metadata/identity" }}'
azure-arc Manage Vm Extensions Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/servers/manage-vm-extensions-portal.md
Title: Enable VM extension from Azure portal description: This article describes how to deploy virtual machine extensions to Azure Arc-enabled servers running in hybrid cloud environments from the Azure portal. Previously updated : 07/16/2021 Last updated : 08/05/2021
This article shows you how to deploy and uninstall Azure VM extensions supported by Azure Arc-enabled servers, on a Linux or Windows hybrid machine through the Azure portal. > [!NOTE]
-> The Key Vault VM extension (preview) does not support deployment from the Azure portal, only using the Azure CLI, the Azure PowerShell, or using an Azure Resource Manager template.
+> The Key Vault VM extension does not support deployment from the Azure portal, only using the Azure CLI, the Azure PowerShell, or using an Azure Resource Manager template.
> [!NOTE] > Azure Arc-enabled servers does not support deploying and managing VM extensions to Azure virtual machines. For Azure VMs, see the following [VM extension overview](../../virtual-machines/extensions/overview.md) article.
azure-arc Manage Vm Extensions Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/servers/manage-vm-extensions-powershell.md
Title: Enable VM extension using Azure PowerShell description: This article describes how to deploy virtual machine extensions to Azure Arc-enabled servers running in hybrid cloud environments using Azure PowerShell. Previously updated : 07/16/2021 Last updated : 08/05/2021
PS C:\> $Setting = @{ "commandToExecute" = "powershell.exe -c Get-Process" }
PS C:\> New-AzConnectedMachineExtension -Name custom -ResourceGroupName myResourceGroup -MachineName myMachineName -Location eastus -Publisher "Microsoft.Compute" -Settings $Setting -ExtensionType CustomScriptExtension ```
-### Key Vault VM extension (preview)
+### Key Vault VM extension
> [!WARNING] > PowerShell clients often add `\` to `"` in the settings.json which will cause akvvm_service fails with error: `[CertificateManagementConfiguration] Failed to parse the configuration settings with:not an object.`
-The following example enables the Key Vault VM extension (preview) on an Arc-enabled server:
+The following example enables the Key Vault VM extension on an Arc-enabled server:
```powershell # Build settings
azure-arc Manage Vm Extensions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/servers/manage-vm-extensions.md
Title: VM extension management with Azure Arc-enabled servers description: Azure Arc-enabled servers can manage deployment of virtual machine extensions that provide post-deployment configuration and automation tasks with non-Azure VMs. Previously updated : 07/16/2021 Last updated : 08/05/2021
Be sure to review the documentation for each VM extension referenced in the prev
The Log Analytics agent VM extension for Linux requires Python 2.x is installed on the target machine.
-### Azure Key Vault VM extension (preview)
+### Azure Key Vault VM extension
-The Key Vault VM extension (preview) doesn't support the following Linux operating systems:
+The Key Vault VM extension doesn't support the following Linux operating systems:
- CentOS Linux 7 (x64) - Red Hat Enterprise Linux (RHEL) 7 (x64) - Amazon Linux 2 (x64)
-Deploying the Key Vault VM extension (preview) is only supported using:
+Deploying the Key Vault VM extension is only supported using:
- The Azure CLI - The Azure PowerShell
azure-arc Resource Graph Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-arc/servers/resource-graph-samples.md
+
+ Title: Azure Resource Graph sample queries for Azure Arc-enabled servers
+description: Sample Azure Resource Graph queries for Azure Arc-enabled servers showing use of resource types and tables to access Azure Arc-enabled servers related resources and properties.
Last updated : 08/04/2021+++
+# Azure Resource Graph sample queries for Azure Arc-enabled servers
+
+This page is a collection of [Azure Resource Graph](../../governance/resource-graph/overview.md)
+sample queries for Azure Arc-enabled servers. For a complete list of Azure Resource Graph samples,
+see
+[Resource Graph samples by Category](../../governance/resource-graph/samples/samples-by-category.md)
+and [Resource Graph samples by Table](../../governance/resource-graph/samples/samples-by-table.md).
+
+## Sample queries
++
+## Next steps
+
+- Learn more about the [query language](../../governance/resource-graph/concepts/query-language.md).
+- Learn more about how to [explore resources](../../governance/resource-graph/concepts/explore-resources.md).
+- See samples of [Starter language queries](../../governance/resource-graph/samples/starter.md).
+- See samples of [Advanced language queries](../../governance/resource-graph/samples/advanced.md).
azure-cache-for-redis Cache Remove Tls 10 11 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-cache-for-redis/cache-remove-tls-10-11.md
As a part of this effort, we'll be making the following changes to Azure Cache f
* **Phase 2:** We'll stop supporting TLS 1.1 and TLS 1.0. After this change, your application must use TLS 1.2 or later to communicate with your cache. The Azure Cache for Redis service is expected to be available while we migrate it to support only TLS 1.2 or later. > [!NOTE]
- > Phase 2 is tentatively planned to begin not earlier than December 31, 2020. However, we strongly recommend that you begin planning for this change now and proactively update clients to support TLS 1.2 or later.
+ > Phase 2 is postponed because of COVID-19. We strongly recommend that you begin planning for this change now and proactively update clients to support TLS 1.2 or later.
> As part of this change, we'll also remove support for older cypher suites that aren't secure. Our supported cypher suites are restricted to the following suites when the cache is configured with a minimum of TLS 1.2:
The dates when these changes take effect are:
| Azure China 21Vianet | March 13, 2020 | Postponed because of COVID-19 | > [!NOTE]
-> Phase 2 is tentatively planned to begin not earlier than December 31, 2020. This article will be updated when specific dates are set.
+> Phase 2 is postponed because of COVID-19. This article will be updated when specific dates are set.
> ## Check whether your application is already compliant
azure-government Compare Azure Government Global Azure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/compare-azure-government-global-azure.md
description: Describe feature differences between Azure Government and global Az
cloud: gov documentationcenter: ''-++ ms.devlang: na na Previously updated : 06/25/2021 Last updated : 08/04/2021 # Compare Azure Government and global Azure
Table below lists API endpoints in Azure vs. Azure Government for accessing and
## Service availability
-Microsoft's goal for Azure Government is to match service availability in Azure. For service availability in Azure Government, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=all&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). Services available in Azure Government are listed by category and whether they are Generally Available or available through Preview. If a service is available in Azure Government, that fact is not reiterated in the rest of this article. Instead, you are encouraged to review [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=all&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia) for the latest, up-to-date information on service availability.
+Microsoft's goal for Azure Government is to match service availability in Azure. For service availability in Azure Government, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=all&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia). Services available in Azure Government are listed by category and whether they are Generally Available or available through Preview. If a service is available in Azure Government, that fact is not reiterated in the rest of this article. Instead, you are encouraged to review [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=all&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia) for the latest, up-to-date information on service availability.
In general, service availability in Azure Government implies that all corresponding service features are available to you. Variations to this approach and other applicable limitations are tracked and explained in this article based on the main service categories outlined in the [online directory of Azure services](https://azure.microsoft.com/services/). Other considerations for service deployment and usage in Azure Government are also provided. ## AI + Machine Learning
-This section outlines variations and considerations when using **Azure Bot Service**, **Azure Machine Learning**, and **Cognitive Services** in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=machine-learning-service,bot-service,cognitive-services&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using **Azure Bot Service**, **Azure Machine Learning**, and **Cognitive Services** in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=machine-learning-service,bot-service,cognitive-services&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [Azure Bot Service](/azure/bot-service/)
The following Translator **features are not currently available** in Azure Gover
## Analytics
-This section outlines variations and considerations when using Analytics services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=data-share,power-bi-embedded,analysis-services,event-hubs,data-lake-analytics,storage,data-catalog,data-factory,synapse-analytics,stream-analytics,databricks,hdinsight&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Analytics services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=data-share,power-bi-embedded,analysis-services,event-hubs,data-lake-analytics,storage,data-catalog,data-factory,synapse-analytics,stream-analytics,databricks,hdinsight&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [Azure Data Factory](../data-factory/index.yml)
The following Power BI Embedded **features are not yet available** in Azure Gove
## Compute
-This section outlines variations and considerations when using Compute services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=spring-cloud,azure-vmware-cloudsimple,cloud-services,batch,container-instances,app-service,service-fabric,functions,kubernetes-service,virtual-machine-scale-sets,virtual-machines&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Compute services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=spring-cloud,azure-vmware-cloudsimple,cloud-services,batch,container-instances,app-service,service-fabric,functions,kubernetes-service,virtual-machine-scale-sets,virtual-machines&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [Virtual Machines](../virtual-machines/sizes.md)
When connecting your Functions app to Application Insights in Azure Government,
## Containers
-This section outlines variations and considerations when using Container services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=openshift,app-service-linux,container-registry,container-instances,kubernetes-service&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Container services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=openshift,app-service-linux,container-registry,container-instances,kubernetes-service&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [Azure Kubernetes Service](../aks/intro-kubernetes.md)
The following Azure Kubernetes Service **features are not currently available**
## Databases
-This section outlines variations and considerations when using Databases services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=azure-api-for-fhir,data-factory,sql-server-stretch-database,redis-cache,database-migration,synapse-analytics,postgresql,mariadb,mysql,sql-database,cosmos-db&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Databases services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=azure-api-for-fhir,data-factory,sql-server-stretch-database,redis-cache,database-migration,synapse-analytics,postgresql,mariadb,mysql,sql-database,cosmos-db&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [Azure Database for MySQL](../mysql/index.yml)
The following Azure SQL Managed Instance **features are not currently available*
## Developer Tools
-This section outlines variations and considerations when using Developer Tools services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=app-configuration,devtest-lab,lab-services,azure-devops&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Developer Tools services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=app-configuration,devtest-lab,lab-services,azure-devops&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [Azure DevTest Labs](../devtest-labs/devtest-lab-overview.md)
The following Azure DevTest Labs **features are not currently available** in Azu
## Internet of Things
-This section outlines variations and considerations when using Internet of Things services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=api-management,cosmos-db,notification-hubs,logic-apps,stream-analytics,machine-learning-studio,machine-learning-service,event-grid,functions,azure-rtos,azure-maps,iot-central,iot-hub&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Internet of Things services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=api-management,cosmos-db,notification-hubs,logic-apps,stream-analytics,machine-learning-studio,machine-learning-service,event-grid,functions,azure-rtos,azure-maps,iot-central,iot-hub&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [Azure IoT Hub](../iot-hub/index.yml)
If you are using the IoT Hub connection string (instead of the Event Hub-compati
## Management and Governance
-This section outlines variations and considerations when using Management and Governance services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=managed-applications,azure-policy,network-watcher,monitor,traffic-manager,automation,scheduler,site-recovery,cost-management,backup,blueprints,advisor&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Management and Governance services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=managed-applications,azure-policy,network-watcher,monitor,traffic-manager,automation,scheduler,site-recovery,cost-management,backup,blueprints,advisor&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
> [!NOTE] >This article has been updated to use the new Azure PowerShell Az module. You can still use the AzureRM module, which will continue to receive bug fixes until at least December 2020. To learn more about the new Az module and AzureRM compatibility, see [**Introducing the new Azure PowerShell Az module**](/powershell/azure/new-azureps-module-az). For Az module installation instructions, see [**Install Azure PowerShell**](/powershell/azure/install-az-ps).
The following Azure Monitor **features behave differently** in Azure Government:
## Media This section outlines variations and considerations when using Media services in the Azure Government environment.
-For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=cdn,media-services&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia). For Azure Media Services v3 availability, see [Azure clouds and regions in which Media Services v3 exists](../media-services/latest/azure-clouds-regions.md).
+For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=cdn,media-services&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia). For Azure Media Services v3 availability, see [Azure clouds and regions in which Media Services v3 exists](../media-services/latest/azure-clouds-regions.md).
### [Media Services](../media-services/previous/index.yml)
For more information, see [Create a Video Indexer account](../azure-video-analyz
## Migration
-This section outlines variations and considerations when using Migration services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=database-migration,cost-management,azure-migrate,site-recovery&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Migration services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=database-migration,cost-management,azure-migrate,site-recovery&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [Azure Migrate](../migrate/migrate-services-overview.md)
The following Azure Migrate **features are not currently available** in Azure Go
## Networking
-This section outlines variations and considerations when using Networking services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=azure-bastion,frontdoor,virtual-wan,dns,ddos-protection,cdn,azure-firewall,network-watcher,load-balancer,vpn-gateway,expressroute,application-gateway,virtual-network&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Networking services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=azure-bastion,frontdoor,virtual-wan,dns,ddos-protection,cdn,azure-firewall,network-watcher,load-balancer,vpn-gateway,expressroute,application-gateway,virtual-network&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [Azure ExpressRoute](../expressroute/index.yml)
Traffic Manager health checks can originate from certain IP addresses for Azure
## Security
-This section outlines variations and considerations when using Security services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=azure-sentinel,azure-dedicated-hsm,information-protection,application-gateway,vpn-gateway,security-center,key-vault,active-directory-ds,ddos-protection,active-directory&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Security services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=azure-sentinel,azure-dedicated-hsm,information-protection,application-gateway,vpn-gateway,security-center,key-vault,active-directory-ds,ddos-protection,active-directory&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [Azure Active Directory Premium P1 and P2](../active-directory/index.yml)
The following features have known limitations in Azure Government:
- Limitations with Azure AD join: - Enterprise state roaming for Windows 10 devices is not available
-
-- Limitations with Azure AD self-service password reset (SSPR):
- - Azure AD SSPR from Windows 10 login screen is not available
### [Azure Information Protection](/azure/information-protection/what-is-information-protection)
For feature variations and limitations, see [Cloud feature availability for US G
## Storage
-This section outlines variations and considerations when using Storage services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=hpc-cache,managed-disks,storsimple,backup,storage&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Storage services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=hpc-cache,managed-disks,storsimple,backup,storage&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [Azure Backup](../backup/backup-overview.md)
For all jobs, we recommend that you rotate your storage account keys after the j
## Web
-This section outlines variations and considerations when using Web services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=spring-cloud,signalr-service,api-management,notification-hubs,search,cdn,app-service-linux,app-service&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia).
+This section outlines variations and considerations when using Web services in the Azure Government environment. For service availability, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=spring-cloud,signalr-service,api-management,notification-hubs,search,cdn,app-service-linux,app-service&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### [API Management](../api-management/index.yml)
azure-government Documentation Government Developer Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/documentation-government-developer-guide.md
description: This article compares features and provides guidance on developing
cloud: gov documentationcenter: ''-++ ms.devlang: na na Previously updated : 1/11/2021- Last updated : 8/04/2021 # Azure Government developer guide
-Azure Government is a separate instance of the Microsoft Azure service. It addresses the security and compliance needs of United States federal agencies, state and local governments, and their solution providers. Azure Government offers physical isolation from non-US government deployments and provides screened US personnel.
-Microsoft provides various tools to help developers create and deploy cloud applications to the global Microsoft Azure service (ΓÇ£global serviceΓÇ¥) and Microsoft Azure Government services.
+Azure Government is a separate instance of the Microsoft Azure service. It addresses the security and compliance needs of United States federal agencies, state and local governments, and their solution providers. Azure Government enforces physical isolation from non-US government infrastructure and relies on [screened US personnel](./documentation-government-plan-security.md#screening) for operations.
+
+Microsoft provides various tools to help you create and deploy cloud applications on global Azure and Azure Government.
-When developers create and deploy applications to Azure Government services, as opposed to the global service, they need to know the key differences between the two services.
-The specific areas to understand are:
+When you create and deploy applications to Azure Government services, as opposed to global Azure, you need to know the key differences between the two cloud environments. The specific areas to understand are:
-- Setting up and configuring their programming environment
+- Setting up and configuring your programming environment
- Configuring endpoints - Writing applications - Deploying applications as services to Azure Government
-The information in this document summarizes the differences between the two services.
-It supplements the information that's available through the following sources:
+The information in this document summarizes the differences between the two cloud environments. It supplements the information that's available through the following sources:
-- [Azure Government](https://www.azure.com/gov "Azure Government") site -- [Microsoft Azure Trust Center](https://www.microsoft.com/trust-center/product-overview "Microsoft Azure Trust Center")
+- [Azure Government](https://azure.microsoft.com/global-infrastructure/government/) site
+- [Microsoft Trust Center](https://www.microsoft.com/trust-center/product-overview)
- [Azure Documentation Center](../index.yml)-- [Azure Blogs](https://azure.microsoft.com/blog/ "Azure Blogs")
+- [Azure Blogs](https://azure.microsoft.com/blog/)
-This content is intended for partners and developers who are deploying to Microsoft Azure Government.
+This content is intended for partners and developers who are deploying to Azure Government.
## Guidance for developers
-Most of the currently available technical content assumes that applications are being developed for the global service rather than for Azure Government. For this reason, itΓÇÖs important to be aware of two key differences in applications that you develop for hosting in Azure Government.
-- Certain services and features that are in specific regions of the global service might not be available in Azure Government.-- Feature configurations in Azure Government might differ from those in the global service.
+Most of the currently available technical content assumes that applications are being developed on global Azure rather than on Azure Government. For this reason, itΓÇÖs important to be aware of two key differences in applications that you develop for hosting in Azure Government.
+
+- Certain services and features that are in specific regions of global Azure might not be available in Azure Government.
+- Feature configurations in Azure Government might differ from those in global Azure.
Therefore, it's important to review your sample code, configurations, and steps to ensure that you are building and executing within the Azure Government cloud services environment.
-Currently, US DoD Central, US DoD East, US Gov Arizona, US Gov Texas, and US Gov Virginia are the regions that support Azure Government. For current regions and available services, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=all&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
+For current Azure Government regions and available services, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=all&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia).
### Quickstarts
-Navigate through the links below to get started using Azure Government.
-- [Login to Azure Government Portal](./documentation-government-get-started-connect-with-portal.md)
+Navigate through the links below to get started using Azure Government:
+
+- [Login to Azure Government portal](./documentation-government-get-started-connect-with-portal.md)
- [Connect with PowerShell](./documentation-government-get-started-connect-with-ps.md) - [Connect with CLI](./documentation-government-get-started-connect-with-cli.md) - [Connect with Visual Studio](./documentation-government-connect-vs.md)
Navigate through the links below to get started using Azure Government.
- [Connect with Azure SDK for Python](/azure/developer/python/azure-sdk-sovereign-domain) ### Azure Government Video Library + The [Azure Government video library](https://aka.ms/AzureGovVideos) contains many helpful videos to get you up and running with Azure Government.
-## Compliance - Azure Blueprint
-The [Azure Blueprint](../governance/blueprints/overview.md) program is designed to facilitate the secure and compliant use of Azure for government agencies and third-party providers building on behalf of government.
+## Compliance
For more information on Azure Government Compliance, refer to the [compliance documentation](./documentation-government-plan-compliance.md) and watch this [video](https://channel9.msdn.com/blogs/Azure-Government/Compliance-on-Azure-Government).
+### Azure Blueprints
+
+[Azure Blueprints](../governance/blueprints/overview.md) is a service that helps you deploy and update cloud environments in a repeatable manner using composable artifacts such as Azure Resource Manager templates to provision resources, role-based access controls, and policies. Resources provisioned through Azure Blueprints adhere to an organizationΓÇÖs standards, patterns, and compliance requirements. The overarching goal of Azure Blueprints is to help automate compliance and cybersecurity risk management in cloud environments. To help you deploy a core set of policies for any Azure-based architecture that requires compliance with certain US government compliance requirements, see [Azure Blueprint samples](/azure/governance/blueprints/samples/).
+ ## Endpoint mapping
-Service endpoints in Azure Government are different than in Azure. For a mapping between Azure and Azure Government endpoints, see [Compare Azure Government and global Azure](./compare-azure-government-global-azure.md#guidance-for-developers).
+
+Service endpoints in Azure Government are different than in Azure. For a mapping between Azure and Azure Government endpoints, see [Compare Azure Government and global Azure](./compare-azure-government-global-azure.md#guidance-for-developers).
## Next steps+ For more information about Azure Government, see the following resources: - [Sign up for a trial](https://azure.microsoft.com/global-infrastructure/government/request/?ReqType=Trial)-- [Acquiring and accessing Azure Government](https://azure.com/gov)
+- [Acquiring and accessing Azure Government](https://azure.microsoft.com/offers/azure-government/)
- [Ask questions via the azure-gov tag in StackOverflow](https://stackoverflow.com/tags/azure-gov) - [Azure Government Overview](./documentation-government-welcome.md) - [Azure Government Blog](https://blogs.msdn.microsoft.com/azuregov/)-- [Azure Compliance](../compliance/index.yml)
+- [Azure Compliance](../compliance/index.yml)
azure-government Documentation Government Overview Dod https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/documentation-government-overview-dod.md
Title: Azure Government DoD Overview | Microsoft Docs
-description: In this article, you can learn about features and guidance on developing applications for Azure Government
+description: Features and guidance for using Azure Government DoD regions
cloud: gov documentationcenter: ''-++ ms.devlang: na na Previously updated : 05/18/2017-+ Last updated : 08/04/2021
-# Department of Defense (DoD) in Azure Government
-## Overview
-Azure Government is used by Department of Defense (DoD) entities to deploy a broad range of workloads and solutions, including those workloads covered by<a href="https://dl.dod.cyber.mil/wp-content/uploads/cloud/pdf/Cloud_Computing_SRG_v1r3_Revision_History.pdf"> The DoD Cloud Computing Security Requirements Guide, Version 1, Release 3</a> at Impact Level 4 (L4), and Impact Level 5 (L5).
-
-Azure Government is the first and only hyperscale commercial cloud service to be awarded an Information Impact Level 5 DoD Provisional Authorization by the Defense Information Systems Agency. In addition, Azure Government regions dedicated to US Department of Defense customer workloads are now generally available.
-
-One of the key drivers for the DoD in moving to the cloud is to enable organizations to focus on their missions and minimize the distractions of building and managing in-house IT solutions.
-
-Azure Government-based cloud architectures allow DoD personnel to focus on mission objectives, and managing IT commodity services such as SharePoint and other application workloads. This allows for the realignment of critical IT resources to focus on application development, analytics, and cyber security.
-
-The elasticity and flexibility delivered by Azure provides enormous benefits to DoD customers. It is simpler, quicker, and more cost-effective to scale-up a workload in the cloud than it is to go through traditional hardware and services procurement processes when working on-premises, or in DoD data centers. For example, to procure new multi-server hardware, even for a test environment, may take many months, and require the approval of significant capital expenditure. By contrast, using Azure, a test migration for an existing workload can be configured in weeks or even days, and in a cost-effective manner (when the test is over, the environment can be torn down with no ongoing costs).
-
-This flexibility is significant. By moving to Azure, DoD customers do not just save money; the cloud delivers new opportunities. For example, it is easy to spin up a test environment to gain insights into new technologies, you can migrate an application and test it in Azure before committing to a production deployment in the cloud. Mission owners can explore more cost effective options easier, and without risk.
-
-Security is another key area, and although any cloud deployment requires proper planning to ensure secure and reliable service delivery, in reality most properly configured cloud-based workloads (up to and including L4 workloads) in Azure Government will be more secure than many traditional deployments in DoD locations and data centers. This is because defense agencies have the experience and expertise to physically secure all assets; however, the IT surface areas present different challenges. Cyber security is a rapidly changing space, requiring specialist skills and the ability to rapidly develop and deploy counter-measures as required. The Azure platform, both commercial and Government, now supports hundreds of thousands of customers, and this scale enables Microsoft to quickly detect evolving attack vectors, and then direct its resources onto rapid development and implementation of the appropriate defenses.
-
-## DoD Region Q&A
-
-### What are the Azure Government DoD Regions? 
-The US DoD East and US DoD Central regions are physically separated regions of Microsoft Azure architected to meet US Department of Defense (DoD) security requirements for cloud computing, specifically for data designated as DoD Impact Level 5 per the DoD Cloud Computing Security Requirements Guide (SRG).   
-
-### What is the difference between Azure Government and the Azure Government DoD Regions? 
-Azure Government is a US government community cloud providing services for Federal, State and Local government customers, tribal, entities subject to ITAR, and solution providers performing work on their behalf. All Azure Government regions are architected and operated to meet the security requirements for DoD Impact Level 5 data and FedRAMP High standards.
-
-The Azure Government DoD regions are architected to support the physical separation requirements for Impact Level 5 data by providing dedicated compute and storage infrastructure for the use of DoD customers only.  
-
-#### What is the difference between Impact Level 4 and Impact Level 5 data?  
-Impact Level 4 data is controlled unclassified information (CUI) that may include data subject to export control, privacy information protected health information and other data requiring explicit CUI designation (e.g. For Official Use Only, Law Enforcement Sensitive, Sensitive Security Information).
-
-Impact Level 5 data includes controlled, unclassified information (CUI) that requires a higher level of protection as deemed necessary by the information owner, public law or government regulation.  Impact Level 5 data is inclusive of unclassified National Security Systems.  More information on the SRG impact levels, their distinguishing requirements and characteristics is available in section 3 of the DoD Cloud Computing Security Requirements Guide.  
-
-### What Data is categorized as Impact Level 5? 
-Level 5 accommodates controlled unclassified information (CUI) that requires a higher level of protection than that afforded by Level 4 as deemed necessary by the information owner, public law, or other government regulations. Level 5 also supports unclassified National Security Systems (NSSs).  This level accommodates NSS and CUI information categorizations based on CNSSI-1253 up to moderate confidentiality and moderate integrity (M-M-x).
-
-### What is Microsoft doing differently to support Impact Level 5 data? 
-Impact Level 5 data by definition can only be processed in a dedicated infrastructure that ensures physical separation of DoD customers from non-Federal government tenants.  In delivering the US DoD East and US DoD Central regions, Microsoft is providing an exclusive service for DoD customers that meets an even higher bar than DoD’s stated requirements and exceeds the level of protection and capability offered by any other hyperscale commercial cloud solution.
-### Do these regions support classified data requirements? 
-These Azure Government DoD regions support only unclassified data up to and including Impact Level 5.  Impact Level 6 data is defined as classified information up to Secret.
-
-### What organizations in the DoD can use the Azure Government DoD Regions? 
-The US DoD East and US DoD Central regions are built to support the US Department of Defense customer base.ΓÇ» This includes:
-* The Office of the Secretary of Defense
-* The Joint Chiefs of Staff
-* The Joint Staff
-* The Defense Agencies
-* Department of Defense Field Activities
-* The Department of the Army
-* The Department of the Navy (including the United States Marine Corps)
-* The Department of the Air Force
-* The United States Coast Guard
-* The unified combatant commands
-* Other offices, agencies, activities, and commands under the control or supervision of any approved entity named above
-
-### Are the DoD regions more secure? 
-Microsoft operates all of its Azure datacenters and supporting infrastructure to comply with local and international standards for security and compliance ΓÇô leading all commercial cloud platforms in compliance investment and achievements.ΓÇ» These new DoD regions will provide specific assurances and commitments to meet the requirements defined in the DoD SRG for Cloud Computing.
-
-### Why are there multiple DoD regions? 
-By having multiple DoD regions, Microsoft provides customers with the opportunity to architect their solutions for disaster recovery scenarios across regions to ensure business continuity and satisfy requirements for system accreditation.  In addition, customers may optimize performance by deploying solutions in the geography within closest proximity to their physical location.
-
-### Are these DoD regions connected to the NIPRNet? 
-The DoD mandates that commercial cloud services used for CUI must be connected to customers through a Cloud Access Point (CAP).  Therefore, the Azure DoD regions are connected to the NIPRNet through redundant connections to multiple geographically distributed CAPs.  A DoD CAP is a system of network boundary protection and monitoring devices that offer protection to DoD information system network and services.
-
-### What Does General Availability Mean? 
-General Availability means that the DoD regions in Azure Government may be used to support production workloads and that financially backed SLAs for all services deployed in the regions and also generally available will be supported.
-
-### How does a DoD customer acquire Azure Government DoD services? 
-Azure Government DoD services may be purchased by qualified entities through the same reseller channels as Azure Government.  In keeping with Microsoft’s commitment to make cloud services acquisition planning and cost estimation simple, pricing for Azure Government DoD regions will be included in the Azure Pricing calculator at the time of general availability.  Azure Government DoD services can quickly scale up or down to match demand, so you only pay for what you use.
-No contractual modifications will be required for Enterprise Agreement customers already using Azure Government.  
-
-### How are the DoD regions priced? 
-The DoD regions utilize region based pricing.  This means that service costs for validated DoD customers will be based on the Azure Government region in which you run your workloads.  For more specific pricing information, please consultant your Microsoft Account Executive.  Pricing for the DoD regions will be provided through the Azure.com calculator at a future date.
-
-### How does a DoD organization get validated for the Azure Government DoD regions? 
-In order to gain access to the Azure DoD regions, customers must complete a pre-qualification process for verifying their organization and intended use of the Azure DoD environment.  After successful completion of the pre-qualification process, Microsoft will provide the organizational applicant with further instructions for creating a subscription, accessing the environment and providing role-based access control to other members of the organization.
-
-### Can independent software vendors and solution providers building on Azure deploy solutions in the Azure Government DoD regions? 
-Solution providers with cloud service offerings built on Azure may operate DoD-only single tenant and multi-tenant solutions in the Azure Government DoD regions.  These providers must first demonstrate eligibility by providing documented evidence of a contract with an approved DoD entity or have a sponsor letter from an approved DoD entity.  Providers offering services in the Azure Government DoD regions must include computer network defense, incident reporting and screened personnel for operating solutions handling Impact Level 5 information in their offering.  Additional guidance for solution providers may be found in the DoD Cloud Computing Security Requirements Guide.
-
-### Will Office 365 or Microsoft Dynamics 365 be a part of this offering? 
-Microsoft is providing Office 365 services for the DoD at Impact Level 5 in conjunction with this offering.  Dynamics 365 is planning to offer Impact Level 5 services from the Azure DoD regions at a future date.
-
-### How do I connect to the DoD Regions once I have a subscription? 
-The DoD regions for Azure Government are available through the Azure Government management portal. DoD customers approved for use will see the regions listed as available options when deploying available services.  For general guidance on managing your Azure Government subscriptions please consult our documentation.
-
-### What services are part of your Impact Level 5 accreditation scope? 
-Azure is an evergreen service where new services and capabilities are being added every week, the number of services in scope is regularly expanding.  For the most up-to-date information, please visit our<a href="https://www.microsoft.com/en-us/TrustCenter/Compliance/DISA"> Microsoft Trust Center.
-
-## <a name="Next-steps"></a>Next steps:
-
-<a href="https://www.microsoft.com/en-us/TrustCenter/Compliance/DISA"> Microsoft Trust Center - DoD web page </a>
-
-<a href="https://dl.dod.cyber.mil/wp-content/uploads/cloud/SRG/https://docsupdatetracker.net/index.html"> The DoD Cloud Computing Security Requirements Guide, Version 1, Release 2 </a>
-
-<a href="https://azure.microsoft.com/offers/azure-government/"> Azure Government Reseller Channels
+# Department of Defense (DoD) in Azure Government
-<a href="https://blogs.msdn.microsoft.com/azuregov/">Microsoft Azure Government Blog. </a>
+## Overview
+Azure Government is used by the US Department of Defense (DoD) entities to deploy a broad range of workloads and solutions, including workloads subject to the DoD Cloud Computing [Security Requirements Guide](https://dl.dod.cyber.mil/wp-content/uploads/cloud/SRG/https://docsupdatetracker.net/index.html) (SRG) Impact Level 4 (IL4) and Impact Level 5 (IL5) restrictions. Azure Government was the first hyperscale cloud services platform to be awarded a DoD IL5 Provisional Authorization (PA) by the Defense Information Systems Agency (DISA). For more information about DISA and DoD IL5, see [Department of Defense (DoD) Impact Level 5](/azure/compliance/offerings/offering-dod-il5) compliance documentation.
+
+Azure Government offers the following regions to DoD mission owners and their partners:
+
+|Regions|Relevant authorizations|# of IL5 PA services|
+||||
+|US Gov Arizona </br> US Gov Texas </br> US Gov Virginia|FedRAMP High, DoD IL4, DoD IL5|138|
+|US DoD Central </br> US DoD East|DoD IL5|64|
+
+**Azure Government regions** (US Gov Arizona, US Gov Texas, and US Gov Virginia) are intended for US federal (including DoD), state, and local government agencies, and their partners. **Azure Government DoD regions** (US DoD Central and US DoD East) are reserved for exclusive DoD use. Separate DoD IL5 PAs are in place for Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia) vs. Azure Government DoD regions (US DoD Central and US DoD East).
+
+The primary differences between DoD IL5 PAs that are in place for Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia) vs. Azure Government DoD regions (US DoD Central and US DoD East) are:
+
+- **IL5 compliance scope:** Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia) have many more services authorized provisionally at DoD IL5, which in turn enables DoD mission owners and their partners to deploy more realistic applications in these regions. For a complete list of services in scope for DoD IL5 PA in Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia), see [Azure Government services by audit scope](./compliance/azure-services-in-fedramp-auditscope.md#azure-government-services-by-audit-scope). For a complete list of Azure Government DoD regions (US DoD Central and US DoD East) services in scope for DoD IL5 PA, see [Azure Government DoD regions IL5 audit scope](#azure-government-dod-regions-il5-audit-scope).
+- **IL5 configuration:** Azure Government DoD regions (US DoD Central and US DoD East) are physically isolated from the rest of Azure Government and reserved for exclusive DoD use. Therefore, no extra configuration is needed in DoD regions when deploying Azure services intended for IL5 workloads. In contrast, some Azure services deployed in Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia) require extra configuration to meet DoD IL5 compute and storage isolation requirements, as explained in [Isolation guidelines for Impact Level 5 workloads](./documentation-government-impact-level-5.md).
+
+> [!NOTE]
+> If you are subject to DoD IL5 requirements, we recommend that you prioritize Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia) for your workloads, as follows:
+>
+> - **New deployments:** Choose Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia) for your new deployments. Doing so will allow you to benefit from the latest cloud innovations while meeting your DoD IL5 isolation requirements.
+> - **Existing deployments:** If you have existing deployments in Azure Government DoD regions (US DoD Central and US DoD East), we encourage you to migrate these workloads to Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia) to take advantage of additional services.
+
+Azure provides [extensive support for tenant isolation](./azure-secure-isolation-guidance.md) across compute, storage, and networking services to segregate each customer's applications and data. This approach provides the scale and economic benefits of multi-tenant cloud services while rigorously helping prevent other customers from accessing your data or applications.
+
+Hyperscale cloud also offers a feature-rich environment incorporating the latest cloud innovations such as artificial intelligence, machine learning, IoT services, intelligent edge, and many more to help DoD mission owners implement their mission objectives. Using Azure Government cloud capabilities, you benefit from rapid feature growth, resiliency, and the cost-effective operation of the hyperscale cloud while still obtaining the levels of isolation, security, and confidence required to handle workloads subject to FedRAMP High, DoD IL4, and DoD IL5 requirements.
+
+## Azure Government regions IL5 audit scope
+
+For a complete list of services in scope for DoD IL5 PA in Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia), see [Azure Government services by audit scope](./compliance/azure-services-in-fedramp-auditscope.md#azure-government-services-by-audit-scope).
+
+## Azure Government DoD regions IL5 audit scope
+
+The following services are in scope for DoD IL5 PA in Azure Government DoD regions (US DoD Central and US DoD East):
+
+- [API Management](https://azure.microsoft.com/services/api-management/)
+- [Application Gateway](https://azure.microsoft.com/services/application-gateway/)
+- [Azure Active Directory (Free and Basic)](../active-directory/fundamentals/active-directory-whatis.md#what-are-the-azure-ad-licenses)
+- [Azure Active Directory (Premium P1 + P2)](../active-directory/fundamentals/active-directory-whatis.md#what-are-the-azure-ad-licenses)
+- [Azure Analysis Services](https://azure.microsoft.com/services/analysis-services/)
+- [Azure Backup](https://azure.microsoft.com/services/backup/)
+- [Azure Cache for Redis](https://azure.microsoft.com/services/cache/)
+- [Azure Cosmos DB](https://azure.microsoft.com/services/cosmos-db/)
+- [Azure Database for MySQL](https://azure.microsoft.com/services/mysql/)
+- [Azure Database for PostgreSQL](https://azure.microsoft.com/services/postgresql/)
+- [Azure DNS](https://azure.microsoft.com/services/dns/)
+- [Azure Firewall](https://azure.microsoft.com/services/azure-firewall/)
+- [Azure Front Door](https://azure.microsoft.com/services/frontdoor/)
+- [Azure Functions](https://azure.microsoft.com/services/functions/)
+- [Azure HDInsight](https://azure.microsoft.com/services/hdinsight/)
+- [Azure Lab Services](https://azure.microsoft.com/services/lab-services/)
+- [Azure Logic Apps](https://azure.microsoft.com/services/logic-apps/)
+- [Azure Managed Applications](https://azure.microsoft.com/services/managed-applications/)
+- [Azure Media Services](https://azure.microsoft.com/services/media-services/)
+- [Azure Monitor](https://azure.microsoft.com/services/monitor/)
+- [Azure Resource Manager](https://azure.microsoft.com/features/resource-manager/)
+- [Azure Scheduler](../scheduler/index.yml)
+- [Azure Service Fabric](https://azure.microsoft.com/services/service-fabric/)
+- [Azure Service Manager (RDFE)](/previous-versions/azure/ee460799(v=azure.100))
+- [Azure Site Recovery](https://azure.microsoft.com/services/site-recovery/)
+- [Azure SQL Database](https://azure.microsoft.com/products/azure-sql/database/) (incl. [Azure SQL MI](https://azure.microsoft.com/products/azure-sql/managed-instance/))
+- [Azure Synapse Analytics (formerly SQL Data Warehouse)](https://azure.microsoft.com/services/synapse-analytics/)
+- [Batch](https://azure.microsoft.com/services/batch/)
+- [Cloud Services](https://azure.microsoft.com/services/cloud-services/)
+- [Dynamics 365 Customer Service](/dynamics365/customer-service/overview)
+- [Dynamics 365 Field Service](/dynamics365/field-service/overview)
+- [Dynamics 365 Project Service Automation](/dynamics365/project-operations/psa/overview)
+- [Dynamics 365 Sales](/dynamics365/sales-enterprise/overview)
+- [Event Grid](https://azure.microsoft.com/services/event-grid/)
+- [Event Hubs](https://azure.microsoft.com/services/event-hubs/)
+- [ExpressRoute](https://azure.microsoft.com/services/expressroute/)
+- [Import/Export](https://azure.microsoft.com/services/storage/import-export/)
+- [Key Vault](https://azure.microsoft.com/services/key-vault/)
+- [Load Balancer](https://azure.microsoft.com/services/load-balancer/)
+- [Microsoft Azure porta](https://azure.microsoft.com/features/azure-portal/)
+- [Microsoft Dataverse (formerly Common Data Service)](/powerapps/maker/data-platform/data-platform-intro)
+- [Microsoft Defender for Endpoint (formerly Microsoft Defender Advanced Threat Protection)](/microsoft-365/security/defender-endpoint/microsoft-defender-endpoint)
+- [Microsoft Graph](/graph/overview)
+- [Microsoft Stream](/stream/overview)
+- [Network Watcher](https://azure.microsoft.com/services/network-watcher/)
+- [Network Watcher Traffic Analytics](../network-watcher/traffic-analytics.md)
+- [Power Apps](/powerapps/powerapps-overview)
+- [Power Apps portal](https://powerapps.microsoft.com/portals/)
+- [Power Automate (formerly Microsoft Flow)](/power-automate/getting-started)
+- [Power BI](https://powerbi.microsoft.com/)
+- [Power BI Embedded](https://azure.microsoft.com/services/power-bi-embedded/)
+- [Service Bus](https://azure.microsoft.com/services/service-bus/)
+- [SQL Server Stretch Database](https://azure.microsoft.com/services/sql-server-stretch-database/)
+- [Storage: Blobs](https://azure.microsoft.com/services/storage/blobs/) (incl. [Azure Data Lake Storage Gen2](../storage/blobs/data-lake-storage-introduction.md))
+- [Storage: Disks (incl. Managed Disks)](https://azure.microsoft.com/services/storage/disks/)
+- [Storage: Files](https://azure.microsoft.com/services/storage/files/)
+- [Storage: Queues](https://azure.microsoft.com/services/storage/queues/)
+- [Storage: Tables](https://azure.microsoft.com/services/storage/tables/)
+- [Traffic Manager](https://azure.microsoft.com/services/traffic-manager/)
+- [Virtual Machine Scale Sets](https://azure.microsoft.com/services/virtual-machine-scale-sets/)
+- [Virtual Machines](https://azure.microsoft.com/services/virtual-machines/)
+- [Virtual Network](https://azure.microsoft.com/services/virtual-network/)
+- [VPN Gateway](https://azure.microsoft.com/services/vpn-gateway/)
+- [Web Apps (App Service)](https://azure.microsoft.com/services/app-service/web/)
+
+## Frequently asked questions
+
+### What are the Azure Government DoD regions? 
+Azure Government DoD regions (US DoD Central and US DoD East) are physically separated Azure Government regions reserved for exclusive use by the DoD.
+
+### What is the difference between Azure Government and the Azure Government DoD regions? 
+Azure Government is a US government community cloud providing services for federal, state and local government customers, tribal entities, and other entities subject to various US government regulations such as CJIS, ITAR, and others. All Azure Government regions are designed to meet the security requirements for DoD IL5 workloads. Azure Government DoD regions (US DoD Central and US DoD East) achieve DoD IL5 tenant separation requirements by being dedicated exclusively to DoD. In Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia), some services require extra configuration to meet DoD IL5 compute and storage isolation requirements, as explained in [Isolation guidelines for Impact Level 5 workloads](./documentation-government-impact-level-5.md).
+
+### How do Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia) support IL5 data?
+Azure provides [extensive support for tenant isolation](./azure-secure-isolation-guidance.md) across compute, storage, and networking services to segregate each customer's applications and data. This approach provides the scale and economic benefits of multi-tenant cloud services while rigorously helping prevent other customers from accessing your data or applications. Moreover, some Azure services deployed in Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia) require extra configuration to meet DoD IL5 compute and storage isolation requirements, as explained in [Isolation guidelines for Impact Level 5 workloads](./documentation-government-impact-level-5.md).
+
+### What is IL5 data? 
+IL5 accommodates controlled unclassified information (CUI) that requires a higher level of protection than that afforded by IL4 as deemed necessary by the information owner, public law, or other government regulations. IL5 also supports unclassified National Security Systems (NSS). This impact level accommodates NSS and CUI categorizations based on CNSSI 1253 up to moderate confidentiality and moderate integrity (M-M-x). For more information on IL5 data, see [DoD IL5 overview](/azure/compliance/offerings/offering-dod-il5#dod-il5-overview).
+
+### What is the difference between IL4 and IL5 data?  
+IL4 data is controlled unclassified information (CUI) that may include data subject to export control, protected health information, and other data requiring explicit CUI designation (for example, For Official Use Only, Law Enforcement Sensitive, and Sensitive Security Information).
+
+IL5 data includes CUI that requires a higher level of protection as deemed necessary by the information owner, public law, or government regulation. IL5 data is inclusive of unclassified National Security Systems.
+
+### Do Azure Government regions support classified data such as IL6? 
+No. Azure Government regions support only unclassified data up to and including IL5. In contrast, IL6 data is defined as classified information up to Secret, and can be accommodated in [Azure Government Secret](https://azure.microsoft.com/global-infrastructure/government/national-security/).
+
+### What DoD organizations can use Azure Government? 
+All Azure Government regions are built to support DoD customers, including:
+
+- The Office of the Secretary of Defense
+- The Joint Chiefs of Staff
+- The Joint Staff
+- The Defense Agencies
+- Department of Defense Field Activities
+- The Department of the Army
+- The Department of the Navy (including the United States Marine Corps)
+- The Department of the Air Force
+- The United States Coast Guard
+- The unified combatant commands
+- Other offices, agencies, activities, and commands under the control or supervision of any approved entity named above
+
+### What services are part of your IL5 authorization scope? 
+For a complete list of services in scope for DoD IL5 PA in Azure Government regions (US Gov Arizona, US Gov Texas, and US Gov Virginia), see [Azure Government services by audit scope](./compliance/azure-services-in-fedramp-auditscope.md#azure-government-services-by-audit-scope). For a complete list of services in scope for DoD IL5 PA in Azure Government DoD regions (US DoD Central and US DoD East), see [Azure Government DoD regions IL5 audit scope](#azure-government-dod-regions-il5-audit-scope).
+
+## Next steps
+
+- [Acquiring and accessing Azure Government](https://azure.microsoft.com/offers/azure-government/)
+- [How to buy Azure Government](https://azure.microsoft.com/global-infrastructure/government/how-to-buy/)
+- [Get started with Azure Government](./documentation-government-get-started-connect-with-portal.md)
+- [Azure Government Blog](https://devblogs.microsoft.com/azuregov/)
azure-government Documentation Government Plan Security https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/documentation-government-plan-security.md
Title: Azure Government Security
-description: Customer guidance and best practices for securing their workloads.
+description: Customer guidance and best practices for securing Azure workloads.
+ Previously updated : 02/26/2021- Last updated : 08/03/2021 # Azure Government security
Azure provides extensive options for [encrypting data at rest](../security/funda
Azure provides many options for [encrypting data in transit](../security/fundamentals/encryption-overview.md#encryption-of-data-in-transit). Data encryption in transit isolates customer network traffic from other traffic and helps protect data from interception. For more information, see [Data encryption in transit](./azure-secure-isolation-guidance.md#data-encryption-in-transit).
-The basic encryption available for connectivity to Azure Government supports Transport Layer Security (TLS) 1.2 protocol and X.509 certificates. Federal Information Processing Standard (FIPS) 140-2 validated cryptographic algorithms are also used for infrastructure network connections between Azure Government datacenters. Windows, Windows Server, and Azure File shares can use SMB 3.0 for encryption between the VM and the file share. Use client-side encryption to encrypt the data before it is transferred into storage in a client application, and to decrypt the data after it is transferred out of storage.
+The basic encryption available for connectivity to Azure Government supports Transport Layer Security (TLS) 1.2 protocol and X.509 certificates. Federal Information Processing Standard (FIPS) 140 validated cryptographic algorithms are also used for infrastructure network connections between Azure Government datacenters. Windows, Windows Server, and Azure File shares can use SMB 3.0 for encryption between the VM and the file share. Use client-side encryption to encrypt the data before it is transferred into storage in a client application, and to decrypt the data after it is transferred out of storage.
### Best practices for encryption
The basic encryption available for connectivity to Azure Government supports Tra
## Managing secrets
-Proper protection and management of encryption keys is essential for data security. Customers should strive to simplify key management and maintain control of keys used by cloud applications and services to encrypt data. [Azure Key Vault](../key-vault/index.yml) is a cloud service for securely storing and managing secrets. Key Vault enables customers to store their encryption keys in hardware security modules (HSMs) that are FIPS 140-2 validated. For more information, see [Data encryption key management](./azure-secure-isolation-guidance.md#data-encryption-key-management).
+Proper protection and management of encryption keys is essential for data security. Customers should strive to simplify key management and maintain control of keys used by cloud applications and services to encrypt data. [Azure Key Vault](../key-vault/index.yml) is a cloud service for securely storing and managing secrets. Key Vault enables customers to store their encryption keys in hardware security modules (HSMs) that are [FIPS 140](/azure/compliance/offerings/offering-fips-140-2) validated. For more information, see [Data encryption key management](./azure-secure-isolation-guidance.md#data-encryption-key-management).
### Best practices for managing secrets - Use Key Vault to minimize the risks of secrets being exposed through hard-coded configuration files, scripts, or in source code. For added assurance, you can import or generate keys in Azure Key Vault HSMs.-- Application code and templates should only contain URI references to the secrets, which means the actual secrets are not in code, configuration, or source code repositories. This approach prevents key phishing attacks on internal or external repositories, such as harvest-bots in GitHub.
+- Application code and templates should only contain URI references to the secrets, meaning the actual secrets are not in code, configuration, or source code repositories. This approach prevents key phishing attacks on internal or external repositories, such as harvest-bots in GitHub.
- Utilize strong Azure role-based access control (RBAC) within Key Vault. If a trusted operator leaves the company or transfers to a new group within the company, they should be prevented from being able to access the secrets. ## Understanding isolation
Isolation in Azure Government is achieved through the implementation of trust bo
### Environment isolation
-The Azure Government multi-tenant cloud platform environment is an Internet standards-based Autonomous System (AS) that is physically isolated and separately administered from the rest of Azure public cloud. The AS as defined by [IETF RFC 4271](https://datatracker.ietf.org/doc/rfc4271/) is comprised of a set of switches and routers under a single technical administration, using an interior gateway protocol and common metrics to route packets within the AS, and using an exterior gateway protocol to route packets to other ASs though a single and clearly defined routing policy. In addition, Azure Government for DoD regions within Azure Government are geographically separated physical instances of compute, storage, SQL, and supporting services that store and/or process customer content in accordance with DoD Cloud Computing Security Requirements Guide (SRG) [Section 5.2.2.3](https://dl.dod.cyber.mil/wp-content/uploads/cloud/SRG/https://docsupdatetracker.net/index.html#5.2LegalConsiderations) requirements.
+The Azure Government multi-tenant cloud platform environment is an Internet standards-based Autonomous System (AS) that is physically isolated and separately administered from the rest of Azure public cloud. The AS as defined by [IETF RFC 4271](https://datatracker.ietf.org/doc/rfc4271/) is composed of a set of switches and routers under a single technical administration, using an interior gateway protocol and common metrics to route packets within the AS, and using an exterior gateway protocol to route packets to other ASs though a single and clearly defined routing policy. In addition, Azure Government for DoD regions within Azure Government are geographically separated physical instances of compute, storage, SQL, and supporting services that store and/or process customer content in accordance with DoD Impact Level 5 (IL5) tenant separation requirements, as stated in the DoD Cloud Computing Security Requirements Guide (SRG) [Section 5.2.2.3](https://dl.dod.cyber.mil/wp-content/uploads/cloud/SRG/https://docsupdatetracker.net/index.html#5.2LegalConsiderations).
The isolation of the Azure Government environment is achieved through a series of physical and logical controls, and associated capabilities that include: - Physically isolated hardware - Physical barriers to the hardware using biometric devices and cameras - Conditional access (Azure RBAC, workflow)-- Specific credentials and multi-factor authentication for logical access
+- Specific credentials and multifactor authentication for logical access
- Infrastructure for Azure Government is located within the United States Within the Azure Government network, internal network system components are isolated from other system components through implementation of separate subnets and access control policies on management interfaces. Azure Government does not directly peer with the public internet or with the Microsoft corporate network. Azure Government directly peers to the commercial Microsoft Azure network, which has routing and transport capabilities to the Internet and the Microsoft Corporate network. Azure Government limits its exposed surface area by applying extra protections and communications capabilities of our commercial Azure network. In addition, Azure Government ExpressRoute (ER) uses peering with our customerΓÇÖs networks over non-Internet private circuits to route ER customer ΓÇ£DMZΓÇ¥ networks using specific Border Gateway Protocol (BGP)/AS peering as a trust boundary for application routing and associated policy enforcement.
Microsoft takes strong measures to protect customer data from inappropriate acce
Microsoft engineers can be granted access to customer data using temporary credentials via **Just-in-Time (JIT)** access. There must be an incident logged in the Azure Incident Management system that describes the reason for access, approval record, what data was accessed, etc. This approach ensures that there is appropriate oversight for all access to customer data and that all JIT actions (consent and access) are logged for audit. Evidence that procedures have been established for granting temporary access for Azure personnel to customer data and applications upon appropriate approval for customer support or incident handling purposes is available from the Azure [SOC 2 Type 2 attestation report](https://aka.ms/azuresoc2auditreport) produced by an independent third-party auditing firm.
-JIT access works with multi-factor authentication that requires Microsoft engineers to use a smartcard to confirm their identity. All access to production systems is performed using Secure Admin Workstations (SAWs) that are consistent with published guidance on [securing privileged access](/security/compass/overview). Use of SAWs for access to production systems is required by Microsoft policy and compliance with this policy is closely monitored. These workstations use a fixed image with all software fully managed ΓÇô only select activities are allowed and users cannot accidentally circumvent the SAW design since they do not have admin privileges on these machines. Access is permitted only with a smartcard and access to each SAW is limited to specific set of users.
+JIT access works with multifactor authentication that requires Microsoft engineers to use a smartcard to confirm their identity. All access to production systems is performed using Secure Admin Workstations (SAWs) that are consistent with published guidance on [securing privileged access](/security/compass/overview). Use of SAWs for access to production systems is required by Microsoft policy and compliance with this policy is closely monitored. These workstations use a fixed image with all software fully managed ΓÇô only select activities are allowed and users cannot accidentally circumvent the SAW design since they do not have admin privileges on these machines. Access is permitted only with a smartcard and access to each SAW is limited to specific set of users.
### Customer Lockbox
With Azure Monitor, customers can get a 360-degree view of their applications, i
**[Azure Blueprints](../governance/blueprints/overview.md)** is a service that helps customers deploy and update cloud environments in a repeatable manner using composable artifacts such as Azure Resource Manager templates to provision resources, role-based access controls, and policies that adhere to an organizationΓÇÖs standards, patterns, and requirements. Customers can use pre-defined standard blueprints and customize these solutions to meet specific requirements, including data encryption, host and service configuration, network and connectivity configuration, identity, and other security aspects of deployed resources. The overarching goal of Azure Blueprints is to help automate compliance and cybersecurity risk management in cloud environments. For more information on Azure Blueprints, including production-ready blueprint solutions for ISO 27001, NIST SP 800-53, PCI DSS, HITRUST, and other standards, see the [Azure Blueprint samples](../governance/blueprints/samples/index.md). ## Next steps
-For supplemental information and updates, subscribe to the
-<a href="https://devblogs.microsoft.com/azuregov/">Microsoft Azure Government Blog. </a>
+
+For supplemental information and updates, subscribe to the [Microsoft Azure Government Blog](https://devblogs.microsoft.com/azuregov/).
azure-government Documentation Government Welcome https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-government/documentation-government-welcome.md
Title: Azure Government Overview | Microsoft Docs
-description: 'This article provides an overview of the Azure Government Cloud capabilities and the trustworthy design and security used to support compliance applicable to federal, state, and local government organizations and their partners. '
+description: Overview of Azure Government capabilities, including security and compliance capabilities applicable to federal, state, and local government organizations and their partners
cloud: gov documentationcenter: ''-++ ms.devlang: na na Previously updated : 09/17/2018-
-#Customer intent: As the chairman of the municipal council, I want to find out if Azure Government will meet our security and compliance requirements.
Last updated : 08/02/2021 + # What is Azure Government?
-US government agencies or their partners interested in cloud services that meet government security and compliance requirements, can be confident that [Microsoft Azure Government](https://azure.microsoft.com/global-infrastructure/government/) provides world-class [security, protection, and compliance services](https://www.microsoft.com/TrustCenter/Compliance/default.aspx). Azure Government delivers a dedicated cloud enabling government agencies and their partners to transform mission-critical workloads to the cloud. Azure Government services handle data that is subject to certain government regulations and requirements, such as FedRAMP, NIST 800.171 (DIB), ITAR, IRS 1075, DoD L4, and CJIS. In order to provide you with the highest level of security and compliance, Azure Government uses physically isolated datacenters and networks (located in U.S. only).
+US government agencies or their partners interested in cloud services that meet government security and compliance requirements, can be confident that [Microsoft Azure Government](https://azure.microsoft.com/global-infrastructure/government/) provides world-class [security, protection, and compliance services](../compliance/index.yml). Azure Government delivers a dedicated cloud enabling government agencies and their partners to transform mission-critical workloads to the cloud. Azure Government services handle data that is subject to various government regulations and requirements, such as FedRAMP, DoD IL4 and IL5, CJIS, IRS 1075, ITAR, CMMC, NIST 800-171, and others. To provide you with the highest level of security and compliance, Azure Government uses physically isolated datacenters and networks (located in the US only).
Azure Government customers (US federal, state, and local government or their partners) are subject to validation of eligibility. If there is a question about eligibility for Azure Government, you should consult your account team. To sign up for trial, request your [trial subscription](https://azure.microsoft.com/global-infrastructure/government/request/?ReqType=Trial).
-The following video gives a good introduction of Azure Government.
+The following video provides a good introduction to Azure Government:
> [!VIDEO https://channel9.msdn.com/Shows/Azure-Friday/Azure-Government/player] ## Compare Azure Government and global Azure
-Azure Government uses same underlying technologies as global Azure, which includes the core components of [Infrastructure-as-a-Service (IaaS)](https://azure.microsoft.com/overview/what-is-iaas/), [Platform-as-a-Service (PaaS)](https://azure.microsoft.com/overview/what-is-paas/), and [Software-as-a-Service (SaaS)](https://azure.microsoft.com/overview/what-is-saas/). Azure Government includes Geo-Synchronous data replication, auto scaling, network, storage, data management, identity management, among other services. However, there are some key differences that developers working on applications hosted in Azure Government must be aware of. For detailed information, see [Guidance for developers](documentation-government-developer-guide.md).
+Azure Government uses same underlying technologies as global Azure, which includes the core components of [Infrastructure-as-a-Service (IaaS)](https://azure.microsoft.com/overview/what-is-iaas/), [Platform-as-a-Service (PaaS)](https://azure.microsoft.com/overview/what-is-paas/), and [Software-as-a-Service (SaaS)](https://azure.microsoft.com/overview/what-is-saas/). Azure Government includes geo-synchronous data replication, auto scaling, network, storage, data management, identity management, and many other services. For service availability in Azure Government, see [Products available by region](https://azure.microsoft.com/global-infrastructure/services/?products=all&regions=non-regional,usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-texas,usgov-virginia). Services available in Azure Government are listed by category and whether they are Generally Available or available through Preview.
-As a developer, you must know how to connect to Azure Government and once you connect you will mostly have the same experience as global Azure. To see variations between Azure Government and global Azure, see [Compare Azure Government and global Azure](compare-azure-government-global-azure.md) and click on individual service.
+There are some key differences that developers working on applications hosted in Azure Government must be aware of. For detailed information, see [Guidance for developers](./documentation-government-developer-guide.md). As a developer, you must know how to connect to Azure Government and once you connect you will mostly have the same experience as in global Azure. To see variations between Azure Government and global Azure, see [Compare Azure Government and global Azure](./compare-azure-government-global-azure.md) and click on individual service.
## Get started
-To start using Azure Government, first check out [Guidance for developers](documentation-government-developer-guide.md). Then, use one of the following quickstarts that show you how to connect to Azure Government.
+To start using Azure Government, first check out [Guidance for developers](./documentation-government-developer-guide.md). Then, use one of the following guides that show you how to connect to Azure Government:
-* [Connect with the Azure Government portal](documentation-government-get-started-connect-with-portal.md)
-* [Connect with the Azure CLI](documentation-government-get-started-connect-with-cli.md)
-* [Connect with PowerShell](documentation-government-get-started-connect-with-ps.md)
-* [Deploy with Azure DevOps Services](connect-with-azure-pipelines.md)
-* [Connect with SQL Server Management Studio](documentation-government-connect-ssms.md)
-* [Connect to Storage in Azure Government](documentation-government-get-started-connect-to-storage.md)
+- [Connect with Azure Government portal](./documentation-government-get-started-connect-with-portal.md)
+- [Connect with Azure CLI](./documentation-government-get-started-connect-with-cli.md)
+- [Connect with PowerShell](./documentation-government-get-started-connect-with-ps.md)
+- [Deploy with Azure DevOps Services](./connect-with-azure-pipelines.md)
+- [Develop with SQL Server Management Studio](./documentation-government-connect-ssms.md)
+- [Develop with Storage API on Azure Government](./documentation-government-get-started-connect-to-storage.md)
## Next steps
-Learn more about Azure Government: [Acquiring and accessing Azure Government](https://azure.microsoft.com/offers/azure-government/)
-
-[Get started with Azure Government](documentation-government-get-started-connect-with-portal.md).
-
-View [YouTube videos](https://www.youtube.com/playlist?list=PLLasX02E8BPA5IgCPjqWms5ne5h4briK7).
+- [Acquiring and accessing Azure Government](https://azure.microsoft.com/offers/azure-government/)
+- [Get started with Azure Government](./documentation-government-get-started-connect-with-portal.md)
+- View [YouTube videos](https://www.youtube.com/playlist?list=PLLasX02E8BPA5IgCPjqWms5ne5h4briK7)
azure-monitor Agents Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/agents/agents-overview.md
Use the Azure Monitor agent if you need to:
- Use different [solutions](../monitor-reference.md#insights-and-core-solutions) to monitor a particular service or application. */ --> Limitations of the Azure Monitor Agent include:-- Cannot use the Log Analytics solutions in production (only available in preview, [see what's supported](../faq.yml#which-log-analytics-solutions-are-supported-on-the-new-azure-monitor-agent-)).
+- Cannot use the Log Analytics solutions in production (only available in preview, [see what's supported](./azure-monitor-agent-overview.md#supported-services-and-features)).
- No support yet for networking scenarios involving private links. - No support yet collecting custom logs (files) or IIS log files. - No support yet for Event Hubs and Storage accounts as destinations.
azure-monitor Azure Monitor Agent Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/agents/azure-monitor-agent-overview.md
In addition to consolidating this functionality into a single agent, the Azure M
- Windows event filtering: Use XPATH queries to filter which Windows events are collected. - Improved extension management: Azure Monitor agent uses a new method of handling extensibility that is more transparent and controllable than management packs and Linux plug-ins in the current Log Analytics agents.
+### Current limitations
+When compared with the existing agents, this new agent does not yet have full parity:
+- **Comparison with Log Analytics Agents (MMA/OMS)**
+ - Not all Log Analytics Solutions are supported today. See [what's supported](#supported-services-and-features)
+ - No support for Private Links
+ - No support for collecting custom logs or IIS logs
+
+- **Comparison with Azure Diagnostic Extensions (WAD/LAD)**
+ - No support for Event Hubs and Storage accounts as destinations
+ ### Changes in data collection The methods for defining data collection for the existing agents are distinctly different from each other, and each have challenges that are addressed with Azure Monitor agent.
azure-monitor Azure Web Apps https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/app/azure-web-apps.md
Title: Monitor Azure app services performance | Microsoft Docs description: Application performance monitoring for Azure app services. Chart load and response time, dependency information, and set alerts on performance. Previously updated : 05/17/2021 Last updated : 08/04/2021
There are two ways to enable application monitoring for Azure App Services hoste
* **Manually instrumenting the application through code** by installing the Application Insights SDK.
- * This approach is much more customizable, but it requires [adding a dependency on the Application Insights SDK NuGet packages](./asp-net.md). This method, also means you have to manage the updates to the latest version of the packages yourself.
+ * This approach is much more customizable, but it requires the following approaches: SDK [for .NET Core](./asp-net-core.md), [.NET](./asp-net.md), [Node.js](./nodejs.md), [Python](./opencensus-python.md), and a standalone agent for [Java](./java-in-process-agent.md). This method, also means you have to manage the updates to the latest version of the packages yourself.
* If you need to make custom API calls to track events/dependencies not captured by default with agent-based monitoring, you would need to use this method. Check out the [API for custom events and metrics article](./api-custom-events-metrics.md) to learn more. This is also currently the only supported option for Linux based workloads. > [!NOTE]
-> If both agent-based monitoring and manual SDK-based instrumentation is detected, only the manual instrumentation settings will be honored. This is to prevent duplicate data from being sent. To learn more about this, check out the [troubleshooting section](#troubleshooting) below.
+> If both agent-based monitoring and manual SDK-based instrumentation is detected, in .NET only the manual instrumentation settings will be honored, while in Java only the agent-based instrumentation will be emitting the telemetry. This is to prevent duplicate data from being sent. To learn more about this, check out the [troubleshooting section](#troubleshooting) below.
+
+> [!NOTE]
+> Snapshot debugger and profiler are only available in .NET and .Net Core
## Enable agent-based monitoring
There are two ways to enable application monitoring for Azure App Services hoste
![Under Settings, choose Application Insights](./media/azure-web-apps/settings-app-insights-01.png)
- * Choose to create a new resource, unless you already set up an Application Insights resource for this application.
+ * Choose to create a new resource, or select an existing Application Insights resource for this application.
- > [!NOTE]
- > When you click **OK** to create the new resource you will be prompted to **Apply monitoring settings**. Selecting **Continue** will link your new Application Insights resource to your app service, doing so will also **trigger a restart of your app service**.
+ > [!NOTE]
+ > When you click **OK** to create the new resource you will be prompted to **Apply monitoring settings**. Selecting **Continue** will link your new Application Insights resource to your app service, doing so will also **trigger a restart of your app service**.
- ![Instrument your web app](./media/azure-web-apps/create-resource-01.png)
+ >[!div class="mx-imgBorder"]
+ >![Instrument your web app](./media/azure-web-apps/ai-create-new.png)
2. After specifying which resource to use, you can choose how you want application insights to collect data per platform for your application. ASP.NET app monitoring is on-by-default with two different levels of collection.
Targeting the full framework from ASP.NET Core, self-contained deployment, and L
![Under Settings, choose Application Insights](./media/azure-web-apps/settings-app-insights-01.png)
- * Choose to create a new resource, unless you already set up an Application Insights resource for this application.
+ * Choose to create a new resource, or select an existing Application Insights resource for this application.
- > [!NOTE]
- > When you click **OK** to create the new resource you will be prompted to **Apply monitoring settings**. Selecting **Continue** will link your new Application Insights resource to your app service, doing so will also **trigger a restart of your app service**.
+ > [!NOTE]
+ > When you click **OK** to create the new resource you will be prompted to **Apply monitoring settings**. Selecting **Continue** will link your new Application Insights resource to your app service, doing so will also **trigger a restart of your app service**.
- ![Instrument your web app](./media/azure-web-apps/create-resource-01.png)
+ >[!div class="mx-imgBorder"]
+ >![Instrument your web app](./media/azure-web-apps/ai-create-new.png)
2. After specifying which resource to use, you can choose how you want Application Insights to collect data per platform for your application. ASP.NET Core offers **Recommended collection** or **Disabled** for ASP.NET Core 2.1 and 3.1.
You can monitor your Node.js apps running in Azure App Service without any code
* Choose to create a new resource, unless you already set up an Application Insights resource for this application.
- > [!NOTE]
- > When you click **OK** to create the new resource you will be prompted to **Apply monitoring settings**. Selecting **Continue** will link your new Application Insights resource to your app service, doing so will also **trigger a restart of your app service**.
+ > [!NOTE]
+ > When you click **OK** to create the new resource you will be prompted to **Apply monitoring settings**. Selecting **Continue** will link your new Application Insights resource to your app service, doing so will also **trigger a restart of your app service**.
- ![Instrument your web app.](./media/azure-web-apps/create-resource-01.png)
+ >[!div class="mx-imgBorder"]
+ >![Instrument your web app.](./media/azure-web-apps/ai-create-new.png)
2. Once you have specified which resource to use, you are all set to go.
You can turn on monitoring for your Java apps running in Azure App Service just
> [!div class="mx-imgBorder"] > ![Under Settings, choose Application Insights.](./media/azure-web-apps/ai-enable.png)
- * Choose to create a new resource, unless you already set up an application insights resource for this application.
+ * Choose to create a new resource, or select an existing Application Insights resource for this application.
- > [!NOTE]
- > When you click **OK** to create the new resource you will be prompted to **Apply monitoring settings**. Selecting **Continue** will link your new Application Insights resource to your app service, doing so will also **trigger a restart of your app service**.
+ > [!NOTE]
+ > When you click **OK** to create the new resource you will be prompted to **Apply monitoring settings**. Selecting **Continue** will link your new Application Insights resource to your app service, doing so will also **trigger a restart of your app service**.
- ![Instrument your web app.](./media/azure-web-apps/create-resource-01.png)
+ >[!div class="mx-imgBorder"]
+ >![Instrument your web app.](./media/azure-web-apps/ai-create-new.png)
-2. After specifying which resource to use, you can configure the Java agent. The full [set of configurations](./java-standalone-config.md) is available, you just need to paste a valid json file. Exclude the connection string and any configurations that are in preview - you will be able to add those as they become generally available.
+2. This step is not required. After specifying which resource to use, you can configure the Java agent. If you do not configure the Java agent, default configurations will apply. The full [set of configurations](./java-standalone-config.md) is available, you just need to paste a valid json file. Exclude the connection string and any configurations that are in preview - you will be able to add those as they become generally available.
> [!div class="mx-imgBorder"] > ![Choose options per platform.](./media/azure-web-apps/create-app-service-ai.png)
$newAppSettings["ApplicationInsightsAgent_EXTENSION_VERSION"] = "~2"; # enable t
$app = Set-AzWebApp -AppSettings $newAppSettings -ResourceGroupName $app.ResourceGroup -Name $app.Name -ErrorAction Stop ```
-## Upgrade monitoring extension/agent
+## Upgrade monitoring extension/agent - .NET
### Upgrading from versions 2.8.9 and up
azure-monitor Customer Managed Keys https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/customer-managed-keys.md
N/A
When using REST, the response initially returns an HTTP status code 202 (Accepted) and header with *Azure-AsyncOperation* property: ```json
-"Azure-AsyncOperation": "https://management.azure.com/subscriptions/subscription-id/providers/Microsoft.OperationalInsights/locations/region-name/operationStatuses/operation-id?api-version=2020-08-01"
+"Azure-AsyncOperation": "https://management.azure.com/subscriptions/subscription-id/providers/Microsoft.OperationalInsights/locations/region-name/operationStatuses/operation-id?api-version=2021-06-01"
``` You can check the status of the asynchronous operation by sending a GET request to the endpoint in *Azure-AsyncOperation* header: ```rst
-GET https://management.azure.com/subscriptions/subscription-id/providers/microsoft.operationalInsights/locations/region-name/operationstatuses/operation-id?api-version=2020-08-01
+GET https://management.azure.com/subscriptions/subscription-id/providers/microsoft.operationalInsights/locations/region-name/operationstatuses/operation-id?api-version=2021-06-01
Authorization: Bearer <token> ```
Update-AzOperationalInsightsCluster -ResourceGroupName "resource-group-name" -Cl
# [REST](#tab/rest) ```rst
-PATCH https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/cluster-name?api-version=2020-08-01
+PATCH https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/cluster-name?api-version=2021-06-01
Authorization: Bearer <token> Content-type: application/json
Content-type: application/json
**Response**
-It takes the propagation of the key a few minutes to complete. You can check the update state in two ways:
+It takes the propagation of the key a while to complete. You can check the update state in two ways:
1. Copy the Azure-AsyncOperation URL value from the response and follow the [asynchronous operations status check](#asynchronous-operations-and-status-check). 2. Send a GET request on the cluster and look at the *KeyVaultProperties* properties. Your recently updated key should return in the response.
A response to GET request should look like this when the key update is complete:
"identity": { "type": "SystemAssigned", "tenantId": "tenant-id",
- "principalId": "principle-id"
- },
+ "principalId": "principal-id"
+ },
"sku": {
- "name": "capacityReservation",
- "capacity": 500,
- "lastSkuUpdate": "Sun, 22 Mar 2020 15:39:29 GMT"
- },
+ "name": "capacityreservation",
+ "capacity": 500
+ },
"properties": { "keyVaultProperties": { "keyVaultUri": "https://key-vault-name.vault.azure.net",
A response to GET request should look like this when the key update is complete:
"keyVersion": "current-version" }, "provisioningState": "Succeeded",
- "billingType": "cluster",
- "clusterId": "cluster-id"
+ "clusterId": "cluster-id",
+ "billingType": "Cluster",
+ "lastModifiedDate": "last-modified-date",
+ "createdDate": "created-date",
+ "isDoubleEncryptionEnabled": false,
+ "isAvailabilityZonesEnabled": false,
+ "capacityReservationProperties": {
+ "lastSkuUpdate": "last-sku-modified-date",
+ "minCapacity": 500
+ }
}, "id": "/subscriptions/subscription-id/resourceGroups/resource-group-name/providers/Microsoft.OperationalInsights/clusters/cluster-name", "name": "cluster-name", "type": "Microsoft.OperationalInsights/clusters",
- "location": "region-name"
+ "location": "cluster-region"
} ```
The cluster's storage periodically checks your Key Vault to attempt to unwrap th
## Key rotation Key rotation has two modes: -- Auto-rotation - when you you update your cluster with ```"keyVaultProperties"``` but omit ```"keyVersion"``` property, or set it to ```""```, storage will autoamatically use the latest versions.
+- Auto-rotation - when you you update your cluster with ```"keyVaultProperties"``` but omit ```"keyVersion"``` property, or set it to ```""```, storage will automatically use the latest versions.
- Explicit key version update - when you update your cluster and provide key version in ```"keyVersion"``` property, any new key versions require an explicit ```"keyVaultProperties"``` update in cluster, see [Update cluster with Key identifier details](#update-cluster-with-key-identifier-details). If you generate new key version in Key Vault but don't update it in the cluster, the Log Analytics cluster storage will keep using your previous key. If you disable or delete your old key before updating the new key in the cluster, you will get into [key revocation](#key-revocation) state. All your data remains accessible after the key rotation operation, since data always encrypted with Account Encryption Key (AEK) while AEK is now being encrypted with your new Key Encryption Key (KEK) version in Key Vault.
New-AzOperationalInsightsLinkedStorageAccount -ResourceGroupName "resource-group
# [REST](#tab/rest) ```rst
-PUT https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/Microsoft.OperationalInsights/workspaces/<workspace-name>/linkedStorageAccounts/Query?api-version=2020-08-01
+PUT https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/Microsoft.OperationalInsights/workspaces/<workspace-name>/linkedStorageAccounts/Query?api-version=2021-06-01
Authorization: Bearer <token> Content-type: application/json
New-AzOperationalInsightsLinkedStorageAccount -ResourceGroupName "resource-group
# [REST](#tab/rest) ```rst
-PUT https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/Microsoft.OperationalInsights/workspaces/<workspace-name>/linkedStorageAccounts/Alerts?api-version=2020-08-01
+PUT https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/Microsoft.OperationalInsights/workspaces/<workspace-name>/linkedStorageAccounts/Alerts?api-version=2021-06-01
Authorization: Bearer <token> Content-type: application/json
Customer-Managed key is provided on dedicated cluster and these operations are r
- Lockbox isn't available in China currently. - [Double encryption](../../storage/common/storage-service-encryption.md#doubly-encrypt-data-with-infrastructure-encryption) is configured automatically for clusters created from October 2020 in supported regions. You can verify if your cluster is configured for double encryption by sending a GET request on the cluster and observing that the `isDoubleEncryptionEnabled` value is `true` for clusters with Double encryption enabled.
- - If you create a cluster and get an error "<region-name> doesnΓÇÖt support Double Encryption for clusters.", you can still create the cluster without Double encryption by adding `"properties": {"isDoubleEncryptionEnabled": false}` in the REST request body.
+ - If you create a cluster and get an error "region-name doesnΓÇÖt support Double Encryption for clusters.", you can still create the cluster without Double encryption by adding `"properties": {"isDoubleEncryptionEnabled": false}` in the REST request body.
- Double encryption setting can not be changed after the cluster has been created.
- - Setting the cluster's `identity` `type` to `None` acks also revokes access to your data, but this approach isn't recommended since you can't revert it without contacting support. The recommended way to revoke access to your data is [key revocation](#key-revocation).
+ - Setting the cluster's `identity` `type` to `None` also revokes access to your data, but this approach isn't recommended since you can't revert it without contacting support. The recommended way to revoke access to your data is [key revocation](#key-revocation).
- You can't use Customer-managed key with User-assigned managed identity if your Key Vault is in Private-Link (vNet). You can use System-assigned managed identity in this scenario.
azure-monitor Data Collector Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/data-collector-api.md
Title: Azure Monitor HTTP Data Collector API | Microsoft Docs
-description: You can use the Azure Monitor HTTP Data Collector API to add POST JSON data to a Log Analytics workspace from any client that can call the REST API. This article describes how to use the API, and has examples of how to publish data by using different programming languages.
+description: You can use the Azure Monitor HTTP Data Collector API to add POST JSON data to a Log Analytics workspace from any client that can call the REST API. This article describes how to use the API, and it has examples of how to publish data by using various programming languages.
Last updated 07/14/2020
-# Send log data to Azure Monitor with the HTTP Data Collector API (public preview)
-This article shows you how to use the HTTP Data Collector API to send log data to Azure Monitor from a REST API client. It describes how to format data collected by your script or application, include it in a request, and have that request authorized by Azure Monitor. Examples are provided for PowerShell, C#, and Python.
+# Send log data to Azure Monitor by using the HTTP Data Collector API (preview)
+This article shows you how to use the HTTP Data Collector API to send log data to Azure Monitor from a REST API client. It describes how to format data that's collected by your script or application, include it in a request, and have that request authorized by Azure Monitor. We provide examples for Azure PowerShell, C#, and Python.
> [!NOTE] > The Azure Monitor HTTP Data Collector API is in public preview. ## Concepts
-You can use the HTTP Data Collector API to send log data to a Log Analytics workspace in Azure Monitor from any client that can call a REST API. This might be a runbook in Azure Automation that collects management data from Azure or another cloud, or it might be an alternate management system that uses Azure Monitor to consolidate and analyze log data.
-
-All data in the Log Analytics workspace is stored as a record with a particular record type. You format your data to send to the HTTP Data Collector API as multiple records in JSON. When you submit the data, an individual record is created in the repository for each record in the request payload.
--
-![HTTP Data Collector overview](media/data-collector-api/overview.png)
+You can use the HTTP Data Collector API to send log data to a Log Analytics workspace in Azure Monitor from any client that can call a REST API. The client might be a runbook in Azure Automation that collects management data from Azure or another cloud, or it might be an alternative management system that uses Azure Monitor to consolidate and analyze log data.
+All data in the Log Analytics workspace is stored as a record with a particular record type. You format your data to send to the HTTP Data Collector API as multiple records in JavaScript Object Notation (JSON). When you submit the data, an individual record is created in the repository for each record in the request payload.
+![Screenshot illustrating the HTTP Data Collector overview.](media/data-collector-api/overview.png)
## Create a request
-To use the HTTP Data Collector API, you create a POST request that includes the data to send in JavaScript Object Notation (JSON). The next three tables list the attributes that are required for each request. We describe each attribute in more detail later in the article.
+To use the HTTP Data Collector API, you create a POST request that includes the data to send in JSON. The next three tables list the attributes that are required for each request. We describe each attribute in more detail later in the article.
### Request URI | Attribute | Property |
To use the HTTP Data Collector API, you create a POST request that includes the
| Method |POST | | URI |https://\<CustomerId\>.ods.opinsights.azure.com/api/logs?api-version=2016-04-01 | | Content type |application/json |
+| | |
### Request URI parameters | Parameter | Description | |: |: | | CustomerID |The unique identifier for the Log Analytics workspace. | | Resource |The API resource name: /api/logs. |
-| API Version |The version of the API to use with this request. Currently, it's 2016-04-01. |
+| API Version |The version of the API to use with this request. Currently, the version is 2016-04-01. |
+| | |
### Request headers | Header | Description | |: |: | | Authorization |The authorization signature. Later in the article, you can read about how to create an HMAC-SHA256 header. |
-| Log-Type |Specify the record type of the data that is being submitted. Can only contain letters, numbers, and underscore (_), and may not exceed 100 characters. |
+| Log-Type |Specify the record type of the data that's being submitted. It can contain only letters, numbers, and the underscore (_) character, and it can't exceed 100 characters. |
| x-ms-date |The date that the request was processed, in RFC 7234 format. |
-| x-ms-AzureResourceId | Resource ID of the Azure resource the data should be associated with. This populates the [_ResourceId](./log-standard-columns.md#_resourceid) property and allows the data to be included in [resource-context](./design-logs-deployment.md#access-mode) queries. If this field isn't specified, the data will not be included in resource-context queries. |
-| time-generated-field | The name of a field in the data that contains the timestamp of the data item. If you specify a field then its contents are used for **TimeGenerated**. If this field isnΓÇÖt specified, the default for **TimeGenerated** is the time that the message is ingested. The contents of the message field should follow the ISO 8601 format YYYY-MM-DDThh:mm:ssZ. |
+| x-ms-AzureResourceId | The resource ID of the Azure resource that the data should be associated with. It populates the [_ResourceId](./log-standard-columns.md#_resourceid) property and allows the data to be included in [resource-context](./design-logs-deployment.md#access-mode) queries. If this field isn't specified, the data won't be included in resource-context queries. |
+| time-generated-field | The name of a field in the data that contains the timestamp of the data item. If you specify a field, its contents are used for **TimeGenerated**. If you don't specify this field, the default for **TimeGenerated** is the time that the message is ingested. The contents of the message field should follow the ISO 8601 format YYYY-MM-DDThh:mm:ssZ. |
+| | |
## Authorization
-Any request to the Azure Monitor HTTP Data Collector API must include an authorization header. To authenticate a request, you must sign the request with either the primary or the secondary key for the workspace that is making the request. Then, pass that signature as part of the request.
+Any request to the Azure Monitor HTTP Data Collector API must include an authorization header. To authenticate a request, sign the request with either the primary or the secondary key for the workspace that's making the request. Then, pass that signature as part of the request.
Here's the format for the authorization header:
Here's the format for the authorization header:
Authorization: SharedKey <WorkspaceID>:<Signature> ```
-*WorkspaceID* is the unique identifier for the Log Analytics workspace. *Signature* is a [Hash-based Message Authentication Code (HMAC)](/dotnet/api/system.security.cryptography.hmacsha256) that is constructed from the request and then computed by using the [SHA256 algorithm](/dotnet/api/system.security.cryptography.sha256). Then, you encode it by using Base64 encoding.
+*WorkspaceID* is the unique identifier for the Log Analytics workspace. *Signature* is a [Hash-based Message Authentication Code (HMAC)](/dotnet/api/system.security.cryptography.hmacsha256) that's constructed from the request and then computed by using the [SHA256 algorithm](/dotnet/api/system.security.cryptography.sha256). Then, you encode it by using Base64 encoding.
Use this format to encode the **SharedKey** signature string:
Signature=Base64(HMAC-SHA256(UTF8(StringToSign)))
The samples in the next sections have sample code to help you create an authorization header. ## Request body
-The body of the message must be in JSON. It must include one or more records with the property name and value pairs in the following format. The property name can only contain letters, numbers, and underscore (_).
+The body of the message must be in JSON. It must include one or more records with the property name and value pairs in the following format. The property name can contain only letters, numbers, and the underscore (_) character.
```json [
You define a custom record type when you submit data through the Azure Monitor H
Each request to the Data Collector API must include a **Log-Type** header with the name for the record type. The suffix **_CL** is automatically appended to the name you enter to distinguish it from other log types as a custom log. For example, if you enter the name **MyNewRecordType**, Azure Monitor creates a record with the type **MyNewRecordType_CL**. This helps ensure that there are no conflicts between user-created type names and those shipped in current or future Microsoft solutions.
-To identify a property's data type, Azure Monitor adds a suffix to the property name. If a property contains a null value, the property is not included in that record. This table lists the property data type and corresponding suffix:
+To identify a property's data type, Azure Monitor adds a suffix to the property name. If a property contains a null value, the property isn't included in that record. This table lists the property data type and corresponding suffix:
| Property data type | Suffix | |: |: |
To identify a property's data type, Azure Monitor adds a suffix to the property
| Double |_d | | Date/time |_t | | GUID (stored as a string) |_g |
+| | |
> [!NOTE]
-> String values that appear to be GUIDs will be given the _g suffix and formatted as a GUID, even if the incoming value doesn't include dashes. For example, both "8145d822-13a7-44ad-859c-36f31a84f6dd" and "8145d82213a744ad859c36f31a84f6dd" will be stored as "8145d822-13a7-44ad-859c-36f31a84f6dd". The only differences between this and another string is the _g in the name and the insertion of dashes if they aren't provided in the input.
+> String values that appear to be GUIDs are given the _g suffix and formatted as a GUID, even if the incoming value doesn't include dashes. For example, both "8145d822-13a7-44ad-859c-36f31a84f6dd" and "8145d82213a744ad859c36f31a84f6dd" are stored as "8145d822-13a7-44ad-859c-36f31a84f6dd". The only differences between this and another string are the _g in the name and the insertion of dashes if they aren't provided in the input.
The data type that Azure Monitor uses for each property depends on whether the record type for the new record already exists.
-* If the record type does not exist, Azure Monitor creates a new one using the JSON type inference to determine the data type for each property for the new record.
+* If the record type doesn't exist, Azure Monitor creates a new one by using the JSON type inference to determine the data type for each property for the new record.
* If the record type does exist, Azure Monitor attempts to create a new record based on existing properties. If the data type for a property in the new record doesnΓÇÖt match and canΓÇÖt be converted to the existing type, or if the record includes a property that doesnΓÇÖt exist, Azure Monitor creates a new property that has the relevant suffix.
-For example, this submission entry would create a record with three properties, **number_d**, **boolean_b**, and **string_s**:
+For example, the following submission entry would create a record with three properties, **number_d**, **boolean_b**, and **string_s**:
-![Sample record 1](media/data-collector-api/record-01.png)
+![Screenshot of sample record 1.](media/data-collector-api/record-01.png)
-If you then submitted this next entry, with all values formatted as strings, the properties would not change. These values can be converted to existing data types:
+If you were to submit this next entry, with all values formatted as strings, the properties wouldn't change. You can convert the values to existing data types.
-![Sample record 2](media/data-collector-api/record-02.png)
+![Screenshot of sample record 2.](media/data-collector-api/record-02.png)
-But, if you then made this next submission, Azure Monitor would create the new properties **boolean_d** and **string_d**. These values can't be converted:
+But, if you then make this next submission, Azure Monitor would create the new properties **boolean_d** and **string_d**. You can't convert these values.
-![Sample record 3](media/data-collector-api/record-03.png)
+![Screenshot of sample record 3.](media/data-collector-api/record-03.png)
-If you then submitted the following entry, before the record type was created, Azure Monitor would create a record with three properties, **number_s**, **boolean_s**, and **string_s**. In this entry, each of the initial values is formatted as a string:
+If you then submit the following entry, before the record type is created, Azure Monitor would create a record with three properties, **number_s**, **boolean_s**, and **string_s**. In this entry, each of the initial values is formatted as a string:
-![Sample record 4](media/data-collector-api/record-04.png)
+![Screenshot of sample record 4.](media/data-collector-api/record-04.png)
## Reserved properties
-The following properties are reserved and should not be used in a custom record type. You will receive an error if your payload includes any of these property names.
+The following properties are reserved and shouldn't be used in a custom record type. You'll receive an error if your payload includes any of these property names:
- tenant ## Data limits
-There are some constraints around the data posted to the Azure Monitor Data collection API.
+The data posted to the Azure Monitor Data collection API is subject to certain constraints:
-* Maximum of 30 MB per post to Azure Monitor Data Collector API. This is a size limit for a single post. If the data from a single post that exceeds 30 MB, you should split the data up to smaller sized chunks and send them concurrently.
-* Maximum of 32 KB limit for field values. If the field value is greater than 32 KB, the data will be truncated.
-* Recommended maximum number of fields for a given type is 50. This is a practical limit from a usability and search experience perspective.
-* A table in a Log Analytics workspace only supports up to 500 columns (referred to as a field in this article).
-* The maximum number of characters for the column name is 500.
+* Maximum of 30 MB per post to Azure Monitor Data Collector API. This is a size limit for a single post. If the data from a single post exceeds 30 MB, you should split the data into smaller sized chunks and send them concurrently.
+* Maximum of 32 KB for field values. If the field value is greater than 32 KB, the data will be truncated.
+* Recommended maximum of 50 fields for a given type. This is a practical limit from a usability and search experience perspective.
+* Tables in Log Analytics workspaces support only up to 500 columns (referred to as fields in this article).
+* Maximum of 50 characters for column names.
## Return codes
-The HTTP status code 200 means that the request has been received for processing. This indicates that the operation completed successfully.
+The HTTP status code 200 means that the request has been received for processing. This indicates that the operation finished successfully.
-This table lists the complete set of status codes that the service might return:
+The complete set of status codes that the service might return is listed in the following table:
| Code | Status | Error code | Description | |: |: |: |: | | 200 |OK | |The request was successfully accepted. | | 400 |Bad request |InactiveCustomer |The workspace has been closed. |
-| 400 |Bad request |InvalidApiVersion |The API version that you specified was not recognized by the service. |
-| 400 |Bad request |InvalidCustomerId |The workspace ID specified is invalid. |
-| 400 |Bad request |InvalidDataFormat |Invalid JSON was submitted. The response body might contain more information about how to resolve the error. |
-| 400 |Bad request |InvalidLogType |The log type specified contained special characters or numerics. |
+| 400 |Bad request |InvalidApiVersion |The API version that you specified wasn't recognized by the service. |
+| 400 |Bad request |InvalidCustomerId |The specified workspace ID is invalid. |
+| 400 |Bad request |InvalidDataFormat |An invalid JSON was submitted. The response body might contain more information about how to resolve the error. |
+| 400 |Bad request |InvalidLogType |The specified log type contained special characters or numerics. |
| 400 |Bad request |MissingApiVersion |The API version wasnΓÇÖt specified. | | 400 |Bad request |MissingContentType |The content type wasnΓÇÖt specified. | | 400 |Bad request |MissingLogType |The required value log type wasnΓÇÖt specified. |
-| 400 |Bad request |UnsupportedContentType |The content type was not set to **application/json**. |
+| 400 |Bad request |UnsupportedContentType |The content type wasn't set to **application/json**. |
| 403 |Forbidden |InvalidAuthorization |The service failed to authenticate the request. Verify that the workspace ID and connection key are valid. |
-| 404 |Not Found | | Either the URL provided is incorrect, or the request is too large. |
+| 404 |Not Found | | Either the provided URL is incorrect or the request is too large. |
| 429 |Too Many Requests | | The service is experiencing a high volume of data from your account. Please retry the request later. | | 500 |Internal Server Error |UnspecifiedError |The service encountered an internal error. Please retry the request. | | 503 |Service Unavailable |ServiceUnavailable |The service currently is unavailable to receive requests. Please retry your request. |
+| | |
## Query data
-To query data submitted by the Azure Monitor HTTP Data Collector API, search for records with **Type** that is equal to the **LogType** value that you specified, appended with **_CL**. For example, if you used **MyCustomLog**, then you'd return all records with `MyCustomLog_CL`.
+To query data submitted by the Azure Monitor HTTP Data Collector API, search for records whose **Type** is equal to the **LogType** value that you specified and appended with **_CL**. For example, if you used **MyCustomLog**, you would return all records with `MyCustomLog_CL`.
## Sample requests
-In the next sections, you'll find samples of how to submit data to the Azure Monitor HTTP Data Collector API by using different programming languages.
+In the next sections, you'll find samples that demonstrate how to submit data to the Azure Monitor HTTP Data Collector API by using various programming languages.
-For each sample, do these steps to set the variables for the authorization header:
+For each sample, set the variables for the authorization header by doing the following:
1. In the Azure portal, locate your Log Analytics workspace. 2. Select **Agents management**.
-2. To the right of **Workspace ID**, select the copy icon, and then paste the ID as the value of the **Customer ID** variable.
-3. To the right of **Primary Key**, select the copy icon, and then paste the ID as the value of the **Shared Key** variable.
+2. To the right of **Workspace ID**, select the **Copy** icon, and then paste the ID as the value of the **Customer ID** variable.
+3. To the right of **Primary Key**, select the **Copy** icon, and then paste the ID as the value of the **Shared Key** variable.
Alternatively, you can change the variables for the log type and JSON data.
public class ApiExample {
## Alternatives and considerations
-While the Data Collector API should cover most of your needs to collect free-form data into Azure Logs, there are instances where an alternative might be required to overcome some of the limitations of the API. All your options are as follows, major considerations included:
+
+Although the Data Collector API should cover most of your needs as you collect free-form data into Azure Logs, you might require an alternative approach to overcome some of the limitations of the API. Your options, including major considerations, are listed in the following table:
| Alternative | Description | Best suited for | ||||
-| [Custom events](../app/api-custom-events-metrics.md?toc=%2Fazure%2Fazure-monitor%2Ftoc.json#properties): Native SDK-based ingestion in Application Insights | Application Insights, typically instrumented through an SDK within your application, offers the ability for you to send custom data through Custom Events. | <ul><li> Data that is generated within your application, but not picked up by SDK through one of the default data types (requests, dependencies, exceptions, and so on).</li><li> Data that is most often correlated to other application data in Application Insights </li></ul> |
-| Data Collector API in Azure Monitor Logs | The Data Collector API in Azure Monitor Logs is a completely open-ended way to ingest data. Any data formatted in a JSON object can be sent here. Once sent, it will be processed, and available in Logs to be correlated with other data in Logs or against other Application Insights data. <br/><br/> It is fairly easy to upload the data as files to an Azure Blob blob, from where these files will be processed and uploaded to Log Analytics. Please see [this](./create-pipeline-datacollector-api.md) article for a sample implementation of such a pipeline. | <ul><li> Data that is not necessarily generated within an application instrumented within Application Insights.</li><li> Examples include lookup and fact tables, reference data, pre-aggregated statistics, and so on. </li><li> Intended for data that will be cross-referenced against other Azure Monitor data (Application Insights, other Logs data types, Security Center, Container insights/VMs, and so on). </li></ul> |
-| [Azure Data Explorer](/azure/data-explorer/ingest-data-overview) | Azure Data Explorer (ADX) is the data platform that powers Application Insights Analytics and Azure Monitor Logs. Now Generally Available ("GA"), using the data platform in its raw form provides you complete flexibility (but requiring the overhead of management) over the cluster (Kubernetes RBAC, retention rate, schema, and so on). ADX provides many [ingestion options](/azure/data-explorer/ingest-data-overview#ingestion-methods) including [CSV, TSV, and JSON](/azure/kusto/management/mappings) files. | <ul><li> Data that will not be correlated to any other data under Application Insights or Logs. </li><li> Data requiring advanced ingestion or processing capabilities not today available in Azure Monitor Logs. </li></ul> |
+| [Custom events](../app/api-custom-events-metrics.md?toc=%2Fazure%2Fazure-monitor%2Ftoc.json#properties): Native SDK-based ingestion in Application Insights | Application Insights, usually instrumented through an SDK within your application, gives you the ability to send custom data through Custom Events. | <ul><li> Data that's generated within your application, but not picked up by the SDK through one of the default data types (requests, dependencies, exceptions, and so on).</li><li> Data that's most often correlated with other application data in Application Insights. </li></ul> |
+| Data Collector API in Azure Monitor Logs | The Data Collector API in Azure Monitor Logs is a completely open-ended way to ingest data. Any data that's formatted in a JSON object can be sent here. After it's sent, it's processed and made available in Monitor Logs to be correlated with other data in Monitor Logs or against other Application Insights data. <br/><br/> It's fairly easy to upload the data as files to an Azure Blob Storage blob, where the files will be processed and then uploaded to Log Analytics. For a sample implementation, see [Create a data pipeline with the Data Collector API](./create-pipeline-datacollector-api.md). | <ul><li> Data that isn't necessarily generated within an application that's instrumented within Application Insights.<br>Examples include lookup and fact tables, reference data, pre-aggregated statistics, and so on. </li><li> Data that will be cross-referenced against other Azure Monitor data (Application Insights, other Monitor Logs data types, Security Center, Container insights and virtual machines, and so on). </li></ul> |
+| [Azure Data Explorer](/azure/data-explorer/ingest-data-overview) | Azure Data Explorer, now generally available to the public, is the data platform that powers Application Insights Analytics and Azure Monitor Logs. By using the data platform in its raw form, you have complete flexibility (but require the overhead of management) over the cluster (Kubernetes role-based access control (RBAC), retention rate, schema, and so on). Azure Data Explorer provides many [ingestion options](/azure/data-explorer/ingest-data-overview#ingestion-methods), including [CSV, TSV, and JSON](/azure/kusto/management/mappings) files. | <ul><li> Data that won't be correlated with any other data under Application Insights or Monitor Logs. </li><li> Data that requires advanced ingestion or processing capabilities that aren't available today in Azure Monitor Logs. </li></ul> |
## Next steps - Use the [Log Search API](./log-query-overview.md) to retrieve data from the Log Analytics workspace. -- Learn more about how [create a data pipeline with the Data Collector API](create-pipeline-datacollector-api.md) using Logic Apps workflow to Azure Monitor.
+- Learn more about how to [create a data pipeline with the Data Collector API](create-pipeline-datacollector-api.md) by using a Logic Apps workflow to Azure Monitor.
azure-monitor Data Security https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/data-security.md
Title: Log Analytics data security | Microsoft Docs
-description: Learn about how Log Analytics protects your privacy and secures your data.
+ Title: Azure Monitor Logs data security | Microsoft Docs
+description: Learn about how [Azure Monitor Logs protects your privacy and secures your data.
Last updated 11/11/2020
-# Log Analytics data security
-This document is intended to provide information specific to Log Analytics, which is a feature of Azure Monitor, to supplement the information on [Azure Trust Center](https://www.microsoft.com/en-us/trust-center?rtc=1).
+# [Azure Monitor Logs data security
+This document is intended to provide information specific to [Azure Monitor Logs](../logs/data-platform-logs.md) to supplement the information on [Azure Trust Center](https://www.microsoft.com/en-us/trust-center?rtc=1).
-This article explains how data is collected, processed, and secured by Log Analytics. You can use agents to connect to the web service, use System Center Operations Manager to collect operational data, or retrieve data from Azure diagnostics for use by Log Analytics.
+This article explains how log data is collected, processed, and secured by Azure Monitor. You can use agents to connect to the web service, use System Center Operations Manager to collect operational data, or retrieve data from Azure diagnostics for use by Azure Monitor.
-The Log Analytics service manages your cloud-based data securely by using the following methods:
+Azure Monitor Logs manages your cloud-based data securely by using the following methods:
* Data segregation * Data retention
The Log Analytics service manages your cloud-based data securely by using the fo
* Compliance * Security standards certifications
-You can also use additional security features built into Azure Monitor and Log Analytics. These features require more administrator management.
+You can also use additional security features built into Azure Monitor. These features require more administrator management.
* Customer-managed (security) keys * Azure Private Storage * Private Link networking
Contact us with any questions, suggestions, or issues about any of the following
## Sending data securely using TLS 1.2
-To ensure the security of data in transit to Log Analytics, we strongly encourage you to configure the agent to use at least Transport Layer Security (TLS) 1.2. Older versions of TLS/Secure Sockets Layer (SSL) have been found to be vulnerable and while they still currently work to allow backwards compatibility, they are **not recommended**, and the industry is quickly moving to abandon support for these older protocols.
+To ensure the security of data in transit to Azure Monitor, we strongly encourage you to configure the agent to use at least Transport Layer Security (TLS) 1.2. Older versions of TLS/Secure Sockets Layer (SSL) have been found to be vulnerable and while they still currently work to allow backwards compatibility, they are **not recommended**, and the industry is quickly moving to abandon support for these older protocols.
-The [PCI Security Standards Council](https://www.pcisecuritystandards.org/) has set a [deadline of June 30th, 2018](https://www.pcisecuritystandards.org/pdfs/PCI_SSC_Migrating_from_SSL_and_Early_TLS_Resource_Guide.pdf) to disable older versions of TLS/SSL and upgrade to more secure protocols. Once Azure drops legacy support, if your agents cannot communicate over at least TLS 1.2 you would not be able to send data to Log Analytics.
+The [PCI Security Standards Council](https://www.pcisecuritystandards.org/) has set a [deadline of June 30th, 2018](https://www.pcisecuritystandards.org/pdfs/PCI_SSC_Migrating_from_SSL_and_Early_TLS_Resource_Guide.pdf) to disable older versions of TLS/SSL and upgrade to more secure protocols. Once Azure drops legacy support, if your agents cannot communicate over at least TLS 1.2 you would not be able to send data to Azure Monitor Logs.
We do not recommend explicitly setting your agent to only use TLS 1.2 unless absolutely necessary, as it can break platform level security features that allow you to automatically detect and take advantage of newer more secure protocols as they become available, such as TLS 1.3.
We do not recommend explicitly setting your agent to only use TLS 1.2 unless abs
| Windows 7 SP1 and Windows Server 2008 R2 SP1 | Supported, but not enabled by default. | See the [Transport Layer Security (TLS) registry settings](/windows-server/security/tls/tls-registry-settings) page for details on how to enable. | ## Data segregation
-After your data is ingested by the Log Analytics service, the data is kept logically separate on each component throughout the service. All data is tagged per workspace. This tagging persists throughout the data lifecycle, and it is enforced at each layer of the service. Your data is stored in a dedicated database in the storage cluster in the region you have selected.
+After your data is ingested by Azure Monitor, the data is kept logically separate on each component throughout the service. All data is tagged per workspace. This tagging persists throughout the data lifecycle, and it is enforced at each layer of the service. Your data is stored in a dedicated database in the storage cluster in the region you have selected.
## Data retention Indexed log search data is stored and retained according to your pricing plan. For more information, see [Log Analytics Pricing](https://azure.microsoft.com/pricing/details/log-analytics/).
The following table shows examples of data types:
| State |StateChangeEventId, StateId, NewHealthState, OldHealthState, Context, TimeGenerated, TimeAdded, StateId2, BaseManagedEntityId, MonitorId, HealthState, LastModified, LastGreenAlertGenerated, DatabaseTimeModified | ## Physical security
-The Log Analytics service is managed by Microsoft personnel and all activities are logged and can be audited. Log Analytics is operated as an Azure Service and meets all Azure Compliance and Security requirements. You can view details about the physical security of Azure assets on page 18 of the [Microsoft Azure Security Overview](https://download.microsoft.com/download/6/0/2/6028B1AE-4AEE-46CE-9187-641DA97FC1EE/Windows%20Azure%20Security%20Overview%20v1.01.pdf). Physical access rights to secure areas are changed within one business day for anyone who no longer has responsibility for the Log Analytics service, including transfer and termination. You can read about the global physical infrastructure we use at [Microsoft Datacenters](https://azure.microsoft.com/global-infrastructure/).
+Azure Monitor is managed by Microsoft personnel and all activities are logged and can be audited. Azure Monitor is operated as an Azure Service and meets all Azure Compliance and Security requirements. You can view details about the physical security of Azure assets on page 18 of the [Microsoft Azure Security Overview](https://download.microsoft.com/download/6/0/2/6028B1AE-4AEE-46CE-9187-641DA97FC1EE/Windows%20Azure%20Security%20Overview%20v1.01.pdf). Physical access rights to secure areas are changed within one business day for anyone who no longer has responsibility for the Azure Monitor service, including transfer and termination. You can read about the global physical infrastructure we use at [Microsoft Datacenters](https://azure.microsoft.com/global-infrastructure/).
## Incident management
-Log Analytics has an incident management process that all Microsoft services adhere to. To summarize, we:
+Azure Monitor has an incident management process that all Microsoft services adhere to. To summarize, we:
* Use a shared responsibility model where a portion of security responsibility belongs to Microsoft and a portion belongs to the customer * Manage Azure security incidents:
While very rare, Microsoft will notify each customer within one day if significa
For more information about how Microsoft responds to security incidents, see [Microsoft Azure Security Response in the Cloud](https://gallery.technet.microsoft.com/Azure-Security-Response-in-dd18c678/file/150826/4/Microsoft%20Azure%20Security%20Response%20in%20the%20cloud.pdf). ## Compliance
-The Log Analytics software development and service team's information security and governance program supports its business requirements and adheres to laws and regulations as described at [Microsoft Azure Trust Center](https://azure.microsoft.com/support/trust-center/) and [Microsoft Trust Center Compliance](https://www.microsoft.com/en-us/trustcenter/compliance/default.aspx). How Log Analytics establishes security requirements, identifies security controls, manages, and monitors risks are also described there. Annually, we review polices, standards, procedures, and guidelines.
+The Azure Monitor software development and service team's information security and governance program supports its business requirements and adheres to laws and regulations as described at [Microsoft Azure Trust Center](https://azure.microsoft.com/support/trust-center/) and [Microsoft Trust Center Compliance](https://www.microsoft.com/en-us/trustcenter/compliance/default.aspx). How Azure Monitor Logs establishes security requirements, identifies security controls, manages, and monitors risks are also described there. Annually, we review polices, standards, procedures, and guidelines.
Each development team member receives formal application security training. Internally, we use a version control system for software development. Each software project is protected by the version control system.
Azure Log Analytics meets the following requirements:
* [HIPAA and HITECH](/compliance/regulatory/offering-hipaa-hitech) for companies that have a HIPAA Business Associate Agreement * Windows Common Engineering Criteria * Microsoft Trustworthy Computing
-* As an Azure service, the components that Log Analytics uses adhere to Azure compliance requirements. You can read more at [Microsoft Trust Center Compliance](https://www.microsoft.com/en-us/trustcenter/compliance/default.aspx).
+* As an Azure service, the components that Azure Monitor uses adhere to Azure compliance requirements. You can read more at [Microsoft Trust Center Compliance](https://www.microsoft.com/en-us/trustcenter/compliance/default.aspx).
> [!NOTE]
-> In some certifications/attestations, Log Analytics is listed under its former name of *Operational Insights*.
+> In some certifications/attestations, Azure Monitor Logs is listed under its former name of *Operational Insights*.
> > ## Cloud computing security data flow
-The following diagram shows a cloud security architecture as the flow of information from your company and how it is secured as is moves to the Log Analytics service, ultimately seen by you in the Azure portal. More information about each step follows the diagram.
+The following diagram shows a cloud security architecture as the flow of information from your company and how it is secured as is moves to Azure Monitor, ultimately seen by you in the Azure portal. More information about each step follows the diagram.
-![Image of Log Analytics data collection and security](./media/data-security/log-analytics-data-security-diagram.png)
+![Image of Azure Monitor Logs data collection and security](./media/data-security/log-analytics-data-security-diagram.png)
-## 1. Sign up for Log Analytics and collect data
-For your organization to send data to Log Analytics, you configure a Windows or Linux agent running on Azure virtual machines, or on virtual or physical computers in your environment or other cloud provider. If you use Operations Manager, from the management group you configure the Operations Manager agent. Users (which might be you, other individual users, or a group of people) create one or more Log Analytics workspaces, and register agents by using one of the following accounts:
+## 1. Sign up for Azure Monitor and collect data
+For your organization to send data to Azure Monitor Logs, you configure a Windows or Linux agent running on Azure virtual machines, or on virtual or physical computers in your environment or other cloud provider. If you use Operations Manager, from the management group you configure the Operations Manager agent. Users (which might be you, other individual users, or a group of people) create one or more Log Analytics workspaces, and register agents by using one of the following accounts:
* [Organizational ID](../../active-directory/fundamentals/sign-up-organization.md) * [Microsoft Account - Outlook, Office Live, MSN](https://account.microsoft.com/account) A Log Analytics workspace is where data is collected, aggregated, analyzed, and presented. A workspace is primarily used as a means to partition data, and each workspace is unique. For example, you might want to have your production data managed with one workspace and your test data managed with another workspace. Workspaces also help an administrator control user access to the data. Each workspace can have multiple user accounts associated with it, and each user account can access multiple Log Analytics workspaces. You create workspaces based on datacenter region.
-For Operations Manager, the Operations Manager management group establishes a connection with the Log Analytics service. You then configure which agent-managed systems in the management group are allowed to collect and send data to the service. Depending on the solution you have enabled, data from these solutions are either sent directly from an Operations Manager management server to the Log Analytics service, or because of the volume of data collected by the agent-managed system, are sent directly from the agent to the service. For systems not monitored by Operations Manager, each connects securely to the Log Analytics service directly.
+For Operations Manager, the Operations Manager management group establishes a connection with the Azure Monitor service. You then configure which agent-managed systems in the management group are allowed to collect and send data to the service. Depending on the solution you have enabled, data from these solutions are either sent directly from an Operations Manager management server to the Azure Monitor service, or because of the volume of data collected by the agent-managed system, are sent directly from the agent to the service. For systems not monitored by Operations Manager, each connects securely to the Azure Monitorservice directly.
-All communication between connected systems and the Log Analytics service is encrypted. The TLS (HTTPS) protocol is used for encryption. The Microsoft SDL process is followed to ensure Log Analytics is up-to-date with the most recent advances in cryptographic protocols.
+All communication between connected systems and the Azure Monitor service is encrypted. The TLS (HTTPS) protocol is used for encryption. The Microsoft SDL process is followed to ensure Log Analytics is up-to-date with the most recent advances in cryptographic protocols.
-Each type of agent collects data for Log Analytics. The type of data that is collected is depends on the types of solutions used. You can see a summary of data collection at [Add Log Analytics solutions from the Solutions Gallery](../insights/solutions.md). Additionally, more detailed collection information is available for most solutions. A solution is a bundle of predefined views, log search queries, data collection rules, and processing logic. Only administrators can use Log Analytics to import a solution. After the solution is imported, it is moved to the Operations Manager management servers (if used), and then to any agents that you have chosen. Afterward, the agents collect the data.
+Each type of agent collects log data for Azure Monitor. The type of data that is collected is depends on the configuration of your workspace and other features of Azure Monitor.
## 2. Send data from agents
-You register all agent types with an enrollment key and a secure connection is established between the agent and the Log Analytics service using certificate-based authentication and TLS with port 443. Log Analytics uses a secret store to generate and maintain keys. Private keys are rotated every 90 days and are stored in Azure and are managed by the Azure operations who follow strict regulatory and compliance practices.
+You register all agent types with an enrollment key and a secure connection is established between the agent and the Azure Monitor service using certificate-based authentication and TLS with port 443. Azure Monitor uses a secret store to generate and maintain keys. Private keys are rotated every 90 days and are stored in Azure and are managed by the Azure operations who follow strict regulatory and compliance practices.
With Operations Manager, the management group registered with a Log Analytics workspace establishes a secure HTTPS connection with an Operations Manager management server. For Windows or Linux agents running on Azure virtual machines, a read-only storage key is used to read diagnostic events in Azure tables.
-With any agent reporting to an Operations Manager management group that is integrated with Log Analytics, if the management server is unable to communicate with the service for any reason, the collected data is stored locally in a temporary cache on the management server. They try to resend the data every eight minutes for two hours. For data that bypasses the management server and is sent directly to Log Analytics, the behavior is consistent with the Windows agent.
+With any agent reporting to an Operations Manager management group that is integrated with Azure Monitor, if the management server is unable to communicate with the service for any reason, the collected data is stored locally in a temporary cache on the management server. They try to resend the data every eight minutes for two hours. For data that bypasses the management server and is sent directly to Azure Monitor, the behavior is consistent with the Windows agent.
The Windows or management server agent cached data is protected by the operating system's credential store. If the service cannot process the data after two hours, the agents will queue the data. If the queue becomes full, the agent starts dropping data types, starting with performance data. The agent queue limit is a registry key so you can modify it, if necessary. Collected data is compressed and sent to the service, bypassing the Operations Manager management group databases, so it does not add any load to them. After the collected data is sent, it is removed from the cache. As described above, data from the management server or direct-connected agents is sent over TLS to Microsoft Azure datacenters. Optionally, you can use ExpressRoute to provide additional security for the data. ExpressRoute is a way to directly connect to Azure from your existing WAN network, such as a multi-protocol label switching (MPLS) VPN, provided by a network service provider. For more information, see [ExpressRoute](https://azure.microsoft.com/services/expressroute/).
-## 3. The Log Analytics service receives and processes data
-The Log Analytics service ensures that incoming data is from a trusted source by validating certificates and the data integrity with Azure authentication. The unprocessed raw data is then stored in an Azure Event Hub in the region the data will eventually be stored at rest. The type of data that is stored depends on the types of solutions that were imported and used to collect data. Then, the Log Analytics service processes the raw data and ingests it into the database.
+## 3. The Azure Monitor service receives and processes data
+The Azure Monitor service ensures that incoming data is from a trusted source by validating certificates and the data integrity with Azure authentication. The unprocessed raw data is then stored in an Azure Event Hub in the region the data will eventually be stored at rest. The type of data that is stored depends on the types of solutions that were imported and used to collect data. Then, the Azure Monitor service processes the raw data and ingests it into the database.
The retention period of collected data stored in the database depends on the selected pricing plan. For the *Free* tier, collected data is available for seven days. For the *Paid* tier, collected data is available for 31 days by default, but can be extended to 730 days. Data is stored encrypted at rest in Azure storage, to ensure data confidentiality, and the data is replicated within the local region using locally redundant storage (LRS). The last two weeks of data are also stored in SSD-based cache and this cache is encrypted. Data in database storage cannot be altered once ingested but can be deleted via [*purge* API path](personal-data-mgmt.md#delete). Although data cannot be altered, some certifications require that data is kept immutable and cannot be changed or deleted in storage. Data immutability can be achieved using [data export](logs-data-export.md) to a storage account that is configured as [immutable storage](../../storage/blobs/storage-blob-immutability-policies-manage.md).
-## 4. Use Log Analytics to access the data
-To access your Log Analytics workspace, you sign into the Azure portal using the organizational account or Microsoft account that you set up previously. All traffic between the portal and Log Analytics service is sent over a secure HTTPS channel. When using the portal, a session ID is generated on the user client (web browser) and data is stored in a local cache until the session is terminated. When terminated, the cache is deleted. Client-side cookies, which do not contain personally identifiable information, are not automatically removed. Session cookies are marked HTTPOnly and are secured. After a pre-determined idle period, the Azure portal session is terminated.
+## 4. Use Azure Monitor to access the data
+To access your Log Analytics workspace, you sign into the Azure portal using the organizational account or Microsoft account that you set up previously. All traffic between the portal and Azure Monitor service is sent over a secure HTTPS channel. When using the portal, a session ID is generated on the user client (web browser) and data is stored in a local cache until the session is terminated. When terminated, the cache is deleted. Client-side cookies, which do not contain personally identifiable information, are not automatically removed. Session cookies are marked HTTPOnly and are secured. After a pre-determined idle period, the Azure portal session is terminated.
## Additional Security features
-You can use these additional security features to further secure your Azure Monitor/Log Analytics environment. These features require more administrator management.
+You can use these additional security features to further secure your Azure Monitor environment. These features require more administrator management.
- [Customer-managed (security) keys](../logs/customer-managed-keys.md) - You can use customer-managed keys to encrypt data sent to your Log Analytics workspaces. It requires use of Azure Key Vault. -- [Private / customer-managed Storage](./private-storage.md) - Manage your personally encrypted storage account and tell Log Analytics to use it to store monitoring data
+- [Private / customer-managed Storage](./private-storage.md) - Manage your personally encrypted storage account and tell Azure Monitor to use it to store monitoring data
- [Private Link networking](./private-link-security.md) - Azure Private Link allows you to securely link Azure PaaS services (including Azure Monitor) to your virtual network using private endpoints. - [Azure customer Lockbox](../../security/fundamentals/customer-lockbox-overview.md#supported-services-and-scenarios-in-preview) - Customer Lockbox for Microsoft Azure provides an interface for customers to review and approve or reject customer data access requests. It is used in cases where a Microsoft engineer needs to access customer data during a support request. ## Next steps
-* Learn how to collect data with Log Analytics for your Azure VMs following the [Azure VM quickstart](../vm/quick-collect-azurevm.md).
+* [See the different kinds of data that you can collect in Azure Monitor](../monitor-reference.md).
-* If you are looking to collect data from physical or virtual Windows or Linux computers in your environment, see the [Quickstart for Linux computers](../vm/quick-collect-linux-computer.md) or [Quickstart for Windows computers](../vm/quick-collect-windows-computer.md)
azure-monitor Get Started Queries https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/get-started-queries.md
Last updated 10/24/2019
# Get started with log queries in Azure Monitor > [!NOTE]
-> You can work through this exercise in your own environment if you are collecting data from at least one virtual machine. If not then use our [Demo environment](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring_Logs/DemoLogsBlade), which includes plenty of sample data. If you already know how to query in KQL, but just need to quickly create useful queries based on resource type(s), see the [saved example queries pane](../logs/queries.md).
+> If you're collecting data from at least one virtual machine, you can work through this exercise in your own environment. For other scenarios, use our [demo environment](https://ms.portal.azure.com/#blade/Microsoft_Azure_Monitoring_Logs/DemoLogsBlade), which includes plenty of sample data.
+>
+> If you already know how to query in Kusto query language, but need to quickly create useful queries based on resource types, see the saved example queries pane in the [Use queries in Azure Monitor Log Analytics](../logs/queries.md) article.
-In this tutorial you will learn to write log queries in Azure Monitor. It will teach you how to:
+In this tutorial, you'll learn to write log queries in Azure Monitor. The article shows you how to:
-- Understand query structure-- Sort query results-- Filter query results-- Specify a time range-- Select which fields to include in the results-- Define and use custom fields-- Aggregate and group results
+- Understand query structure.
+- Sort query results.
+- Filter query results.
+- Specify a time range.
+- Select which fields to include in the results.
+- Define and use custom fields.
+- Aggregate and group results.
-For a tutorial on using Log Analytics in the Azure portal, see [Get started with Azure Monitor Log Analytics](./log-analytics-tutorial.md).<br>
-For more details on log queries in Azure Monitor, see [Overview of log queries in Azure Monitor](../logs/log-query-overview.md).
+For a tutorial on using Log Analytics in the Azure portal, see [Get started with Azure Monitor Log Analytics](./log-analytics-tutorial.md).
-Follow along with a video version of this tutorial below:
+For more information about log queries in Azure Monitor, see [Overview of log queries in Azure Monitor](../logs/log-query-overview.md).
+
+Here's a video version of this tutorial:
> [!VIDEO https://www.microsoft.com/videoplayer/embed/RE42pGX]
-## Writing a new query
+## Write a new query
-Queries can start with either a table name or the *search* command. You should start with a table name, since it defines a clear scope for the query and improves both query performance and relevance of the results.
+Queries can start with either a table name or the *search* command. It's a good idea to start with a table name, because it defines a clear scope for the query and improves both query performance and the relevance of the results.
> [!NOTE]
-> The Kusto query language used by Azure Monitor is case-sensitive. Language keywords are typically written in lower-case. When using names of tables or columns in a query, make sure to use the correct case, as shown on the schema pane.
+> The Kusto query language, which is used by Azure Monitor, is case-sensitive. Language keywords are usually written in lowercase. When you use names of tables or columns in a query, be sure to use the correct case, as shown on the schema pane.
### Table-based queries
-Azure Monitor organizes log data in tables, each composed of multiple columns. All tables and columns are shown on the schema pane in Log Analytics in the Analytics portal. Identify a table that you're interested in and then take a look at a bit of data:
+Azure Monitor organizes log data in tables, each composed of multiple columns. All tables and columns are shown on the schema pane in Log Analytics in the Analytics portal. Identify a table that you're interested in, and then take a look at a bit of data:
```Kusto SecurityEvent | take 10 ```
-The query shown above returns 10 results from the *SecurityEvent* table, in no specific order. This is a very common way to take a glance at a table and understand its structure and content. Let's examine how it's built:
+The preceding query returns 10 results from the *SecurityEvent* table, in no specific order. This is a common way to take a glance at a table and understand its structure and content. Let's examine how it's built:
-* The query starts with the table name *SecurityEvent* - this part defines the scope of the query.
-* The pipe (|) character separates commands, so the output of the first one is the input of the following command. You can add any number of piped elements.
+* The query starts with the table name *SecurityEvent*, which defines the scope of the query.
+* The pipe (|) character separates commands, so the output of the first command is the input of the next. You can add any number of piped elements.
* Following the pipe is the **take** command, which returns a specific number of arbitrary records from the table.
-We could actually run the query even without adding `| take 10` - that would still be valid, but it could return up to 10,000 results.
+We could actually run the query even without adding `| take 10`. The command would still be valid, but it could return up to 10,000 results.
### Search queries
-Search queries are less structured, and generally more suited for finding records that include a specific value in any of their columns:
+Search queries are less structured, and they're generally better suited for finding records that include a specific value in any of their columns:
```Kusto search in (SecurityEvent) "Cryptographic" | take 10 ```
-This query searches the *SecurityEvent* table for records that contain the phrase "Cryptographic". Of those records, 10 records will be returned and displayed. If we omit the `in (SecurityEvent)` part and just run `search "Cryptographic"`, the search will go over *all* tables, which would take longer and be less efficient.
+This query searches the *SecurityEvent* table for records that contain the phrase "Cryptographic". Of those records, 10 records will be returned and displayed. If you omit the `in (SecurityEvent)` part and run only `search "Cryptographic"`, the search will go over *all* tables, which would take longer and be less efficient.
-> [!WARNING]
-> Search queries are typically slower than table-based queries because they have to process more data.
+> [!IMPORTANT]
+> Search queries are ordinarily slower than table-based queries because they have to process more data.
## Sort and top
-While **take** is useful to get a few records, the results are selected and displayed in no particular order. To get an ordered view, you could **sort** by the preferred column:
+Although **take** is useful for getting a few records, the results are selected and displayed in no particular order. To get an ordered view, you could **sort** by the preferred column:
```Kusto SecurityEvent | sort by TimeGenerated desc ```
-That could return too many results though and might also take some time. The above query sorts *the entire* SecurityEvent table by the TimeGenerated column. The Analytics portal then limits the display to show only 10,000 records. This approach is of course not optimal.
+The preceding query could return too many results, however, and might also take some time. The query sorts the *entire* SecurityEvent table by the **TimeGenerated** column. The Analytics portal then limits the display to only 10,000 records. This approach is of course not optimal.
The best way to get only the latest 10 records is to use **top**, which sorts the entire table on the server side and then returns the top records:
SecurityEvent
| top 10 by TimeGenerated ```
-Descending is the default sorting order, so we typically omit the **desc** argument. The output will look like this:
+Descending is the default sorting order, so you would usually omit the **desc** argument. The output looks like this:
-![Top 10](media/get-started-queries/top10.png)
+![Screenshot of the top 10 records, sorted in descending order.](media/get-started-queries/top10.png)
-## Where: filtering on a condition
+## The where operator: filtering on a condition
Filters, as indicated by their name, filter the data by a specific condition. This is the most common way to limit query results to relevant information. To add a filter to a query, use the **where** operator followed by one or more conditions. For example, the following query returns only *SecurityEvent* records where _Level_ equals _8_:
SecurityEvent
| where Level == 8 ```
-When writing filter conditions, you can use the following expressions:
+When you write filter conditions, you can use the following expressions:
| Expression | Description | Example | |:|:|:|
When writing filter conditions, you can use the following expressions:
| !=, <> | Check inequality<br>(both expressions are identical) | `Level != 4` | | *and*, *or* | Required between conditions| `Level == 16 or CommandLine != ""` |
-To filter by multiple conditions, you can either use **and**:
+To filter by multiple conditions, you can use either of the following approaches:
+
+Use **and**, as shown here:
```Kusto SecurityEvent | where Level == 8 and EventID == 4672 ```
-or pipe multiple **where** elements one after the other:
+Pipe multiple **where** elements, one after the other, as shown here:
```Kusto SecurityEvent
SecurityEvent
``` > [!NOTE]
-> Values can have different types, so you might need to cast them to perform comparison on the correct type. For example, SecurityEvent *Level* column is of type String, so you must cast it to a numerical type such as *int* or *long*, before you can use numerical operators on it:
+> Values can have different types, so you might need to cast them to perform comparisons on the correct type. For example, the SecurityEvent *Level* column is of type String, so you must cast it to a numerical type, such as *int* or *long*, before you can use numerical operators on it, as shown here:
> `SecurityEvent | where toint(Level) >= 10` ## Specify a time range
-### Time picker
+### Use the time picker
-The time picker is next to the Run button and indicates weΓÇÖre querying only records from the last 24 hours. This is the default time range applied to all queries. To get only records from the last hour, select _Last hour_ and run the query again.
+The time picker is displayed next to the **Run** button and indicates that youΓÇÖre querying records from only the last 24 hours. This is the default time range applied to all queries. To get records from only the last hour, select _Last hour_, and then run the query again.
-![Time Picker](media/get-started-queries/timepicker.png)
+![Screenshot of the time picker and its list of time-range commands.](media/get-started-queries/timepicker.png)
-### Time filter in query
+### Add a time filter to the query
You can also define your own time range by adding a time filter to the query. ItΓÇÖs best to place the time filter immediately after the table name:
SecurityEvent
| where toint(Level) >= 10 ```
-In the above time filter `ago(30m)` means "30 minutes ago" so this query only returns records from the last 30 minutes. Other units of time include days (2d), minutes (25m), and seconds (10s).
+In the preceding time filter, `ago(30m)` means "30 minutes ago," which means that this query returns records from only the last 30 minutes (expressed as, for example, 30m). Other units of time include days (for example, 2d) and seconds (for example, 10s).
-## Project and Extend: select and compute columns
+## Use project and extend to select and compute columns
Use **project** to select specific columns to include in the results:
SecurityEvent
| project TimeGenerated, Computer, Activity ```
-The preceding example generates this output:
+The preceding example generates the following output:
-![Query project results](media/get-started-queries/project.png)
+![Screenshot of the query "project" results list.](media/get-started-queries/project.png)
-You can also use **project** to rename columns and define new ones. The following example uses project to do the following:
+You can also use **project** to rename columns and define new ones. The next example uses **project** to do the following:
* Select only the *Computer* and *TimeGenerated* original columns.
-* Displays the *Activity* column as *EventDetails*.
+* Display the *Activity* column as *EventDetails*.
* Create a new column named *EventCode*. The **substring()** function is used to get only the first four characters from the Activity field.
SecurityEvent
| project Computer, TimeGenerated, EventDetails=Activity, EventCode=substring(Activity, 0, 4) ```
-**extend** keeps all original columns in the result set and defines additional ones. The following query uses **extend** to add the *EventCode* column. Note that this column may not display at the end of the table results, in which case you would need to expand the details of a record to view it.
+You can use **extend** to keep all original columns in the result set and define additional ones. The following query uses **extend** to add the *EventCode* column. This column might not be displayed at the end of the table results, in which case you would need to expand the details of a record to view it.
```Kusto SecurityEvent
SecurityEvent
| extend EventCode=substring(Activity, 0, 4) ```
-## Summarize: aggregate groups of rows
-Use **summarize** to identify groups of records, according to one or more columns, and apply aggregations to them. The most common use of **summarize** is *count*, which returns the number of results in each group.
+## Use summarize to aggregate groups of rows
+Use **summarize** to identify groups of records according to one or more columns, and apply aggregations to them. The most common use of **summarize** is *count*, which returns the number of results in each group.
The following query reviews all *Perf* records from the last hour, groups them by *ObjectName*, and counts the records in each group: + ```Kusto Perf | where TimeGenerated > ago(1h)
Perf
| summarize count() by ObjectName, CounterName ```
-Another common use is to perform mathematical or statistical calculations on each group. For example, the following calculates the average *CounterValue* for each computer:
+Another common use is to perform mathematical or statistical calculations on each group. The following example calculates the average *CounterValue* for each computer:
```Kusto Perf
Perf
| summarize avg(CounterValue) by Computer ```
-Unfortunately, the results of this query are meaningless since we mixed together different performance counters. To make this more meaningful, we should calculate the average separately for each combination of *CounterName* and *Computer*:
+Unfortunately, the results of this query are meaningless, because we mixed together a variety of performance counters. To make the results more meaningful, calculate the average separately for each combination of *CounterName* and *Computer*:
```Kusto Perf
Perf
``` ### Summarize by a time column
-Grouping results can also be based on a time column, or another continuous value. Simply summarizing `by TimeGenerated` though would create groups for every single millisecond over the time range, since these are unique values.
+Grouping results can also be based on a time column, or another continuous value. Simply summarizing `by TimeGenerated`, though, would create groups for every single millisecond over the time range, because these are unique values.
-To create groups based on continuous values, it is best to break the range into manageable units using **bin**. The following query analyzes *Perf* records that measure free memory (*Available MBytes*) on a specific computer. It calculates the average value of each 1 hour period over the last 7 days:
+To create groups that are based on continuous values, it's best to break the range into manageable units by using **bin**. The following query analyzes *Perf* records that measure free memory (*Available MBytes*) on a specific computer. It calculates the average value of each 1-hour period over the last 7 days:
```Kusto Perf
Perf
| summarize avg(CounterValue) by bin(TimeGenerated, 1h) ```
-To make the output clearer, you select to display it as a time-chart, showing the available memory over time:
+To make the output clearer, you can select to display it as a time chart, which shows the available memory over time:
-![Query memory over time](media/get-started-queries/chart.png)
+![Screenshot displaying the values of a query memory over time.](media/get-started-queries/chart.png)
## Next steps -- Learn more about using string data in a log query with [Work with strings in Azure Monitor log queries](/azure/data-explorer/kusto/query/samples?&pivots=azuremonitor#string-operations).-- Learn more about aggregating data in a log query with [Advanced aggregations in Azure Monitor log queries](/azure/data-explorer/write-queries#advanced-aggregations).-- Learn how to join data from multiple tables with [Joins in Azure Monitor log queries](/azure/data-explorer/kusto/query/samples?&pivots=azuremonitor#joins).
+- To learn more about using string data in a log query, see [Work with strings in Azure Monitor log queries](/azure/data-explorer/kusto/query/samples?&pivots=azuremonitor#string-operations).
+- To learn more about aggregating data in a log query, see [Advanced aggregations in Azure Monitor log queries](/azure/data-explorer/write-queries#advanced-aggregations).
+- To learn how to join data from multiple tables, see [Joins in Azure Monitor log queries](/azure/data-explorer/kusto/query/samples?&pivots=azuremonitor#joins).
- Get documentation on the entire Kusto query language in the [KQL language reference](/azure/kusto/query/).
azure-monitor Logs Dedicated Clusters https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/logs-dedicated-clusters.md
Complete details are billing for Log Analytics dedicated clusters are available
Some of the configuration steps run asynchronously because they can't be completed quickly. The status in response contains can be one of the followings: 'InProgress', 'Updating', 'Deleting', 'Succeeded or 'Failed' including the error code. When using REST, the response initially returns an HTTP status code 202 (Accepted) and header with Azure-AsyncOperation property: ```JSON
-"Azure-AsyncOperation": "https://management.azure.com/subscriptions/subscription-id/providers/Microsoft.OperationalInsights/locations/region-name/operationStatuses/operation-id?api-version=2020-08-01"
+"Azure-AsyncOperation": "https://management.azure.com/subscriptions/subscription-id/providers/Microsoft.OperationalInsights/locations/region-name/operationStatuses/operation-id?api-version=2021-06-01"
``` You can check the status of the asynchronous operation by sending a GET request to the Azure-AsyncOperation header value: ```rst
-GET https://management.azure.com/subscriptions/subscription-id/providers/microsoft.operationalInsights/locations/region-name/operationstatuses/operation-id?api-version=2020-08-01
+GET https://management.azure.com/subscriptions/subscription-id/providers/microsoft.operationalInsights/locations/region-name/operationstatuses/operation-id?api-version=2021-06-01
Authorization: Bearer <token> ```
Get-Job -Command "New-AzOperationalInsightsCluster*" | Format-List -Property *
*Call* ```rst
-PUT https://management.azure.com/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/<cluster-name>?api-version=2020-08-01
+PUT https://management.azure.com/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/<cluster-name>?api-version=2021-06-01
Authorization: Bearer <token> Content-type: application/json
Content-type: application/json
"Capacity": 500 }, "properties": {
- "billingType": "cluster",
+ "billingType": "Cluster",
},
- "location": "<region-name>",
+ "location": "<region>",
} ```
The provisioning of the Log Analytics cluster takes a while to complete. You can
- Send a GET request on the *Cluster* resource and look at the *provisioningState* value. The value is *ProvisioningAccount* while provisioning and *Succeeded* when completed.
- ```rst
- GET https://management.azure.com/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/<cluster-name>?api-version=2020-08-01
- Authorization: Bearer <token>
- ```
-
- **Response**
-
- ```json
- {
- "identity": {
- "type": "SystemAssigned",
- "tenantId": "tenant-id",
- "principalId": "principal-id"
- },
- "sku": {
- "name": "capacityReservation",
- "capacity": 500,
- "lastSkuUpdate": "Sun, 22 Mar 2020 15:39:29 GMT"
- },
- "properties": {
- "provisioningState": "ProvisioningAccount",
- "billingType": "cluster",
- "clusterId": "cluster-id"
- },
- "id": "/subscriptions/subscription-id/resourceGroups/resource-group-name/providers/Microsoft.OperationalInsights/clusters/cluster-name",
- "name": "cluster-name",
- "type": "Microsoft.OperationalInsights/clusters",
- "location": "region-name"
- }
- ```
-
-The *principalId* GUID is generated by the managed identity service for the *Cluster* resource.
+ ```rst
+ GET https://management.azure.com/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/<cluster-name>?api-version=2021-06-01
+ Authorization: Bearer <token>
+ ```
+
+ **Response**
+
+ ```json
+ {
+ "identity": {
+ "type": "SystemAssigned",
+ "tenantId": "tenant-id",
+ "principalId": "principal-id"
+ },
+ "sku": {
+ "name": "capacityreservation",
+ "capacity": 500
+ },
+ "properties": {
+ "provisioningState": "ProvisioningAccount",
+ "clusterId": "cluster-id",
+ "billingType": "Cluster",
+ "lastModifiedDate": "last-modified-date",
+ "createdDate": "created-date",
+ "isDoubleEncryptionEnabled": false,
+ "isAvailabilityZonesEnabled": false,
+ "capacityReservationProperties": {
+ "lastSkuUpdate": "last-sku-modified-date",
+ "minCapacity": 500
+ }
+ },
+ "id": "/subscriptions/subscription-id/resourceGroups/resource-group-name/providers/Microsoft.OperationalInsights/clusters/cluster-name",
+ "name": "cluster-name",
+ "type": "Microsoft.OperationalInsights/clusters",
+ "location": "cluster-region"
+ }
+ ```
+
+The *principalId* GUID is generated by the managed identity service at *Cluster* creation.
## Link a workspace to cluster
Use the following REST call to link to a cluster:
*Send* ```rst
-PUT https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.operationalinsights/workspaces/<workspace-name>/linkedservices/cluster?api-version=2020-08-01
+PUT https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.operationalinsights/workspaces/<workspace-name>/linkedservices/cluster?api-version=2021-06-01
Authorization: Bearer <token> Content-type: application/json
Content-type: application/json
### Check workspace link status
-If you use customer-managed keys, ingested data is stored encrypted with your managed key after the association operation, which can take up to 90 minutes to complete.
-
-You can check the workspace association state in two ways:
+When cluster is configured with customer-managed keys, data ingested to the workspaces after the link operation completion is stored encrypted with your managed key. The workspace link operation can take up to 90 minutes to complete and you can check the state in two ways:
- Copy the Azure-AsyncOperation URL value from the response and follow the asynchronous operations status check.
You can check the workspace association state in two ways:
**CLI** ```azurecli
-az monitor log-analytics cluster show --resource-group "resource-group-name" --name "cluster-name"
+az monitor log-analytics workspace show --resource-group "resource-group-name" --workspace-name "workspace-name"
``` **PowerShell**
Get-AzOperationalInsightsWorkspace -ResourceGroupName "resource-group-name" -Nam
*Call* ```rest
-GET https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.operationalinsights/workspaces/<workspace-name>?api-version=2020-08-01
+GET https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.operationalinsights/workspaces/<workspace-name>?api-version=2021-06-01
Authorization: Bearer <token> ```
Authorization: Bearer <token>
"id": "/subscriptions/subscription-id/resourcegroups/resource-group-name/providers/microsoft.operationalinsights/workspaces/workspace-name", "name": "workspace-name", "type": "Microsoft.OperationalInsights/workspaces",
- "location": "region-name"
+ "location": "region"
} ```
Get-AzOperationalInsightsCluster -ResourceGroupName "resource-group-name"
*Call*
- ```rst
- GET https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters?api-version=2020-08-01
- Authorization: Bearer <token>
- ```
+```rst
+GET https://management.azure.com/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters?api-version=2021-06-01
+Authorization: Bearer <token>
+```
*Response*
- ```json
- {
- "value": [
- {
- "identity": {
- "type": "SystemAssigned",
- "tenantId": "tenant-id",
- "principalId": "principal-Id"
- },
- "sku": {
- "name": "capacityReservation",
- "capacity": 500,
- "lastSkuUpdate": "Sun, 22 Mar 2020 15:39:29 GMT"
- },
- "properties": {
- "keyVaultProperties": {
- "keyVaultUri": "https://key-vault-name.vault.azure.net",
- "keyName": "key-name",
- "keyVersion": "current-version"
- },
- "provisioningState": "Succeeded",
- "billingType": "cluster",
- "clusterId": "cluster-id"
- },
- "id": "/subscriptions/subscription-id/resourcegroups/resource-group-name/providers/microsoft.operationalinsights/workspaces/workspace-name",
- "name": "cluster-name",
- "type": "Microsoft.OperationalInsights/clusters",
- "location": "region-name"
- }
- ]
- }
- ```
+```json
+{
+ "value": [
+ {
+ "identity": {
+ "type": "SystemAssigned",
+ "tenantId": "tenant-id",
+ "principalId": "principal-id"
+ },
+ "sku": {
+ "name": "capacityreservation",
+ "capacity": 500
+ },
+ "properties": {
+ "provisioningState": "Succeeded",
+ "clusterId": "cluster-id",
+ "billingType": "Cluster",
+ "lastModifiedDate": "last-modified-date",
+ "createdDate": "created-date",
+ "isDoubleEncryptionEnabled": false,
+ "isAvailabilityZonesEnabled": false,
+ "capacityReservationProperties": {
+ "lastSkuUpdate": "last-sku-modified-date",
+ "minCapacity": 500
+ }
+ },
+ "id": "/subscriptions/subscription-id/resourceGroups/resource-group-name/providers/Microsoft.OperationalInsights/clusters/cluster-name",
+ "name": "cluster-name",
+ "type": "Microsoft.OperationalInsights/clusters",
+ "location": "cluster-region"
+ }
+ ]
+}
+```
### Get all clusters in subscription
Get-AzOperationalInsightsCluster
*Call* ```rst
-GET https://management.azure.com/subscriptions/<subscription-id>/providers/Microsoft.OperationalInsights/clusters?api-version=2020-08-01
+GET https://management.azure.com/subscriptions/<subscription-id>/providers/Microsoft.OperationalInsights/clusters?api-version=2021-06-01
Authorization: Bearer <token> ```
Authorization: Bearer <token>
The same as for 'clusters in a resource group', but in subscription scope. -- ### Update commitment tier in cluster When the data volume to your linked workspaces change over time and you want to update the Commitment Tier level appropriately. The tier is specified in units of GB and can have values of 500, 1000, 2000 or 5000 GB/day. Note that you donΓÇÖt have to provide the full REST request body but should include the sku.
Update-AzOperationalInsightsCluster -ResourceGroupName "resource-group-name" -Cl
*Call*
- ```rst
- PATCH https://management.azure.com/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/<cluster-name>?api-version=2020-08-01
- Authorization: Bearer <token>
- Content-type: application/json
+```rst
+PATCH https://management.azure.com/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/<cluster-name>?api-version=2021-06-01
+Authorization: Bearer <token>
+Content-type: application/json
- {
- "sku": {
- "name": "capacityReservation",
- "Capacity": 2000
- }
+{
+ "sku": {
+ "name": "capacityReservation",
+ "Capacity": 2000
}
- ```
+}
+```
### Update billingType in cluster The *billingType* property determines the billing attribution for the cluster and its data:-- *cluster* (default) -- The billing is attributed to the subscription hosting your Cluster resource-- *workspaces* -- The billing is attributed to the subscriptions hosting your workspaces proportionally
+- *Cluster* (default) -- The billing is attributed to the Cluster resource
+- *Workspaces* -- The billing is attributed to linked workspaces proportionally. When data volume from all workspaces is below the Commitment Tier level, the remaining volume is attributed to the cluster
**REST** *Call*
- ```rst
- PATCH https://management.azure.com/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/<cluster-name>?api-version=2020-08-01
- Authorization: Bearer <token>
- Content-type: application/json
+```rst
+PATCH https://management.azure.com/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/<cluster-name>?api-version=2021-06-01
+Authorization: Bearer <token>
+Content-type: application/json
- {
- "properties": {
- "billingType": "cluster",
- }
- }
- ```
+{
+ "properties": {
+ "billingType": "Workspaces",
+ }
+}
+```
### Unlink a workspace from cluster
Old data of the unlinked workspace might be left on the cluster. If this data is
**CLI** ```azurecli
-az monitor log-analytics workspace linked-service delete --resource-group "resource-group-name" --workspace-name "MyWorkspace" --name cluster
+az monitor log-analytics workspace linked-service delete --resource-group "resource-group-name" --workspace-name "workspace-name" --name cluster
``` **PowerShell**
Within the 14 days after deletion, the cluster resource name is reserved and can
Use the following PowerShell command to delete a cluster:
- ```powershell
- Remove-AzOperationalInsightsCluster -ResourceGroupName "resource-group-name" -ClusterName "cluster-name"
- ```
+```powershell
+Remove-AzOperationalInsightsCluster -ResourceGroupName "resource-group-name" -ClusterName "cluster-name"
+```
**REST** Use the following REST call to delete a cluster:
- ```rst
- DELETE https://management.azure.com/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/<cluster-name>?api-version=2020-08-01
- Authorization: Bearer <token>
- ```
+```rst
+DELETE https://management.azure.com/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.OperationalInsights/clusters/<cluster-name>?api-version=2021-06-01
+Authorization: Bearer <token>
+```
**Response**
Use the following REST call to delete a cluster:
- Lockbox isn't available in China currently. - [Double encryption](../../storage/common/storage-service-encryption.md#doubly-encrypt-data-with-infrastructure-encryption) is configured automatically for clusters created from October 2020 in supported regions. You can verify if your cluster is configured for double encryption by sending a GET request on the cluster and observing that the `isDoubleEncryptionEnabled` value is `true` for clusters with Double encryption enabled.
- - If you create a cluster and get an error "<region-name> doesnΓÇÖt support Double Encryption for clusters.", you can still create the cluster without Double encryption by adding `"properties": {"isDoubleEncryptionEnabled": false}` in the REST request body.
+ - If you create a cluster and get an error "region-name doesnΓÇÖt support Double Encryption for clusters.", you can still create the cluster without Double encryption by adding `"properties": {"isDoubleEncryptionEnabled": false}` in the REST request body.
- Double encryption setting can not be changed after the cluster has been created. ## Troubleshooting
azure-monitor Quick Create Workspace Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/quick-create-workspace-cli.md
The Azure CLI 2.0 is used to create and manage Azure resources from the command
* Device collections from Configuration Manager * Diagnostic or log data from Azure storage
-For other sources, such as Azure VMs and Windows or Linux VMs in your environment, see the following topics:
-
-* [Collect data from Azure virtual machines](../vm/quick-collect-azurevm.md)
-* [Collect data from hybrid Linux computer](../vm/quick-collect-linux-computer.md)
-* [Collect data from hybrid Windows computer](../vm/quick-collect-windows-computer.md)
[!INCLUDE [quickstarts-free-trial-note](../../../includes/quickstarts-free-trial-note.md)]
azure-monitor Quick Create Workspace https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/logs/quick-create-workspace.md
Use the **Log Analytics workspaces** menu to create a Log Analytics workspace us
* Device collections from Configuration Manager * Diagnostics or log data from Azure storage
-For other sources, such as Azure VMs and Windows or Linux VMs in your environment, see the following topics:
-
-* [Collect data from Azure virtual machines](../vm/quick-collect-azurevm.md)
-* [Collect data from hybrid Linux computer](../vm/quick-collect-linux-computer.md)
-* [Collect data from hybrid Windows computer](../vm/quick-collect-windows-computer.md)
- If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin. ## Sign in to Azure portal
azure-monitor Resource Graph Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/resource-graph-samples.md
Title: Azure Resource Graph sample queries for Azure Monitor description: Sample Azure Resource Graph queries for Azure Monitor showing use of resource types and tables to access Azure Monitor related resources and properties. Previously updated : 07/21/2021 Last updated : 08/04/2021
azure-monitor Quick Collect Azurevm https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/vm/quick-collect-azurevm.md
- Title: Collect data from an Azure virtual machine with Azure Monitor | Microsoft Docs
-description: Learn how to enable the Log Analytics agent VM Extension and enable collection of data from your Azure VMs with Log Analytics.
--- Previously updated : 08/19/2019----
-# Collect data from an Azure virtual machine with Azure Monitor
-
-[Azure Monitor](../overview.md) can collect data directly from your Azure virtual machines into a Log Analytics workspace for analysis of details and correlations. Installing the Log Analytics VM extension for [Windows](../../virtual-machines/extensions/oms-windows.md) and [Linux](../../virtual-machines/extensions/oms-linux.md) allows Azure Monitor to collect data from your Azure VMs. This quickstart shows you how to configure and collect data from your Azure Linux or Windows VMs using the VM extension with a few easy steps.
-
-This quickstart assumes you have an existing Azure virtual machine. If not you can [create a Windows VM](../../virtual-machines/windows/quick-create-portal.md) or [create a Linux VM](../../virtual-machines/linux/quick-create-cli.md) following our VM quickstarts.
-
-## Sign in to Azure portal
-
-Sign in to the Azure portal at [https://portal.azure.com](https://portal.azure.com).
-
-## Create a workspace
-
-1. In the Azure portal, select **All services**. In the list of resources, type **Log Analytics**. As you begin typing, the list filters based on your input. Select **Log Analytics workspaces**.
-
- ![Azure portal](media/quick-collect-azurevm/azure-portal-log-analytics-workspaces.png)<br>
-
-2. Select **Create**, and then select choices for the following items:
-
- * Provide a name for the new **Log Analytics workspace**, such as *DefaultLAWorkspace*.
- * Select a **Subscription** to link to by selecting from the drop-down list if the default selected is not appropriate.
- * For **Resource Group**, select an existing resource group that contains one or more Azure virtual machines.
- * Select the **Location** your VMs are deployed to. For additional information, see which [regions Log Analytics is available in](https://azure.microsoft.com/regions/services/).
- * If you are creating a workspace in a new subscription created after April 2, 2018, it will automatically use the *Per GB* pricing plan and the option to select a pricing tier will not be available. If you are creating a workspace for an existing subscription created before April 2, or to subscription that was tied to an existing EA enrollment, select your preferred pricing tier. For additional information about the particular tiers, see [Log Analytics Pricing Details](https://azure.microsoft.com/pricing/details/log-analytics/).
-
- ![Create Log Analytics resource blade](media/quick-collect-azurevm/create-log-analytics-workspace-azure-portal.png)
-
-3. After providing the required information on the **Log Analytics workspace** pane, select **OK**.
-
-While the information is verified and the workspace is created, you can track its progress under **Notifications** from the menu.
-
-## Enable the Log Analytics VM Extension
--
-For Windows and Linux virtual machines already deployed in Azure, you install the Log Analytics agent with the Log Analytics VM Extension. Using the extension simplifies the installation process and automatically configures the agent to send data to the Log Analytics workspace that you specify. The agent is also upgraded automatically when a newer version is released, ensuring that you have the latest features and fixes. Before proceeding, verify the VM is running otherwise the process will fail to complete successfully.
-
->[!NOTE]
->The Log Analytics agent for Linux cannot be configured to report to more than one Log Analytics workspace.
-
-1. In the Azure portal, select **All services** found in the upper left-hand corner. In the list of resources, type **Log Analytics**. As you begin typing, the list filters based on your input. Select **Log Analytics workspaces**.
-
-2. In your list of Log Analytics workspaces, select *DefaultLAWorkspace* created earlier.
-
-3. On the left-hand menu, under Workspace Data Sources, select **Virtual machines**.
-
-4. In the list of **Virtual machines**, select a virtual machine you want to install the agent on. Notice that the **Log Analytics connection status** for the VM indicates that it is **Not connected**.
-
-5. In the details for your virtual machine, select **Connect**. The agent is automatically installed and configured for your Log Analytics workspace. This process takes a few minutes, during which time the **Status** shows **Connecting**.
-
-6. After you install and connect the agent, the **Log Analytics connection status** will be updated with **This workspace**.
-
-## Collect event and performance data
-
-Azure Monitor can collect events from the Windows event logs or Linux Syslog and performance counters that you specify for longer term analysis and reporting, and take action when a particular condition is detected. Follow these steps to configure collection of events from the Windows system log and Linux Syslog, and several common performance counters to start with.
-
-### Data collection from Windows VM
-
-1. Select **Advanced settings**.
-
- ![Log Analytics Advance Settings](media/quick-collect-azurevm/log-analytics-advanced-settings-azure-portal.png)
-
-2. Select **Data**, and then select **Windows Event Logs**.
-
-3. You add an event log by typing in the name of the log. Type **System** and then select the plus sign **+**.
-
-4. In the table, check the severities **Error** and **Warning**.
-
-5. Select **Save** at the top of the page to save the configuration.
-
-6. Select **Windows Performance Data** to enable collection of performance counters on a Windows computer.
-
-7. When you first configure Windows Performance counters for a new Log Analytics workspace, you are given the option to quickly create several common counters. They are listed with a checkbox next to each.
-
- ![Screenshot of the Windows Performance Counters pane with a list of selected counters and the Add the selected performance counters button selected.](media/quick-collect-azurevm/windows-perfcounters-default.png)
-
- Select **Add the selected performance counters**. They are added and preset with a ten second collection sample interval.
-
-8. Select **Save** at the top of the page to save the configuration.
-
-### Data collection from Linux VM
-
-1. Select **Syslog**.
-
-2. You add an event log by typing in the name of the log. Type **Syslog** and then select the plus sign **+**.
-
-3. In the table, deselect the severities **Info**, **Notice** and **Debug**.
-
-4. Select **Save** at the top of the page to save the configuration.
-
-5. Select **Linux Performance Data** to enable collection of performance counters on a Linux computer.
-
-6. When you first configure Linux Performance counters for a new Log Analytics workspace, you are given the option to quickly create several common counters. They are listed with a checkbox next to each.
-
- ![Screenshot of the Linux Performance Counters pane with a list of selected counters and the Add the selected performance counters button selected.](media/quick-collect-azurevm/linux-perfcounters-azure-monitor.png)
-
- Select **Apply below configuration to to my machines** and then select **Add the selected performance counters**. They are added and preset with a ten second collection sample interval.
-
-7. Select **Save** at the top of the page to save the configuration.
-
-## View data collected
-
-Now that you have enabled data collection, lets run a simple log search example to see some data from the target VMs.
-
-1. In the selected workspace, from the left-hand pane, select **Logs**.
-
-2. On the Logs query page, type `Perf` in the query editor and select **Run**.
-
- ![Log Analytics log search query example](./media/quick-collect-windows-computer/log-analytics-portal-queryexample.png)
-
- For example, the query in the following image returned 10,000 performance records. Your results will be significantly less.
-
- ![Log Analytics log search result](media/quick-collect-azurevm/log-analytics-search-perf.png)
-
-## Clean up resources
-
-When no longer needed, delete the Log Analytics workspace. To do so, select the Log Analytics workspace you created earlier and on the resource page select **Delete**.
--
-![Delete Log Analytics resource](media/quick-collect-azurevm/log-analytics-portal-delete-resource.png)
-
-## Next steps
-
-Now that you are collecting operational and performance data from your Windows or Linux virtual machines, you can easily begin exploring, analyzing, and taking action on data that you collect for *free*.
-
-To learn how to view and analyze the data, continue to the tutorial.
-
-> [!div class="nextstepaction"]
-> [View or analyze data in Log Analytics](../logs/log-analytics-tutorial.md)
azure-monitor Quick Collect Linux Computer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/vm/quick-collect-linux-computer.md
- Title: 'Quickstart: Collect data from a hybrid Linux Computer with Azure Monitor'
-description: In this quickstart, you'll learn how to deploy the Log Analytics agent for Linux computers running outside of Azure and enable data collection with Azure Monitor Logs.
------ Previously updated : 12/24/2019----
-# Quickstart: Collect data from a Linux computer in a hybrid environment with Azure Monitor
-
-[Azure Monitor](../overview.md) can collect data directly from your physical or virtual Linux computers in your environment into a Log Analytics workspace for detailed analysis and correlation. Installing the [Log Analytics agent](../agents/log-analytics-agent.md) allows Azure Monitor to collect data from a datacenter or other cloud environment. This quickstart shows you how to configure and collect data from your Linux server with a few easy steps. For information about Azure Linux VMs, see [Collect data about Azure virtual machines](./quick-collect-azurevm.md).
-
-To understand the supported configuration, see [Supported operating systems](../agents/agents-overview.md#supported-operating-systems) and [Network firewall configuration](../agents/log-analytics-agent.md#network-requirements).
-
-If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
-
-## Sign in to the Azure portal
-
-Sign in to the Azure portal at [https://portal.azure.com](https://portal.azure.com).
-
-## Create a workspace
-
-1. In the Azure portal, select **All services**. In the list of resources, type **Log Analytics**. As you begin typing, the list filters based on your input. Select **Log Analytics workspaces**.
-
- ![Finding Log Analytics workspace in the Azure portal](media/quick-collect-azurevm/azure-portal-log-analytics-workspaces.png)<br>
-
-2. Select **Create**, and then select choices for the following items:
-
- * Provide a name for the new **Log Analytics workspace**, such as *DefaultLAWorkspace*.
- * Select a **Subscription** to link to by selecting from the drop-down list if the default selected is not appropriate.
- * For **Resource Group**, select an existing resource group that contains one or more Azure virtual machines.
- * Select the **Location** your VMs are deployed to. For additional information, see which [regions Log Analytics is available in](https://azure.microsoft.com/regions/services/).
- * If you are creating a workspace in a new subscription created after April 2, 2018, it will automatically use the *Per GB* pricing plan and the option to select a pricing tier will not be available. If you are creating a workspace for an existing subscription created before April 2, or to subscription that was tied to an existing EA enrollment, select your preferred pricing tier. For additional information about the particular tiers, see [Log Analytics Pricing Details](https://azure.microsoft.com/pricing/details/log-analytics/).
-
- ![Creating a Log Analytics workspace in the Azure portal](media/quick-collect-azurevm/create-log-analytics-workspace-azure-portal.png)
-
-3. After providing the required information on the **Log Analytics workspace** pane, select **OK**.
-
-While the information is verified and the workspace is created, you can track its progress under **Notifications** from the menu.
-
-## Obtain workspace ID and key
-
-Before installing the Log Analytics agent for Linux, you need the workspace ID and key for your Log Analytics workspace. This information is required by the agent wrapper script to properly configure the agent and ensure it can successfully communicate with Azure Monitor.
--
-1. In the upper-left corner of the Azure portal, select **All services**. In the search box, enter **Log Analytics**. As you type, the list filters based on your input. Select **Log Analytics workspaces**.
-
-2. In your list of Log Analytics workspaces, select the workspace you created earlier. (You might have named it **DefaultLAWorkspace**.)
-
-3. Select **Agents management**:
-
-4. Then select **Linux servers**.
-
-5. The value to the right of **Workspace ID** and **Primary key**. Copy and paste both into your favorite editor.
-
-## Install the agent for Linux
-
-The following steps configure setup of the agent for Log Analytics in Azure and Azure Government cloud.
-
->[!NOTE]
->The Log Analytics agent for Linux cannot be configured to report to more than one Log Analytics workspace.
-
-If your Linux computer needs to communicate through a proxy server to Log Analytics, the proxy configuration can be specified on the command line by including `-p [protocol://][user:password@]proxyhost[:port]`. The *proxyhost* property accepts a fully qualified domain name or IP address of the proxy server.
-
-For example: `https://user01:password@proxy01.contoso.com:30443`
-
-1. To configure the Linux computer to connect to a Log Analytics workspace, run the following command providing the workspace ID and primary key copied earlier. The following command downloads the agent, validates its checksum, and installs it.
-
- ```
- wget https://raw.githubusercontent.com/Microsoft/OMS-Agent-for-Linux/master/installer/scripts/onboard_agent.sh && sh onboard_agent.sh -w <YOUR WORKSPACE ID> -s <YOUR WORKSPACE PRIMARY KEY>
- ```
-
- The following command includes the `-p` proxy parameter and example syntax when authentication is required by your proxy server:
-
- ```
- wget https://raw.githubusercontent.com/Microsoft/OMS-Agent-for-Linux/master/installer/scripts/onboard_agent.sh && sh onboard_agent.sh -p [protocol://][user:password@]proxyhost[:port] -w <YOUR WORKSPACE ID> -s <YOUR WORKSPACE PRIMARY KEY>
- ```
-
-2. To configure the Linux computer to connect to Log Analytics workspace in Azure Government cloud, run the following command providing the workspace ID and primary key copied earlier. The following command downloads the agent, validates its checksum, and installs it.
-
- ```
- wget https://raw.githubusercontent.com/Microsoft/OMS-Agent-for-Linux/master/installer/scripts/onboard_agent.sh && sh onboard_agent.sh -w <YOUR WORKSPACE ID> -s <YOUR WORKSPACE PRIMARY KEY> -d opinsights.azure.us
- ```
-
- The following command includes the `-p` proxy parameter and example syntax when authentication is required by your proxy server:
-
- ```
- wget https://raw.githubusercontent.com/Microsoft/OMS-Agent-for-Linux/master/installer/scripts/onboard_agent.sh && sh onboard_agent.sh -p [protocol://][user:password@]proxyhost[:port] -w <YOUR WORKSPACE ID> -s <YOUR WORKSPACE PRIMARY KEY> -d opinsights.azure.us
- ```
-
-3. Restart the agent by running the following command:
-
- ```
- sudo /opt/microsoft/omsagent/bin/service_control restart [<workspace id>]
- ```
-
-## Collect event and performance data
-
-Azure Monitor can collect events from the Linux Syslog and performance counters that you specify for longer term analysis and reporting. It can also take action when it detects a particular condition. Follow these steps to configure collection of events from the Linux Syslog, and several common performance counters to start with.
-
-1. In the Azure portal, select **All services**. In the list of resources, type Log Analytics. As you type, the list filters based on your input. Select **Log Analytics workspaces** and in your list of Log Analytics workspaces, select the workspace you are looking for and select **Advanced settings** of the **Log Analytics** workspace.
-
-2. Select **Data**, and then select **Syslog**.
-
-3. You add syslog by typing in the name of the log. Enter **Syslog** and then select the plus sign **+**.
-
-4. In the table, uncheck the severities **Info**, **Notice** and **Debug**.
-
-5. Select **Save** at the top of the page to save the configuration.
-
-6. Select **Linux Performance Data** to enable collection of performance counters on a Linux computer.
-
-7. When you first configure Linux Performance counters for a new Log Analytics workspace, you are given the option to quickly create several common counters. They are listed with a checkbox next to each.
-
- ![Default Linux performance counters selected in Azure Monitor](media/quick-collect-azurevm/linux-perfcounters-azure-monitor.png)
-
- Select **Apply below configuration to to my machines** and then select **Add the selected performance counters**. They are added and preset with a ten second collection sample interval.
-
-8. Select **Save** at the top of the page to save the configuration.
-
-## View data collected
-
-Now that you have enabled data collection, lets run a simple log search example to see some data from the target computer.
-
-1. In the selected workspace, from the left-hand pane, select **Logs**.
-
-2. On the Logs query page, type `Perf` in the query editor and select **Run**.
-
- ![Log Analytics log search](media/quick-collect-linux-computer/log-analytics-portal-queryexample.png)
-
- For example, the query in the following image returned 10,000 Performance records. Your results will be significantly less.
-
- ![Log Analytics log search result](media/quick-collect-linux-computer/log-analytics-search-perf.png)
-
-## Clean up resources
-
-When no longer needed, you can remove the agent from the Linux computer and delete the Log Analytics workspace.
-
-To remove the agent, run the following command on the Linux computer. The *--purge* argument completely removes the agent and its configuration.
-
- `wget https://raw.githubusercontent.com/Microsoft/OMS-Agent-for-Linux/master/installer/scripts/onboard_agent.sh && sh onboard_agent.sh --purge`
-
-To delete the workspace, select the Log Analytics workspace you created earlier and on the resource page select **Delete**.
-
-![Delete Log Analytics resource](media/quick-collect-linux-computer/log-analytics-portal-delete-resource.png)
-
-## Next steps
-
-Now that you are collecting operational and performance data from your on-premises Linux computer, you can easily begin exploring, analyzing, and taking action on data that you collect for *free*.
-
-To learn how to view and analyze the data, continue to the tutorial.
-
-> [!div class="nextstepaction"]
-> [View or analyze data in Log Analytics](../logs/log-analytics-tutorial.md)
azure-monitor Quick Collect Windows Computer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/vm/quick-collect-windows-computer.md
- Title: Collect data from hybrid Windows computer with Azure Monitor
-description: In this quickstart, you'll learn how to deploy the Log Analytics agent for Windows computers running outside of Azure and enable data collection with Azure Monitor Logs.
------ Previously updated : 08/22/2019----
-# Collect data from a Windows computer in a hybrid environment with Azure Monitor
-
-[Azure Monitor](../overview.md) can collect data directly from your your physical or virtual Windows computers in your environment into a Log Analytics workspace for detailed analysis and correlation. Installing the [Log Analytics agent](../agents/log-analytics-agent.md) allows Azure Monitor to collect data from a datacenter or other cloud environment. This quickstart shows you how to configure and collect data from your Windows computer with a few easy steps. For information about Azure Windows VMs, see [Collect data about Azure virtual machines](./quick-collect-azurevm.md).
-
-To understand the supported configuration, see [Supported operating systems](../agents/agents-overview.md#supported-operating-systems) and [Network firewall configuration](../agents/log-analytics-agent.md#network-requirements).
-
-If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
-
-## Sign in to Azure portal
-
-Sign in to the Azure portal at [https://portal.azure.com](https://portal.azure.com).
-
-## Create a workspace
-
-1. In the Azure portal, select **All services**. In the list of resources, type **Log Analytics**. As you begin typing, the list filters based on your input. Select **Log Analytics workspaces**.
-
- ![Azure portal](media/quick-collect-azurevm/azure-portal-log-analytics-workspaces.png)<br>
-
-2. Select **Create**, and then select choices for the following items:
-
- * Provide a name for the new **Log Analytics workspace**, such as *DefaultLAWorkspace*.
- * Select a **Subscription** to link to by selecting from the drop-down list if the default selected is not appropriate.
- * For **Resource Group**, select an existing resource group that contains one or more Azure virtual machines.
- * Select the **Location** your VMs are deployed to. For additional information, see which [regions Log Analytics is available in](https://azure.microsoft.com/regions/services/).
- * If you are creating a workspace in a new subscription created after April 2, 2018, it will automatically use the *Per GB* pricing plan and the option to select a pricing tier will not be available. If you are creating a workspace for an existing subscription created before April 2, or to subscription that was tied to an existing EA enrollment, select your preferred pricing tier. For additional information about the particular tiers, see [Log Analytics Pricing Details](https://azure.microsoft.com/pricing/details/log-analytics/).
-
- ![Create Log Analytics resource blade](media/quick-collect-azurevm/create-log-analytics-workspace-azure-portal.png)
-
-3. After providing the required information on the **Log Analytics workspace** pane, select **OK**.
-
-While the information is verified and the workspace is created, you can track its progress under **Notifications** from the menu.
--
-## Get the workspace ID and key
-
-Before you install the Log Analytics agent for Windows (also referred to as the Microsoft Monitoring Agent (MMA)), you need the workspace ID and key for your Log Analytics workspace. The setup wizard needs this information to properly configure the agent and ensure it can communicate with Azure Monitor.
-
-1. In the upper-left corner of the Azure portal, select **All services**. In the search box, enter **Log Analytics**. As you type, the list filters based on your input. Select **Log Analytics workspaces**.
-
-2. In your list of Log Analytics workspaces, select the workspace you created earlier. (You might have named it **DefaultLAWorkspace**.)
-
-3. Select **Advanced settings**:
-
- ![Log Analytics advanced settings](media/quick-collect-azurevm/log-analytics-advanced-settings-azure-portal.png)
-
-4. Select **Connected Sources**, and then select **Windows Servers**.
-
-5. Copy the values to the right of **Workspace ID** and **Primary Key**. Paste them into your favorite editor.
-
-## Install the agent for Windows
-
-The following steps install and configure the agent for Log Analytics in Azure and Azure Government. You'll use the Microsoft Monitoring Agent Setup program to install the agent on your computer.
-
-1. Continuing from the previous set of steps, on the **Windows Servers** page, select the **Download Windows Agent** version that you want to download. Select the appropriate version for the processor architecture of your Windows operating system.
-
-2. Run Setup to install the agent on your computer.
-
-3. On the **Welcome** page, select **Next**.
-
-4. On the **License Terms** page, read the license and then select **I Agree**.
-
-5. On the **Destination Folder** page, change or keep the default installation folder and then select **Next**.
-
-6. On the **Agent Setup Options** page, connect the agent to Azure Log Analytics and then select **Next**.
-
-7. On the **Azure Log Analytics** page, complete these steps:
-
- 1. Paste in the **Workspace ID** and **Workspace Key (Primary Key)** that you copied earlier. If the computer should report to a Log Analytics workspace in Azure Government, select **Azure US Government** in the **Azure Cloud** list.
- 2. If the computer needs to communicate through a proxy server to the Log Analytics service, select **Advanced** and provide the URL and port number of the proxy server. If your proxy server requires authentication, enter the user name and password for authentication with the proxy server and then select **Next**.
-
-8. Select **Next** after you've added the configuration settings:
-
- ![Microsoft Monitoring Agent Setup](media/quick-collect-windows-computer/log-analytics-mma-setup-laworkspace.png)
-
-9. On the **Ready to Install** page, review your choices and then select **Install**.
-
-10. On the **Configuration completed successfully** page, select **Finish**.
-
-When the installation and setup is finished, Microsoft Monitoring Agent appears in Control Panel. You can review your configuration and verify that the agent is connected to the Log Analytics workspace. When connected, on the **Azure Log Analytics** tab, the agent displays this message: **The Microsoft Monitoring Agent has successfully connected to the Microsoft Log Analytics service.**<br><br> ![MMA connection status](media/quick-collect-windows-computer/log-analytics-mma-laworkspace-status.png)
-
-## Collect event and performance data
-
-Azure Monitor can collect events that you specify from the Windows event log and performance counters for longer term analysis and reporting. It can also take action when it detects a particular condition. Follow these steps to configure collection of events from the Windows event log, and several common performance counters to start with.
-
-1. In the lower-left corner of the Azure portal, select **More services**. In the search box, enter **Log Analytics**. As you type, the list filters based on your input. Select **Log Analytics workspaces**.
-
-2. Select **Advanced settings**:
-
- ![Log Analytics advanced settings](media/quick-collect-azurevm/log-analytics-advanced-settings-azure-portal.png)
-
-3. Select **Data**, and then select **Windows Event Logs**.
-
-4. You add an event log by entering the name of the log. Enter **System**, and then select the plus sign (**+**).
-
-5. In the table, select the **Error** and **Warning** severities.
-
-6. Select **Save** at the top of the page.
-
-7. Select **Windows Performance Counters** to enable collection of performance counters on a Windows computer.
-
-8. When you first configure Windows performance counters for a new Log Analytics workspace, you're given the option to quickly create several common counters. Each option is listed, with a check box next to it:
-
- ![Windows performance counters](media/quick-collect-windows-computer/windows-perfcounters-default.png).
-
- Select **Add the selected performance counters**. The counters are added and preset with a ten-second collection sample interval.
-
-9. Select **Save** at the top of the page.
-
-## View collected data
-
-Now that you've enabled data collection, let's run a simple log search to see some data from the target computer.
-
-1. In the selected workspace, from the left-hand pane, select **Logs**.
-
-2. On the Logs query page, type `Perf` in the query editor and select **Run**.
-
- ![Log Analytics log search](media/quick-collect-windows-computer/log-analytics-portal-queryexample.png)
-
- For example, the query in this image returned 10,000 Performance records. Your results will be significantly less.
-
- ![Log Analytics log search result](media/quick-collect-windows-computer/log-analytics-search-perf.png)
-
-## Clean up resources
-
-You can remove the agent from your computer and delete the Log Analytics workspace if you no longer need them.
-
-To remove the agent, complete these steps:
-
-1. Open Control Panel.
-
-2. Open **Programs and Features**.
-
-3. In **Programs and Features**, select **Microsoft Monitoring Agent** and then select **Uninstall**.
-
-To delete the Log Analytics workspace you created earlier, select it, and, on the resource page, select **Delete**:
-
-![Delete Log Analytics workspace](media/quick-collect-windows-computer/log-analytics-portal-delete-resource.png)
-
-## Next steps
-
-Now that you're collecting operational and performance data from your Windows computer, you can easily begin exploring, analyzing, and acting on the data you collect, for *free*.
-
-To learn how to view and analyze the data, continue to the tutorial:
-
-> [!div class="nextstepaction"]
-> [View or analyze data in Log Analytics](../logs/log-analytics-tutorial.md)
azure-monitor Quick Monitor Azure Vm https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-monitor/vm/quick-monitor-azure-vm.md
- Title: Monitor an Azure virtual machine with Azure Monitor
-description: Learn how to collect and analyze data for an Azure virtual machine in Azure Monitor.
---- Previously updated : 03/10/2020--
-# Quickstart: Monitor an Azure virtual machine with Azure Monitor
-[Azure Monitor](../overview.md) starts collecting data from Azure virtual machines the moment that they're created. In this quickstart, you'll take a brief walkthrough the data that's automatically collected for an Azure VM and how to view it in the Azure portal. You'll then enable [VM insights](../vm/vminsights-overview.md) for your VM which will enable agents on the VM to collect and analyze data from the guest operating system including processes and their dependencies.
-
-This quickstart assumes you have an existing Azure virtual machine. If not you can create a [Windows VM](../../virtual-machines/windows/quick-create-portal.md) or create a [Linux VM](../../virtual-machines/linux/quick-create-cli.md) following our VM quickstarts.
-
-For more detailed descriptions of monitoring data collected from Azure resources see [Monitoring Azure virtual machines with Azure Monitor](./monitor-vm-azure.md).
--
-## Complete the Monitor an Azure resource quickstart.
-Complete [Monitor an Azure resource with Azure Monitor](../essentials/quick-monitor-azure-resource.md) to view the overview page, activity log, and metrics for a VM in your subscription. Azure VMs collect the same monitoring data as any other Azure resource, but this is only for the host VM. The rest of this quickstart will focus on monitoring the guest operating system and its workloads.
--
-## Enable VM insights
-While metrics and activity logs will be collected for the host VM, you need an agent and some configuration to collect and analyze monitoring data from the guest operating system and its workloads. VM insights installs these agents and provides additional powerful features for monitoring your virtual machines.
-
-1. Go to the menu for your virtual machine.
-2. Either click **Go to Insights** from the tile in the **Overview** page, or click on **Insights** from the **Monitoring** menu.
-
- ![Overview page](media/quick-monitor-azure-vm/overview-insights.png)
-
-3. If VM insights has not yet been enabled for the virtual machine, click **Enable**.
-
- ![Enable insights](media/quick-monitor-azure-vm/enable-insights.png)
-
-4. If the virtual machine isn't already attached to a Log Analytics workspace, you will be prompted to select an existing workspace or create a new one. Select the default which is a workspace with a unique name in the same region as your virtual machine.
-
- ![Select workspace](media/quick-monitor-azure-vm/select-workspace.png)
-
-5. Onboarding will take a few minutes as extensions are enabled and agents are installed on your virtual machine. When it's complete, you get a message that insights have been successfully deployed. Click **Azure Monitor** to open VM insights.
-
- ![Open Azure Monitor](media/quick-monitor-azure-vm/azure-monitor.png)
-
-6. You'll see your VM with any other VMs in your subscription that are onboarded. Select the **Not monitored** tab if you want to view virtual machines in your subscription that aren't onboarded.
-
- ![Get started](media/quick-monitor-azure-vm/get-started.png)
--
-## Configure workspace
-When you create a new Log Analytics workspace, it needs to be configured to collect logs. This configuration only needs to be performed once since configuration is sent to any virtual machines that connect to it.
-
-1. Select **Workspace configuration** and then select your workspace.
-
-2. Select **Advanced settings**
-
- ![Log Analytics Advance Settings](../vm/media/quick-collect-azurevm/log-analytics-advanced-settings-azure-portal.png)
-
-### Data collection from Windows VM
--
-2. Select **Data**, and then select **Windows Event Logs**.
-
-3. Add an event log by typing in the name of the log. Type **System** and then select the plus sign **+**.
-
-4. In the table, check the severities **Error** and **Warning**.
-
-5. Select **Save** at the top of the page to save the configuration.
-
-### Data collection from Linux VM
-
-1. Select **Data**, and then select **Syslog**.
-
-2. Add an event log by typing in the name of the log. Type **Syslog** and then select the plus sign **+**.
-
-3. In the table, deselect the severities **Info**, **Notice** and **Debug**.
-
-4. Select **Save** at the top of the page to save the configuration.
-
-## View data collected
-
-7. Click on your virtual machine and then select the **Performance** tab that's under **Monitoring** menu **Insights** tile. This shows a select group of performance counters collected from the guest operating system of your VM. Scroll down to view more counters, and move the mouse over a graph to view average and percentiles at different times.
-
- ![Screenshot shows the Performance pane.](media/quick-monitor-azure-vm/performance.png)
-
-9. Select **Map** to open the maps feature which shows the processes running on the virtual machine and their dependencies. Select **Properties** to open the property pane if it isn't already open.
-
- ![Screenshot shows the Map pane.](media/quick-monitor-azure-vm/map.png)
-
-11. Expand the processes for your virtual machine. Select one of the processes to view its details and to highlight its dependencies.
-
- ![Screenshot shows the Map pane with processes for a virtual machine expanded.](media/quick-monitor-azure-vm/processes.png)
-
-12. Select your virtual machine again and then select **Log Events**.
-
- ![Log events](media/quick-monitor-azure-vm/log-events.png)
-
-13. You see a list of tables that are stored in the Log Analytics workspace for the virtual machine. This list will be different depending whether you're using a Windows or Linux virtual machine. Click the **Event** table. This includes all events from the Windows event log. Log Analytics opens with a simple query to retrieve event log entries.
-
- ![Log analytics](media/quick-monitor-azure-vm/log-analytics.png)
-
-## Next steps
-In this quickstart, you enabled VM insights for a virtual machine and configured the Log Analytics workspace to collect events for the guest operating system. To learn how to view and analyze the data, continue to the tutorial.
-
-> [!div class="nextstepaction"]
-> [View or analyze data in Log Analytics](../logs/log-analytics-tutorial.md)
azure-netapp-files Azure Netapp Files Create Volumes Smb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/azure-netapp-files-create-volumes-smb.md
na ms.devlang: na Previously updated : 07/12/2021 Last updated : 08/05/2021 # Create an SMB volume for Azure NetApp Files
Before creating an SMB volume, you need to create an Active Directory connection
- It can contain only letters, numbers, or dashes (`-`). - The length must not exceed 80 characters.
- * If you want to enable encryption for SMB3, select **Enable SMB3 Protocol Encryption**.
+ * <a name="smb3-encryption"></a>If you want to enable encryption for SMB3, select **Enable SMB3 Protocol Encryption**.
This feature enables encryption for in-flight SMB3 data. SMB clients not using SMB3 encryption will not be able to access this volume. Data at rest is encrypted regardless of this setting. See [SMB encryption](azure-netapp-files-smb-performance.md#smb-encryption) for additional information.
Before creating an SMB volume, you need to create an Active Directory connection
``` You can also use [Azure CLI commands](/cli/azure/feature?preserve-view=true&view=azure-cli-latest) `az feature register` and `az feature show` to register the feature and display the registration status.
- * If you want to enable Continuous Availability for the SMB volume, select **Enable Continuous Availability**.
+ * <a name="continuous-availability"></a>If you want to enable Continuous Availability for the SMB volume, select **Enable Continuous Availability**.
> [!IMPORTANT] > The SMB Continuous Availability feature is currently in public preview. You need to submit a waitlist request for accessing the feature through the **[Azure NetApp Files SMB Continuous Availability Shares Public Preview waitlist submission page](https://aka.ms/anfsmbcasharespreviewsignup)**. Wait for an official confirmation email from the Azure NetApp Files team before using the Continuous Availability feature.
azure-netapp-files Cross Region Replication Introduction https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/cross-region-replication-introduction.md
Azure NetApp Files volume replication is supported between various [Azure region
* Australia East and Southeast Asia * Germany West Central and UK South * Germany West Central and West Europe
+* Germany West Central and France Central
## Service-level objectives
azure-netapp-files Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-netapp-files/whats-new.md
na ms.devlang: na Previously updated : 07/12/2021 Last updated : 08/05/2021
Azure NetApp Files is updated regularly. This article provides a summary about t
Users have requested direct control over provisioned capacity. Users want to control and balance storage capacity and utilization. They also want to control cost along with the application-side and client-side visibility of available, used, and provisioned capacity and the performance of their application volumes. With this new behavior, all this capability has now been enabled.
-* [SMB Continuous Availability (CA) shares support for FSLogix user profile containers](azure-netapp-files-create-volumes-smb.md#add-an-smb-volume) (Preview)
+* [SMB Continuous Availability (CA) shares support for FSLogix user profile containers](azure-netapp-files-create-volumes-smb.md#continuous-availability) (Preview)
- [FSLogix](/fslogix/overview) is a set of solutions that enhance, enable, and simplify non-persistent Windows computing environments. FSLogix solutions are appropriate for virtual environments in both public and private clouds. FSLogix solutions can also be used to create more portable computing sessions when you use physical devices. FSLogix can be used to provide dynamic access to persistent user profile containers stored on SMB shared networked storage, including Azure NetApp Files. To further enhance FSLogix resiliency to storage service maintenance events, Azure NetApp Files has extended support for SMB Transparent Failover via [SMB Continuous Availability (CA) shares on Azure NetApp Files](azure-netapp-files-create-volumes-smb.md#add-an-smb-volume) for user profile containers. See Azure NetApp Files [Azure Virtual Desktop solutions](azure-netapp-files-solution-architectures.md#windows-virtual-desktop) for additional information.
+ [FSLogix](/fslogix/overview) is a set of solutions that enhance, enable, and simplify non-persistent Windows computing environments. FSLogix solutions are appropriate for virtual environments in both public and private clouds. FSLogix solutions can also be used to create more portable computing sessions when you use physical devices. FSLogix can be used to provide dynamic access to persistent user profile containers stored on SMB shared networked storage, including Azure NetApp Files. To further enhance FSLogix resiliency to storage service maintenance events, Azure NetApp Files has extended support for SMB Transparent Failover via [SMB Continuous Availability (CA) shares on Azure NetApp Files](azure-netapp-files-create-volumes-smb.md#continuous-availability) for user profile containers. See Azure NetApp Files [Azure Virtual Desktop solutions](azure-netapp-files-solution-architectures.md#windows-virtual-desktop) for additional information.
-* [SMB3 Protocol Encryption](azure-netapp-files-create-volumes-smb.md#add-an-smb-volume) (Preview)
+* [SMB3 Protocol Encryption](azure-netapp-files-create-volumes-smb.md#smb3-encryption) (Preview)
You can now enable SMB3 Protocol Encryption on Azure NetApp Files SMB and dual-protocol volumes. This feature enables encryption for in-flight SMB3 data, using the [AES-CCM algorithm on SMB 3.0, and the AES-GCM algorithm on SMB 3.1.1](/windows-server/storage/file-server/file-server-smb-overview#features-added-in-smb-311-with-windows-server-2016-and-windows-10-version-1607) connections. SMB clients not using SMB3 encryption will not be able to access this volume. Data at rest is encrypted regardless of this setting. SMB encryption further enhances security. However, it might impact the client (CPU overhead for encrypting and decrypting messages). It might also impact storage resource utilization (reductions in throughput). You should test the encryption performance impact against your applications before deploying workloads into production.
azure-percept How To Select Update Package https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-percept/how-to-select-update-package.md
Using the **model** and **swVersion** identified in the previous section, check
|model |swVersion |Update method |Download links |Note | ||||||
-|PE-101 |2020.108.101.105, <br>2020.108.114.120, <br>2020.109.101.122, <br>2020.109.116.120, <br>2021.101.106.118 |**USB only** |[2021.106.111.115 USB update package](https://go.microsoft.com/fwlink/?linkid=2167236) |June release (2106) |
-|PE-101 |2021.102.108.112, <br> |OTA or USB |[2021.106.111.115 OTA manifest (PE-101)](https://go.microsoft.com/fwlink/?linkid=2167127)<br>[2021.106.111.115 OTA update package](https://go.microsoft.com/fwlink/?linkid=2167128)<br>[2021.106.111.115 USB update package](https://go.microsoft.com/fwlink/?linkid=2167236) |June release (2106) |
-|APDK-101 |All swVersions |OTA or USB | [2021.106.111.115 OTA manifest (APDK-101)](https://go.microsoft.com/fwlink/?linkid=2167235)<br>[2021.106.111.115 OTA update package](https://go.microsoft.com/fwlink/?linkid=2167128)<br>[2021.106.111.115 USB update package](https://go.microsoft.com/fwlink/?linkid=2167236) |June release (2106) |
+|PE-101 |All swVersions |**USB only** |[2021.107.129.116 USB update package](https://go.microsoft.com/fwlink/?linkid=2169086) |July release (2107) |
+|APDK-101 |Any swVersion earlier than 2021.106.111.115 |**USB only** |[2021.107.129.116 USB update package](https://go.microsoft.com/fwlink/?linkid=2169086) |July release (2107) |
+|APDK-101 |2021.106.111.115 |OTA or USB |[2021.107.129.116 OTA update package](https://go.microsoft.com/fwlink/?linkid=2169245)<br>[2021.107.129.116 USB update package](https://go.microsoft.com/fwlink/?linkid=2169086) |July release (2107) |
## Next steps
azure-resource-manager Learn Bicep https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/bicep/learn-bicep.md
This path contains the following modules.
| Learn module | Description | | | -- | | [Introduction to infrastructure as code using Bicep](/learn/modules/introduction-to-infrastructure-as-code-using-bicep/) | This module describes the benefits of using infrastructure as code, Azure Resource Manager, and Bicep to quickly and confidently scale your cloud deployments. It helps you determine the types of deployments for which Bicep is a good deployment tool. |
-| [Build your first Bicep template](/learn/modules/deploy-azure-resources-by-using-bicep-templates/) | In this module, you define Azure resources within a Bicep template. You improve the consistency and reliability of your deployments, reduce the manual effort required, and scale your deployments across environments. Your template will be flexible and reusable by using parameters, variables, expressions, and modules. |
+| [Build your first Bicep template](/learn/modules/build-first-bicep-template/) | In this module, you define Azure resources within a Bicep template. You improve the consistency and reliability of your deployments, reduce the manual effort required, and scale your deployments across environments. Your template will be flexible and reusable by using parameters, variables, expressions, and modules. |
| [Build reusable Bicep templates by using parameters](/learn/modules/build-reusable-bicep-templates-parameters/) | This module describes how you can use Bicep parameters to provide information for your template during each deployment. You'll learn about parameter decorators, which make your parameters easy to understand and work with. You'll also learn about the different ways that you can provide parameter values and protect them when you're working with secure information. | | [Build flexible Bicep templates by using conditions and loops](/learn/modules/build-flexible-bicep-templates-conditions-loops/) | Learn how to use conditions to deploy resources only when specific constraints are in place. Also learn how to use loops to deploy multiple resources that have similar properties. | | [Deploy child and extension resources by using Bicep](/learn/modules/child-extension-bicep-templates/) | This module shows how to deploy various Azure resources in your Bicep code. Learn about child and extension resources, and how they can be defined and used within Bicep. Use Bicep to work with resources that you created outside a Bicep template or module. |
In addition to the preceding path, the following modules contain Bicep content.
| [Publish libraries of reusable infrastructure code by using template specs](/learn/modules/arm-template-specs/) | Template specs enable you to reuse and share your ARM templates across your organization. Learn how to create and publish template specs, and how to deploy them. You'll also learn how to manage template specs, including how to control access and how to safely update them by using versions. | | [Preview Azure deployment changes by using what-if](/learn/modules/arm-template-whatif/) | This module teaches you how to preview your changes with the what-if operation. By using what-if, you can make sure your Bicep file only makes changes that you expect. | | [Authenticate your Azure deployment pipeline by using service principals](/learn/modules/authenticate-azure-deployment-pipeline-service-principals/) | Service principals enable your deployment pipelines to authenticate securely with Azure. In this module, you'll learn what service principals are, how they work, and how to create them. You'll also learn how to grant them permission to your Azure resources so that your pipelines can deploy your Bicep files. |
+| [Build your first Bicep deployment pipeline by using Azure Pipelines](/learn/modules/build-first-bicep-deployment-pipeline-using-azure-pipelines/) | Build a basic deployment pipeline for Bicep code. Use a service connection to securely identify your pipeline to Azure. Configure when the pipeline runs by using triggers. |
## Next steps
azure-resource-manager Azure Services Resource Providers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/management/azure-services-resource-providers.md
Title: Resource providers by Azure services description: Lists all resource provider namespaces for Azure Resource Manager and shows the Azure service for that namespace. Previously updated : 06/14/2021 Last updated : 08/05/2021
The resources providers that are marked with **- registered** are registered by
| Microsoft.DBforPostgreSQL | [Azure Database for PostgreSQL](../../postgresql/index.yml) | | Microsoft.DesktopVirtualization | [Windows Virtual Desktop](../../virtual-desktop/index.yml) | | Microsoft.Devices | [Azure IoT Hub](../../iot-hub/index.yml)<br />[Azure IoT Hub Device Provisioning Service](../../iot-dps/index.yml) |
+| Microsoft.DeviceUpdate | [Device Update for IoT Hub](../../iot-hub-device-update/index.yml)
| Microsoft.DevOps | [Azure DevOps](/azure/devops/) | | Microsoft.DevSpaces | [Azure Dev Spaces](/previous-versions/azure/dev-spaces/) | | Microsoft.DevTestLab | [Azure Lab Services](../../lab-services/index.yml) |
ResourceType : Microsoft.KeyVault/vaults
## Next steps
-For more information about resource providers, including how to register a resource provider, see [Azure resource providers and types](resource-providers-and-types.md).
+For more information about resource providers, including how to register a resource provider, see [Azure resource providers and types](resource-providers-and-types.md).
azure-resource-manager Resource Graph Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-resource-manager/management/resource-graph-samples.md
Title: Azure Resource Graph sample queries for Azure Resource Manager description: Sample Azure Resource Graph queries for Azure Resource Manager showing use of resource types and tables to access Azure Resource Manager related resources and properties. Previously updated : 07/21/2021 Last updated : 08/04/2021
azure-sql Resource Graph Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-sql/database/resource-graph-samples.md
Title: Azure Resource Graph sample queries for Azure SQL Database description: Sample Azure Resource Graph queries for Azure SQL Database showing use of resource types and tables to access Azure SQL Database related resources and properties. Previously updated : 07/21/2021 Last updated : 08/04/2021
azure-vmware Deploy Azure Vmware Solution https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/deploy-azure-vmware-solution.md
In the planning phase, you defined whether to use an *existing* or *new* Express
| If | Then | | | |
-| You don't already have a virtual network... | Create the following:<ol><li><a href="tutorial-configure-networking.md#create-a-virtual-network">Virtual network</a></li><li><a href="../expressroute/expressroute-howto-add-gateway-portal-resource-manager.md#create-the-gateway-subnet">GatewaySubnet</a></li><li><a href="tutorial-configure-networking.md#create-a-virtual-network-gateway">Virtual network gateway</a></li><li><a href="tutorial-configure-networking.md#connect-expressroute-to-the-virtual-network-gateway">Connect ExpressRoute to the gateway</a></li></ol> |
+| You don't already have a virtual network... | Create the following:<ol><li><a href="tutorial-configure-networking.md#create-a-vnet-manually">Virtual network</a></li><li><a href="../expressroute/expressroute-howto-add-gateway-portal-resource-manager.md#create-the-gateway-subnet">GatewaySubnet</a></li><li><a href="tutorial-configure-networking.md#create-a-virtual-network-gateway">Virtual network gateway</a></li><li><a href="tutorial-configure-networking.md#connect-expressroute-to-the-virtual-network-gateway">Connect ExpressRoute to the gateway</a></li></ol> |
| You already have a virtual network **without** a GatewaySubnet... | Create the following: <ol><li><a href="../expressroute/expressroute-howto-add-gateway-portal-resource-manager.md#create-the-gateway-subnet">GatewaySubnet</a></li><li><a href="tutorial-configure-networking.md#create-a-virtual-network-gateway">Virtual network gateway</a></li><li><a href="tutorial-configure-networking.md#connect-expressroute-to-the-virtual-network-gateway">Connect ExpressRoute to the gateway</a></li></ol> | | You already have a virtual network **with** a GatewaySubnet... | Create the following: <ol><li><a href="tutorial-configure-networking.md#create-a-virtual-network-gateway">Virtual network gateway</a></li><li><a href="tutorial-configure-networking.md#connect-expressroute-to-the-virtual-network-gateway">Connect ExpressRoute to the gateway</a></li></ol> | - ### Use an existing virtual network gateway [!INCLUDE [connect-expressroute-to-vnet](includes/connect-expressroute-vnet.md)]
azure-vmware Tutorial Configure Networking https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-vmware/tutorial-configure-networking.md
Title: Tutorial - Configure networking for your VMware private cloud in Azure
description: Learn to create and configure the networking needed to deploy your private cloud in Azure Previously updated : 04/23/2021-
-#Customer intent: As a < type of user >, I want < what? > so that < why? >.
Last updated : 07/30/2021
An Azure VMware Solution private cloud requires an Azure Virtual Network. Becaus
[!INCLUDE [disk-pool-planning-note](includes/disk-pool-planning-note.md)] + In this tutorial, you learn how to: > [!div class="checklist"]
-> * Create a virtual network
+> * Create a virtual network
> * Create a virtual network gateway > * Connect your ExpressRoute circuit to the gateway
+>[!NOTE]
+>Before you create a new vNet, evaluate if you already have an existing vNet in Azure and plan to use it to connect to Azure VMware Solution; or whether to create a new vNet entirely.
+>* To use an existing vNet, use the **[Azure vNet connect](#select-an-existing-vnet)** tab under **Connectivity**.
+>* To create a new vNet, use the **[Azure vNet connect](#create-a-new-vnet)** tab or create one [manually](#create-a-vnet-manually).
+
+## Connect with the Azure vNet connect feature
+
+You can use the **Azure vNet connect** feature to use an existing vNet or create a new vNet to connect to Azure VMware Solution.
+
+>[!NOTE]
+>Address space in the vNet cannot overlap with the Azure VMware Solution private cloud CIDR.
++
+### Select an existing vNet
+
+When you select an existing vNet, the Azure Resource Manager (ARM) template that creates the vNet and other resources gets redeployed. The resources in this case are the public IP, gateway, gateway connection, and ExpressRoute authorization key. If everything is set up, the deployment won't change anything. However, if anything is missing, it gets created automatically. For example, if the GatewaySubnet is missing, then it gets added during the deployment.
+
+1. In your Azure VMware Solution private cloud, under **Manage**, select **Connectivity**.
+
+2. Select the **Azure vNet connect** tab and then select the existing vNet.
+
+ :::image type="content" source="media/networking/azure-vnet-connect-tab.png" alt-text="Screenshot showing the Azure vNet connect tab under Connectivity with an existing vNet selected.":::
+
+3. Select **Save**.
+
+ At this point, the vNet validates if overlapping IP address spaces between Azure VMware Solution and vNet are detected. If detected, then change the network address of either the private cloud or the vNet so they don't overlap.
+
-## Create a virtual network
+### Create a new vNet
+
+When you create a new vNet, the required components needed to connect to Azure VMware Solution get created automatically.
+
+1. In your Azure VMware Solution private cloud, under **Manage**, select **Connectivity**.
+
+2. Select the **Azure vNet connect** tab and then select **Create new**.
+
+ :::image type="content" source="media/networking/azure-vnet-connect-tab-create-new.png" alt-text="Screenshot showing the Azure vNet connect tab under Connectivity.":::
+
+3. Provide or update the information for the new vNet and then select **OK**.
+
+ At this point, the vNet validates if overlapping IP address spaces between Azure VMware Solution and vNet are detected. If detected, then change the network address of either the private cloud or the vNet so they don't overlap.
+
+ :::image type="content" source="media/networking/create-new-virtual-network.png" alt-text="Screenshot showing the Create virtual network window.":::
+
+The vNet with the provided address range and GatewaySubnet is created in your subscription and resource group.
++
+## Connect to the private cloud manually
+
+### Create a vNet manually
1. Sign in to the [Azure portal](https://portal.azure.com).
In this tutorial, you learn how to:
1. Verify the information and select **Create**. Once the deployment is complete, you'll see your virtual network in the resource group.
-## Create a virtual network gateway
++
+### Create a virtual network gateway
Now that you've created a virtual network, you'll create a virtual network gateway.
Now that you've created a virtual network, you'll create a virtual network gatew
1. Verify that the details are correct, and select **Create** to start the deployment of your virtual network gateway. 1. Once the deployment completes, move to the next section to connect your ExpressRoute connection to the virtual network gateway containing your Azure VMware Solution private cloud.
-## Connect ExpressRoute to the virtual network gateway
+### Connect ExpressRoute to the virtual network gateway
Now that you've deployed a virtual network gateway, you'll add a connection between it and your Azure VMware Solution private cloud.
Now that you've deployed a virtual network gateway, you'll add a connection betw
In this tutorial, you learned how to: > [!div class="checklist"]
-> * Create a virtual network
-> * Create a virtual network gateway
+> * Create a Virtual Network using the vNet Connect Feature
+> * Create a Virtual Network Manually
+> * Create a Virtual Network gateway
> * Connect your ExpressRoute circuit to the gateway
azure-web-pubsub Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-web-pubsub/overview.md
Azure Web PubSub service is a bi-directional messaging service that allows diffe
There are many different ways to program with Azure Web PubSub service, as some of the samples listed here: - **Build serverless real-time applications**: Use Azure Functions' integration with Azure Web PubSub service to build serverless real-time applications in languages such as JavaScript, C#, Java and Python.
+- **Use WebSocket subprotocol to do client-side only Pub/Sub** - Azure Web PubSub service provides WebSocket subprotocols to empower authorized clients to publish to other clients in a convenience manner.
+- **Use provided SDKs to manage the WebSocket connections in self-host app servers** - Azure Web PubSub service provides SDKs in C#, JavaScript, Java and Python to manage the WebSocket connections easily, including broadcast messages to the connections, add connections to some groups, or close the connections, etc.
- **Send messages from server to clients via REST API** - Azure Web PubSub service provides REST API to enable applications to post messages to clients connected, in any REST capable programming languages.
azure-web-pubsub Quickstart Live Demo https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/azure-web-pubsub/quickstart-live-demo.md
Title: Azure Web PubSub service live demo
+ Title: A simple Pub/Sub live demo
description: A quickstart for getting started with Azure Web PubSub service live demo.
Last updated 04/26/2021
# Quickstart: Get started with chatroom live demo
-The Azure Web PubSub service helps you build real-time messaging web applications using WebSockets and the publish-subscribe pattern easily. The [chatroom live demo](https://azure.github.io/azure-webpubsub/demos/clientpubsub.html) demonstrates the real-time messaging capability provided by Azure Web PubSub. With this live demo, you could easily join a chat group and send real-time message to a specific group.
+The Azure Web PubSub service helps you build real-time messaging web applications using WebSockets and the publish-subscribe pattern easily. The [pub/sub live demo](https://azure.github.io/azure-webpubsub/demos/clientpubsub.html) demonstrates the real-time messaging capability provided by Azure Web PubSub. With this live demo, you could easily join a chat group and send real-time message to a specific group.
:::image type="content" source="media/quickstart-live-demo/chat-live-demo.gif" alt-text="Using the chatroom live demo.":::
With this live demo, you could join or leave a group and send messages to the gr
## Next steps
-In this quickstart, you learned the real-time messaging capability with the chatroom live demo. Now, you could start to build your own application.
+This quickstart provides you a basic idea of the Web PubSub service. In this quickstart, we leverage the *Client URL Generator* to generate a temporarily available client URL to connect to the service. In real-world applications, SDKs in various languages are provided for you to generate the client URL from the *Connection String*. Besides using SDKs to talk to the Web PubSub service from the application servers, Azure Function extension is also provided for you to build your serverless applications.
+
+Follow the quick starts listed below to start building your own application.
> [!div class="nextstepaction"] > [Quick start: publish and subscribe messages in Azure Web PubSub](https://azure.github.io/azure-webpubsub/getting-started/publish-messages/js-publish-message)
cognitive-services Whats New https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Anomaly-Detector/whats-new.md
We've also added links to some user-generated content. Those items will be marke
### June 2021
-* Multivariate anomaly detection APIs available in more regions: West US2, West Europe, East US2, South Central US, East US, and UK South.
+* Multivariate anomaly detection APIs available in more regions: West US 2, West Europe, East US 2, South Central US, East US, and UK South.
* Anomaly Detector (univariate) available in Azure cloud for US Government. * Anomaly Detector (univariate) available in Azure China (China North 2).
We've also added links to some user-generated content. Those items will be marke
## Service updates
-[Azure update announcements for Cognitive Services](https://azure.microsoft.com/updates/?product=cognitive-services)
+[Azure update announcements for Cognitive Services](https://azure.microsoft.com/updates/?product=cognitive-services)
cognitive-services Language Support https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/Speech-Service/language-support.md
Custom Voice is available in the neural tier (a.k.a, Custom Neural Voice). Based
| Korean (Korea) | `ko-KR` | Yes | Yes | | Norwegian (Bokmål, Norway) | `nb-NO` | Yes | No | | Portuguese (Brazil) | `pt-BR` | Yes | Yes |
+| Russian (Russia) | `ru-RU` | Yes | Yes |
| Spanish (Mexico) | `es-MX` | Yes | Yes | | Spanish (Spain) | `es-ES` | Yes | Yes |
cognitive-services Text Analytics How To Call Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cognitive-services/text-analytics/how-tos/text-analytics-how-to-call-api.md
The `/analyze` endpoint lets you choose which of the supported Text Analytics fe
{ "parameters": { "model-version": "latest",
- "loggingOptOut": "false"
+ "loggingOptOut": false
} } ],
The `/analyze` endpoint lets you choose which of the supported Text Analytics fe
{ "parameters": { "model-version": "latest",
- "loggingOptOut": "true",
+ "loggingOptOut": true,
"domain": "phi", "piiCategories":["default"] }
The `/analyze` endpoint lets you choose which of the supported Text Analytics fe
{ "parameters": { "model-version": "latest",
- "loggingOptOut": "false"
+ "loggingOptOut": false
} } ],
The `/analyze` endpoint lets you choose which of the supported Text Analytics fe
{ "parameters": { "model-version": "latest",
- "loggingOptOut": "false"
+ "loggingOptOut": false
} } ],
The `/analyze` endpoint lets you choose which of the supported Text Analytics fe
{ "parameters": { "model-version": "latest",
- "loggingOptOut": "false",
- "opinionMining": "true"
+ "loggingOptOut": false,
+ "opinionMining": true
} } ]
container-registry Container Registry Access Selected Networks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/container-registry-access-selected-networks.md
Title: Configure public registry access description: Configure IP rules to enable access to an Azure container registry from selected public IP addresses or address ranges. Previously updated : 07/28/2021 Last updated : 07/30/2021 # Configure public IP network rules
If you use Azure Pipelines with an Azure container registry that limits access t
One workaround is to change the agent used to run the pipeline from Microsoft-hosted to self-hosted. With a self-hosted agent running on a [Windows](/azure/devops/pipelines/agents/v2-windows) or [Linux](/azure/devops/pipelines/agents/v2-linux) machine that you manage, you control the outbound IP address of the pipeline, and you can add this address in a registry IP access rule.
+## Access from AKS
+
+If you use Azure Kubernetes Service (AKS) with an Azure container registry that limits access to specific IP addresses, you can't configure a fixed AKS IP address by default. The egress IP address from the AKS cluster is randomly assigned.
+
+To allow the AKS cluster to access the registry, you have these options:
+
+* If you use the Azure Basic Load Balancer, set up a [static IP address](../aks/egress.md) for the AKS cluster.
+* If you use the Azure Standard Load Balancer, see guidance to [control egress traffic](../aks/limit-egress-traffic.md) from the cluster.
+ ## Next steps * To restrict access to a registry using a private endpoint in a virtual network, see [Configure Azure Private Link for an Azure container registry](container-registry-private-link.md). * If you need to set up registry access rules from behind a client firewall, see [Configure rules to access an Azure container registry behind a firewall](container-registry-firewall-access-rules.md).
+* For more troubleshooting guidance, see [Troubleshoot network issues with registry](container-registry-troubleshoot-access.md).
[az-acr-login]: /cli/azure/acr#az_acr_login [az-acr-network-rule-add]: /cli/azure/acr/network-rule/#az_acr_network_rule_add
container-registry Resource Graph Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/container-registry/resource-graph-samples.md
Title: Azure Resource Graph sample queries for Azure Container Registry description: Sample Azure Resource Graph queries for Azure Container Registry showing use of resource types and tables to access Azure Container Registry related resources and properties. Previously updated : 07/21/2021 Last updated : 08/04/2021
cosmos-db Resource Graph Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cosmos-db/resource-graph-samples.md
Title: Azure Resource Graph sample queries for Azure Cosmos DB description: Sample Azure Resource Graph queries for Azure Cosmos DB showing use of resource types and tables to access Azure Cosmos DB related resources and properties. Previously updated : 07/21/2021 Last updated : 08/04/2021
cost-management-billing Tutorial Export Acm Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/costs/tutorial-export-acm-data.md
Title: Tutorial - Create and manage exported data from Azure Cost Management
description: This article shows you how you can create and manage exported Azure Cost Management data so that you can use it in external systems. Previously updated : 07/26/2021 Last updated : 08/05/2021
Exports for management groups of other subscription types aren't supported.
### File partitioning for large datasets
-If you have a Microsoft Customer Agreement or a Microsoft Partner Agreement, you can enable Exports to chunk your file into multiple smaller file partitions to help with data ingestion. When you initially configure your export, set the **File Partitioning** setting to **On**. The setting is **Off** by default.
+If you have a Microsoft Customer Agreement, Microsoft Partner Agreement, or Enterprise Agreement, you can enable Exports to chunk your file into multiple smaller file partitions to help with data ingestion. When you initially configure your export, set the **File Partitioning** setting to **On**. The setting is **Off** by default.
:::image type="content" source="./media/tutorial-export-acm-data/file-partition.png" alt-text="Screenshot showing File Partitioning option." lightbox="./media/tutorial-export-acm-data/file-partition.png" :::
-If you don't have a Microsoft Customer Agreement or a Microsoft Partner Agreement, then you won't see the **File Partitioning** option.
+If you don't have a Microsoft Customer Agreement, Microsoft Partner Agreement, or Enterprise Agreement, then you won't see the **File Partitioning** option.
#### Update existing exports to use file partitioning
cost-management-billing Ea Transfers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/manage/ea-transfers.md
Other points to keep in mind before an account transfer:
- After the transfer is complete, the transferred account appears inactive under the source enrollment and appears active under the target enrollment. - The account shows the end date corresponding to the effective transfer date on the source enrollment and as a start date on the target enrollment. - Any usage occurred for the account before the effective transfer date remains under the source enrollment.-- There's no downtime during an enrollment transfer.-- Usage may take up to 24 - 48 hours to be reflected in the target enrollment.-- Cost view settings for Department Administrators or Account Owners don't carry over.
- - If previously enabled, settings must be enabled for the target enrollment.
-- Any API keys used in the source enrollment must be regenerated for the target enrollment.-- If the source and destination enrollments are on different cloud instances, the transfer will fail. Azure Support can transfer only within the same cloud instance.-- For reservations (reserved instances):
- - The enrollment or account transfer between different currencies affects monthly reservation purchases.
- - Whenever there's is a currency change during or after an enrollment transfer, reservations paid for monthly are canceled for the source enrollment. This is intentional and affects only the monthly reservation purchases.
- - You may have to repurchase the canceled monthly reservations from the source enrollment using the new enrollment in the local or new currency.
## Transfer enterprise enrollment to a new one
Other points to keep in mind before an enrollment transfer:
- Approval from both target and source enrollment EA Administrators is required. - If an enrollment transfer doesn't meet your requirements, consider an account transfer. - The source enrollment status will be updated to transferred and will only be available for historic usage reporting purposes.
+- There's no downtime during an enrollment transfer.
+- Usage may take up to 24 - 48 hours to be reflected in the target enrollment.
+- Cost view settings for Department Administrators or Account Owners don't carry over.
+ - If previously enabled, settings must be enabled for the target enrollment.
+- Any API keys used in the source enrollment must be regenerated for the target enrollment.
+- If the source and destination enrollments are on different cloud instances, the transfer will fail. Azure Support can transfer only within the same cloud instance.
+- For reservations (reserved instances):
+ - The enrollment or account transfer between different currencies affects monthly reservation purchases.
+ - Whenever there's is a currency change during or after an enrollment transfer, reservations paid for monthly are canceled for the source enrollment. This is intentional and affects only the monthly reservation purchases.
+ - You may have to repurchase the canceled monthly reservations from the source enrollment using the new enrollment in the local or new currency.
+ ### Auto enrollment transfer
cost-management-billing Reservation Renew https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/cost-management-billing/reservations/reservation-renew.md
Previously updated : 07/24/2020 Last updated : 08/05/2020
You'll receive an email notification if any of the preceding conditions occur an
## Renewal notification
+Renewal notification emails are sent 30 days before expiration and again on the expiration date. The sending email address is `azure-noreply@microsoft.com`. You might want to add the email address to your safe senders or allow list.
+ Emails are sent to different people depending on your purchase method: - EA customers - Emails are sent to the notification contacts set on the EA portal.
data-factory Author Visually https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/author-visually.md
Title: Visual authoring+ description: Learn how to use visual authoring in Azure Data Factory +
data-factory Ci Cd Github Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/ci-cd-github-troubleshoot-guide.md
Title: Troubleshoot CI-CD, Azure DevOps, and GitHub issues in ADF+ description: Use different methods to troubleshoot CI-CD issues in ADF. + Last updated 06/27/2021
data-factory Compute Linked Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/compute-linked-services.md
Title: Compute environments supported by Azure Data Factory + description: Compute environments that can be used with Azure Data Factory pipelines (such as Azure HDInsight) to transform or process data. Last updated 05/08/2019 -+ # Compute environments supported by Azure Data Factory
data-factory Concepts Data Flow Column Pattern https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-column-pattern.md
Title: Column patterns in Azure Data Factory mapping data flow+ description: Create generalized data transformation patterns using column patterns in Azure Data Factory mapping data flows + Last updated 05/21/2021
data-factory Concepts Data Flow Debug Mode https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-debug-mode.md
Title: Mapping data flow Debug Mode+ description: Start an interactive debug session when building data flows -+ Last updated 04/16/2021
data-factory Concepts Data Flow Expression Builder https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-expression-builder.md
Title: Expression builder in mapping data flow+ description: Build expressions by using Expression Builder in mapping data flows in Azure Data Factory + Last updated 04/29/2021
data-factory Concepts Data Flow Manage Graph https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-manage-graph.md
Title: Managing the mapping data flow graph+ description: How to effectively manage and edit the mapping data flow graph -+ Last updated 09/02/2020
data-factory Concepts Data Flow Monitoring https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-monitoring.md
Title: Monitoring mapping data flows+ description: How to visually monitor mapping data flows in Azure Data Factory -+ Last updated 06/18/2021
data-factory Concepts Data Flow Performance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-performance.md
Title: Mapping data flow performance and tuning guide+ description: Learn about key factors that affect the performance of mapping data flows in Azure Data Factory. -+ Last updated 06/07/2021
data-factory Concepts Data Flow Schema Drift https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-data-flow-schema-drift.md
Title: Schema drift in mapping data flow+ description: Build resilient Data Flows in Azure Data Factory with Schema Drift -+ Last updated 04/15/2020
data-factory Concepts Datasets Linked Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-datasets-linked-services.md
Title: Datasets+ description: 'Learn about datasets in Data Factory. Datasets represent input/output data.' -+ Last updated 08/24/2020
data-factory Concepts Integration Runtime https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-integration-runtime.md
Title: Integration runtime+ description: 'Learn about integration runtime in Azure Data Factory.' -+ Last updated 06/16/2021
data-factory Concepts Linked Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-linked-services.md
Title: Linked services in Azure Data Factory + description: 'Learn about linked services in Data Factory. Linked services link compute/data stores to data factory.' + Last updated 08/21/2020
data-factory Concepts Pipeline Execution Triggers https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-pipeline-execution-triggers.md
Title: Pipeline execution and triggers in Azure Data Factory + description: This article provides information about how to execute a pipeline in Azure Data Factory, either on-demand or by creating a trigger.
Last updated 07/05/2018 -+ # Pipeline execution and triggers in Azure Data Factory
data-factory Concepts Pipelines Activities https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/concepts-pipelines-activities.md
Title: Pipelines and activities in Azure Data Factory + description: 'Learn about pipelines and activities in Azure Data Factory.' + Last updated 06/19/2021
data-factory Connector Amazon Marketplace Web Service https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-amazon-marketplace-web-service.md
Title: Copy data from AWS Marketplace+ description: Learn how to copy data from Amazon Marketplace Web Service to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 08/01/2018
data-factory Connector Amazon Redshift https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-amazon-redshift.md
Title: Copy data from Amazon Redshift+ description: Learn about how to copy data from Amazon Redshift to supported sink data stores by using Azure Data Factory. + Last updated 12/09/2020
data-factory Connector Amazon S3 Compatible Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-amazon-s3-compatible-storage.md
Title: Copy data from Amazon Simple Storage Service (S3) Compatible Storage+ description: Learn about how to copy data from Amazon S3 Compatible Storage to supported sink data stores by using Azure Data Factory. -+ Last updated 05/11/2021
data-factory Connector Amazon Simple Storage Service https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-amazon-simple-storage-service.md
Title: Copy data from Amazon Simple Storage Service (S3)+ description: Learn about how to copy data from Amazon Simple Storage Service (S3) to supported sink data stores by using Azure Data Factory. -+ Last updated 03/17/2021
data-factory Connector Azure Blob Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-blob-storage.md
Title: Copy and transform data in Azure Blob storage+ description: Learn how to copy data to and from Blob storage, and transform data in Blob storage by using Data Factory. -+ Last updated 07/19/2021
data-factory Connector Azure Cosmos Db Mongodb Api https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-cosmos-db-mongodb-api.md
Title: Copy data from Azure Cosmos DB's API for MongoDB+ description: Learn how to copy data from supported source data stores to or from Azure Cosmos DB's API for MongoDB to supported sink stores by using Data Factory. -+ Last updated 11/20/2019
data-factory Connector Azure Cosmos Db https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-cosmos-db.md
Title: Copy and transform data in Azure Cosmos DB (SQL API)+ description: Learn how to copy data to and from Azure Cosmos DB (SQL API), and transform data in Azure Cosmos DB (SQL API) by using Data Factory. -+ Last updated 05/18/2021
data-factory Connector Azure Data Explorer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-data-explorer.md
Title: Copy data to or from Azure Data Explorer+ description: Learn how to copy data to or from Azure Data Explorer by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 07/19/2020
data-factory Connector Azure Data Lake Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-data-lake-storage.md
Title: Copy and transform data in Azure Data Lake Storage Gen2+ description: Learn how to copy data to and from Azure Data Lake Storage Gen2, and transform data in Azure Data Lake Storage Gen2 by using Azure Data Factory. -+ Last updated 07/19/2021
data-factory Connector Azure Data Lake Store https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-data-lake-store.md
Title: Copy data to or from Azure Data Lake Storage Gen1+ description: Learn how to copy data from supported source data stores to Azure Data Lake Store, or from Data Lake Store to supported sink stores, by using Data Factory. -+ Last updated 07/19/2021
data-factory Connector Azure Database For Mariadb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-database-for-mariadb.md
Title: Copy data from Azure Database for MariaDB+ description: Learn how to copy data from Azure Database for MariaDB to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 09/04/2019
data-factory Connector Azure Database For Mysql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-database-for-mysql.md
Title: Copy and transform data in Azure Database for MySQL+ description: earn how to copy and transform data in Azure Database for MySQL by using Azure Data Factory. -+ Last updated 03/10/2021
data-factory Connector Azure Database For Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-database-for-postgresql.md
Title: Copy and transform data in Azure Database for PostgreSQL+ description: Learn how to copy and transform data in Azure Database for PostgreSQL by using Azure Data Factory. -+ Last updated 06/16/2021
data-factory Connector Azure Databricks Delta Lake https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-databricks-delta-lake.md
Title: Copy data to and from Azure Databricks Delta Lake+ description: Learn how to copy data to and from Azure Databricks Delta Lake by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 06/16/2021
data-factory Connector Azure File Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-file-storage.md
Title: Copy data from/to Azure File Storage+ description: Learn how to copy data from Azure File Storage to supported sink data stores (or) from supported source data stores to Azure File Storage by using Azure Data Factory. -+ Last updated 03/17/2021
data-factory Connector Azure Search https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-search.md
Title: Copy data to Search index+ description: Learn about how to push or copy data to an Azure search index by using the Copy Activity in an Azure Data Factory pipeline. -+ Last updated 03/17/2021
data-factory Connector Azure Sql Data Warehouse https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-sql-data-warehouse.md
Title: Copy and transform data in Azure Synapse Analytics+ description: Learn how to copy data to and from Azure Synapse Analytics, and transform data in Azure Synapse Analytics by using Data Factory. + Last updated 05/10/2021
data-factory Connector Azure Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-sql-database.md
Title: Copy and transform data in Azure SQL Database+ description: Learn how to copy data to and from Azure SQL Database, and transform data in Azure SQL Database by using Azure Data Factory. -+ Last updated 06/15/2021
data-factory Connector Azure Sql Managed Instance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-sql-managed-instance.md
Title: Copy and transform data in Azure SQL Managed Instance+ description: Learn how to copy and transform data in Azure SQL Managed Instance by using Azure Data Factory. -+ Last updated 06/15/2021
data-factory Connector Azure Table Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-azure-table-storage.md
Title: Copy data to and from Azure Table storage+ description: Learn how to copy data from supported source stores to Azure Table storage, or from Table storage to supported sink stores, by using Data Factory. -+ Last updated 03/17/2021
data-factory Connector Cassandra https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-cassandra.md
Title: Copy data from Cassandra using Azure Data Factory + description: Learn how to copy data from Cassandra to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 08/12/2019
data-factory Connector Concur https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-concur.md
Title: Copy data from Concur using Azure Data Factory (Preview) + description: Learn how to copy data from Concur to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 11/25/2020
data-factory Connector Couchbase https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-couchbase.md
Title: Copy data from Couchbase using Azure Data Factory (Preview) + description: Learn how to copy data from Couchbase to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 08/12/2019
data-factory Connector Db2 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-db2.md
Title: Copy data from DB2 using Azure Data Factory + description: Learn how to copy data from DB2 to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 05/26/2020
data-factory Connector Drill https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-drill.md
Title: Copy data from Drill using Azure Data Factory + description: Learn how to copy data from Drill to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 10/25/2019
data-factory Connector Dynamics Ax https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-dynamics-ax.md
Title: Copy data from Dynamics AX+ description: Learn how to copy data from Dynamics AX to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 06/12/2020
data-factory Connector Dynamics Crm Office 365 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-dynamics-crm-office-365.md
Title: Copy data in Dynamics (Microsoft Dataverse)+ description: Learn how to copy data from Microsoft Dynamics CRM or Microsoft Dynamics 365 (Microsoft Dataverse) to supported sink data stores or from supported source data stores to Dynamics CRM or Dynamics 365 by using a copy activity in a data factory pipeline. -+ Last updated 03/17/2021 # Copy data from and to Dynamics 365 (Microsoft Dataverse) or Dynamics CRM by using Azure Data Factory
data-factory Connector File System https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-file-system.md
Title: Copy data from/to a file system by using Azure Data Factory + description: Learn how to copy data from file system to supported sink data stores (or) from supported source data stores to file system by using Azure Data Factory. + Last updated 03/29/2021
data-factory Connector Ftp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-ftp.md
Title: Copy data from an FTP server by using Azure Data Factory + description: Learn how to copy data from an FTP server to a supported sink data store by using a copy activity in an Azure Data Factory pipeline. + Last updated 03/17/2021
data-factory Connector Github https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-github.md
Title: Connect to GitHub+ description: Use GitHub to specify your Common Data Model entity references + Last updated 06/03/2020
data-factory Connector Google Adwords https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-google-adwords.md
Title: Copy data from Google AdWords+ description: Learn how to copy data from Google AdWords to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 10/25/2019
data-factory Connector Google Bigquery https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-google-bigquery.md
Title: Copy data from Google BigQuery by using Azure Data Factory + description: Learn how to copy data from Google BigQuery to supported sink data stores by using a copy activity in a data factory pipeline. -+ Last updated 09/04/2019
data-factory Connector Google Cloud Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-google-cloud-storage.md
Title: Copy data from Google Cloud Storage by using Azure Data Factory + description: Learn about how to copy data from Google Cloud Storage to supported sink data stores by using Azure Data Factory. + Last updated 03/17/2021
data-factory Connector Greenplum https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-greenplum.md
Title: Copy data from Greenplum using Azure Data Factory + description: Learn how to copy data from Greenplum to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 09/04/2019
data-factory Connector Hbase https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-hbase.md
Title: Copy data from HBase using Azure Data Factory + description: Learn how to copy data from HBase to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 08/12/2019
data-factory Connector Hdfs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-hdfs.md
Title: Copy data from HDFS by using Azure Data Factory + description: Learn how to copy data from a cloud or on-premises HDFS source to supported sink data stores by using Copy activity in an Azure Data Factory pipeline. + Last updated 03/17/2021
data-factory Connector Hive https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-hive.md
Title: Copy data from Hive using Azure Data Factory + description: Learn how to copy data from Hive to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 11/17/2020
data-factory Connector Http https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-http.md
Title: Copy data from an HTTP source by using Azure Data Factory + description: Learn how to copy data from a cloud or on-premises HTTP source to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 03/17/2021
data-factory Connector Hubspot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-hubspot.md
Title: Copy data from HubSpot using Azure Data Factory + description: Learn how to copy data from HubSpot to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 12/18/2020
data-factory Connector Impala https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-impala.md
Title: Copy data from Impala by using Azure Data Factory + description: Learn how to copy data from Impala to supported sink data stores by using a copy activity in a data factory pipeline. + Last updated 09/04/2019
data-factory Connector Informix https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-informix.md
Title: Copy data from and to IBM Informix using Azure Data Factory + description: Learn how to copy data from and to IBM Informix by using a copy activity in an Azure Data Factory pipeline. + Last updated 03/17/2021
data-factory Connector Jira https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-jira.md
Title: Copy data from Jira using Azure Data Factory + description: Learn how to copy data from Jira to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 10/25/2019
data-factory Connector Magento https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-magento.md
Title: Copy data from Magento using Azure Data Factory (Preview) + description: Learn how to copy data from Magento to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 08/01/2019
data-factory Connector Mariadb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-mariadb.md
Title: Copy data from MariaDB using Azure Data Factory + description: Learn how to copy data from MariaDB to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 08/12/2019
data-factory Connector Marketo https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-marketo.md
Title: Copy data from Marketo using Azure Data Factory (Preview) + description: Learn how to copy data from Marketo to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 06/04/2020
data-factory Connector Microsoft Access https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-microsoft-access.md
Title: Copy data from and to Microsoft Access+ description: Learn how to copy data from and to Microsoft Access by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 03/17/2021
data-factory Connector Mongodb Atlas https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-mongodb-atlas.md
Title: Copy data from or to MongoDB Atlas+ description: Learn how to copy data from MongoDB Atlas to supported sink data stores, or from supported source data stores to MongoDB Atlas, by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 06/01/2021
data-factory Connector Mongodb Legacy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-mongodb-legacy.md
Title: Copy data from MongoDB using legacy+ description: Learn how to copy data from Mongo DB to supported sink data stores by using a copy activity in a legacy Azure Data Factory pipeline. -+ Last updated 08/12/2019
data-factory Connector Mongodb https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-mongodb.md
Title: Copy data from or to MongoDB+ description: Learn how to copy data from MongoDB to supported sink data stores, or from supported source data stores to MongoDB, by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 06/01/2021
data-factory Connector Mysql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-mysql.md
Title: Copy data from MySQL using Azure Data Factory + description: Learn about MySQL connector in Azure Data Factory that lets you copy data from a MySQL database to a data store supported as a sink. + Last updated 09/09/2020
data-factory Connector Netezza https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-netezza.md
Title: Copy data from Netezza by using Azure Data Factory + description: Learn how to copy data from Netezza to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 05/28/2020
data-factory Connector Odata https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-odata.md
Title: Copy data from OData sources by using Azure Data Factory + description: Learn how to copy data from OData sources to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 03/30/2021
data-factory Connector Odbc https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-odbc.md
Title: Copy data from and to ODBC data stores using Azure Data Factory + description: Learn how to copy data from and to ODBC data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 05/10/2021
data-factory Connector Office 365 https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-office-365.md
Title: Copy data from Office 365 using Azure Data Factory + description: Learn how to copy data from Office 365 to supported sink data stores by using copy activity in an Azure Data Factory pipeline. + Last updated 10/20/2019
data-factory Connector Oracle Cloud Storage https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-oracle-cloud-storage.md
Title: Copy data from Oracle Cloud Storage by using Azure Data Factory + description: Learn about how to copy data from Oracle Cloud Storage to supported sink data stores by using Azure Data Factory. + Last updated 05/11/2021
data-factory Connector Oracle Eloqua https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-oracle-eloqua.md
Title: Copy data from Oracle Eloqua (Preview)+ description: Learn how to copy data from Oracle Eloqua to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 08/01/2019
data-factory Connector Oracle Responsys https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-oracle-responsys.md
Title: Copy data from Oracle Responsys (Preview)+ description: Learn how to copy data from Oracle Responsys to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 08/01/2019
data-factory Connector Oracle Service Cloud https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-oracle-service-cloud.md
Title: Copy data from Oracle Service Cloud (Preview)+ description: Learn how to copy data from Oracle Service Cloud to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 08/01/2019
data-factory Connector Oracle https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-oracle.md
Title: Copy data to and from Oracle by using Azure Data Factory + description: Learn how to copy data from supported source stores to an Oracle database, or from Oracle to supported sink stores, by using Data Factory. + Last updated 03/17/2021
data-factory Connector Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-overview.md
Title: Azure Data Factory connector overview + description: Learn the supported connectors in Data Factory. + Last updated 05/26/2021
data-factory Connector Paypal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-paypal.md
Title: Copy data from PayPal using Azure Data Factory (Preview) + description: Learn how to copy data from PayPal to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 08/01/2019
data-factory Connector Phoenix https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-phoenix.md
Title: Copy data from Phoenix using Azure Data Factory + description: Learn how to copy data from Phoenix to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 09/04/2019
data-factory Connector Postgresql https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-postgresql.md
Title: Copy data From PostgreSQL using Azure Data Factory + description: Learn how to copy data from PostgreSQL to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 02/19/2020
data-factory Connector Presto https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-presto.md
Title: Copy data from Presto using Azure Data Factory+ description: Learn how to copy data from Presto to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 12/18/2020
data-factory Connector Quickbooks https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-quickbooks.md
Title: Copy data from QuickBooks Online using Azure Data Factory (Preview) + description: Learn how to copy data from QuickBooks Online to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 01/15/2021
data-factory Connector Rest https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-rest.md
Title: Copy data from and to a REST endpoint by using Azure Data Factory + description: Learn how to copy data from a cloud or on-premises REST source to supported sink data stores, or from supported source data store to a REST sink by using a copy activity in an Azure Data Factory pipeline. + Last updated 07/27/2021
data-factory Connector Salesforce Marketing Cloud https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-salesforce-marketing-cloud.md
Title: Copy data from Salesforce Marketing Cloud+ description: Learn how to copy data from Salesforce Marketing Cloud to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 07/17/2020
data-factory Connector Salesforce Service Cloud https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-salesforce-service-cloud.md
Title: Copy data from and to Salesforce Service Cloud+ description: Learn how to copy data from Salesforce Service Cloud to supported sink data stores or from supported source data stores to Salesforce Service Cloud by using a copy activity in a data factory pipeline. -+ Last updated 03/17/2021
data-factory Connector Salesforce https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-salesforce.md
Title: Copy data from and to Salesforce+ description: Learn how to copy data from Salesforce to supported sink data stores or from supported source data stores to Salesforce by using a copy activity in a data factory pipeline. -+ Last updated 03/17/2021
data-factory Connector Sap Business Warehouse Open Hub https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-business-warehouse-open-hub.md
Title: Copy data from SAP Business Warehouse via Open Hub+ description: Learn how to copy data from SAP Business Warehouse (BW) via Open Hub to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 07/30/2021
data-factory Connector Sap Business Warehouse https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-business-warehouse.md
Title: Copy data from SAP BW+ description: Learn how to copy data from SAP Business Warehouse to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 09/04/2019
data-factory Connector Sap Cloud For Customer https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-cloud-for-customer.md
Title: Copy data from/to SAP Cloud for Customer+ description: Learn how to copy data from SAP Cloud for Customer to supported sink data stores (or) from supported source data stores to SAP Cloud for Customer by using Data Factory. -+ Last updated 03/17/2021
data-factory Connector Sap Ecc https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-ecc.md
Title: Copy data from SAP ECC+ description: Learn how to copy data from SAP ECC to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 10/28/2020
data-factory Connector Sap Hana https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-hana.md
Title: Copy data from SAP HANA+ description: Learn how to copy data from SAP HANA to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 04/22/2020
data-factory Connector Sap Table https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sap-table.md
Title: Copy data from an SAP table+ description: Learn how to copy data from an SAP table to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 07/30/2021
data-factory Connector Servicenow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-servicenow.md
Title: Copy data from ServiceNow+ description: Learn how to copy data from ServiceNow to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 08/01/2019
data-factory Connector Sftp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sftp.md
Title: Copy data from and to SFTP server+ description: Learn how to copy data from and to SFTP server by using Azure Data Factory. -+ Last updated 03/17/2021
data-factory Connector Sharepoint Online List https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sharepoint-online-list.md
Title: Copy data from SharePoint Online List by using Azure Data Factory + description: Learn how to copy data from SharePoint Online List to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 05/19/2020
data-factory Connector Shopify https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-shopify.md
Title: Copy data from Shopify (Preview) + description: Learn how to copy data from Shopify to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 08/01/2019
data-factory Connector Snowflake https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-snowflake.md
Title: Copy and transform data in Snowflake+ description: Learn how to copy and transform data in Snowflake by using Data Factory. -+ Last updated 03/16/2021
data-factory Connector Spark https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-spark.md
Title: Copy data from Spark+ description: Learn how to copy data from Spark to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 09/04/2019
data-factory Connector Sql Server https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sql-server.md
Title: Copy and transform data to and from SQL Server+ description: Learn about how to copy and transform data to and from SQL Server database that is on-premises or in an Azure VM by using Azure Data Factory. -+ Last updated 06/08/2021
data-factory Connector Square https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-square.md
Title: Copy data from Square (Preview) + description: Learn how to copy data from Square to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. -+ Last updated 08/03/2020
data-factory Connector Sybase https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-sybase.md
Title: Copy data from Sybase using Azure Data Factory + description: Learn how to copy data from Sybase to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 06/10/2020
data-factory Connector Teradata https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-teradata.md
Title: Copy data from Teradata Vantage by using Azure Data Factory + description: The Teradata Connector of the Data Factory service lets you copy data from a Teradata Vantage to data stores supported by Data Factory as sinks. + Last updated 01/22/2021
data-factory Connector Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-troubleshoot-guide.md
Title: Troubleshoot Azure Data Factory connectors+ description: Learn how to troubleshoot connector issues in Azure Data Factory. Last updated 07/30/2021 -+ # Troubleshoot Azure Data Factory connectors
data-factory Connector Vertica https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-vertica.md
Title: Copy data from Vertica using Azure Data Factory + description: Learn how to copy data from Vertica to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 09/04/2019
data-factory Connector Web Table https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-web-table.md
Title: Copy data from Web Table using Azure Data Factory + description: Learn about Web Table Connector of Azure Data Factory that lets you copy data from a web table to data stores supported by Data Factory as sinks. + Last updated 08/01/2019
data-factory Connector Xero https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-xero.md
Title: Copy data from Xero using Azure Data Factory + description: Learn how to copy data from Xero to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 01/26/2021
data-factory Connector Zoho https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/connector-zoho.md
Title: Copy data from Zoho using Azure Data Factory (Preview) + description: Learn how to copy data from Zoho to supported sink data stores by using a copy activity in an Azure Data Factory pipeline. + Last updated 08/03/2020
data-factory Control Flow Append Variable Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-append-variable-activity.md
Title: Append Variable Activity in Azure Data Factory + description: Learn how to set the Append Variable activity to add a value to an existing array variable defined in a Data Factory pipeline +
data-factory Control Flow Azure Function Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-azure-function-activity.md
Title: Azure Function Activity in Azure Data Factory + description: Learn how to use the Azure Function activity to run an Azure Function in a Data Factory pipeline + Last updated 07/30/2021
data-factory Control Flow Execute Data Flow Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-execute-data-flow-activity.md
Title: Data Flow activity+ description: How to execute data flows from inside a data factory pipeline. + Last updated 05/20/2021
data-factory Control Flow Execute Pipeline Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-execute-pipeline-activity.md
Title: Execute Pipeline Activity in Azure Data Factory + description: Learn how you can use the Execute Pipeline Activity to invoke one Data Factory pipeline from another Data Factory pipeline. + Last updated 01/10/2018
data-factory Control Flow Expression Language Functions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-expression-language-functions.md
Title: Expression and functions in Azure Data Factory + description: This article provides information about expressions and functions that you can use in creating data factory entities. + Last updated 07/16/2021
data-factory Control Flow Filter Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-filter-activity.md
Title: Filter activity in Azure Data Factory + description: The Filter activity filters the inputs. + Last updated 05/04/2018
data-factory Control Flow For Each Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-for-each-activity.md
Title: ForEach activity in Azure Data Factory + description: The For Each Activity defines a repeating control flow in your pipeline. It is used for iterating over a collection and execute specified activities. + Last updated 01/23/2019
data-factory Control Flow Get Metadata Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-get-metadata-activity.md
Title: Get Metadata activity in Azure Data Factory + description: Learn how to use the Get Metadata activity in a Data Factory pipeline. + Last updated 02/25/2021
data-factory Control Flow If Condition Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-if-condition-activity.md
Title: If Condition activity in Azure Data Factory + description: The If Condition activity allows you to control the processing flow based on a condition.
Last updated 01/10/2018 -+ # If Condition activity in Azure Data Factory
data-factory Control Flow Lookup Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-lookup-activity.md
Title: Lookup activity in Azure Data Factory + description: Learn how to use Lookup activity to look up a value from an external source. This output can be further referenced by succeeding activities. + Last updated 02/25/2021
data-factory Control Flow Set Variable Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-set-variable-activity.md
Title: Set Variable Activity in Azure Data Factory + description: Learn how to use the Set Variable activity to set the value of an existing variable defined in a Data Factory pipeline + Last updated 04/07/2020
data-factory Control Flow System Variables https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-system-variables.md
Title: System variables in Azure Data Factory + description: This article describes system variables supported by Azure Data Factory. You can use these variables in expressions when defining Data Factory entities. + Last updated 06/12/2018
data-factory Control Flow Until Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-until-activity.md
Title: Until activity in Azure Data Factory + description: The Until activity executes a set of activities in a loop until the condition associated with the activity evaluates to true or it times out.
Last updated 01/10/2018 -+ # Until activity in Azure Data Factory
data-factory Control Flow Validation Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-validation-activity.md
Title: Validation activity in Azure Data Factory + description: The Validation activity does not continue execution of the pipeline until it validates the attached dataset with certain criteria the user specifies. + Last updated 03/25/2019
data-factory Control Flow Wait Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-wait-activity.md
Title: Wait activity in Azure Data Factory + description: The Wait activity pauses the execution of the pipeline for the specified period. + Last updated 01/12/2018
data-factory Control Flow Web Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-web-activity.md
Title: Web Activity in Azure Data Factory + description: Learn how you can use Web Activity, one of the control flow activities supported by Data Factory, to invoke a REST endpoint from a pipeline. + Last updated 12/19/2018
data-factory Control Flow Webhook Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/control-flow-webhook-activity.md
Title: Webhook activity in Azure Data Factory + description: The webhook activity doesn't continue execution of the pipeline until it validates the attached dataset with certain criteria the user specifies. + Last updated 03/25/2019
data-factory Copy Activity Data Consistency https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-data-consistency.md
Title: Data consistency verification in copy activity + description: 'Learn about how to enable data consistency verification in copy activity in Azure Data Factory.' + Last updated 3/27/2020
data-factory Copy Activity Fault Tolerance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-fault-tolerance.md
Title: Fault tolerance of copy activity in Azure Data Factory + description: 'Learn about how to add fault tolerance to copy activity in Azure Data Factory by skipping the incompatible data.' + Last updated 06/22/2020
data-factory Copy Activity Monitoring https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-monitoring.md
Title: Monitor copy activity+ description: Learn about how to monitor the copy activity execution in Azure Data Factory. + Last updated 03/22/2021
data-factory Copy Activity Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-overview.md
Title: Copy activity in Azure Data Factory + description: Learn about the Copy activity in Azure Data Factory. You can use it to copy data from a supported source data store to a supported sink data store. + Last updated 6/1/2021
data-factory Copy Activity Performance Features https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-performance-features.md
Title: Copy activity performance optimization features+ description: Learn about the key features that help you optimize the copy activity performance in Azure Data Factory。 -+ Last updated 09/24/2020
data-factory Copy Activity Performance Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-performance-troubleshooting.md
Title: Troubleshoot copy activity performance+ description: Learn about how to troubleshoot copy activity performance in Azure Data Factory. -+ Last updated 01/07/2021
data-factory Copy Activity Performance https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-performance.md
Title: Copy activity performance and scalability guide+ description: Learn about key factors that affect the performance of data movement in Azure Data Factory when you use the copy activity. documentationcenter: ''
-+ Last updated 09/15/2020 # Copy activity performance and scalability guide
data-factory Copy Activity Preserve Metadata https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-preserve-metadata.md
Title: Preserve metadata and ACLs using copy activity in Azure Data Factory + description: 'Learn about how to preserve metadata and ACLs during copy using copy activity in Azure Data Factory.' + Last updated 09/23/2020
data-factory Copy Activity Schema And Type Mapping https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-activity-schema-and-type-mapping.md
Title: Schema and data type mapping in copy activity + description: Learn about how copy activity in Azure Data Factory maps schemas and data types from source data to sink data. + Last updated 06/22/2020
data-factory Copy Data Tool https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/copy-data-tool.md
Title: Copy Data tool Azure Data Factory + description: 'Provides information about the Copy Data tool in Azure Data Factory UI' + Last updated 06/04/2021
data-factory Create Azure Integration Runtime https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/create-azure-integration-runtime.md
Title: Create Azure integration runtime in Azure Data Factory + description: Learn how to create Azure integration runtime in Azure Data Factory, which is used to copy data and dispatch transform activities. Last updated 06/04/2021 -+ # How to create and configure Azure Integration Runtime [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Create Self Hosted Integration Runtime https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/create-self-hosted-integration-runtime.md
Title: Create a self-hosted integration runtime+ description: Learn how to create a self-hosted integration runtime in Azure Data Factory, which lets data factories access data stores in a private network. Last updated 06/16/2021 -+ # Create and configure a self-hosted integration runtime
data-factory Data Factory Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-factory-troubleshoot-guide.md
Title: Troubleshoot Azure Data Factory | Microsoft Docs+ description: Learn how to troubleshoot external control activities in Azure Data Factory. + Last updated 06/18/2021
data-factory Data Flow Aggregate https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-aggregate.md
Title: Aggregate transformation in mapping data flow+ description: Learn how to aggregate data at scale in Azure Data Factory with the mapping data flow Aggregate transformation. -+ Last updated 09/14/2020
data-factory Data Flow Alter Row https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-alter-row.md
Title: Alter row transformation in mapping data flow+ description: How to update database target using the alter row transformation in mapping data flow -+ Last updated 05/06/2020
data-factory Data Flow Conditional Split https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-conditional-split.md
Title: Conditional split transformation in mapping data flow + description: Split data into different streams using the conditional split transformation in Azure Data Factory mapping data flow -+ Last updated 05/21/2020
data-factory Data Flow Derived Column https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-derived-column.md
Title: Derived column transformation in mapping data flow+ description: Learn how to transform data at scale in Azure Data Factory with the mapping data flow Derived Column transformation. -+ Last updated 09/14/2020
data-factory Data Flow Exists https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-exists.md
Title: Exists transformation in mapping data flow + description: Check for existing rows using the exists transformation in Azure Data Factory mapping data flow -+ Last updated 05/07/2020
data-factory Data Flow Expression Functions https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-expression-functions.md
Title: Expression functions in the mapping data flow+ description: Learn about expression functions in mapping data flow. + Last updated 07/04/2021
data-factory Data Flow Filter https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-filter.md
Title: Filter transformation in mapping data flow + description: Filter out rows using the filter transformation in Azure Data Factory mapping data flow -+ Last updated 05/26/2020
data-factory Data Flow Flatten https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-flatten.md
Title: Flatten transformation in mapping data flow+ description: Denormalize hierarchical data using the flatten transformation ms.review: daperlov + Last updated 03/09/2020
data-factory Data Flow Join https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-join.md
Title: Join transformation in mapping data flow + description: Combine data from two data sources using the join transformation in Azure Data Factory mapping data flow -+ Last updated 05/15/2020
data-factory Data Flow Lookup https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-lookup.md
Title: Lookup transformation in mapping data flow+ description: Reference data from another source using the lookup transformation in mapping data flow. -+ Last updated 02/19/2021
data-factory Data Flow New Branch https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-new-branch.md
Title: Multiple branches in mapping data flow+ description: Replicating data streams in mapping data flow with multiple branches -+ Last updated 04/16/2021
data-factory Data Flow Pivot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-pivot.md
Title: Pivot transformation in mapping data flow+ description: Pivot data from rows to columns using Azure Data Factory mapping data flow Pivot Transformation -+ Last updated 07/17/2020
data-factory Data Flow Rank https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-rank.md
Title: Rank transformation in mapping data flow + description: How to use Azure Data Factory's mapping data flow rank transformation generate a ranking column -+ Last updated 10/05/2020
data-factory Data Flow Select https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-select.md
Title: Select transformation in mapping data flow+ description: Azure Data Factory mapping data flow Select Transformation -+ Last updated 06/02/2020
data-factory Data Flow Sink https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-sink.md
Title: Sink transformation in mapping data flow+ description: Learn how to configure a sink transformation in mapping data flow. -+ Last updated 07/20/2021
data-factory Data Flow Sort https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-sort.md
Title: Sort transformation in mapping data flow+ description: Azure Data Factory Mapping Data Sort Transformation -+ Last updated 04/14/2020
data-factory Data Flow Source https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-source.md
Title: Source transformation in mapping data flow+ description: Learn how to set up a source transformation in mapping data flow. -+ Last updated 03/10/2021
data-factory Data Flow Surrogate Key https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-surrogate-key.md
Title: Surrogate key transformation in mapping data flow + description: How to use Azure Data Factory's mapping data flow Surrogate Key Transformation to generate sequential key values -+ Last updated 10/30/2020
data-factory Data Flow Transformation Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-transformation-overview.md
Title: Mapping data flow transformation overview+ description: An overview of the different transformations available in mapping data flow + Last updated 10/27/2020
data-factory Data Flow Union https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-union.md
Title: Union transformation in mapping data flow+ description: Azure Data Factory mapping data flow New Branch Transformation -+ Last updated 04/27/2020
data-factory Data Flow Unpivot https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-unpivot.md
Title: Unpivot transformation in mapping data flow+ description: Azure Data Factory mapping data flow Unpivot Transformation -+ Last updated 07/14/2020
data-factory Data Flow Window https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/data-flow-window.md
Title: Window transformation in mapping data flow+ description: Azure Data Factory mapping data flow Window Transformation -+ Last updated 11/16/2020
data-factory Delete Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/delete-activity.md
Title: Delete Activity in Azure Data Factory + description: Learn how to delete files in various file stores with the Delete Activity in Azure Data Factory. + Last updated 08/12/2020
data-factory Format Avro https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-avro.md
Title: Avro format in Azure Data Factory + description: 'This topic describes how to deal with Avro format in Azure Data Factory.' + Last updated 09/15/2020
data-factory Format Binary https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-binary.md
Title: Binary format in Azure Data Factory + description: 'This topic describes how to deal with Binary format in Azure Data Factory.' + Last updated 10/29/2020
data-factory Format Common Data Model https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-common-data-model.md
Title: Common Data Model format+ description: Transform data using the Common Data Model metadata system + Last updated 02/04/2021
data-factory Format Delimited Text https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-delimited-text.md
Title: Delimited text format in Azure Data Factory + description: 'This topic describes how to deal with delimited text format in Azure Data Factory.' + Last updated 03/23/2021
data-factory Format Excel https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-excel.md
Title: Excel format in Azure Data Factory + description: 'This topic describes how to deal with Excel format in Azure Data Factory.' + Last updated 12/08/2020
data-factory Format Json https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-json.md
Title: JSON format in Azure Data Factory + description: 'This topic describes how to deal with JSON format in Azure Data Factory.' + Last updated 10/29/2020
data-factory Format Orc https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-orc.md
Title: ORC format in Azure Data Factory + description: 'This topic describes how to deal with ORC format in Azure Data Factory.' + Last updated 09/28/2020
data-factory Format Parquet https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-parquet.md
Title: Parquet format in Azure Data Factory + description: 'This topic describes how to deal with Parquet format in Azure Data Factory.' + Last updated 09/27/2020
data-factory Format Xml https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/format-xml.md
Title: XML format in Azure Data Factory + description: 'This topic describes how to deal with XML format in Azure Data Factory.' + Last updated 04/29/2021
data-factory How To Create Event Trigger https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/how-to-create-event-trigger.md
Title: Create event-based triggers in Azure Data Factory + description: Learn how to create a trigger in Azure Data Factory that runs a pipeline in response to an event. +
data-factory How To Create Schedule Trigger https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/how-to-create-schedule-trigger.md
Title: Create schedule triggers in Azure Data Factory + description: Learn how to create a trigger in Azure Data Factory that runs a pipeline on a schedule.
Last updated 10/30/2020-+ # Create a trigger that runs a pipeline on a schedule
data-factory How To Create Tumbling Window Trigger https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/how-to-create-tumbling-window-trigger.md
Title: Create tumbling window triggers in Azure Data Factory + description: Learn how to create a trigger in Azure Data Factory that runs a pipeline on a tumbling window. + Last updated 10/25/2020
data-factory Iterative Development Debugging https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/iterative-development-debugging.md
Title: Iterative development and debugging in Azure Data Factory + description: Learn how to develop and debug Data Factory pipelines iteratively in the ADF UX Last updated 04/21/2021 + documentationcenter: ''
data-factory Load Azure Sql Data Warehouse https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/load-azure-sql-data-warehouse.md
Title: Load data into Azure Synapse Analytics+ description: Use Azure Data Factory to copy data into Azure Synapse Analytics -+ Last updated 07/28/2021
data-factory Load Sap Bw Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/load-sap-bw-data.md
Title: Load data from SAP Business Warehouse+ description: 'Use Azure Data Factory to copy data from SAP Business Warehouse (BW)' Previously updated : 07/05/2021 Last updated : 08/04/2021 # Copy data from SAP Business Warehouse by using Azure Data Factory
In the Azure portal, go to your data factory. Select **Open** on the **Open Azur
1. On the home page, select **Ingest** to open the Copy Data tool.
-2. On the **Properties** page, specify a **Task name**, and then select **Next**.
+2. On the **Properties** page, choose **Built-in copy task** under **Task type**, and choose **Run once now** under **Task cadence or task schedule**, and then select **Next**.
-3. On the **Source data store** page, select **+Create new connection**. Select **SAP BW Open Hub** from the connector gallery, and then select **Continue**. To filter the connectors, you can type **SAP** in the search box.
+3. On the **Source data store** page, select **+ New connection**. Select **SAP BW Open Hub** from the connector gallery, and then select **Continue**. To filter the connectors, you can type **SAP** in the search box.
-4. On the **Specify SAP BW Open Hub connection** page, follow these steps to create a new connection.
-
- ![Create SAP BW Open Hub-linked service page](media/load-sap-bw-data/create-sap-bw-open-hub-linked-service.png)
+4. On the **New connection (SAP BW Open Hub)** page, follow these steps to create a new connection.
1. From the **Connect via integration runtime** list, select an existing self-hosted IR. Or, choose to create one if you don't have one yet.
In the Azure portal, go to your data factory. Select **Open** on the **Open Azur
As mentioned in [Prerequisites](#prerequisites), make sure that you have SAP Connector for Microsoft .NET 3.0 installed on the same computer where the self-hosted IR is running.
- 2. Fill in the SAP BW **Server name**, **System number**, **Client ID,** **Language** (if other than **EN**), **User name**, and **Password**.
+ 2. Fill in the SAP BW **Server name**, **System number**, **Client ID**, **Language** (if other than **EN**), **User name**, and **Password**.
- 3. Select **Test connection** to validate the settings, and then select **Finish**.
+ 3. Select **Test connection** to validate the settings, and then select **Create**.
- 4. A new connection is created. Select **Next**.
+ ![Create SAP BW Open Hub-linked service page](media/load-sap-bw-data/create-sap-bw-open-hub-linked-service.png)
-5. On the **Select Open Hub Destinations** page, browse the Open Hub Destinations that are available in your SAP BW. Select the OHD to copy data from, and then select **Next**.
+ 4. On the **Source data store** page, select the newly created connection in the **Connection** block.
- ![Select SAP BW Open Hub Destination table](media/load-sap-bw-data/select-sap-bw-open-hub-table.png)
+ 5. In the section of selecting Open Hub destinations, browse the Open Hub Destinations that are available in your SAP BW. You can preview the data in each destination by selecting the preview button at the end of each row. Select the OHD to copy data from, and then select **Next**.
+
+ :::image type="content" source="./media/load-sap-bw-data/source-data-store-page.png" alt-text="Screenshot showing the 'Source data store' page.":::
-6. Specify a filter, if you need one. If your OHD only contains data from a single data-transfer process (DTP) execution with a single request ID, or you're sure that your DTP is finished and you want to copy the data, clear the **Exclude Last Request** check box.
+5. Specify a filter, if you need one. If your OHD only contains data from a single data-transfer process (DTP) execution with a single request ID, or you're sure that your DTP is finished and you want to copy the data, clear the **Exclude Last Request** check box in **Advanced** section. You can preview the data by selecting **Preview data** button.
- Learn more about these settings in the [SAP BW Open Hub Destination configurations](#sap-bw-open-hub-destination-configurations) section of this article. Select **Validate** to double-check what data will be returned. Then select **Next**.
+ Learn more about these settings in the [SAP BW Open Hub Destination configurations](#sap-bw-open-hub-destination-configurations) section of this article. Then select **Next**.
![Configure SAP BW Open Hub filter](media/load-sap-bw-data/configure-sap-bw-open-hub-filter.png)
-7. On the **Destination data store** page, select **+Create new connection** > **Azure Data Lake Storage Gen2** > **Continue**.
+6. On the **Destination data store** page, select **+ New connection** > **Azure Data Lake Storage Gen2** > **Continue**.
-8. On the **Specify Azure Data Lake Storage connection** page, follow these steps to create a connection.
+7. On the **New connection (Azure Data Lake Storage Gen2)** page, follow these steps to create a connection.
+ 1. Select your Data Lake Storage Gen2-capable account from the **Name** drop-down list.
+ 2. Select **Create** to create the connection.
![Create an ADLS Gen2 linked service page](media/load-sap-bw-data/create-adls-gen2-linked-service.png)
- 1. Select your Data Lake Storage Gen2-capable account from the **Name** drop-down list.
- 2. Select **Finish** to create the connection. Then select **Next**.
-
-9. On the **Choose the output file or folder** page, enter **copyfromopenhub** as the output folder name. Then select **Next**.
+8. On the **Destination data store** page, select the newly created connection in the **Connection** section, and enter **copyfromopenhub** as the output folder name. Then select **Next**.
- ![Choose output folder page](media/load-sap-bw-data/choose-output-folder.png)
+ :::image type="content" source="./media/load-sap-bw-data/destination-data-store-page.png" alt-text="Screenshot showing the 'Destination data store' page.":::
-10. On the **File format setting** page, select **Next** to use the default settings.
+9. On the **File format setting** page, select **Next** to use the default settings.
![Specify sink format page](media/load-sap-bw-data/specify-sink-format.png)
-11. On the **Settings** page, expand **Performance settings**. Enter a value for **Degree of copy parallelism** such as 5 to load from SAP BW in parallel. Then select **Next**.
+10. On the **Settings** page, specify a **Task name**, and expand **Advanced**. Enter a value for **Degree of copy parallelism** such as 5 to load from SAP BW in parallel. Then select **Next**.
![Configure copy settings](media/load-sap-bw-data/configure-copy-settings.png)
-12. On the **Summary** page, review the settings. Then select **Next**.
-
-13. On the **Deployment** page, select **Monitor** to monitor the pipeline.
+11. On the **Summary** page, review the settings. Then select **Next**.
- ![Deployment page](media/load-sap-bw-data/deployment.png)
+ :::image type="content" source="./media/load-sap-bw-data/summary-page.png" alt-text="Screenshot showing the Summary page.":::
-14. Notice that the **Monitor** tab on the left side of the page is automatically selected. The **Actions** column includes links to view activity-run details and to rerun the pipeline.
+12. On the **Deployment** page, select **Monitor** to monitor the pipeline.
- ![Pipeline monitoring view](media/load-sap-bw-data/pipeline-monitoring.png)
+13. Notice that the **Monitor** tab on the left side of the page is automatically selected. You can use links under the **Pipeline name** column in the **Pipeline runs** page to view activity details and to rerun the pipeline.
-15. To view activity runs that are associated with the pipeline run, select **View Activity Runs** in the **Actions** column. There's only one activity (copy activity) in the pipeline, so you see only one entry. To switch back to the pipeline-runs view, select the **Pipelines** link at the top. Select **Refresh** to refresh the list.
+14. To view activity runs that are associated with the pipeline run, select links under the **Pipeline name** column. There's only one activity (copy activity) in the pipeline, so you see only one entry. To switch back to the pipeline-runs view, select the **All pipeline runs** link at the top. Select **Refresh** to refresh the list.
![Activity-monitoring screen](media/load-sap-bw-data/activity-monitoring.png)
-16. To monitor the execution details for each copy activity, select the **Details** link, which is an eyeglasses icon below **Actions** in the activity-monitoring view. Available details include the data volume copied from the source to the sink, data throughput, execution steps and duration, and configurations used.
+15. To monitor the execution details for each copy activity, select the **Details** link, which is an eyeglasses icon in the same row of each copy activity in the activity-monitoring view. Available details include the data volume copied from the source to the sink, data throughput, execution steps and duration, and configurations used.
![Activity monitoring details](media/load-sap-bw-data/activity-monitoring-details.png)
-17. To view the **maximum Request ID**, go back to the activity-monitoring view and select **Output** under **Actions**.
+16. To view the **maximum Request ID** of each copy activity, go back to the activity-monitoring view and select **Output** in the same row of each copy activity.
![Activity output screen](media/load-sap-bw-data/activity-output.png)
data-factory Parameterize Linked Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/parameterize-linked-services.md
Title: Parameterize linked services in Azure Data Factory + description: Learn how to parameterize linked services in Azure Data Factory and pass dynamic values at run time. + Last updated 06/01/2021
data-factory Parameters Data Flow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/parameters-data-flow.md
Title: Parameterizing mapping data flows+ description: Learn how to parameterize a mapping data flow from data factory pipelines + Last updated 04/19/2021
data-factory Security And Access Control Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/security-and-access-control-troubleshoot-guide.md
Title: Troubleshoot security and access control issues+ description: Learn how to troubleshoot security and access control issues in Azure Data Factory. + Last updated 07/28/2021
data-factory Self Hosted Integration Runtime Troubleshoot Guide https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/self-hosted-integration-runtime-troubleshoot-guide.md
Title: Troubleshoot self-hosted integration runtime in Azure Data Factory+ description: Learn how to troubleshoot self-hosted integration runtime issues in Azure Data Factory. + Last updated 05/31/2021
data-factory Supported File Formats And Compression Codecs Legacy https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/supported-file-formats-and-compression-codecs-legacy.md
Title: Supported file formats in Azure Data Factory (legacy)+ description: 'This topic describes the file formats and compression codes that are supported by file-based connectors in Azure Data Factory.' + Last updated 12/10/2019
data-factory Supported File Formats And Compression Codecs https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/supported-file-formats-and-compression-codecs.md
Title: Supported file formats by copy activity in Azure Data Factory+ description: 'This topic describes the file formats and compression codes that are supported by copy activity in Azure Data Factory.' + Last updated 07/16/2020
data-factory Transform Data Databricks Jar https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-databricks-jar.md
Title: Transform data with Databricks Jar+ description: Learn how to process or transform data by running a Databricks Jar within an Azure Data Factory pipeline. +
data-factory Transform Data Databricks Notebook https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-databricks-notebook.md
Title: Transform data with Databricks Notebook + description: Learn how to process or transform data by running a Databricks notebook in Azure Data Factory. +
data-factory Transform Data Databricks Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-databricks-python.md
Title: Transform data with Databricks Python + description: Learn how to process or transform data by running a Databricks Python activity in an Azure Data Factory pipeline. Last updated 03/15/2018 -+ # Transform data by running a Python activity in Azure Databricks [!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
data-factory Transform Data Machine Learning Service https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-machine-learning-service.md
Title: Execute Azure Machine Learning pipelines + description: Learn how to run your Azure Machine Learning pipelines in your Azure Data Factory pipelines. +
data-factory Transform Data Using Data Lake Analytics https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-using-data-lake-analytics.md
Title: Transform data using U-SQL script+ description: Learn how to process or transform data by running U-SQL scripts on Azure Data Lake Analytics compute service. -+ Last updated 08/01/2018
data-factory Transform Data Using Dotnet Custom Activity https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-using-dotnet-custom-activity.md
Title: Use custom activities in a pipeline+ description: Learn how to create custom activities by using .NET, and then use the activities in an Azure Data Factory pipeline. -+ Last updated 11/26/2018
data-factory Transform Data Using Hadoop Hive https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-using-hadoop-hive.md
Title: Transform data using Hadoop Hive activity+ description: Learn how you can use the Hive Activity in an Azure data factory to run Hive queries on an on-demand/your own HDInsight cluster. -+ Last updated 05/08/2019
data-factory Transform Data Using Hadoop Map Reduce https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-using-hadoop-map-reduce.md
Title: Transform data using Hadoop MapReduce activity+ description: Learn how to process data by running Hadoop MapReduce programs on an Azure HDInsight cluster from an Azure data factory. -+ Last updated 05/08/2020
data-factory Transform Data Using Hadoop Pig https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-using-hadoop-pig.md
Title: Transform data using Hadoop Pig activity+ description: Learn how you can use the Pig Activity in an Azure data factory to run Pig scripts on an on-demand/your own HDInsight cluster. -+ Last updated 05/08/2020
data-factory Transform Data Using Hadoop Streaming https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-using-hadoop-streaming.md
Title: Transform data using Hadoop Streaming activity+ description: Explains how to use Hadoop Streaming Activity in Azure Data Factory to transform data by running Hadoop Streaming programs on a Hadoop cluster. -+ Last updated 05/08/2020
data-factory Transform Data Using Machine Learning https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-using-machine-learning.md
Title: Create predictive data pipelines+ description: Learn how to create a predictive pipeline by using Azure Machine Learning Studio (classic) - Batch Execution Activity in Azure Data Factory. -+ Last updated 07/16/2020
data-factory Transform Data Using Spark https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-using-spark.md
Title: Transform data using Spark activity+ description: Learn how to transform data by running Spark programs from an Azure Data Factory pipeline using the Spark Activity. -+ Last updated 06/09/2021
data-factory Transform Data Using Stored Procedure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data-using-stored-procedure.md
Title: Transform data by using the Stored Procedure activity+ description: Explains how to use SQL Server Stored Procedure Activity to invoke a stored procedure in an Azure SQL Database/Data Warehouse from a Data Factory pipeline. -+ Last updated 11/27/2018
data-factory Transform Data https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/transform-data.md
Title: Transform data+ description: Transform data or process data in Azure Data Factory using Hadoop, Azure Machine Learning Studio (classic), or Azure Data Lake Analytics. -+ Last updated 07/31/2018
data-factory Tumbling Window Trigger Dependency https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tumbling-window-trigger-dependency.md
Title: Create tumbling window trigger dependencies+ description: Learn how to create dependency on a tumbling window trigger in Azure Data Factory. -+ Last updated 09/03/2020
data-factory Tutorial Bulk Copy Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-bulk-copy-portal.md
- Last updated 07/06/2021
data-factory Tutorial Control Flow Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-control-flow-portal.md
- Last updated 06/07/2021
data-factory Tutorial Control Flow https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-control-flow.md
- Last updated 9/27/2019
data-factory Tutorial Hybrid Copy Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-hybrid-copy-portal.md
- Last updated 07/05/2021
data-factory Tutorial Hybrid Copy Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-hybrid-copy-powershell.md
-+ Last updated 02/18/2021
data-factory Tutorial Incremental Copy Change Tracking Feature Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-incremental-copy-change-tracking-feature-portal.md
- Last updated 07/05/2021
data-factory Tutorial Incremental Copy Change Tracking Feature Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-incremental-copy-change-tracking-feature-powershell.md
-+ Last updated 02/18/2021
data-factory Tutorial Incremental Copy Multiple Tables Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-incremental-copy-multiple-tables-portal.md
- Last updated 07/05/2021
data-factory Tutorial Incremental Copy Multiple Tables Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-incremental-copy-multiple-tables-powershell.md
-+ Last updated 07/05/2021
data-factory Tutorial Incremental Copy Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-incremental-copy-overview.md
For step-by-step instructions, see the following tutorial: <br/>
- [Incrementally copy data from Azure SQL Database to Azure Blob storage by using Change Tracking technology](tutorial-incremental-copy-change-tracking-feature-powershell.md) ## Loading new and changed files only by using LastModifiedDate
-You can copy the new and changed files only by using LastModifiedDate to the destination store. ADF will scan all the files from the source store, apply the file filter by their LastModifiedDate, and only copy the new and updated file since last time to the destination store. Please be aware if you let ADF scan huge amounts of files but only copy a few files to destination, you would still expect the long duration due to file scanning is time consuming as well.
+You can copy the new and changed files only by using LastModifiedDate to the destination store. ADF will scan all the files from the source store, apply the file filter by their LastModifiedDate, and only copy the new and updated file since last time to the destination store. Please be aware that if you let ADF scan huge amounts of files but you only copy a few files to the destination, this will still take a long time because of the file scanning process.
For step-by-step instructions, see the following tutorial: <br/> - [Incrementally copy new and changed files based on LastModifiedDate from Azure Blob storage to Azure Blob storage](tutorial-incremental-copy-lastmodified-copy-data-tool.md)
data-factory Tutorial Incremental Copy Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-incremental-copy-portal.md
- Last updated 07/05/2021
data-factory Tutorial Incremental Copy Powershell https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-incremental-copy-powershell.md
- Last updated 02/18/2021
data-factory Tutorial Transform Data Hive Virtual Network Portal https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-transform-data-hive-virtual-network-portal.md
- Last updated 06/07/2021
data-factory Tutorial Transform Data Hive Virtual Network https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/tutorial-transform-data-hive-virtual-network.md
-+ Last updated 01/22/2018
data-factory Update Machine Learning Models https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/data-factory/update-machine-learning-models.md
Title: Update Azure Machine Learning Studio (classic) models using Azure Data Factory + description: Describes how to create predictive pipelines using Azure Data Factory and Azure Machine Learning Studio (classic) + Last updated 07/16/2020
defender-for-iot Getting Started https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/organizations/getting-started.md
Research your network architecture, monitored bandwidth, and other network detai
Azure Defender for IoT supports both physical and virtual deployments. For the physical deployments, you can purchase various certified appliances. For more information, see [Identify required appliances](how-to-identify-required-appliances.md).
-We recommend that you calculate the approximate number of devices that will be monitored. Later, when you register your Azure subscription to the portal, you'll be asked to enter this number. Numbers can be added in intervals of 1,000 seconds. The numbers of monitored devices are called *committed devices*.
+We recommend that you calculate the approximate number of devices that will be monitored. Later, when you register your Azure subscription to the portal, you'll be asked to enter this number. Numbers can be added in intervals of 1,000,for example 1000, 2000, 3000. The numbers of monitored devices are called *committed devices*.
## Register with Azure Defender for IoT
defender-for-iot How To Manage Individual Sensors https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/organizations/how-to-manage-individual-sensors.md
The Defender for IoT sensor, and on-premises management console use SSL, and TLS
- Secure communications between the sensors and an on-premises management console.
-Once installed, the appliance generates a local self-signed certificate to allow preliminary access to the web console. Enterprise SSL, and TLS certificates may be installed using the [`cyberx-xsense-certificate-import`](#cli-commands) command-line tool.
+Once installed, the appliance generates a local self-signed certificate to allow preliminary access to the web console.
> [!NOTE] > For integrations and forwarding rules where the appliance is the client and initiator of the session, specific certificates are used and are not related to the system certificates.
defender-for-iot How To Set Up Your Network https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/defender-for-iot/organizations/how-to-set-up-your-network.md
Verify that your organizational security policy allows access to the following:
| SNMP | UDP | OUT | 161 | Monitoring | Remote SNMP collectors. | On-premises management console and Sensor | SNMP server | | WMI | UDP | OUT | 135 | monitoring | Windows Endpoint Monitoring | Sensor | Relevant network element | | Tunneling | TCP | IN | 9000 <br /><br />- on top of port 443 <br /><br />From end user to the on-premises management console. <br /><br />- Port 22 from sensor to the on-premises management console | monitoring | Tunneling | Sensor | On-premises management console |
+| HTTP| TCP | OUT | 80 | Certificate validation | Download CRL file | Sensor | CRL server |
### Planning rack installation
dns Private Dns Privatednszone https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/dns/private-dns-privatednszone.md
Azure Private DNS provides a reliable, secure DNS service to manage and resolve
The records contained in a private DNS zone aren't resolvable from the Internet. DNS resolution against a private DNS zone works only from virtual networks that are linked to it. You can link a private DNS zone to one or more virtual networks by creating [virtual network links](./private-dns-virtual-network-links.md).
-You can also enable the [autoregistration](./private-dns-autoregistration.md) feature to automatically manage the life cycle of the DNS records for the virtual machines that gets deployed in a virtual network.
+You can also enable the [autoregistration](./private-dns-autoregistration.md) feature to automatically manage the life cycle of the DNS records for the virtual machines that get deployed in a virtual network.
## Limits
To understand how many private DNS zones you can create in a subscription and ho
## Restrictions
-* Single labeled private DNS zones aren't supported. Your private DNS zone must have two or more labels. For example contoso.com has two labels separated by a dot. A private DNS zone can have a maximum 34 labels.
-* You can't create zone delegations (NS records) in a private DNS zone. If you intend to use a child domain, you can directly create the domain as a private DNS zone. Then you can link it to virtual network without setting up a nameserver delegation from the parent zone.
+* Single-labeled private DNS zones aren't supported. Your private DNS zone must have two or more labels. For example contoso.com has two labels separated by a dot. A private DNS zone can have a maximum of 34 labels.
+* You can't create zone delegations (NS records) in a private DNS zone. If you intend to use a child domain, you can directly create the domain as a private DNS zone. Then you can link it to the virtual network without setting up a nameserver delegation from the parent zone.
## Next steps
To understand how many private DNS zones you can create in a subscription and ho
* Read about some common [private zone scenarios](./private-dns-scenarios.md) that can be realized with private zones in Azure DNS.
-* For common questions and answers about private zones in Azure DNS, see [Private DNS FAQ](./dns-faq-private.yml).
+* For common questions and answers about private zones in Azure DNS, see [Private DNS FAQ](./dns-faq-private.yml).
event-grid Add Identity Roles https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/event-grid/add-identity-roles.md
Currently, Azure event grid supports custom topics or domains configured with a
| Azure Blob storage | [Storage Blob Data Contributor](../storage/blobs/assign-azure-role-data-access.md) | | Azure Queue storage |[Storage Queue Data Message Sender](../storage/blobs/assign-azure-role-data-access.md) | - ## Use the Azure portal You can use the Azure portal to assign the custom topic or domain identity to an appropriate role so that the custom topic or domain can forward events to the destination.
The following example adds a managed identity for an event grid custom topic nam
1. Go to your **Service Bus namespace** in the [Azure portal](https://portal.azure.com). 1. Select **Access Control** in the left pane.
-1. Select **Add** in the **Add a role assignment** section.
-1. On the **Add a role assignment** page, do the following steps:
- 1. Select the role. In this case, it's **Azure Service Bus Data Sender**.
- 1. Select the **identity** for your event grid custom topic or domain.
- 1. Select **Save** to save the configuration.
+1. Select **Add** in the **Add role assignment (Preview)** section.
+
+ :::image type="content" source="./media/add-identity-roles/add-role-assignment-menu.png" alt-text="Image showing the selection of Add role assignment (Preview) menu":::
+1. On the **Add role assignment** page, select **Azure Service Bus Data Sender**, and select **Next**.
+
+ :::image type="content" source="./media/add-identity-roles/select-role.png" alt-text="Image showing the selection of the Azure Service Bus Data Sender role":::
+1. In the **Members** tab, follow these steps:
+ 1. Select **Use, group, or service principal**, and click **+ Select members**. The **Managed identity** option doesn't support Event Grid identities yet.
+ 1. In the **Select members** window, search for and select the service principal with the same name as your custom topic. In the following example, it's **spcustomtopic0728**.
+
+ :::image type="content" source="./media/add-identity-roles/select-managed-identity-option.png" alt-text="Image showing the selection of the User, group, or service principal option":::
+ 1. In the **Select members** window, click **Select**.
+
+ :::image type="content" source="./media/add-identity-roles/managed-identity-selected.png" alt-text="Image showing the selection of the Managed identity option":::
+1. Now, back on the **Members** tab, select **Next**.
+
+ :::image type="content" source="./media/add-identity-roles/members-select-next.png" alt-text="Image showing the selection of the Next button on the Members page":::
+1. On the **Review + assign** page, select **Review + assign** after reviewing the settings.
The steps are similar for adding an identity to other roles mentioned in the table.
expressroute Monitor Expressroute https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/expressroute/monitor-expressroute.md
Here are some queries that you can enter into the Log search bar to help you mon
AzureDiagnostics | where TimeGenerated > ago(12h) | where ResourceType == "EXPRESSROUTECIRCUITS"
- | project TimeGenerated, ResourceType , network s, path s, OperationName
+ | project TimeGenerated, ResourceType , network_s, path_s, OperationName
``` * To query for BGP informational messages by level, resource type, and network.
Here are some queries that you can enter into the Log search bar to help you mon
AzureDiagnostics | where Level == "Informational" | where ResourceType == "EXPRESSROUTECIRCUITS"
- | project TimeGenerated, ResourceId , Level, ResourceType , network s, path s
+ | project TimeGenerated, ResourceId , Level, ResourceType , network_s, path_s
``` * To query for Traffic graph BitInPerSeconds in the last one hour.
In the **Alert Criteria**, you can select **Activity Log** for the Signal Type a
## Next steps * See [Monitoring ExpressRoute data reference](monitor-expressroute-reference.md) for a reference of the metrics, logs, and other important values created by ExpressRoute.
-* See [Monitoring Azure resources with Azure Monitor](../azure-monitor/overview.md) for details on monitoring Azure resources.
+* See [Monitoring Azure resources with Azure Monitor](../azure-monitor/overview.md) for details on monitoring Azure resources.
governance Definition Structure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/policy/concepts/definition-structure.md
assignment is applied to a resource group, it's applicable to all the resources
group. The policy definition _policyRule_ schema is found here:
-[https://schema.management.azure.com/schemas/2019-09-01/policyDefinition.json](https://schema.management.azure.com/schemas/2019-09-01/policyDefinition.json)
+[https://schema.management.azure.com/schemas/2020-10-01/policyDefinition.json](https://schema.management.azure.com/schemas/2020-10-01/policyDefinition.json)
You use JSON to create a policy definition. The policy definition contains elements for:
governance Effects https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/policy/concepts/effects.md
related resources to match.
- Default is _ResourceGroup_. - **EvaluationDelay** (optional) - Specifies when the existence of the related resources should be evaluated. The delay is only used for evaluations that are a result of a create or update resource request.
- - Allowed values are `AfterProvisioning`, `AfterProvisioningSuccess`, `AfterProvisioningFailure` or an ISO 8601 duration between 0 and 360 minutes.
+ - Allowed values are `AfterProvisioning`, `AfterProvisioningSuccess`, `AfterProvisioningFailure` or an ISO 8601 duration between 10 and 360 minutes.
- The _AfterProvisioning_ values inspect the provisioning result of the resource that was evaluated in the policy rule's IF condition. `AfterProvisioning` runs after provisioning is complete, regardless of outcome. If provisioning takes longer than 6 hours it will be treated as a failure when determining _AfterProvisioning_ evaluation delays. - Default is `PT10M` (10 minutes). - Specifying a long evaluation delay may cause the recorded compliance state of the resource to not update until the next [evaluation trigger](../how-to/get-compliance-data.md#evaluation-triggers).
governance Resource Graph Samples https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/policy/samples/resource-graph-samples.md
Title: Azure Resource Graph sample queries for Azure Policy description: Sample Azure Resource Graph queries for Azure Policy showing use of resource types and tables to access Azure Policy related resources and properties. Previously updated : 07/21/2021 Last updated : 08/04/2021
governance Query Language https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/resource-graph/concepts/query-language.md
Title: Understand the query language description: Describes Resource Graph tables and the available Kusto data types, operators, and functions usable with Azure Resource Graph. Previously updated : 07/20/2021 Last updated : 08/05/2021 # Understanding the Azure Resource Graph query language
properties from related resource types. Here is the list of tables available in
|Resource Graph table |Can `join` other tables? |Description | |||| |Resources |Yes |The default table if none defined in the query. Most Resource Manager resource types and properties are here. |
-|ResourceContainers |Yes |Includes subscription (`Microsoft.Resources/subscriptions`) and resource group (`Microsoft.Resources/subscriptions/resourcegroups`) resource types and data. |
+|ResourceContainers |Yes |Includes management group (`Microsoft.Management/managementGroups`), subscription (`Microsoft.Resources/subscriptions`) and resource group (`Microsoft.Resources/subscriptions/resourcegroups`) resource types and data. |
|AdvisorResources |Yes (preview) |Includes resources _related_ to `Microsoft.Advisor`. | |AlertsManagementResources |Yes (preview) |Includes resources _related_ to `Microsoft.AlertsManagement`. | |ExtendedLocationResources |No |Includes resources _related_ to `Microsoft.ExtendedLocation`. | |GuestConfigurationResources |No |Includes resources _related_ to `Microsoft.GuestConfiguration`. |
+|HealthResources|Yes |Includes resources _related_ to `Microsoft.ResourceHealth/availabilitystatuses`. |
|KubernetesConfigurationResources |No |Includes resources _related_ to `Microsoft.KubernetesConfiguration`. | |MaintenanceResources |Partial, join _to_ only. (preview) |Includes resources _related_ to `Microsoft.Maintenance`. | |PatchAssessmentResources|No |Includes resources _related_ to Azure Virtual Machines patch assessment. |
properties from related resource types. Here is the list of tables available in
|PolicyResources |Yes |Includes resources _related_ to `Microsoft.PolicyInsights`. | |RecoveryServicesResources |Partial, join _to_ only. (preview) |Includes resources _related_ to `Microsoft.DataProtection` and `Microsoft.RecoveryServices`. | |SecurityResources |Yes (preview) |Includes resources _related_ to `Microsoft.Security`. |
-|ServiceHealthResources |No (preview) |Includes resources _related_ to `Microsoft.ResourceHealth`. |
+|ServiceHealthResources |No (preview) |Includes resources _related_ to `Microsoft.ResourceHealth/events`. |
|WorkloadMonitorResources |No |Includes resources _related_ to `Microsoft.WorkloadMonitor`. | For a complete list, including resource types, see [Reference: Supported tables and resource types](../reference/supported-tables-resources.md).
governance Supported Tables Resources https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/resource-graph/reference/supported-tables-resources.md
Title: Supported Azure Resource Manager resource types description: Provide a list of the Azure Resource Manager resource types supported by Azure Resource Graph and Change History. Previously updated : 07/07/2021 Last updated : 08/04/2021
For sample queries for this table, see [Resource Graph sample queries for adviso
For sample queries for this table, see [Resource Graph sample queries for extendedlocationresources](../samples/samples-by-table.md#extendedlocationresources). - microsoft.extendedlocation/customlocations/enabledresourcetypes
- - Sample query: [Get enabled resource types for Azure Arc enabled custom locations](../samples/samples-by-category.md#get-enabled-resource-types-for-azure-arc-enabled-custom-locations)
- - Sample query: [List Azure Arc enabled custom locations with VMware or SCVMM enabled](../samples/samples-by-category.md#list-azure-arc-enabled-custom-locations-with-vmware-or-scvmm-enabled)
+ - Sample query: [Get enabled resource types for Azure Arc-enabled custom locations](../samples/samples-by-category.md#get-enabled-resource-types-for-azure-arc-enabled-custom-locations)
+ - Sample query: [List Azure Arc-enabled custom locations with VMware or SCVMM enabled](../samples/samples-by-category.md#list-azure-arc-enabled-custom-locations-with-vmware-or-scvmm-enabled)
## guestconfigurationresources For sample queries for this table, see [Resource Graph sample queries for guestconfigurationresources](../samples/samples-by-table.md#guestconfigurationresources). - microsoft.guestconfiguration/guestconfigurationassignments
+ - Sample query: [Count machines in scope of guest configuration policies](../samples/samples-by-category.md#count-machines-in-scope-of-guest-configuration-policies)
+ - Sample query: [Count of non-compliant guest configuration assignments](../samples/samples-by-category.md#count-of-non-compliant-guest-configuration-assignments)
+ - Sample query: [Find all reasons a machine is non-compliant for guest configuration assignments](../samples/samples-by-category.md#find-all-reasons-a-machine-is-non-compliant-for-guest-configuration-assignments)
+ - Sample query: [Query details of guest configuration assignment reports](../samples/samples-by-category.md#query-details-of-guest-configuration-assignment-reports)
+
+## healthresources
+
+For sample queries for this table, see [Resource Graph sample queries for healthresources](../samples/samples-by-table.md#healthresources).
+
+- microsoft.resourcehealth/availabilitystatuses
+ - Sample query: [Count of virtual machines by availability state and Subscription Id](../samples/samples-by-category.md#count-of-virtual-machines-by-availability-state-and-subscription-id)
+ - Sample query: [List of virtual machines and associated availability states by Resource Ids](../samples/samples-by-category.md#list-of-virtual-machines-and-associated-availability-states-by-resource-ids)
+ - Sample query: [List of virtual machines by availability state and power state with Resource Ids and resource Groups](../samples/samples-by-category.md#list-of-virtual-machines-by-availability-state-and-power-state-with-resource-ids-and-resource-groups)
+ - Sample query: [List of virtual machines that are not Available by Resource Ids](../samples/samples-by-category.md#list-of-virtual-machines-that-are-not-available-by-resource-ids)
## kubernetesconfigurationresources For sample queries for this table, see [Resource Graph sample queries for kubernetesconfigurationresources](../samples/samples-by-table.md#kubernetesconfigurationresources). - microsoft.kubernetesconfiguration/extensions
- - Sample query: [List all Azure Arc enabled Kubernetes clusters with Azure Monitor extension](../samples/samples-by-category.md#list-all-azure-arc-enabled-kubernetes-clusters-with-azure-monitor-extension)
- - Sample query: [List all Azure Arc enabled Kubernetes clusters without Azure Monitor extension](../samples/samples-by-category.md#list-all-azure-arc-enabled-kubernetes-clusters-without-azure-monitor-extension)
+ - Sample query: [List all Azure Arc-enabled Kubernetes clusters with Azure Monitor extension](../samples/samples-by-category.md#list-all-azure-arc-enabled-kubernetes-clusters-with-azure-monitor-extension)
+ - Sample query: [List all Azure Arc-enabled Kubernetes clusters without Azure Monitor extension](../samples/samples-by-category.md#list-all-azure-arc-enabled-kubernetes-clusters-without-azure-monitor-extension)
- microsoft.kubernetesconfiguration/sourcecontrolconfigurations ## maintenanceresources
+- microsoft.maintenance/applyupdates
- microsoft.maintenance/configurationassignments - microsoft.maintenance/updates
For sample queries for this table, see [Resource Graph sample queries for policy
For sample queries for this table, see [Resource Graph sample queries for resourcecontainers](../samples/samples-by-table.md#resourcecontainers).
+- microsoft.management/managementgroups
- microsoft.resources/subscriptions (Subscriptions) - Sample query: [Key vaults with subscription name](../samples/samples-by-category.md#key-vaults-with-subscription-name) - Sample query: [Remove columns from results](../samples/samples-by-category.md#remove-columns-from-results)
For sample queries for this table, see [Resource Graph sample queries for resour
- microsoft.alertsmanagement/resourcehealthalertrules - microsoft.alertsmanagement/smartdetectoralertrules - Microsoft.AnalysisServices/servers (Analysis Services)-- microsoft.anybuild/clusters
+- Microsoft.AnyBuild/clusters (AnyBuild Clusters)
- Microsoft.ApiManagement/service (API Management services) - microsoft.appassessment/migrateprojects - Microsoft.AppConfiguration/configurationStores (App Configuration)
For sample queries for this table, see [Resource Graph sample queries for resour
- Microsoft.Authorization/resourceManagementPrivateLinks (Resource management private links) - microsoft.automanage/accounts - microsoft.automanage/configurationprofilepreferences
+- microsoft.automanage/configurationprofiles
- Microsoft.Automation/AutomationAccounts (Automation Accounts) - microsoft.automation/automationaccounts/configurations - Microsoft.Automation/automationAccounts/runbooks (Runbook)
For sample queries for this table, see [Resource Graph sample queries for resour
- Microsoft.AzureActiveDirectory/b2cDirectories (B2C Tenants) - Microsoft.AzureActiveDirectory/guestUsages (Guest Usages) - Microsoft.AzureArcData/dataControllers (Azure Arc data controllers)-- Microsoft.AzureArcData/postgresInstances (Azure Database for PostgreSQL server groups - Azure Arc)
+- Microsoft.AzureArcData/postgresInstances (Azure Arc-enabled PostgreSQL Hyperscale server groups)
- Microsoft.AzureArcData/sqlManagedInstances (SQL managed instances - Azure Arc) - Microsoft.AzureArcData/sqlServerInstances (SQL Server - Azure Arc) - microsoft.azurecis/autopilotenvironments
For sample queries for this table, see [Resource Graph sample queries for resour
- microsoft.azuredata/sqlmanagedinstances - microsoft.azuredata/sqlserverinstances - Microsoft.AzureData/sqlServerRegistrations (SQL Server registries)-- microsoft.azurepercept/accounts
+- Microsoft.AzurePercept/accounts (Azure Percept accounts)
- microsoft.azuresphere/catalogs - microsoft.azuresphere/catalogs/products - microsoft.azuresphere/catalogs/products/devicegroups
For sample queries for this table, see [Resource Graph sample queries for resour
- microsoft.azurestackhci/virtualharddisks - Microsoft.AzureStackHci/virtualMachines (Azure Stack HCI virtual machine - Azure Arc) - microsoft.azurestackhci/virtualnetworks
+- microsoft.backupsolutions/vmwareapplications
- microsoft.baremetal/consoleconnections - Microsoft.BareMetal/crayServers (Cray Servers) - Microsoft.BareMetal/monitoringServers (Monitoring Servers)
For sample queries for this table, see [Resource Graph sample queries for resour
- Microsoft.Cdn/Profiles/AfdEndpoints (Endpoints) - microsoft.cdn/profiles/endpoints (Endpoints) - Microsoft.CertificateRegistration/certificateOrders (App Service Certificates)-- Microsoft.chaos/chaosexperiments (Chaos Experiments)
+- microsoft.chaos/chaosexperiments (Chaos Experiments)
+- microsoft.chaos/experiments
- microsoft.classicCompute/domainNames (Cloud services (classic)) - Microsoft.ClassicCompute/VirtualMachines (Virtual machines (classic)) - Microsoft.ClassicNetwork/networkSecurityGroups (Network security groups (classic))
For sample queries for this table, see [Resource Graph sample queries for resour
- Microsoft.Compute/disks (Disks) - Microsoft.Compute/galleries (Shared image galleries) - Microsoft.Compute/galleries/applications (Gallery applications)-- microsoft.compute/galleries/applications/versions
+- Microsoft.Compute/galleries/applications/versions (Gallery application versions)
- Microsoft.Compute/galleries/images (Image definitions) - Microsoft.Compute/galleries/images/versions (Image versions) - Microsoft.Compute/hostgroups (Host groups)
For sample queries for this table, see [Resource Graph sample queries for resour
- Microsoft.Compute/sshPublicKeys (SSH keys) - microsoft.compute/swiftlets - Microsoft.Compute/VirtualMachines (Virtual machines)
+ - Sample query: [Count of virtual machines by power state](../samples/samples-by-category.md#count-of-virtual-machines-by-power-state)
- Sample query: [Count virtual machines by OS type](../samples/samples-by-category.md#count-virtual-machines-by-os-type) - Sample query: [Count virtual machines by OS type with extend](../samples/samples-by-category.md#count-virtual-machines-by-os-type-with-extend) - Sample query: [List all extensions installed on a virtual machine](../samples/samples-by-category.md#list-all-extensions-installed-on-a-virtual-machine)
+ - Sample query: [List of virtual machines by availability state and power state with Resource Ids and resource Groups](../samples/samples-by-category.md#list-of-virtual-machines-by-availability-state-and-power-state-with-resource-ids-and-resource-groups)
- Sample query: [List virtual machines with their network interface and public IP](../samples/samples-by-category.md#list-virtual-machines-with-their-network-interface-and-public-ip) - Sample query: [Show all virtual machines ordered by name in descending order](../samples/samples-by-category.md#show-all-virtual-machines-ordered-by-name-in-descending-order) - Sample query: [Show first five virtual machines by name and their OS type](../samples/samples-by-category.md#show-first-five-virtual-machines-by-name-and-their-os-type)
For sample queries for this table, see [Resource Graph sample queries for resour
- microsoft.costmanagement/connectors - microsoft.customproviders/resourceproviders - microsoft.d365customerinsights/instances
+- Microsoft.Dashboard/grafana (Grafana Workspaces)
- Microsoft.DataBox/jobs (Data Box) - Microsoft.DataBoxEdge/dataBoxEdgeDevices (Azure Stack Edge / Data Box Gateway) - Microsoft.Databricks/workspaces (Azure Databricks Services)
For sample queries for this table, see [Resource Graph sample queries for resour
- microsoft.datamigration/slots - microsoft.datamigration/sqlmigrationservices - Microsoft.DataProtection/BackupVaults (Backup vaults)-- microsoft.dataprotection/resourceguards
+- Microsoft.DataProtection/resourceGuards (Resource Guards (Preview))
- microsoft.dataprotection/resourceoperationgatekeepers - Microsoft.DataShare/accounts (Data Shares) - Microsoft.DBforMariaDB/servers (Azure Database for MariaDB servers)
For sample queries for this table, see [Resource Graph sample queries for resour
- Microsoft.DigitalTwins/digitalTwinsInstances (Azure Digital Twins) - Microsoft.DocumentDB/cassandraClusters (Azure Managed Instance for Apache Cassandra) - Microsoft.DocumentDb/databaseAccounts (Azure Cosmos DB accounts)
- - Sample query: [List Cosmos DB with specific write locations](../samples/samples-by-category.md#list-cosmos-db-with-specific-write-locations)
+ - Sample query: [List Azure Cosmos DB with specific write locations](../samples/samples-by-category.md#list-azure-cosmos-db-with-specific-write-locations)
- Microsoft.DomainRegistration/domains (App Service Domains) - microsoft.dynamics365fraudprotection/instances - Microsoft.EdgeOrder/addresses (Azure Edge Hardware Center Address)
For sample queries for this table, see [Resource Graph sample queries for resour
- Microsoft.EventHub/namespaces (Event Hubs Namespaces) - Microsoft.Experimentation/experimentWorkspaces (Experiment Workspaces) - Microsoft.ExtendedLocation/CustomLocations (Custom locations)
- - Sample query: [List Azure Arc enabled custom locations with VMware or SCVMM enabled](../samples/samples-by-category.md#list-azure-arc-enabled-custom-locations-with-vmware-or-scvmm-enabled)
+ - Sample query: [List Azure Arc-enabled custom locations with VMware or SCVMM enabled](../samples/samples-by-category.md#list-azure-arc-enabled-custom-locations-with-vmware-or-scvmm-enabled)
- microsoft.falcon/namespaces - Microsoft.Fidalgo/devcenters (Fidalgo DevCenters) - Microsoft.Fidalgo/projects (Fidalgo Projects)
For sample queries for this table, see [Resource Graph sample queries for resour
- Microsoft.HanaOnAzure/hanaInstances (SAP HANA on Azure) - Microsoft.HanaOnAzure/sapMonitors (Azure Monitors for SAP Solutions) - microsoft.hardwaresecuritymodules/dedicatedhsms-- microsoft.hdinsight/clusterpools-- microsoft.hdinsight/clusterpools/clusters
+- Microsoft.HDInsight/clusterpools (HDInsight cluster pools)
+- Microsoft.HDInsight/clusterpools/clusters (HDInsight gen2 clusters)
- Microsoft.HDInsight/clusters (HDInsight clusters) - Microsoft.HealthBot/healthBots (Azure Health Bot) - Microsoft.HealthcareApis/services (Azure API for FHIR) - microsoft.healthcareapis/services/privateendpointconnections - Microsoft.HealthcareApis/workspaces (Healthcare APIs Workspaces)-- Microsoft.HealthcareApis/workspaces/dicomservices (DICOM Services)-- Microsoft.HealthcareApis/workspaces/fhirservices (FHIR Services)
+- Microsoft.HealthcareApis/workspaces/dicomservices (DICOM services)
+- Microsoft.HealthcareApis/workspaces/fhirservices (FHIR services)
- Microsoft.HealthcareApis/workspaces/iotconnectors (IoT Connectors)
+- Microsoft.HpcWorkbench/instances (HPC Workbenches (preview))
- Microsoft.HybridCompute/machines (Servers - Azure Arc)
+ - Sample query: [Get count and percentage of Arc-enabled servers by domain](../samples/samples-by-category.md#get-count-and-percentage-of-arc-enabled-servers-by-domain)
- microsoft.hybridcompute/machines/extensions - Microsoft.HybridCompute/privateLinkScopes (Azure Arc Private Link Scopes) - Microsoft.HybridData/dataManagers (StorSimple Data Managers)
For sample queries for this table, see [Resource Graph sample queries for resour
- Sample query: [Count key vault resources](../samples/samples-by-category.md#count-key-vault-resources) - Sample query: [Key vaults with subscription name](../samples/samples-by-category.md#key-vaults-with-subscription-name) - Microsoft.Kubernetes/connectedClusters (Kubernetes - Azure Arc)
- - Sample query: [List all Azure Arc enabled Kubernetes clusters without Azure Monitor extension](../samples/samples-by-category.md#list-all-azure-arc-enabled-kubernetes-clusters-without-azure-monitor-extension)
- - Sample query: [List all Azure Arc enabled Kubernetes resources](../samples/samples-by-category.md#list-all-azure-arc-enabled-kubernetes-resources)
+ - Sample query: [List all Azure Arc-enabled Kubernetes clusters without Azure Monitor extension](../samples/samples-by-category.md#list-all-azure-arc-enabled-kubernetes-clusters-without-azure-monitor-extension)
+ - Sample query: [List all Azure Arc-enabled Kubernetes resources](../samples/samples-by-category.md#list-all-azure-arc-enabled-kubernetes-resources)
- Microsoft.Kusto/clusters (Azure Data Explorer Clusters) - Microsoft.Kusto/clusters/databases (Azure Data Explorer Databases) - Microsoft.LabServices/labAccounts (Lab Services)
For sample queries for this table, see [Resource Graph sample queries for resour
- Microsoft.Network/customIpPrefixes (Custom IP Prefixes) - microsoft.network/ddoscustompolicies - Microsoft.Network/ddosProtectionPlans (DDoS protection plans)
+- microsoft.network/dnsforwardingrulesets
- microsoft.network/dnsresolvers - Microsoft.Network/dnsZones (DNS zones) - microsoft.network/dscpconfigurations
For sample queries for this table, see [Resource Graph sample queries for resour
- microsoft.network/networkprofiles - Microsoft.Network/NetworkSecurityGroups (Network security groups) - Sample query: [Show unassociated network security groups](../samples/samples-by-category.md#show-unassociated-network-security-groups)
+- microsoft.network/networksecurityperimeters
- microsoft.network/networkvirtualappliances - microsoft.network/networkwatchers (Network Watchers) - microsoft.network/networkwatchers/connectionmonitors
For sample queries for this table, see [Resource Graph sample queries for resour
- Microsoft.TimeSeriesInsights/environments/referenceDataSets (Time Series Insights reference data sets) - microsoft.token/stores - microsoft.tokenvault/vaults
+- Microsoft.VideoIndexer/accounts (Video Analyzer for Media)
- Microsoft.VirtualMachineImages/imageTemplates (Image Templates) - microsoft.visualstudio/account (Azure DevOps organizations) - microsoft.visualstudio/account/extension
For sample queries for this table, see [Resource Graph sample queries for resour
- microsoft.workloadbuilder/migrationagents - microsoft.workloadbuilder/workloads - myget.packagemanagement/services
+- NGINX.NGINXPLUS/nginxDeployments (NGINX Deployment)
- Paraleap.CloudMonix/services (CloudMonix) - Pokitdok.Platform/services (PokitDok Platform)-- Providers.Test/statefulIbizaEngines (Application assessments)
+- Providers.Test/statefulIbizaEngines (Service Linkers)
- RavenHq.Db/databases (RavenHQ) - Raygun.CrashReporting/apps (Raygun) - Sendgrid.Email/accounts (SendGrid Accounts)
governance Samples By Category https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/resource-graph/samples/samples-by-category.md
Title: List of sample Azure Resource Graph queries by category description: List sample queries for Azure Resource-Graph. Categories include Tags, Azure Advisor, Key Vault, Kubernetes, Guest Configuration, and more. Previously updated : 07/21/2021 Last updated : 08/04/2021
Otherwise, use <kbd>Ctrl</kbd>-<kbd>F</kbd> to use your browser's search feature
[!INCLUDE [azure-resource-graph-samples-cat-azure-arc-enabled-kubernetes](../../../../includes/resource-graph/samples/bycat/azure-arc-enabled-kubernetes.md)]
+## Azure Arc-enabled servers
++ ## Azure Container Registry [!INCLUDE [azure-resource-graph-samples-cat-azure-container-registry](../../../../includes/resource-graph/samples/bycat/azure-container-registry.md)]
Otherwise, use <kbd>Ctrl</kbd>-<kbd>F</kbd> to use your browser's search feature
[!INCLUDE [azure-resource-graph-samples-cat-azure-policy](../../../../includes/resource-graph/samples/bycat/azure-policy.md)]
-## Azure Policy Guest Configuration
+## Azure Policy guest configuration
[!INCLUDE [azure-resource-graph-samples-cat-azure-policy-guest-configuration](../../../../includes/resource-graph/samples/bycat/azure-policy-guest-configuration.md)]
Otherwise, use <kbd>Ctrl</kbd>-<kbd>F</kbd> to use your browser's search feature
[!INCLUDE [azure-resource-graph-samples-cat-networking](../../../../includes/resource-graph/samples/bycat/networking.md)]
+## Resource health
++ ## Tags [!INCLUDE [azure-resource-graph-samples-cat-tags](../../../../includes/resource-graph/samples/bycat/tags.md)]
governance Samples By Table https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/governance/resource-graph/samples/samples-by-table.md
Title: List of sample Azure Resource Graph queries by table description: List sample queries for Azure Resource-Graph. Tables include Resources, ResourceContainers, PolicyResources, and more. Previously updated : 07/21/2021 Last updated : 08/04/2021
details, see [Resource Graph tables](../concepts/query-language.md#resource-grap
[!INCLUDE [azure-resource-graph-samples-table-guestconfigurationresources](../../../../includes/resource-graph/samples/bytable/guestconfigurationresources.md)]
+## HealthResources
++ ## KubernetesConfigurationResources [!INCLUDE [azure-resource-graph-samples-table-kubernetesconfigurationresources](../../../../includes/resource-graph/samples/bytable/kubernetesconfigurationresources.md)]
healthcare-apis Authentication Authorization https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/authentication-authorization.md
This article provides an overview of the authentication and authorization proces
## Authentication
-The Healthcare APIs is a collection of secured managed services using [Azure Active Directory (Azure AD)](https://review.docs.microsoft.com/azure/active-directory/), a global identity provider that supports [OAuth 2.0](https://oauth.net/2/).
+The Healthcare APIs is a collection of secured managed services using [Azure Active Directory (Azure AD)](https://docs.microsoft.com/azure/active-directory/), a global identity provider that supports [OAuth 2.0](https://oauth.net/2/).
For the Healthcare APIs services to access Azure resources, such as storage accounts and event hubs, you must **enable the system managed identity**, and **grant proper permissions** to the managed identity. For more information, see [Azure managed identities](../active-directory/managed-identities-azure-resources/overview.md).
-The Healthcare APIs does not support other identity providers. However, customers can use their own identity provider to secure applications, and enable them to interact with the Healthcare APIs by managing client applications and user data access controls.
+The Healthcare APIs do not support other identity providers. However, customers can use their own identity provider to secure applications, and enable them to interact with the Healthcare APIs by managing client applications and user data access controls.
The client applications are registered in the Azure AD and can be used to access the Healthcare APIs. User data access controls are done in the applications or services that implement business logic.
The FHIR service of the Healthcare APIs provides the following roles:
The DICOM service of the Healthcare APIs provides the following roles:
-* **DICOM Data Owner**: Can read, write and delete DICOM data.
+* **DICOM Data Owner**: Can read, write, and delete DICOM data.
* **DICOM Data Read**: Can read DICOM data. The IoT Connector does not require application roles, but it does rely on the ΓÇ£Azure Event Hubs Data ReceiverΓÇ¥ to retrieve data stored in the event hub of the customerΓÇÖs subscription.
For obtaining an access token for the Healthcare APIs, these are the steps using
In the **client credentials flow**, permissions are granted directly to the application itself. When the application presents a token to a resource, the resource enforces that the application itself has authorization to perform an action since there is no user involved in the authentication. Therefore, it is different from the **authorization code flow** in the following ways: -- The user or the client does not need to login interactively
+- The user or the client does not need to log in interactively
- The authorization code is not required. - The access token is obtained directly through application permissions.
You can use online tools such as [https://jwt.ms](https://jwt.ms/) or [https://j
|appid |e97e1b8c-xxx |This is the application ID of the client using the token. The application can act as itself or on behalf of a user. The application ID typically represents an application object, but it can also represent a service principal object in Azure AD.| |appidacr |1 |Indicates how the client was authenticated. For a public client, the value is "0". If client ID and client secret are used, the value is "1". If a client certificate was used for authentication, the value is "2".| |idp |https://sts.windows.net/{tenantid}/|Records the identity provider that authenticated the subject of the token. This value is identical to the value of the Issuer claim unless the user account is not in the same tenant as the issuer - guests, for instance. If the claim is not present, it means that the value of iss can be used instead. For personal accounts being used in an organizational context (for instance, a personal account invited to an Azure AD tenant), the idp claim may be 'live.com' or an STS URI containing the Microsoft account tenant 9188040d-6c67-4c5b-b112-36a304b66dad.|
-|oid |e.g., tenantid |The is the immutable identifier for an object in the Microsoft identity system, in this case, a user account. This ID uniquely identifies the user across applications - two different applications signing in the same user will receive the same value in the oid claim. The Microsoft Graph will return this ID as the ID property for a given user account. Because the oid allows multiple apps to correlate users, the profile scope is required to receive this claim. Note: If a single user exists in multiple tenants, the user will contain a different object ID in each tenant - they are considered different accounts, even though the user logs into each account with the same credentials.|
+|oid |For example, tenantid |This is the immutable identifier for an object in the Microsoft identity system, in this case, a user account. This ID uniquely identifies the user across applications - two different applications signing in the same user will receive the same value in the oid claim. The Microsoft Graph will return this ID as the ID property for a given user account. Because the oid allows multiple apps to correlate users, the profile scope is required to receive this claim. Note: If a single user exists in multiple tenants, the user will contain a different object ID in each tenant - they are considered different accounts, even though the user logs into each account with the same credentials.|
|rh |0.ARoxxx |An internal claim used by Azure to revalidate tokens. It should be ignored.|
-|sub |e.g., tenantid |The principal about which the token asserts information, such as the user of an app. This value is immutable and cannot be reassigned or reused. The subject is a pairwise identifier - it is unique to a particular application ID. Therefore, if a single user signs into two different apps using two different client IDs, those apps will receive two different values for the subject claim. This may or may not be desired depending on your architecture and privacy requirements.|
-|tid |e.g., tenantid |A GUID that represents the Azure AD tenant that the user is from. For work and school accounts, the GUID is the immutable tenant ID of the organization that the user belongs to. For personal accounts, the value is 9188040d-6c67-4c5b-b112-36a304b66dad. The profile scope is required in order to receive this claim.
+|sub |For example, tenantid |The principal about which the token asserts information, such as the user of an app. This value is immutable and cannot be reassigned or reused. The subject is a pairwise identifier - it is unique to a particular application ID. Therefore, if a single user signs into two different apps using two different client IDs, those apps will receive two different values for the subject claim. This may or may not be desired depending on your architecture and privacy requirements.|
+|tid |For example, tenantid |A GUID that represents the Azure AD tenant that the user is from. For work and school accounts, the GUID is the immutable tenant ID of the organization that the user belongs to. For personal accounts, the value is 9188040d-6c67-4c5b-b112-36a304b66dad. The profile scope is required in order to receive this claim.
|uti |bY5glsxxx |An internal claim used by Azure to revalidate tokens. It should be ignored.| |ver |1 |Indicates the version of the token.| **The access token is valid for one hour by default. You can obtain a new token or renew it using the refresh token before it expires.**
-To obtain an access token, you can use tools such as Postman, the Rest Client extension in Visual Studio Code, PowerShell, CLI, curl and the [Azure AD authentication libraries](../active-directory/develop/reference-v2-libraries.md).
+To obtain an access token, you can use tools such as Postman, the Rest Client extension in Visual Studio Code, PowerShell, CLI, curl, and the [Azure AD authentication libraries](../active-directory/develop/reference-v2-libraries.md).
## Next steps
healthcare-apis Autoscale Azure Api Fhir https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/azure-api-for-fhir/autoscale-azure-api-fhir.md
+
+ Title: Autoscale for Azure API for FHIR
+description: This article describes the autoscale feature for Azure API for FHIR.
++++ Last updated : 07/26/2021+++
+# Autoscale for Azure API for FHIR
+
+The Azure API for FHIR as a managed service allows customers to persist with FHIR compliant healthcare data and exchange it securely through the service API. To accommodate different transaction workloads, customers can use manual scale or autoscale.
+
+## What is autoscale?
+
+By default, the Azure API for FHIR is set to manual scale. This option works well when the transaction workloads are known and consistent. Customers can adjust the throughput `RU/s` through the portal up to 10,000 and submit a request to increase the limit.
+
+With autoscale, customers can run various workloads and the throughput `RU/s` are scaled up and down automatically without manual adjustments.
+
+## How to enable autoscale?
+
+To enable the autoscale feature, you can create a one-time support ticket to request it. The Microsoft support team will enable the autoscale feature based on the support priority.
+
+> [!NOTE]
+> The autoscale feature isn't available from the Azure portal.
+
+## How to adjust the maximum throughput RU/s?
+
+When autoscale is enabled, the system calculates and sets the initial `Tmax` value. The scalability is governed by the maximum throughput `RU/s` value, or `Tmax`, and runs between `0.1 *Tmax` (or 10% `Tmax`) and `Tmax RU/s`.
+
+You can increase the max `RU/s` or `Tmax` value and go as high as the service supports. When the service is busy, the throughput `RU/s` are scaled up to the `Tmax` value. When the service is idle, the throughput `RU/s` are scaled down to 10% `Tmax` value.
+
+You can also decrease the max `RU/s` or `Tmax` value. When you lower the max `RU/s`, the minimum value you can set it to is: `MAX (4000, highest max RU/s ever provisioned / 10, current storage in GB * 400)`, rounded to the nearest 1000 `RU/s`.
+
+* **Example 1**: You have 1-GB data and the highest provisioned `RU/s` is 10,000. The minimum value is Max (**4000**, 10,000/10, 1x400) = 4000. The first number, **4000**, is used.
+* **Example 2**: You have 20-GB data and the highest provisioned `RU/s` is 100,000. The minimum value is Max (4000, **100,000/10**, 20x400) = 10,000. The second number, **100,000/10 =10,000**, is used.
+* **Example 3**: You have 80-GB data and the highest provisioned RU/s is 300,000. The minimum value is Max (4000, 300,000/10, **80x400**) = 32,000. The third number, **80x400=32,000**, is used.
+
+You can adjust the max `RU/s` or `Tmax` value through the portal if it is a valid number, and it is no greater than 10,000 `RU/s`. You can create a support ticket to request `Tmax` value larger than 10,000.
+
+## What is the cost impact of autoscale?
+
+The autoscale feature incurs costs because of managing the provisioned throughput units automatically. This cost increase doesn't apply to storage and runtime costs. For information about pricing, see [Azure API for FHIR pricing](https://azure.microsoft.com/pricing/details/azure-api-for-fhir/).
+
+>[!div class="nextstepaction"]
+>[About Azure API for FHIR](overview.md)
healthcare-apis Disaster Recovery https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/azure-api-for-fhir/disaster-recovery.md
+
+ Title: Disaster recovery for Azure API for FHIR
+description: In this article, you'll learn how to enable disaster recovery features for Azure API for FHIR.
++++ Last updated : 08/03/2021+++
+# Disaster recovery for Azure API for FHIR
+
+The Azure API for FHIR® is a fully managed service, based on Fast Healthcare Interoperability Resources (FHIR®). To meet business and compliance requirements you can use the disaster recovery (DR) feature for Azure API for FHIR.
+
+The DR feature provides a Recovery Point Objective (RPO) of 15 minutes and a Recovery Time Objective (RTO) of 60 minutes.
+
+ ## How to enable DR
+
+To enable the DR feature, create a one-time support ticket. You can choose an Azure paired region or another region where the Azure API for FHIR is supported. The Microsoft support team will enable the DR feature based on the support priority.
+
+## How the DR process works
+
+The DR process involves the following steps:
+* Data replication
+* Automatic failover
+* Affected region recovery
+* Manual failback
+
+### Data replication in the secondary region
+
+By default, the Azure API for FHIR offers data protection through backup and restore. When the disaster recovery feature is enabled, data replication begins. A data replica is automatically created and synchronized in the secondary Azure region. The initial data replication can take a few minutes to a few hours, or longer, depending on the amount of data. The secondary data replica is a replication of the primary data. It's used directly to recover the service, and it helps speed up the recovery process.
+
+It's worth noting that the throughput RU/s must have the same values in the primary and secondary regions.
+
+[ ![Azure Traffic Manager.](media/disaster-recovery/azure-traffic-manager.png) ](media/disaster-recovery/azure-traffic-manager.png#lightbox)
+
+### Automatic failover
+
+During a primary region outage, the Azure API for FHIR automatically fails over to the secondary region and the same service endpoint is used. The service is expected to resume in one hour or less, and potential data loss is up to 15 minutes' worth of data. Other configuration changes may be required. For more information, see [Configuration changes in DR](#configuration-changes-in-dr).
+
+[ ![Failover in disaster recovery.](media/disaster-recovery/failover-in-disaster-recovery.png) ](media/disaster-recovery/failover-in-disaster-recovery.png#lightbox)
+
+### Affected region recovery
+
+After the affected region recovers, it's automatically available as a secondary region and data replication restarts. You can start the data recovery process or wait until the failback step is completed.
+
+[ ![Replication in disaster recovery.](media/disaster-recovery/replication-in-disaster-recovery.png) ](media/disaster-recovery/replication-in-disaster-recovery.png#lightbox)
+
+When the compute has failed back to the recovered region and the data hasn't, there may be potential network latencies. The main reason is that the compute and the data are in two different regions. The network latencies should disappear automatically as soon as the data fails back to the recovered region through a manual trigger.
+
+[ ![Network latency.](media/disaster-recovery/network-latency.png) ](media/disaster-recovery/network-latency.png#lightbox)
++
+### Manual failback
+
+The compute fails back automatically to the recovered region. The data is switched back to the recovered region manually by the Microsoft support team using the script.
+
+[ ![Failback in disaster recovery.](media/disaster-recovery/failback-in-disaster-recovery.png) ](media/disaster-recovery/failback-in-disaster-recovery.png#lightbox)
+
+## Configuration changes in DR
+
+Other configuration changes may be required when Private Link, Customer Managed Key (CMK), IoMT FHIR Connector (the Internet of Medical Things) and $export are used.
+
+### Private link
+
+You can enable the private link feature before or after the Azure API for FHIR has been provisioned. You can also provision private link before or after the DR feature has been enabled. Refer to the list below when you're ready to configure Private Link for DR.
+
+* Configure Azure Private Link in the primary region. This step isn't required in the secondary region. For more information, see [Configure private link](https://docs.microsoft.com/azure/healthcare-apis/fhir/configure-private-link)
+
+* Create one Azure VNet in the primary region and another VNet in the secondary region. For information, see [Create a virtual network using the Azure portal](https://docs.microsoft.com/azure/virtual-network/quick-create-portal).
+
+* In the primary region, VNet creates a VNet peering to the secondary region VNet. For more information, see [Virtual network peering](https://docs.microsoft.com/azure/virtual-network/virtual-network-peering-overview).
+
+* Use the default settings, or you can tailor the configuration as needed. The importance is that the traffic can flow between the two virtual networks.
+
+* When the private DNS is set up, the VNet in the secondary region needs to be manually set up as a "Virtual network links". The primary VNet should have already been added as part of the Private Link endpoint creation flow. For more information, see [Virtual network links](https://docs.microsoft.com/azure/dns/private-dns-virtual-network-links).
+
+* Optionally, set up one VM in the primary region VNet and one in the secondary region VNet. You can access the Azure API for FHIR from both VMs.
+
+The private link feature should continue to work during a regional outage and after the failback has completed. For more information, see [Configure private link](https://docs.microsoft.com/azure/healthcare-apis/fhir/configure-private-link).
+
+> [!NOTE]
+> Configuring virtual networks and VNet peering does not affect data replication.
+
+### CMK
+
+Your access to the Azure API for FHIR will be maintained if the key vault hosting the managed key in your subscription is accessible. There's a possible temporary downtime as Key Vault can take up to 20 minutes to re-establish its connection. For more information, see [Azure Key Vault availability and redundancy](https://docs.microsoft.com/azure/key-vault/general/disaster-recovery-guidance).
+
+### $export
+
+The export job will be picked up from another region after 10 minutes without an update to the job status. Follow the guidance for Azure storage for recovering your storage account in the event of a regional outage. For more information, see [Disaster recovery and storage account failover](https://docs.microsoft.com/azure/storage/common/storage-disaster-recovery-guidance).
+
+Ensure that you grant the same permissions to the system identity of the Azure API for FHIR. Also, if the storage account is configured with selected networks, see [How to export FHIR data](https://docs.microsoft.com/azure/healthcare-apis/fhir/export-data).
+
+### IoMT FHIR Connector
+
+Any existing connection won't function until the failed region is restored. You can create a new connection once the failover has completed and your FHIR server is accessible. This new connection will continue to function when failback occurs.
+
+> [!NOTE]
+> IoMT Connector is a preview feature and does not provide support for disaster recovery.
+
+## How to test DR
+
+While not required, you can test the DR feature on a non-production environment. For DR test, only the data will be included and the compute won't be included.
+
+Consider the following steps for DR test.
+
+* Prepare a test environment with test data. It's recommended that you use a service instance with small amounts of data to reduce the time to replicate the data.
+
+* Create a support ticket and provide your Azure subscription and the service name for the Azure API for FHIR for your test environment.
+
+* Come up with a test plan, as you would with any DR test.
+
+* The Microsoft support team enables the DR feature and confirms that the failover has taken place.
+
+* Conduct your DR test and record the testing results, which it should include any data loss and network latency issues.
+
+* For failback, notify the Microsoft support team to complete the failback step.
+
+* (Optional) Share any feedback with the Microsoft support team.
++
+> [!NOTE]
+> The DR test will double the cost of your test environment during the test. No extra cost will incur after the DR test is completed and the DR feature is disabled.
+
+## Cost of disaster recovery
+
+The disaster recovery feature incurs extra costs because data of the compute and data replica running in the environment in the secondary region. For more pricing details, refer to the [Azure API for FHIR pricing]( https://azure.microsoft.com/pricing/details/azure-api-for-fhir) web page.
+
+> [!NOTE]
+> The DR offering is subject to the [SLA for Azure API for FHIR](https://azure.microsoft.com/support/legal/sla/azure-api-for-fhir/v1_0/), 1.0.
++
+## Next steps
+
+In this article, you've learned how DR for Azure API for FHIR works and how to enable it. To learn about Azure API for FHIR's other supported features, see:
+
+>[!div class="nextstepaction"]
+>[FHIR supported features](fhir-features-supported.md)
healthcare-apis Overview Of Search https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/azure-api-for-fhir/overview-of-search.md
To help manage the returned resources, there are search result parameters that y
| - | -- | - | | | _elements | Yes | Yes | | _count | Yes | Yes | _count is limited to 1000 resources. If it's set higher than 1000, only 1000 will be returned and a warning will be returned in the bundle. |
-| _include | Yes | Yes | Included items are limited to 100. _include on PaaS and OSS on Cosmos DB do not include :iterate support [(#1313)](https://github.com/microsoft/fhir-server/issues/1313). |
-| _revinclude | Yes | Yes |Included items are limited to 100. _revinclude on PaaS and OSS on Cosmos DB do not include :iterate support [(#1313)](https://github.com/microsoft/fhir-server/issues/1313). Issue [#1319](https://github.com/microsoft/fhir-server/issues/1319) |
+| _include | Yes | Yes | Included items are limited to 100. _include on PaaS and OSS on Cosmos DB do not include :iterate support [(#2137)](https://github.com/microsoft/fhir-server/issues/2137). |
+| _revinclude | Yes | Yes |Included items are limited to 100. _revinclude on PaaS and OSS on Cosmos DB do not include :iterate support [(#2137)](https://github.com/microsoft/fhir-server/issues/2137). There is also an incorrect status code for a bad request [#1319](https://github.com/microsoft/fhir-server/issues/1319) |
| _summary | Yes | Yes | | _total | Partial | Partial | _total=none and _total=accurate | | _sort | Partial | Partial | sort=_lastUpdated is supported. For Azure API for FHIR and OSS Cosmos DB databases created after April 20, 2021 sort is also supported on first name, last name, and clinical date. The FHIR service and the OSS SQL DB database support sorting by strings and dates. |
healthcare-apis Api Versioning Dicom Service https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/api-versioning-dicom-service.md
+
+ Title: API Versioning for DICOM service - Azure Healthcare APIs
+description: This guide gives an overview of the API version policies for the DICOM service.
+++++ Last updated : 08/04/2021+++
+# API versioning for DICOM service
+
+> [!IMPORTANT]
+> Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+
+This reference guide provides you with an overview of the API version policies for the DICOM service.
+
+All versions of the DICOM APIs will always conform to the DICOMwebΓäó Standard specifications, but versions may expose different APIs based on the [DICOM Conformance Statement](dicom-services-conformance-statement.md).
+
+## Specifying version of REST API in requests
+
+The version of the REST API should be explicitly specified in the request URL as in the following example:
+
+`<service_url>/v<version>/studies`
+
+Currently routes without a version are still supported. For example, `<service_url>/studies` has the same behavior as specifying the version as v1.0-prerelease. However, we strongly recommended that you specify the version in all requests via the URL.
+
+## Supported versions
+
+Currently the supported versions are:
+
+* v1.0-prerelease
+
+The OpenApi Doc for the supported versions can be found at the following url:
+
+`<service_url>/{version}/api.yaml`
+
+## Prerelease versions
+
+An API version with the label "prerelease" indicates that the version is not ready for production, and it should only be used in testing environments. These endpoints may experience breaking changes without notice.
+
+## How versions are incremented
+
+We currently only increment the major version whenever there is a breaking change, which is considered to be not backwards compatible. All minor versions are implied to be 0. All versions are in the format `Major.0`.
+
+Below are some examples of breaking changes (Major version is incremented):
+
+1. Renaming or removing endpoints.
+2. Removing parameters or adding mandatory parameters.
+3. Changing status code.
+4. Deleting a property in a response, or altering a response type at all, but it's okay to add properties to the response.
+5. Changing the type of a property.
+6. Behavior when an API changes such as changes in business logic used to do foo, but it now does bar.
+
+Non-breaking changes (Version is not incremented):
+
+1. Addition of properties that are nullable or have a default value.
+2. Addition of properties to a response model.
+3. Changing the order of properties.
+
+## Header in response
+
+ReportApiVersions is turned on, which means we will return the headers api-supported-versions and api-deprecated-versions when appropriate.
+
+* api-supported-versions will list which versions are supported for the requested API. It's only returned when calling an endpoint annotated with `ApiVersion("<someVersion>")`.
+
+* api-deprecated-versions will list which versions have been deprecated for the requested API. It's only returned when calling an endpoint annotated with `ApiVersion("<someVersion>", Deprecated = true)`.
+
+Example:
+
+ApiVersion("1.0")
+
+ApiVersion("1.0-prerelease", Deprecated = true)
+
+[ ![API supported and deprecated versions.](media/api-supported-deprecated-versions.png) ](media/api-supported-deprecated-versions.png#lightbox)
+
healthcare-apis Deploy Dicom Services In Azure https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/deploy-dicom-services-in-azure.md
Previously updated : 07/10/2021 Last updated : 08/04/2021
In this quickstart, you'll learn how to deploy the DICOM Service using the Azure portal.
+Once deployment is complete, you can use the Azure portal to navigate to the newly created DICOM service to see the details including your Service URL. The Service URL to access your DICOM service will be: ```https://<workspacename-dicomservicename>.dicom.azurehealthcareapis.com```. Make sure to specify the version as part of the url when making requests. More information can be found in the [API Versioning for DICOM service documentation](api-versioning-dicom-service.md).
+ ## Prerequisite To deploy the DICOM service, you must have a workspace created in the Azure portal. For more information about creating a workspace, see **Deploy Workspace in the Azure portal**.
healthcare-apis Dicom Change Feed Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicom-change-feed-overview.md
Previously updated : 07/10/2021 Last updated : 08/04/2021
> [!IMPORTANT] > Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
-The Change Feed provides logs of all the changes that occur in the DICOM Service. The Change Feed provides ordered, guaranteed, immutable, and read-only logs of these changes. The Change Feed offers the ability to go through the history of the DICOM Service and acts upon the creates and deletes in the service.
+The Change Feed provides logs of all the changes that occur in the DICOM service. The Change Feed provides ordered, guaranteed, immutable, and read-only logs of these changes. The Change Feed offers the ability to go through the history of the DICOM service and acts upon the creates and deletes in the service.
Client applications can read these logs at any time, either in streaming, or in batch mode. The Change Feed enables you to build efficient and scalable solutions that process change events that occur in your DICOM service. You can process these change events asynchronously, incrementally or in-full. Any number of client applications can independently read the Change Feed, in parallel, and at their own pace.
+Make sure to specify the version as part of the URL when making requests. More information can be found in the [API Versioning for DICOM service Documentation](api-versioning-dicom-service.md).
+ ## API Design The API exposes two `GET` endpoints for interacting with the Change Feed. A typical flow for consuming the Change Feed is [provided below](#example-usage-flow).
healthcare-apis Dicom Configure Azure Rbac https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicom-configure-azure-rbac.md
Last updated 07/13/2020
-# Configure Azure RBAC for the DICOM service
+# Configure Azure RBAC for the DICOM service
+
+> [!IMPORTANT]
+> Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
In this article, you will learn how to use [Azure role-based access control (Azure RBAC)](https://docs.microsoft.com/azure/role-based-access-control) to assign access to the DICOM service.
healthcare-apis Dicom Get Access Token Azure Cli https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicom-get-access-token-azure-cli.md
# Get access token for the DICOM service using Azure CLI
+> [!IMPORTANT]
+> Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+ In this article, you'll learn how to obtain an access token for the DICOM service using the Azure CLI. When you [deploy the DICOM service](deploy-dicom-services-in-azure.md), you configure a set of users or service principals that have access to the service. If your user object ID is in the list of allowed object IDs, you can access the service using a token obtained using the Azure CLI. ## Prerequisites
$token=$(az account get-access-token --resource=https://dicom.healthcareapis.azu
```Azure CLICopy Try It
-curl -X GET --header "Authorization: Bearer $token" https://<workspace-service>.dicom.azurehealthcareapis.com/changefeed
+curl -X GET --header "Authorization: Bearer $token" https://<workspacename-dicomservicename>.dicom.azurehealthcareapis.com/v<version of REST API>/changefeed
``` ## Next steps
healthcare-apis Dicom Register Azure Active Directory Applications https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicom-register-azure-active-directory-applications.md
# Register Azure Active Directory applications for the DICOM service
+> [!IMPORTANT]
+> Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+ You have several options to choose from when you're setting up the DICOM service or the FHIR Server for Azure (OSS). For open source, you'll need to create your own resource application registration. For Azure API for FHIR, this resource application is created automatically. ## Application registrations
healthcare-apis Dicom Register Confidential Client Application https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicom-register-confidential-client-application.md
# Register a confidential client application
+> [!IMPORTANT]
+> Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+ In this tutorial, you'll learn how to register a confidential client application in Azure Active Directory (Azure AD). ## Register a new application
healthcare-apis Dicom Register Public Application https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicom-register-public-application.md
# Register a public client application
+> [!IMPORTANT]
+> Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+ In this article, you'll learn how to register a public application in the Azure Active Directory (Azure AD). Client application registrations are Azure AD representations of applications that can authenticate and ask for API permissions on behalf of a user. Public clients are applications such as mobile applications and single page JavaScript applications that can't keep secrets confidential. The procedure is similar to [registering a confidential client application](dicom-register-confidential-client-application.md), but since public clients can't be trusted to hold an application secret, there's no need to add one.
healthcare-apis Dicom Register Service Client Application https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicom-register-service-client-application.md
# Register a service client application
+> [!IMPORTANT]
+> Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+ In this article, you'll learn how to register a service client application in Azure Active Directory (Azure AD). ## Application registrations in the Azure portal
healthcare-apis Dicom Services Conformance Statement https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicom-services-conformance-statement.md
Previously updated : 06/21/2021 Last updated : 08/04/2021
Additionally, the following non-standard API(s) are supported:
- [Delete](#delete)
+Our service also makes use of REST API versioning. For information on how to specify the version when making requests visit the [API Versioning for DICOM service Documentation](api-versioning-dicom-service.md).
+ ## Store (STOW-RS) This transaction uses the POST method to store representations of studies, series, and instances contained in the request payload.
healthcare-apis Dicom Services Overview https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicom-services-overview.md
# Overview of the DICOM service
-This article describes the concepts of DICOM, Medical Imaging, and the DICOM service.
- > [!IMPORTANT] > Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+This article describes the concepts of DICOM, Medical Imaging, and the DICOM service.
+ ## Medical imaging Medical imaging is the technique and process of creating visual representations of the interior of a body for clinical analysis and medical intervention, as well as visual representation of the function of some organs or tissues (physiology). Medical imaging seeks to reveal internal structures hidden by the skin and bones, as well as to diagnose and treat disease. Medical imaging also establishes a database of normal anatomy and physiology to make it possible to identify abnormalities. Although imaging of removed organs and tissues can be performed for medical reasons, such procedures are usually part of pathology instead of medical imaging. [Wikipedia, Medical imaging](https://en.wikipedia.org/wiki/Medical_imaging)
healthcare-apis Dicomweb Standard Apis C Sharp https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicomweb-standard-apis-c-sharp.md
Previously updated : 07/16/2021 Last updated : 08/03/2021
After you've deployed an instance of the DICOM service, retrieve the URL for you
1. Sign into the [Azure portal](https://ms.portal.azure.com/). 1. Search **Recent resources** and select your DICOM service instance.
-1. Copy the **Service URL** of your DICOM service.
+1. Copy the **Service URL** of your DICOM service. Make sure to specify the version as part of the url when making requests. More information can be found in the [API Versioning for DICOM service Documentation](api-versioning-dicom-service.md).
In your application, install the following NuGet packages:
healthcare-apis Dicomweb Standard Apis Curl https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicomweb-standard-apis-curl.md
Once you've deployed an instance of the DICOM service, retrieve the URL for your
1. Sign into the [Azure portal](https://ms.portal.azure.com/). 2. Search **Recent resources** and select your DICOM service instance.
-3. Copy the **Service URL** of your DICOM service.
+3. Copy the **Service URL** of your DICOM service.
4. If you haven't already obtained a token, see [Get access token for the DICOM service using Azure CLI](dicom-get-access-token-azure-cli.md). For this code, we'll be accessing an Public Preview Azure service. It is important that you don't upload any private health information (PHI).
The DICOMweb&trade; Standard makes heavy use of `multipart/related` HTTP request
The cURL commands each contain at least one, and sometimes two, variables that must be replaced. To simplify running the commands, search and replace the following variables by replacing them with your specific values:
-* {dicom-service-name} This is the service URL of the DICOM service that you provisioned in the Azure portal, for example, `http://{service-name}.dicom.azurehealthcareapis.com`.
+* {Service URL} This is the URL to access your DICOM service that you provisioned in the Azure portal, for example, ```https://<workspacename-dicomservicename>.dicom.azurehealthcareapis.com```. Make sure to specify the version as part of the url when making requests. More information can be found in the [API Versioning for DICOM service Documentation](api-versioning-dicom-service.md).
* {path-to-dicoms} - The path to the directory that contains the red-triangle.dcm file, such as `C:/dicom-server/docs/dcms` * Ensure to use forward slashes as separators and end the directory _without_ a trailing forward slash.
Some programming languages and tools behave differently. For instance, some requ
* Content-Type: multipart/related ```
-curl --location --request POST "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies"
+curl --location --request POST "{Service URL}/v{version}/studies"
--header "Accept: application/dicom+json" --header "Content-Type: multipart/related; type=\"application/dicom\""header "Authorization: Bearer {token value"
+--header "Authorization: Bearer {token value}"
--form "file1=@{path-to-dicoms}/red-triangle.dcm;type=application/dicom" --trace-ascii "trace.txt" ```
Some programming languages and tools behave differently. For instance, some requ
* Content-Type: multipart/related ```
-curl --request POST "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420"
+curl --request POST "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420"
--header "Accept: application/dicom+json" --header "Content-Type: multipart/related; type=\"application/dicom\" --header "Authorization: Bearer {token value}"
_Details:_
* Contains a single DICOM file as binary bytes. ```
-curl --location --request POST "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies"
+curl --location --request POST "{Service URL}/v{version}/studies"
--header "Accept: application/dicom+json" --header "Content-Type: application/dicom" --header "Authorization: Bearer {token value}"
_Details:_
* Authorization: Bearer {token value} ```
-curl --request GET "https://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420"
+curl --request GET "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420"
--header "Accept: multipart/related; type=\"application/dicom\"; transfer-syntax=*" --header "Authorization: Bearer {token value}" --output "suppressWarnings.txt"
_Details:_
This cURL command will show the downloaded bytes in the output file (suppressWarnings.txt), but these are not direct DICOM files, only a text representation of the multipart/related download. ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/metadata"
+curl --request GET "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/metadata"
--header "Accept: application/dicom+json" --header "Authorization: Bearer {token value}" ```
_Details:_
This cURL command will show the downloaded bytes in the output file (suppressWarnings.txt), but it's not the DICOM file, only a text representation of the multipart/related download. ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652"
+curl --request GET "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652"
--header "Accept: multipart/related; type=\"application/dicom\"; transfer-syntax=*" --header "Authorization: Bearer {token value}" --output "suppressWarnings.txt"
_Details:_
* Authorization: Bearer {token value} ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/metadata"
+curl --request GET "{Service URL}/v{version}/studies1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/metadata"
--header "Accept: application/dicom+json" --header "Authorization: Bearer {token value}" ```
_Details:_
* Authorization: Bearer {token value} ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/instances/1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395"
+curl --request GET "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/instances/1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395"
--header "Accept: application/dicom; transfer-syntax=*" --header "Authorization: Bearer {token value}" --output "suppressWarnings.txt"
_Details:_
* Authorization: Bearer {token value} ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/instances/1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395/metadata"
+curl --request GET "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/instances/1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395/metadata"
--header "Accept: application/dicom+json" --header "Authorization: Bearer {token value}" ```
_Details:_
* Authorization: Bearer {token value} ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/instances/1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395/frames/1"
+curl --request GET "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/instances/1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395/frames/1"
--header "Accept: multipart/related; type=\"application/octet-stream\"; transfer-syntax=1.2.840.10008.1.2.1" --header "Authorization: Bearer {token value}" --output "suppressWarnings.txt"
_Details:_
* Authorization: Bearer {token value} ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies?StudyInstanceUID=1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420"
+curl --request GET "{Service URL}/v{version}/studies?StudyInstanceUID=1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420"
--header "Accept: application/dicom+json" --header "Authorization: Bearer {token value}" ```
_Details:_
* Authorization: Bearer {token value} ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/series?SeriesInstanceUID=1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652"
+curl --request GET "{Service URL}/v{version}/series?SeriesInstanceUID=1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652"
--header "Accept: application/dicom+json" --header "Authorization: Bearer {token value}" ```
_Details:_
* Authorization: Bearer {token value} ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series?SeriesInstanceUID=1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652"
+curl --request GET "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series?SeriesInstanceUID=1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652"
--header "Accept: application/dicom+json" --header "Authorization: Bearer {token value}" ```
_Details:_
* Authorization: Bearer {token value} ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/instances?SOPInstanceUID=1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395"
+curl --request GET "{Service URL}/v{version}/instances?SOPInstanceUID=1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395"
--header "Accept: application/dicom+json" --header "Authorization: Bearer {token value}" ```
_Details:_
* Authorization: Bearer {token value} ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/instances?SOPInstanceUID=1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395"
+curl --request GET "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/instances?SOPInstanceUID=1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395"
--header "Accept: application/dicom+json" --header "Authorization: Bearer {token value}" ```
_Details:_
* Authorization: Bearer {token value} ```
-curl --request GET "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/instances?SOPInstanceUID=1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395"
+curl --request GET "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/instances?SOPInstanceUID=1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395"
--header "Accept: application/dicom+json" --header "Authorization: Bearer {token value}" ```
_Details:_
* Authorization: Bearer {token value} ```
-curl --request DELETE "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/instances/1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395"
+curl --request DELETE "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652/instances/1.2.826.0.1.3680043.8.498.47359123102728459884412887463296905395"
--header "Authorization: Bearer {token value}" ```
_Details:_
* Authorization: Bearer {token value} ```
-curl --request DELETE "http://{dicom-service-name}.dicom.azurehealthcareapis.com/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652"
+curl --request DELETE "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420/series/1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652"
--header "Authorization: Bearer {token value}" ```
_Details:_
* Authorization: Bearer {token value} ```
-curl--request DELETE "http://{dicom-service-name}.azurewebsites.net/studies/1.2.826.0.1.3680043.8.498
+curl--request DELETE "{Service URL}/v{version}/studies/1.2.826.0.1.3680043.8.498
--header "Authorization: Bearer {token value}" ```
healthcare-apis Dicomweb Standard Apis Python https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicomweb-standard-apis-python.md
After you've deployed an instance of the DICOM service, retrieve the URL for you
1. Sign into the [Azure portal](https://ms.portal.azure.com/). 1. Search **Recent resources** and select your DICOM service instance.
-1. Copy the **Service URL** of your DICOM service.
+1. Copy the **Service URL** of your DICOM service.
2. If you haven't already obtained a token, see [Get access token for the DICOM service using Azure CLI](dicom-get-access-token-azure-cli.md). For this code, we'll be accessing an Public Preview Azure service. It is important that you don't upload any private health information (PHI).
from azure.identity import DefaultAzureCredential
### Configure user-defined variables to be used throughout
-Replace all variable values wrapped in { } with your own values. Additionally, validate that any constructed variables are correct. For instance, `base_url` is constructed using the default URL for Azure App service. If you're using a custom URL, you'll need to override that value with your own.
+Replace all variable values wrapped in { } with your own values. Additionally, validate that any constructed variables are correct. For instance, `base_url` is constructed using the Service URL and then appended with the version of the REST API being used. The Service URL of your DICOM service will be: ```https://<workspacename-dicomservicename>.dicom.azurehealthcareapis.com```. You can use the Azure Portal to navigate to the DICOM service and obtain your Service URL. You can also visit the [API Versioning for DICOM service Documentation](api-versioning-dicom-service.md) for more information on versioning. If you're using a custom URL, you'll need to override that value with your own.
```python dicom_service_name = "{server-name}" path_to_dicoms_dir = "{path to the folder that includes green-square.dcm and other dcm files}"
-base_url = f"https://{dicom_service_name}.dicom.azurehealthcareapis.com"
+base_url = f"{Service URL}/v{version}"
study_uid = "1.2.826.0.1.3680043.8.498.13230779778012324449356534479549187420"; #StudyInstanceUID for all 3 examples series_uid = "1.2.826.0.1.3680043.8.498.45787841905473114233124723359129632652"; #SeriesInstanceUID for green-square and red-triangle
healthcare-apis Dicomweb Standard Apis With Dicom Services https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/dicomweb-standard-apis-with-dicom-services.md
Previously updated : 07/10/2021 Last updated : 08/04/2021
-# Using DICOMweb&trade;Standard APIs with DICOM Services
+# Using DICOMweb&trade;Standard APIs with DICOM services
> [!IMPORTANT] > Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability. This tutorial provides an overview of how to use the DICOMweb&trade; Standard APIs with the DICOM Services.
-The DICOM service supports a subset of the DICOMweb&trade; Standard. This support includes:
+The DICOM service supports a subset of the DICOMweb&trade; Standard. This support includes the following:
* Store (STOW-RS) * Retrieve (WADO-RS)
To learn more about our support of the DICOM Web Standard APIs, see the [DICOM C
## Prerequisites
-To use the DICOMweb&trade; Standard APIs, you must have an instance of the DICOM Services deployed. If you haven't already deployed an instance of the DICOM service, see [Deploy DICOM Service using the Azure portal](deploy-dicom-services-in-azure.md).
+To use the DICOMweb&trade; Standard APIs, you must have an instance of the DICOM Services deployed. If you haven't already deployed an instance of the DICOM service, see [Deploy DICOM service using the Azure portal](deploy-dicom-services-in-azure.md).
+
+Once deployment is complete, you can use the Azure portal to navigate to the newly created DICOM service to see the details including your Service URL. The Service URL to access your DICOM service will be: ```https://<workspacename-dicomservicename>.dicom.azurehealthcareapis.com```. Make sure to specify the version as part of the url when making requests. More information can be found in the [API Versioning for DICOM service Documentation](api-versioning-dicom-service.md).
## Overview of various methods to use with DICOM service
To see language-specific examples, refer to the examples below. If you open the
### C#
-The C# examples use the library included in this repo to simplify access to the API. Refer to the [C# examples](dicomweb-standard-apis-c-sharp.md) to learn how to use C# with the DICOM service.
+Refer to the [Using DICOMwebΓäó Standard APIs with C#](dicomweb-standard-apis-c-sharp.md) tutorial to learn how to use C# with the DICOM service.
### cURL
-cURL is a common command-line tool for calling web endpoints that is available for nearly any operating system. [Download cURL](https://curl.haxx.se/download.html) to get started. To use the examples, you'll need to replace the server name with your instance name, and then you must download the [sample DICOM files](https://github.com/microsoft/dicom-server/tree/main/docs/dcms) in this repo to a known location on your local file system. To learn how to use cURL with the DICOM service, see [cURL examples](dicomweb-standard-apis-curl.md).
+cURL is a common command-line tool for calling web endpoints that is available for nearly any operating system. [Download cURL](https://curl.haxx.se/download.html) to get started.
+
+To learn how to use cURL with the DICOM service, see [Using DICOMWebΓäó Standard APIs with cURL](dicomweb-standard-apis-curl.md) tutorial.
+
+### Phyton
+
+Refer to the [Using DICOMWebΓäó Standard APIs with Python](dicomweb-standard-apis-python.md) tutorial to learn how to use Python with the DICOM service.
### Postman
To use the Postman collection, you'll need to download the collection locally an
This tutorial provided an overview of the APIs supported by the DICOM service. Get started using these APIs with the following tools: -- [Use DICOM Web Standard APIs with C#](dicomweb-standard-apis-c-sharp.md)-- [Use DICOM Web Standard APIs with cURL](dicomweb-standard-apis-curl.md)
+- [Using DICOMwebΓäó Standard APIs with C#](dicomweb-standard-apis-c-sharp.md)
+- [Using DICOMWebΓäó Standard APIs with cURL](dicomweb-standard-apis-curl.md)
+- [Using DICOMWebΓäó Standard APIs with Python](dicomweb-standard-apis-python.md)
- [Use DICOM Web Standard APIs with Postman Example Collection](https://github.com/microsoft/dicom-server/blob/main/docs/resources/Conformance-as-Postman.postman_collection.json) ### Next Steps
healthcare-apis Enable Diagnostic Logging https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/enable-diagnostic-logging.md
# Enable Diagnostic Logging in the DICOM service
+> [!IMPORTANT]
+> Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+ In this article, you will learn how to enable diagnostic logging in DICOM service and be able to review some sample queries for these logs. Access to diagnostic logs is essential for any healthcare service where compliance with regulatory requirements is a must. The feature in DICOM service enables diagnostic logs is the [Diagnostic settings](../../azure-monitor/essentials/diagnostic-settings.md) in the Azure portal. ## Enable audit logs
healthcare-apis Pull Dicom Changes From Change Feed https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/dicom/pull-dicom-changes-from-change-feed.md
Previously updated : 07/10/2021 Last updated : 08/04/2021
DICOM Change Feed offers customers the ability to go through the history of the DICOM Service and act on the create and delete events in the service. This how-to guide describes how to consume Change Feed.
-The Change Feed is accessed using REST APIs. These APIs along with sample usage of Change Feed are documented in the [Overview of DICOM Change Feed](dicom-change-feed-overview.md).
+The Change Feed is accessed using REST APIs. These APIs along with sample usage of Change Feed are documented in the [Overview of DICOM Change Feed](dicom-change-feed-overview.md). The version of the REST API should be explicitly specified in the request URL as called out in the [API Versioning for DICOM service Documentation](api-versioning-dicom-service.md).
## Consume Change Feed
healthcare-apis Azure Ad Hcapi Token Validation https://github.com/MicrosoftDocs/azure-docs/commits/master/articles/healthcare-apis/fhir/azure-ad-hcapi-token-validation.md
Previously updated : 02/19/2019 Last updated : 08/03/2021 # FHIR service access token validation
+> [!IMPORTANT]
+> Azure Healthcare APIs is currently in PREVIEW. The [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/) include additional legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
+ How FHIR service validates the access token will depend on implementation and configuration. In this article, we will walk through the validation steps, which can be helpful when tro