Updates from: 04/12/2022 01:11:36
Service Microsoft Docs article Related commit history on GitHub Change details
active-directory Active Directory Authentication Libraries https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/azuread-dev/active-directory-authentication-libraries.md
The Azure Active Directory Authentication Library (ADAL) v1.0 enables applicatio
| Platform | Library | Download | Source Code | Sample | Reference | | | | | | |
-| .NET |OWIN for AzureAD|[NuGet](https://www.nuget.org/packages/Microsoft.Owin.Security.ActiveDirectory/) |[GitHub](https://github.com/aspnet/AspNetKatan) | |
-| .NET |OWIN for OpenIDConnect |[NuGet](https://www.nuget.org/packages/Microsoft.Owin.Security.OpenIdConnect) |[GitHub](https://github.com/aspnet/AspNetKatana/tree/dev/src/Microsoft.Owin.Security.OpenIdConnect) |[Web App](https://github.com/AzureADSamples/WebApp-OpenIDConnect-DotNet) | |
-| .NET |OWIN for WS-Federation |[NuGet](https://www.nuget.org/packages/Microsoft.Owin.Security.WsFederation) |[GitHub](https://github.com/aspnet/AspNetKatana/tree/dev/src/Microsoft.Owin.Security.WsFederation) |[MVC Web App](https://github.com/AzureADSamples/WebApp-WSFederation-DotNet) | |
+| .NET |OWIN for AzureAD|[NuGet](https://www.nuget.org/packages/Microsoft.Owin.Security.ActiveDirectory/) |[GitHub](https://github.com/aspnet/AspNetKatan) | |
+| .NET |OWIN for OpenIDConnect |[NuGet](https://www.nuget.org/packages/Microsoft.Owin.Security.OpenIdConnect) |[GitHub](https://github.com/aspnet/AspNetKatana/tree/main/src/Microsoft.Owin.Security.OpenIdConnect) |[Web App](https://github.com/AzureADSamples/WebApp-OpenIDConnect-DotNet) | |
+| .NET |OWIN for WS-Federation |[NuGet](https://www.nuget.org/packages/Microsoft.Owin.Security.WsFederation) |[GitHub](https://github.com/aspnet/AspNetKatana/tree/main/src/Microsoft.Owin.Security.WsFederation) |[MVC Web App](https://github.com/AzureADSamples/WebApp-WSFederation-DotNet) | |
| .NET |Identity Protocol Extensions for .NET 4.5 |[NuGet](https://www.nuget.org/packages/Microsoft.IdentityModel.Protocol.Extensions) |[GitHub](https://github.com/AzureAD/azure-activedirectory-identitymodel-extensions-for-dotnet) | | | | .NET |JWT Handler for .NET 4.5 |[NuGet](https://www.nuget.org/packages/System.IdentityModel.Tokens.Jwt) |[GitHub](https://github.com/AzureAD/azure-activedirectory-identitymodel-extensions-for-dotnet) | | | | Node.js |Azure AD Passport |[npm](https://www.npmjs.com/package/passport-azure-ad) |[GitHub](https://github.com/AzureAD/passport-azure-ad) | [Web API](../develop/authentication-flows-app-scenarios.md)| |
active-directory External Collaboration Settings Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/external-identities/external-collaboration-settings-configure.md
Previously updated : 01/31/2022 Last updated : 04/11/2022
For B2B collaboration with other Azure AD organizations, you should also review
1. Under **Guest user access**, choose the level of access you want guest users to have:
+ ![Screenshot showing Guest user access settings.](./media/external-collaboration-settings-configure/guest-user-access.png)
+ - **Guest users have the same access as members (most inclusive)**: This option gives guests the same access to Azure AD resources and directory data as member users. - **Guest users have limited access to properties and memberships of directory objects**: (Default) This setting blocks guests from certain directory tasks, like enumerating users, groups, or other directory resources. Guests can see membership of all non-hidden groups.
For B2B collaboration with other Azure AD organizations, you should also review
1. Under **Guest invite settings**, choose the appropriate settings:
- ![Guest invite settings](./media/external-collaboration-settings-configure/guest-invite-settings.png)
+ ![Screenshot showing Guest invite settings.](./media/external-collaboration-settings-configure/guest-invite-settings.png)
- **Anyone in the organization can invite guest users including guests and non-admins (most inclusive)**: To allow guests in the organization to invite other guests including those who are not members of an organization, select this radio button. - **Member users and users assigned to specific admin roles can invite guest users including guests with member permissions**: To allow member users and users who have specific administrator roles to invite guests, select this radio button.
For B2B collaboration with other Azure AD organizations, you should also review
1. Under **Enable guest self-service sign up via user flows**, select **Yes** if you want to be able to create user flows that let users sign up for apps. For more information about this setting, see [Add a self-service sign-up user flow to an app](self-service-sign-up-user-flow.md).
- ![Self-service sign up via user flows setting](./media/external-collaboration-settings-configure/self-service-sign-up-setting.png)
+ ![Screenshot showing Self-service sign up via user flows setting.](./media/external-collaboration-settings-configure/self-service-sign-up-setting.png)
1. Under **Collaboration restrictions**, you can choose whether to allow or deny invitations to the domains you specify and enter specific domain names in the text boxes. For multiple domains, enter each domain on a new line. For more information, see [Allow or block invitations to B2B users from specific organizations](allow-deny-list.md).
- ![Collaboration restrictions settings](./media/external-collaboration-settings-configure/collaboration-restrictions.png)
+ ![Screenshot showing Collaboration restrictions settings.](./media/external-collaboration-settings-configure/collaboration-restrictions.png)
## Assign the Guest Inviter role to a user With the Guest Inviter role, you can give individual users the ability to invite guests without assigning them a global administrator or other admin role. Assign the Guest inviter role to individuals. Then make sure you set **Admins and users in the guest inviter role can invite** to **Yes**.
active-directory How To Connect Install Prerequisites https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/hybrid/how-to-connect-install-prerequisites.md
For more information on setting the PowerShell execution policy, see [Set-Execut
### Azure AD Connect server The Azure AD Connect server contains critical identity data. It's important that administrative access to this server is properly secured. Follow the guidelines in [Securing privileged access](/windows-server/identity/securing-privileged-access/securing-privileged-access).
-The Azure AD Connect server must be treated as a Tier 0 component as documented in the [Active Directory administrative tier model](/windows-server/identity/securing-privileged-access/securing-privileged-access-reference-material)
+The Azure AD Connect server must be treated as a Tier 0 component as documented in the [Active Directory administrative tier model](/windows-server/identity/securing-privileged-access/securing-privileged-access-reference-material). We recommend hardening the Azure AD Connect server as a Control Plane asset by following the guidance provided in [Secure Privileged Access]( https://docs.microsoft.com/security/compass/overview)
To read more about securing your Active Directory environment, see [Best practices for securing Active Directory](/windows-server/identity/ad-ds/plan/security-best-practices/best-practices-for-securing-active-directory).
To read more about securing your Active Directory environment, see [Best practic
### Harden your Azure AD Connect server We recommend that you harden your Azure AD Connect server to decrease the security attack surface for this critical component of your IT environment. Following these recommendations will help to mitigate some security risks to your organization. -- Treat Azure AD Connect the same as a domain controller and other Tier 0 resources. For more information, see [Active Directory administrative tier model](/windows-server/identity/securing-privileged-access/securing-privileged-access-reference-material).
+- We recommend hardening the Azure AD Connect server as a Control Plane (formerly Tier 0) asset by following the guidance provided in [Secure Privileged Access]( https://docs.microsoft.com/security/compass/overview) and [Active Directory administrative tier model](/windows-server/identity/securing-privileged-access/securing-privileged-access-reference-material).
- Restrict administrative access to the Azure AD Connect server to only domain administrators or other tightly controlled security groups. - Create a [dedicated account for all personnel with privileged access](/windows-server/identity/securing-privileged-access/securing-privileged-access). Administrators shouldn't be browsing the web, checking their email, and doing day-to-day productivity tasks with highly privileged accounts. - Follow the guidance provided in [Securing privileged access](/windows-server/identity/securing-privileged-access/securing-privileged-access).
active-directory Pim Create Azure Ad Roles And Resource Roles Review https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/privileged-identity-management/pim-create-azure-ad-roles-and-resource-roles-review.md
To manage a series of access reviews, navigate to the access review, and you wil
Based on your selections in **Upon completion settings**, auto-apply will be executed after the review's end date or when you manually stop the review. The status of the review will change from **Completed** through intermediate states such as **Applying** and finally to state **Applied**. You should expect to see denied users, if any, being removed from roles in a few minutes.
-> [!IMPORTANT]
-> If a group is assigned to **Azure resource roles**, the reviewer of the Azure resource role will see the expanded list of the indirect users with access assigned through a nested group. Should a reviewer deny a member of a nested group, that deny result will not be applied successfully for the role because the user will not be removed from the nested group. For **Azure AD roles**, [role-assignable groups](../roles/groups-concept.md) will show up in the review instead of expanding the members of the group, and a reviewer will either approve or deny access to the entire group.
+## Impact of groups assigned to Azure AD roles and Azure resource roles in access reviews
+
+ΓÇó For **Azure AD roles**, role-assignable groups can be assigned to the role using [role-assignable groups](../roles/groups-concept.md). When a review is created on an Azure AD role with role-assignable groups assigned, the group name shows up in the review without expanding the group membership. The reviewer can approve or deny access of the entire group to the role. Denied groups will lose their assignment to the role when review results are applied.
+
+ΓÇó For **Azure resource roles**, any security group can be assigned to the role. When a review is created on an Azure resource role with a security group assigned, the users assigned to that security group will be fully expanded and shown to the reviewer of the role. When a reviewer denies a user that was assigned to the role via the security group, the user will not be removed from the group, and therefore the apply of the deny result will be unsuccessful.
+
+> [!NOTE]
+> It is possible for a security group to have other groups assigned to it. In this case, only the users assigned directly to the security group assigned to the role will appear in the review of the role.
## Update the access review
active-directory Ibmid Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/ibmid-tutorial.md
Title: 'Tutorial: Azure Active Directory single sign-on (SSO) integration with IBMid | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with IBMid'
description: Learn how to configure single sign-on between Azure Active Directory and IBMid.
Previously updated : 06/22/2021 Last updated : 04/08/2022
-# Tutorial: Azure Active Directory single sign-on (SSO) integration with IBMid
+# Tutorial: Azure AD SSO integration with IBMid
In this tutorial, you'll learn how to integrate IBMid with Azure Active Directory (Azure AD). When you integrate IBMid with Azure AD, you can:
Follow these steps to enable Azure AD SSO in the Azure portal.
1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode: In the **Sign-on URL** text box, type the URL:
- `https://myibm.ibm.com/`
+ `https://login.ibm.com`
1. Click **Save**.
Follow these steps to enable Azure AD SSO in the Azure portal.
| lastName | user.surname | | emailAddress | user.mail | - 1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Federation Metadata XML** and select **Download** to download the certificate and save it on your computer. ![The Certificate download link](common/metadataxml.png)
In this section, you test your Azure AD single sign-on configuration with follow
You can also use Microsoft My Apps to test the application in any mode. When you click the IBMid tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the IBMid for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510). - ## Next steps
-Once you configure IBMid you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Defender for Cloud Apps](/cloud-app-security/proxy-deployment-any-app).
+Once you configure IBMid you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Defender for Cloud Apps](/cloud-app-security/proxy-deployment-any-app).
active-directory Informatica Platform Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/informatica-platform-tutorial.md
Previously updated : 03/23/2022 Last updated : 04/08/2022
Follow these steps to enable Azure AD SSO in the Azure portal.
1. On the **Basic SAML Configuration** section, perform the following steps:
- a. In the **Identifier** text box, type the value:
- `Informatica`
+ a. In the **Identifier** text box, type the following value or URL pattern:
+
+ | App | URL |
+ |--||
+ | For EDC | `Informatica` |
+ | For Axon | `https://<host name: port number>/saml/metadata`
b. In the **Reply URL** text box, type a URL using the following pattern: `https://<host name: port number>/administrator/Login.do`
- c. In the **Sign-on URL** text box, type a URL using the following pattern:
+1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
+
+ In the **Sign-on URL** text box, type a URL using the following pattern:
`https://<host name: port number>/administrator/saml/login` > [!NOTE]
- > These values are not real. Update these values with the actual Reply URL and Sign-on URL. Contact [Informatica Platform Client support team](mailto:support@informatica.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier, Reply URL and Sign-on URL. Contact [Informatica Platform Client support team](mailto:support@informatica.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
1. Informatica Platform application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
active-directory Postbeyond Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/postbeyond-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with PostBeyond | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with PostBeyond'
description: Learn how to configure single sign-on between Azure Active Directory and PostBeyond.
Previously updated : 03/25/2019 Last updated : 03/29/2022
-# Tutorial: Azure Active Directory integration with PostBeyond
+# Tutorial: Azure AD SSO integration with PostBeyond
-In this tutorial, you learn how to integrate PostBeyond with Azure Active Directory (Azure AD).
-Integrating PostBeyond with Azure AD provides you with the following benefits:
+In this tutorial, you'll learn how to integrate PostBeyond with Azure Active Directory (Azure AD). When you integrate PostBeyond with Azure AD, you can:
-* You can control in Azure AD who has access to PostBeyond.
-* You can enable your users to be automatically signed-in to PostBeyond (Single Sign-On) with their Azure AD accounts.
-* You can manage your accounts in one central location - the Azure portal.
-
-If you want to know more details about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
+* Control in Azure AD who has access to PostBeyond.
+* Enable your users to be automatically signed-in to PostBeyond with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with PostBeyond, you need the following items:
+To configure Azure AD integration with PostBeyond, you need to have:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can get one-month trial [here](https://azure.microsoft.com/pricing/free-trial/)
-* PostBeyond single sign-on enabled subscription
+* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/pricing/free-trial/).
+* PostBeyond subscription that has single sign-on enabled.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
## Scenario description In this tutorial, you configure and test Azure AD single sign-on in a test environment.
-* PostBeyond supports **SP** initiated SSO
+* PostBeyond supports **SP** initiated SSO.
-## Adding PostBeyond from the gallery
+## Add PostBeyond from the gallery
To configure the integration of PostBeyond into Azure AD, you need to add PostBeyond from the gallery to your list of managed SaaS apps.
-**To add PostBeyond from the gallery, perform the following steps:**
-
-1. In the **[Azure portal](https://portal.azure.com)**, on the left navigation panel, click **Azure Active Directory** icon.
-
- ![The Azure Active Directory button](common/select-azuread.png)
-
-2. Navigate to **Enterprise Applications** and then select the **All Applications** option.
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add new application, click **New application** button on the top of dialog.
-
- ![The New application button](common/add-new-app.png)
-
-4. In the search box, type **PostBeyond**, select **PostBeyond** from result panel then click **Add** button to add the application.
-
- ![PostBeyond in the results list](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **PostBeyond** in the search box.
+1. Select **PostBeyond** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you configure and test Azure AD single sign-on with PostBeyond based on a test user called **Britta Simon**.
-For single sign-on to work, a link relationship between an Azure AD user and the related user in PostBeyond needs to be established.
+## Configure and test Azure AD SSO for PostBeyond
-To configure and test Azure AD single sign-on with PostBeyond, you need to complete the following building blocks:
+Configure and test Azure AD SSO with PostBeyond using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in PostBeyond.
-1. **[Configure Azure AD Single Sign-On](#configure-azure-ad-single-sign-on)** - to enable your users to use this feature.
-2. **[Configure PostBeyond Single Sign-On](#configure-postbeyond-single-sign-on)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with Britta Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable Britta Simon to use Azure AD single sign-on.
-5. **[Create PostBeyond test user](#create-postbeyond-test-user)** - to have a counterpart of Britta Simon in PostBeyond that is linked to the Azure AD representation of user.
-6. **[Test single sign-on](#test-single-sign-on)** - to verify whether the configuration works.
+To configure and test Azure AD SSO with PostBeyond, perform the following steps:
-### Configure Azure AD single sign-on
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure PostBeyond SSO](#configure-postbeyond-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create PostBeyond test user](#create-postbeyond-test-user)** - to have a counterpart of B.Simon in PostBeyond that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-In this section, you enable Azure AD single sign-on in the Azure portal.
+## Configure Azure AD SSO
-To configure Azure AD single sign-on with PostBeyond, perform the following steps:
+Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **PostBeyond** application integration page, select **Single sign-on**.
+1. In the Azure portal, on the **PostBeyond** application integration page, find the **Manage** section and select **Single sign-on**.
+1. On the **Select a Single sign-on method** page, select **SAML**.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Configure single sign-on link](common/select-sso.png)
-
-2. On the **Select a Single sign-on method** dialog, select **SAML/WS-Fed** mode to enable single sign-on.
-
- ![Single sign-on select mode](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, click **Edit** icon to open **Basic SAML Configuration** dialog.
-
- ![Edit Basic SAML Configuration](common/edit-urls.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
4. On the **Basic SAML Configuration** section, perform the following steps:
- ![PostBeyond Domain and URLs single sign-on information](common/sp-identifier.png)
-
- a. In the **Sign on URL** text box, type a URL using the following pattern:
+ a. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
`https://<subdomain>.postbeyond.com`
- b. In the **Identifier (Entity ID)** text box, type a URL using the following pattern:
+ b. In the **Sign on URL** text box, type a URL using the following pattern:
`https://<subdomain>.postbeyond.com` > [!NOTE]
- > These values are not real. Update these values with the actual Sign on URL and Identifier. Contact [PostBeyond Client support team](mailto:sso@postbeyond.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ > These values are not real. Update these values with the actual Identifier and Sign on URL. Contact [PostBeyond Client support team](mailto:sso@postbeyond.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, click **Download** to download the **Certificate (Base64)** from the given options as per your requirement and save it on your computer.
To configure Azure AD single sign-on with PostBeyond, perform the following step
![Copy configuration URLs](common/copy-configuration-urls.png)
- a. Login URL
-
- b. Azure AD Identifier
-
- c. Logout URL
-
-### Configure PostBeyond Single Sign-On
-
-To configure single sign-on on **PostBeyond** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [PostBeyond support team](mailto:sso@postbeyond.com). They set this setting to have the SAML SSO connection set properly on both sides.
- ### Create an Azure AD test user
-The objective of this section is to create a test user in the Azure portal called Britta Simon.
-
-1. In the Azure portal, in the left pane, select **Azure Active Directory**, select **Users**, and then select **All users**.
-
- ![The "Users and groups" and "All users" links](common/users.png)
-
-2. Select **New user** at the top of the screen.
-
- ![New user Button](common/new-user.png)
-
-3. In the User properties, perform the following steps.
-
- ![The User dialog box](common/user-properties.png)
-
- a. In the **Name** field enter **BrittaSimon**.
-
- b. In the **User name** field type brittasimon@yourcompanydomain.extension. For example, BrittaSimon@contoso.com
-
- c. Select **Show password** check box, and then write down the value that's displayed in the Password box.
+In this section, you'll create a test user in the Azure portal called B.Simon.
- d. Click **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you enable Britta Simon to use Azure single sign-on by granting access to PostBeyond.
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to PostBeyond.
-1. In the Azure portal, select **Enterprise Applications**, select **All applications**, then select **PostBeyond**.
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **PostBeyond**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![Enterprise applications blade](common/enterprise-applications.png)
+## Configure PostBeyond SSO
-2. In the applications list, select **PostBeyond**.
-
- ![The PostBeyond link in the Applications list](common/all-applications.png)
-
-3. In the menu on the left, select **Users and groups**.
-
- ![The "Users and groups" link](common/users-groups-blade.png)
-
-4. Click the **Add user** button, then select **Users and groups** in the **Add Assignment** dialog.
-
- ![The Add Assignment pane](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog select **Britta Simon** in the Users list, then click the **Select** button at the bottom of the screen.
-
-6. If you are expecting any role value in the SAML assertion then in the **Select Role** dialog select the appropriate role for the user from the list, then click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog click the **Assign** button.
+To configure single sign-on on **PostBeyond** side, you need to send the downloaded **Certificate (Base64)** and appropriate copied URLs from Azure portal to [PostBeyond support team](mailto:sso@postbeyond.com). They set this setting to have the SAML SSO connection set properly on both sides.
### Create PostBeyond test user In this section, you create a user called Britta Simon in PostBeyond. Work with [PostBeyond support team](mailto:sso@postbeyond.com) to add the users in the PostBeyond platform. Users must be created and activated before you use single sign-on.
-### Test single sign-on
+## Test SSO
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you click the PostBeyond tile in the Access Panel, you should be automatically signed in to the PostBeyond for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+* Click on **Test this application** in Azure portal. This will redirect to PostBeyond Sign-on URL where you can initiate the login flow.
-## Additional Resources
+* Go to PostBeyond Sign-on URL directly and initiate the login flow from there.
-- [List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the PostBeyond tile in the My Apps, this will redirect to PostBeyond Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure PostBeyond you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Preciate Provisioning Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/preciate-provisioning-tutorial.md
# Tutorial: Configure Preciate for automatic user provisioning
-This tutorial describes the steps you need to perform in both Preciate and Azure Active Directory (Azure AD) to configure automatic user provisioning. When configured, Azure AD automatically provisions and de-provisions users and groups to [Preciate](https://www.preciate.org/) using the Azure AD Provisioning service. For important details on what this service does, how it works, and frequently asked questions, see [Automate user provisioning and deprovisioning to SaaS applications with Azure Active Directory](../app-provisioning/user-provisioning.md).
+This tutorial describes the steps you need to perform in both Preciate and Azure Active Directory (Azure AD) to configure automatic user provisioning. When configured, Azure AD automatically provisions and de-provisions users and groups to [Preciate](https://preciate.com/) using the Azure AD Provisioning service. For important details on what this service does, how it works, and frequently asked questions, see [Automate user provisioning and deprovisioning to SaaS applications with Azure Active Directory](../app-provisioning/user-provisioning.md).
## Capabilities supported
Once you've configured provisioning, use the following resources to monitor your
## Next steps
-* [Learn how to review logs and get reports on provisioning activity](../app-provisioning/check-status-user-account-provisioning.md)
+* [Learn how to review logs and get reports on provisioning activity](../app-provisioning/check-status-user-account-provisioning.md)
active-directory Predictixordering Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/predictixordering-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with Predictix Ordering | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Predictix Ordering'
description: In this tutorial, you'll learn how to configure single sign-on between Azure Active Directory and Predictix Ordering.
Previously updated : 03/26/2019 Last updated : 03/29/2022
-# Tutorial: Azure Active Directory integration with Predictix Ordering
+# Tutorial: Azure AD SSO integration with Predictix Ordering
-In this tutorial, you'll learn how to integrate Predictix Ordering with Azure Active Directory (Azure AD).
-This integration provides these benefits:
+In this tutorial, you'll learn how to integrate Predictix Ordering with Azure Active Directory (Azure AD). When you integrate Predictix Ordering with Azure AD, you can:
-* You can use Azure AD to control who has access to Predictix Ordering.
-* You can enable your users to be automatically signed in to Predictix Ordering (single sign-on) with their Azure AD accounts.
-* You can manage your accounts in one central location: the Azure portal.
-
-To learn more about SaaS app integration with Azure AD, see [Single sign-on to applications in Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you start.
+* Control in Azure AD who has access to Predictix Ordering.
+* Enable your users to be automatically signed-in to Predictix Ordering with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
To configure Azure AD integration with Predictix Ordering, you need to have:
* An Azure AD subscription. If you don't have an Azure AD environment, you can get a [free account](https://azure.microsoft.com/pricing/free-trial/). * A Predictix Ordering subscription that has single sign-on enabled.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
## Scenario description
In this tutorial, you'll configure and test Azure AD single sign-on in a test en
## Add Predictix Ordering from the gallery
-To set up the integration of Predictix Ordering into Azure AD, you need to add Predictix Ordering from the gallery to your list of managed SaaS apps.
-
-1. In the [Azure portal](https://portal.azure.com), in the left pane, select **Azure Active Directory**:
-
- ![Select Azure Active Directory](common/select-azuread.png)
-
-2. Go to **Enterprise applications** > **All applications**:
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add an application, select **New application** at the top of the window:
-
- ![Select New application](common/add-new-app.png)
-
-4. In the search box, enter **Predictix Ordering**. Select **Predictix Ordering** in the search results and then select **Add**.
-
- ![Search results](common/search-new-app.png)
-
-## Configure and test Azure AD single sign-on
+To configure the integration of Predictix Ordering into Azure AD, you need to add Predictix Ordering from the gallery to your list of managed SaaS apps.
-In this section, you'll configure and test Azure AD single sign-on with Predictix Ordering by using a test user named Britta Simon.
-To enable single sign-on, you need to establish a relationship between an Azure AD user and the corresponding user in Predictix Ordering.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Predictix Ordering** in the search box.
+1. Select **Predictix Ordering** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-To configure and test Azure AD single sign-on with Predictix Ordering, you need to complete these steps:
+## Configure and test Azure AD SSO for Predictix Ordering
-1. **[Configure Azure AD single sign-on](#configure-azure-ad-single-sign-on)** to enable the feature for your users.
-2. **[Configure Predictix Ordering single sign-on](#configure-predictix-ordering-single-sign-on)** on the application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** to test Azure AD single sign-on.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** to enable Azure AD single sign-on for the user.
-5. **[Create a Predictix Ordering test user](#create-a-predictix-ordering-test-user)** that's linked to the Azure AD representation of the user.
-6. **[Test single sign-on](#test-single-sign-on)** to verify that the configuration works.
+Configure and test Azure AD SSO with Predictix Ordering using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Predictix Ordering.
-### Configure Azure AD single sign-on
+To configure and test Azure AD SSO with Predictix Ordering, perform the following steps:
-In this section, you'll enable Azure AD single sign-on in the Azure portal.
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Predictix Ordering SSO](#configure-predictix-ordering-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create a Predictix Ordering test user](#create-a-predictix-ordering-test-user)** - to have a counterpart of B.Simon in Predictix Ordering that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-To configure Azure AD single sign-on with Predictix Ordering, take these steps:
+## Configure Azure AD SSO
-1. In the [Azure portal](https://portal.azure.com/), on the **Predictix Ordering** application integration page, select **Single sign-on**:
+Follow these steps to enable Azure AD SSO in the Azure portal.
- ![Select Single sign-on](common/select-sso.png)
+1. In the Azure portal, on the **Predictix Ordering** application integration page, find the **Manage** section and select **Single sign-on**.
+1. On the **Select a Single sign-on method** page, select **SAML**.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
-2. In the **Select a single sign-on method** dialog box, select **SAML/WS-Fed** mode to enable single sign-on:
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
- ![Select a single sign-on method](common/select-saml-option.png)
-
-3. On the **Set up Single Sign-On with SAML** page, select the **Edit** icon to open the **Basic SAML Configuration** dialog box:
-
- ![Edit icon](common/edit-urls.png)
-
-4. In the **Basic SAML Configuration** dialog box, complete the following steps.
-
- ![Basic SAML Configuration dialog box](common/sp-identifier.png)
-
- 1. In the **Sign on URL** box, enter a URL in this pattern:
-
- `https://<companyname-pricing>.ordering.predictix.com/sso/request`
-
- 1. In the **Identifier (Entity ID)** box, enter a URL in this pattern:
-
- ```https
- https://<companyname-pricing>.dev.ordering.predictix.com
- https://<companyname-pricing>.ordering.predictix.com
- ```
+4. In the **Basic SAML Configuration** dialog box, perform the following steps:
+
+ a. In the **Identifier (Entity ID)** box, type a URL using one of the following patterns:
+
+ | **Identifier** |
+ |--|
+ | `https://<companyname-pricing>.dev.ordering.predictix.com` |
+ | `https://<companyname-pricing>.ordering.predictix.com` |
+
+ b. In the **Sign on URL** box, type a URL using the following pattern:
+ `https://<companyname-pricing>.ordering.predictix.com/sso/request`
> [!NOTE]
- > These values are placeholders. You need to use the actual sign-on URL and identifier. Contact the [Predictix Ordering support team](https://www.predix.io/support/) to get the values. You can also refer to the patterns shown in the **Basic SAML Configuration** dialog box in the Azure portal.
+ > These values are placeholders. Update these values with the actual Identifier and Sign on URL. Contact the [Predictix Ordering support team](https://www.predix.io/support/) to get the values. You can also refer to the patterns shown in the **Basic SAML Configuration** dialog box in the Azure portal.
5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, select the **Download** link next to **Certificate (Base64)**, per your requirements, and save the certificate on your computer:
To configure Azure AD single sign-on with Predictix Ordering, take these steps:
![Copy the configuration URLs](common/copy-configuration-urls.png)
- 1. **Login URL**.
-
- 2. **Azure AD Identifier**.
-
- 3. **Logout URL**.
-
-### Configure Predictix Ordering single sign-on
-
-To configure single sign-on on the Predictix Ordering side, you need to send the certificate that you downloaded and the URLs that you copied from the Azure portal to the [Predictix Ordering support team](https://www.predix.io/support/). This team ensures the SAML SSO connection is set properly on both sides.
- ### Create an Azure AD test user
-In this section, you'll create a test user named Britta Simon in the Azure portal.
-
-1. In the Azure portal, select **Azure Active Directory** in the left pane, select **Users**, and then select **All users**:
-
- ![Select All users](common/users.png)
-
-2. Select **New user** at the top of the screen:
-
- ![Select New user](common/new-user.png)
-
-3. In the **User** dialog box, take the following steps.
-
- ![User dialog box](common/user-properties.png)
-
- 1. In the **Name** box, enter **BrittaSimon**.
-
- 1. In the **User name** box, enter **BrittaSimon@\<yourcompanydomain>.\<extension>**. (For example, BrittaSimon@contoso.com.)
-
- 1. Select **Show Password**, and then write down the value that's in the **Password** box.
+In this section, you'll create a test user in the Azure portal called B.Simon.
- 1. Select **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you'll enable Britta Simon to use Azure AD single sign-on by granting her access to Predictix Ordering.
-
-1. In the Azure portal, select **Enterprise applications**, select **All applications**, and then select **Predictix Ordering**:
-
- ![Enterprise applications](common/enterprise-applications.png)
-
-2. In the list of applications, select **Predictix Ordering**.
-
- ![List of applications](common/all-applications.png)
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Predictix Ordering.
-3. In the left pane, select **Users and groups**:
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Predictix Ordering**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![Select Users and groups](common/users-groups-blade.png)
+## Configure Predictix Ordering SSO
-4. Select **Add user**, and then select **Users and groups** in the **Add Assignment** dialog box.
-
- ![Select Add user](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog box, select **Britta Simon** in the users list, and then click the **Select** button at the bottom of the screen.
-
-6. If you expect a role value in the SAML assertion, in the **Select Role** dialog box, select the appropriate role for the user from the list. Click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog box, select **Assign**.
+To configure single sign-on on the Predictix Ordering side, you need to send the certificate that you downloaded and the URLs that you copied from the Azure portal to the [Predictix Ordering support team](https://www.predix.io/support/). This team ensures the SAML SSO connection is set properly on both sides.
### Create a Predictix Ordering test user Next, you need to create a user named Britta Simon in Predictix Ordering. Work with the [Predictix Ordering support team](https://www.predix.io/support/) to add users. Users need to be created and activated before you use single sign-on.
-### Test single sign-on
+## Test SSO
-Now you need to test your Azure AD single sign-on configuration by using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you select the Predictix Ordering tile in the Access Panel, you should be automatically signed in to the Predictix Ordering instance for which you set up SSO. For more information, see [Access and use apps on the My Apps portal](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+* Click on **Test this application** in Azure portal. This will redirect to Predictix Ordering Sign-on URL where you can initiate the login flow.
-## Additional resources
+* Go to Predictix Ordering Sign-on URL directly and initiate the login flow from there.
-- [Tutorials for integrating SaaS applications with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Predictix Ordering tile in the My Apps, this will redirect to Predictix Ordering Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Predictix Ordering you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Predictixpricereporting Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/predictixpricereporting-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with Predictix Price Reporting | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with Predictix Price Reporting'
description: In this tutorial, you'll learn how to configure single sign-on between Azure Active Directory and Predictix Price Reporting.
Previously updated : 03/26/2019 Last updated : 03/29/2022
-# Tutorial: Azure Active Directory integration with Predictix Price Reporting
+# Tutorial: Azure AD SSO integration with Predictix Price Reporting
-In this tutorial, you'll learn how to integrate Predictix Price Reporting with Azure Active Directory (Azure AD).
+In this tutorial, you'll learn how to integrate Predictix Price Reporting with Azure Active Directory (Azure AD). When you integrate Predictix Price Reporting with Azure AD, you can:
-This integration provides these benefits:
-
-* You can use Azure AD to control who has access to Predictix Price Reporting.
-* You can enable your users to be automatically signed in to Predictix Price Reporting (single sign-on) with their Azure AD accounts.
-* You can manage your accounts in one central location: the Azure portal.
-
-To learn more about SaaS app integration with Azure AD, see [Single sign-on to applications in Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
-
-If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you start.
+* Control in Azure AD who has access to Predictix Price Reporting.
+* Enable your users to be automatically signed-in to Predictix Price Reporting with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
## Prerequisites
-To configure Azure AD integration with Predictix Price Reporting, you need:
+To get started, you need the following items:
-* An Azure AD subscription. If you don't have an Azure AD environment, you can sign up for a [one-month trial](https://azure.microsoft.com/pricing/free-trial/) subscription.
-* A Predictix Price Reporting subscription that has single sign-on enabled.
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Predictix Price Reporting single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
## Scenario description
In this tutorial, you'll configure and test Azure AD single sign-on in a test en
* Predictix Price Reporting supports SP-initiated SSO.
-## Adding Predictix Price Reporting from the gallery
-
-To set up the integration of Predictix Price Reporting into Azure AD, you need to add Predictix Price Reporting from the gallery to your list of managed SaaS apps.
-
-1. In the [Azure portal](https://portal.azure.com), in the left pane, select **Azure Active Directory**:
-
- ![Select Azure Active Directory](common/select-azuread.png)
-
-2. Go to **Enterprise applications** > **All applications**:
-
- ![The Enterprise applications blade](common/enterprise-applications.png)
-
-3. To add an application, select **New application** at the top of the window:
-
- ![Select New application](common/add-new-app.png)
-
-4. In the search box, enter **Predictix Price Reporting**. Select **Predictix Price Reporting** in the search results and then select **Add**.
+## Add Predictix Price Reporting from the gallery
- ![Search results](common/search-new-app.png)
+To configure the integration of Predictix Price Reporting into Azure AD, you need to add Predictix Price Reporting from the gallery to your list of managed SaaS apps.
-## Configure and test Azure AD single sign-on
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Predictix Price Reporting** in the search box.
+1. Select **Predictix Price Reporting** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
-In this section, you'll configure and test Azure AD single sign-on with Predictix Price Reporting by using a test user named Britta Simon.
-To enable single sign-on, you need to establish a relationship between an Azure AD user and the corresponding user in Predictix Price Reporting.
+## Configure and test Azure AD SSO for Predictix Price Reporting
-To configure and test Azure AD single sign-on with Predictix Price Reporting, you need to complete these steps:
+Configure and test Azure AD SSO with Predictix Price Reporting using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Predictix Price Reporting.
-1. **[Configure Azure AD single sign-on](#configure-azure-ad-single-sign-on)** to enable the feature for your users.
-2. **[Configure Predictix Price Reporting single sign-on](#configure-predictix-price-reporting-single-sign-on)** on the application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** to test Azure AD single sign-on.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** to enable Azure AD single sign-on for the user.
-5. **[Create a Predictix Price Reporting test user](#create-a-predictix-price-reporting-test-user)** that's linked to the Azure AD representation of the user.
-6. **[Test single sign-on](#test-single-sign-on)** to verify that the configuration works.
+To configure and test Azure AD SSO with Predictix Price Reporting, perform the following steps:
-### Configure Azure AD single sign-on
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Predictix Price Reporting SSO](#configure-predictix-price-reporting-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create a Predictix Price Reporting test user](#create-a-predictix-price-reporting-test-user)** - to have a counterpart of B.Simon in Predictix Price Reporting that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-In this section, you'll enable Azure AD single sign-on in the Azure portal.
+## Configure Azure AD SSO
-To configure Azure AD single sign-on with Predictix Price Reporting, take these steps:
+Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **Predictix Price Reporting** application integration page, select **Single sign-on**:
+1. In the Azure portal, on the **Predictix Price Reporting** application integration page, find the **Manage** section and select **Single sign-on**.
+1. On the **Select a Single sign-on method** page, select **SAML**.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
- ![Select Single sign-on](common/select-sso.png)
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
-2. In the **Select a single sign-on method** dialog box, select **SAML/WS-Fed** mode to enable single sign-on:
+4. In the **Basic SAML Configuration** dialog box, perform the following steps:
- ![Select a single sign-on method](common/select-saml-option.png)
+ a. In the **Identifier (Entity ID)** box, type a URL using one of the following patterns:
+
+ | **Identifier** |
+ |-|
+ | `https://<companyname-pricing>.predictix.com` |
+ | `https://<companyname-pricing>.dev.predictix.com` |
-3. On the **Set up Single Sign-On with SAML** page, select the **Edit** icon to open the **Basic SAML Configuration** dialog box:
-
- ![Edit icon](common/edit-urls.png)
-
-4. In the **Basic SAML Configuration** dialog box, complete the following steps.
-
- ![Basic SAML Configuration dialog box](common/sp-identifier.png)
-
- 1. In the **Sign on URL** box, enter a URL in this pattern:
-
- `https://<companyname-pricing>.predictix.com/sso/request`
-
- 1. In the **Identifier (Entity ID)** box, enter a URL in this pattern:
-
- ```https
- https://<companyname-pricing>.predictix.com
- https://<companyname-pricing>.dev.predictix.com
- ```
+ b. In the **Sign on URL** box, type a URL using the following pattern:
+ `https://<companyname-pricing>.predictix.com/sso/request`
> [!NOTE]
- > These values are placeholders. You need to use the actual sign-on URL and identifier. Contact the [Predictix Price Reporting support team](https://www.infor.com/company/customer-center/) to get the values. You can also refer to the patterns shown in the **Basic SAML Configuration** dialog box in the Azure portal.
+ > These values are placeholders. Update these values with the actual Identifier and Sign on URL. Contact the [Predictix Price Reporting support team](https://www.infor.com/company/customer-center/) to get the values. You can also refer to the patterns shown in the **Basic SAML Configuration** dialog box in the Azure portal.
5. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, select the **Download** link next to **Certificate (Base64)**, per your requirements, and save the certificate on your computer:
To configure Azure AD single sign-on with Predictix Price Reporting, take these
![Copy the configuration URLs](common/copy-configuration-urls.png)
- 1. **Login URL**.
-
- 1. **Azure AD Identifier**.
-
- 1. **Logout URL**.
-
-### Configure Predictix Price Reporting single sign-on
-
-To configure single sign-on on the Predictix Price Reporting side, you need to send the certificate that you downloaded and the URLs that you copied from the Azure portal to the [Predictix Price Reporting support team](https://www.infor.com/company/customer-center/). This team ensures the SAML SSO connection is set properly on both sides.
- ### Create an Azure AD test user
-In this section, you'll create a test user named Britta Simon in the Azure portal.
-
-1. In the Azure portal, select **Azure Active Directory** in the left pane, select **Users**, and then select **All users**:
-
- ![Select All users](common/users.png)
-
-2. Select **New user** at the top of the screen:
-
- ![Select New user](common/new-user.png)
-
-3. In the **User** dialog box, take the following steps.
-
- ![User dialog box](common/user-properties.png)
-
- 1. In the **Name** box, enter **BrittaSimon**.
-
- 1. In the **User name** box, enter **BrittaSimon@\<yourcompanydomain>.\<extension>**. (For example, BrittaSimon@contoso.com.)
+In this section, you'll create a test user in the Azure portal called B.Simon.
- 1. Select **Show Password**, and then write down the value that's in the **Password** box.
-
- 1. Select **Create**.
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
### Assign the Azure AD test user
-In this section, you'll enable Britta Simon to use Azure AD single sign-on by granting her access to Predictix Price Reporting.
-
-1. In the Azure portal, select **Enterprise applications**, select **All applications**, and then select **Predictix Price Reporting**.
-
- ![Enterprise applications](common/enterprise-applications.png)
-
-2. In the list of applications, select **Predictix Price Reporting**.
-
- ![List of applications](common/all-applications.png)
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Predictix Price Reporting.
-3. In the left pane, select **Users and groups**:
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Predictix Price Reporting**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. In the **Add Assignment** dialog, click the **Assign** button.
- ![Select Users and groups](common/users-groups-blade.png)
+## Configure Predictix Price Reporting SSO
-4. Select **Add user**, and then select **Users and groups** in the **Add Assignment** dialog box.
-
- ![Select Add user](common/add-assign-user.png)
-
-5. In the **Users and groups** dialog box, select **Britta Simon** in the users list, and then click the **Select** button at the bottom of the screen.
-
-6. If you expect a role value in the SAML assertion, in the **Select Role** dialog box, select the appropriate role for the user from the list. Click the **Select** button at the bottom of the screen.
-
-7. In the **Add Assignment** dialog box, select **Assign**.
+To configure single sign-on on the Predictix Price Reporting side, you need to send the certificate that you downloaded and the URLs that you copied from the Azure portal to the [Predictix Price Reporting support team](https://www.infor.com/company/customer-center/). This team ensures the SAML SSO connection is set properly on both sides.
### Create a Predictix Price Reporting test user Next, you need to create a user named Britta Simon in Predictix Price Reporting. Work with the [Predictix Price Reporting support team](https://www.infor.com/company/customer-center/) to add users. Users need to be created and activated before you use single sign-on.
-### Test single sign-on
+## Test SSO
-Now you need to test your Azure AD single sign-on configuration by using the Access Panel.
+In this section, you test your Azure AD single sign-on configuration with following options.
-When you select the Predictix Price Reporting tile in the Access Panel, you should be automatically signed in to the Predictix Price Reporting instance for which you set up SSO. For more information, see [Access and use apps on the My Apps portal](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+* Click on **Test this application** in Azure portal. This will redirect to Predictix Price Reporting Sign-on URL where you can initiate the login flow.
-## Additional resources
+* Go to Predictix Price Reporting Sign-on URL directly and initiate the login flow from there.
-- [Tutorials for integrating SaaS applications with Azure Active Directory](./tutorial-list.md)
+* You can use Microsoft My Apps. When you click the Predictix Price Reporting tile in the My Apps, this will redirect to Predictix Price Reporting Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is Conditional Access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure Predictix Price Reporting you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Pronovos Ops Manager Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/pronovos-ops-manager-tutorial.md
Title: 'Tutorial: Azure Active Directory integration with ProNovos Ops Manager | Microsoft Docs'
+ Title: 'Tutorial: Azure AD SSO integration with ProNovos Ops Manager'
description: Learn how to configure single sign-on between Azure Active Directory and ProNovos Ops Manager.
Previously updated : 07/18/2019 Last updated : 03/29/2022
-# Tutorial: Integrate ProNovos Ops Manager with Azure Active Directory
+# Tutorial: Azure AD SSO integration with ProNovos Ops Manager
In this tutorial, you'll learn how to integrate ProNovos Ops Manager with Azure Active Directory (Azure AD). When you integrate ProNovos Ops Manager with Azure AD, you can:
In this tutorial, you'll learn how to integrate ProNovos Ops Manager with Azure
* Enable your users to be automatically signed-in to ProNovos Ops Manager with their Azure AD accounts. * Manage your accounts in one central location - the Azure portal.
-To learn more about SaaS app integration with Azure AD, see [What is application access and single sign-on with Azure Active Directory](../manage-apps/what-is-single-sign-on.md).
- ## Prerequisites To get started, you need the following items: * An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/). * ProNovos Ops Manager single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
## Scenario description In this tutorial, you configure and test Azure AD SSO in a test environment.
-* ProNovos Ops Manager supports **SP and IDP** initiated SSO
+* ProNovos Ops Manager supports **SP and IDP** initiated SSO.
-## Adding ProNovos Ops Manager from the gallery
+## Add ProNovos Ops Manager from the gallery
To configure the integration of ProNovos Ops Manager into Azure AD, you need to add ProNovos Ops Manager from the gallery to your list of managed SaaS apps.
-1. Sign in to the [Azure portal](https://portal.azure.com) using either a work or school account, or a personal Microsoft account.
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
1. On the left navigation pane, select the **Azure Active Directory** service. 1. Navigate to **Enterprise Applications** and then select **All Applications**. 1. To add new application, select **New application**. 1. In the **Add from the gallery** section, type **ProNovos Ops Manager** in the search box. 1. Select **ProNovos Ops Manager** from results panel and then add the app. Wait a few seconds while the app is added to your tenant. -
-## Configure and test Azure AD single sign-on
+## Configure and test Azure AD SSO for ProNovos Ops Manager
Configure and test Azure AD SSO with ProNovos Ops Manager using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in ProNovos Ops Manager.
-To configure and test Azure AD SSO with ProNovos Ops Manager, complete the following building blocks:
+To configure and test Azure AD SSO with ProNovos Ops Manager, perform the following steps:
1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
-2. **[Configure ProNovos Ops Manager SSO](#configure-pronovos-ops-manager-sso)** - to configure the Single Sign-On settings on application side.
-3. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
-4. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
-5. **[Create ProNovos Ops Manager test user](#create-pronovos-ops-manager-test-user)** - to have a counterpart of B.Simon in ProNovos Ops Manager that is linked to the Azure AD representation of user.
-6. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure ProNovos Ops Manager SSO](#configure-pronovos-ops-manager-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create ProNovos Ops Manager test user](#create-pronovos-ops-manager-test-user)** - to have a counterpart of B.Simon in ProNovos Ops Manager that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
-### Configure Azure AD SSO
+## Configure Azure AD SSO
Follow these steps to enable Azure AD SSO in the Azure portal.
-1. In the [Azure portal](https://portal.azure.com/), on the **ProNovos Ops Manager** application integration page, find the **Manage** section and select **Single sign-on**.
+1. In the Azure portal, on the **ProNovos Ops Manager** application integration page, find the **Manage** section and select **Single sign-on**.
1. On the **Select a Single sign-on method** page, select **SAML**.
-1. On the **Set up Single Sign-On with SAML** page, click the edit/pen icon for **Basic SAML Configuration** to edit the settings.
+1. On the **Set up Single Sign-On with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
![Edit Basic SAML Configuration](common/edit-urls.png)
Follow these steps to enable Azure AD SSO in the Azure portal.
1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
- In the **Sign-on URL** text box, type a URL:
+ In the **Sign-on URL** text box, type the URL:
`https://gly.smartsubz.com/saml2/acs` - 4. On the **Set up Single Sign-On with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Raw)** and select **Download** to download the certificate and save it on your computer. ![The Certificate download link](common/certificateraw.png)
Follow these steps to enable Azure AD SSO in the Azure portal.
![Copy configuration URLs](common/copy-configuration-urls.png)
-### Configure ProNovos Ops Manager SSO
-
-To configure single sign-on on **ProNovos Ops Manager** side, you need to send the downloaded **Certificate (Raw)** and appropriate copied URLs from Azure portal to [ProNovos Ops Manager support team](mailto:support@pronovos.com). They set this setting to have the SAML SSO connection set properly on both sides.
### Create an Azure AD test user In this section, you'll create a test user in the Azure portal called B.Simon.
In this section, you'll enable B.Simon to use Azure single sign-on by granting a
1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**. 1. In the applications list, select **ProNovos Ops Manager**. 1. In the app's overview page, find the **Manage** section and select **Users and groups**.-
- ![The "Users and groups" link](common/users-groups-blade.png)
- 1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.-
- ![The Add User link](common/add-assign-user.png)
- 1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen. 1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen. 1. In the **Add Assignment** dialog, click the **Assign** button.
+## Configure ProNovos Ops Manager SSO
+
+To configure single sign-on on **ProNovos Ops Manager** side, you need to send the downloaded **Certificate (Raw)** and appropriate copied URLs from Azure portal to [ProNovos Ops Manager support team](mailto:support@pronovos.com). They set this setting to have the SAML SSO connection set properly on both sides.
+ ### Create ProNovos Ops Manager test user In this section, you create a user called B.Simon in ProNovos Ops Manager. Work with [ProNovos Ops Manager support team](mailto:support@pronovos.com) to add the users in the ProNovos Ops Manager platform. Users must be created and activated before you use single sign-on.
-### Test SSO
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to ProNovos Ops Manager Sign on URL where you can initiate the login flow.
-In this section, you test your Azure AD single sign-on configuration using the Access Panel.
+* Go to ProNovos Ops Manager Sign-on URL directly and initiate the login flow from there.
-When you click the ProNovos Ops Manager tile in the Access Panel, you should be automatically signed in to the ProNovos Ops Manager for which you set up SSO. For more information about the Access Panel, see [Introduction to the Access Panel](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+#### IDP initiated:
-## Additional Resources
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the ProNovos Ops Manager for which you set up the SSO.
-- [ List of Tutorials on How to Integrate SaaS Apps with Azure Active Directory ](./tutorial-list.md)
+You can also use Microsoft My Apps to test the application in any mode. When you click the ProNovos Ops Manager tile in the My Apps, if configured in SP mode you would be redirected to the application sign on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the ProNovos Ops Manager for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
-- [What is application access and single sign-on with Azure Active Directory? ](../manage-apps/what-is-single-sign-on.md)
+## Next steps
-- [What is conditional access in Azure Active Directory?](../conditional-access/overview.md)
+Once you configure ProNovos Ops Manager you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
active-directory Speexx Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/speexx-tutorial.md
+
+ Title: 'Tutorial: Azure AD SSO integration with Speexx'
+description: Learn how to configure single sign-on between Azure Active Directory and Speexx.
++++++++ Last updated : 03/28/2022++++
+# Tutorial: Azure AD SSO integration with Speexx
+
+In this tutorial, you'll learn how to integrate Speexx with Azure Active Directory (Azure AD). When you integrate Speexx with Azure AD, you can:
+
+* Control in Azure AD who has access to Speexx.
+* Enable your users to be automatically signed-in to Speexx with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Speexx single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Speexx supports **SP** initiated SSO.
+* Speexx supports **Just In Time** user provisioning.
+
+## Add Speexx from the gallery
+
+To configure the integration of Speexx into Azure AD, you need to add Speexx from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Speexx** in the search box.
+1. Select **Speexx** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Speexx
+
+Configure and test Azure AD SSO with Speexx using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Speexx.
+
+To configure and test Azure AD SSO with Speexx, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Speexx SSO](#configure-speexx-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Speexx test user](#create-speexx-test-user)** - to have a counterpart of B.Simon in Speexx that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Speexx** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, perform the following steps:
+
+ a. In the **Identifier** text box, type a URL using the following pattern:
+ `https://portal.speexx.com/auth/saml/<customername>`
+
+ b. In the **Reply URL** text box, type a URL using the following pattern:
+ `https://portal.speexx.com/auth/saml/<customername>/adfs/postResponse`
+
+ c. In the **Sign-on URL** text box, type a URL using the following pattern:
+ `https://portal.speexx.com/auth/saml/<customername>`
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Identifier, Reply URL and Sign-on URL. Contact [Speexx Client support team](mailto:support@speexx.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. On the **Set up single sign-on with SAML** page, In the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer.
+
+ ![The Certificate download link](common/copy-metadataurl.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Speexx.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Speexx**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Speexx SSO
+
+To configure single sign-on on **Speexx** side, you need to send the **App Federation Metadata Url** to [Speexx support team](mailto:support@speexx.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create Speexx test user
+
+In this section, a user called B.Simon is created in Speexx. Speexx supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Speexx, a new one is created after authentication.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on **Test this application** in Azure portal. This will redirect to Speexx Sign-on URL where you can initiate the login flow.
+
+* Go to Speexx Sign-on URL directly and initiate the login flow from there.
+
+* You can use Microsoft My Apps. When you click the Speexx tile in the My Apps, this will redirect to Speexx Sign-on URL. For more information about the My Apps, see [Introduction to the My Apps](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+
+## Next steps
+
+Once you configure Speexx you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Defender for Cloud Apps](/cloud-app-security/proxy-deployment-any-app).
active-directory Twilio Sendgrid Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/twilio-sendgrid-tutorial.md
+
+ Title: 'Tutorial: Azure AD SSO integration with Twilio Sendgrid'
+description: Learn how to configure single sign-on between Azure Active Directory and Twilio Sendgrid.
++++++++ Last updated : 04/04/2022+++
+# Tutorial: Azure AD SSO integration with Twilio Sendgrid
+
+In this tutorial, you'll learn how to integrate Twilio Sendgrid with Azure Active Directory (Azure AD). When you integrate Twilio Sendgrid with Azure AD, you can:
+
+* Control in Azure AD who has access to Twilio Sendgrid.
+* Enable your users to be automatically signed-in to Twilio Sendgrid with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Twilio Sendgrid single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Twilio Sendgrid supports **SP and IDP** initiated SSO.
+* Twilio Sendgrid supports **Just In Time** user provisioning.
+
+## Add Twilio Sendgrid from the gallery
+
+To configure the integration of Twilio Sendgrid into Azure AD, you need to add Twilio Sendgrid from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Twilio Sendgrid** in the search box.
+1. Select **Twilio Sendgrid** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Twilio Sendgrid
+
+Configure and test Azure AD SSO with Twilio Sendgrid using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Twilio Sendgrid.
+
+To configure and test Azure AD SSO with Twilio Sendgrid, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Twilio Sendgrid SSO](#configure-twilio-sendgrid-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Twilio Sendgrid test user](#create-twilio-sendgrid-test-user)** - to have a counterpart of B.Simon in Twilio Sendgrid that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Twilio Sendgrid** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, perform the following steps:
+
+ a. In the **Identifier** textbox, type a URL using the following pattern:
+ `https://api.sendgrid.com/v3/public/sso/saml/response/id/<uuid>`
+
+ b. In the **Reply URL** textbox, type a URL using the following pattern:
+ `https://api.sendgrid.com/v3/public/sso/saml/response/id/<uuid>`
+
+1. Click **Set additional URLs** and perform the following step if you wish to configure the application in **SP** initiated mode:
+
+ In the **Sign-on URL** text box, type the URL:
+ `https://app.sendgrid.com/ssologin`
+
+ > [!NOTE]
+ > These values are not real. Update these values with the actual Identifier and Reply URL. Contact [Twilio Sendgrid Client support team](mailto:help@sendgrid.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. Click **Save**.
+
+1. Twilio Sendgrid application expects the SAML assertions in a specific format, which requires you to add custom attribute mappings to your SAML token attributes configuration. The following screenshot shows the list of default attributes.
+
+ ![image](common/default-attributes.png)
+
+1. In addition to above, Twilio Sendgrid application expects few more attributes to be passed back in SAML response, which are shown below. These attributes are also pre populated but you can review them as per your requirements.
+
+ | Name | Source Attribute|
+ | | |
+ | FirstName | user.givenname |
+ | LastName | user.surname |
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Base64)** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/certificatebase64.png)
+
+1. On the **Set up Twilio Sendgrid** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Twilio Sendgrid.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Twilio Sendgrid**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you're expecting any role value in the SAML assertion, in the **Select Role** dialog, select the appropriate role for the user from the list and then click the **Select** button at the bottom of the screen.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Twilio Sendgrid SSO
+
+To configure single sign-on on **Twilio Sendgrid** side, you need to send the **Certificate (Base64)** to [Twilio Sendgrid support team](mailto:help@sendgrid.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create Twilio Sendgrid test user
+
+In this section, a user called B.Simon is created in Twilio Sendgrid. Twilio Sendgrid supports just-in-time user provisioning, which is enabled by default. There is no action item for you in this section. If a user doesn't already exist in Twilio Sendgrid, a new one is created after authentication.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+#### SP initiated:
+
+* Click on **Test this application** in Azure portal. This will redirect to Twilio Sendgrid Sign on URL where you can initiate the login flow.
+
+* Go to Twilio Sendgrid Sign-on URL directly and initiate the login flow from there.
+
+#### IDP initiated:
+
+* Click on **Test this application** in Azure portal and you should be automatically signed in to the Twilio Sendgrid for which you set up the SSO.
+
+You can also use Microsoft My Apps to test the application in any mode. When you click the Twilio Sendgrid tile in the My Apps, if configured in SP mode you would be redirected to the application sign-on page for initiating the login flow and if configured in IDP mode, you should be automatically signed in to the Twilio Sendgrid for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](https://support.microsoft.com/account-billing/sign-in-and-start-apps-from-the-my-apps-portal-2f3b1bae-0e5a-4a86-a33e-876fbd2a4510).
+
+## Next steps
+
+Once you configure Twilio Sendgrid you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Defender for Cloud Apps](/cloud-app-security/proxy-deployment-aad).
active-directory Us Bank Prepaid Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/us-bank-prepaid-tutorial.md
Previously updated : 03/04/2022 Last updated : 04/08/2022
Follow these steps to enable Azure AD SSO in the Azure portal.
a. In the **Identifier** text box, type the value: `USBank:SAML2.0:Prepaid_SP`
- b. In the **Reply URL** text box, type a URL using the following pattern:
- `https://<Environment>.usbank.com/sp/ACS.saml2`
+ b. In the **Reply URL** text box, type the URL:
+ `https://federation.usbank.com/sp/ACS.saml2`
- c. In the **Sign-on URL** text box, type a URL using the following pattern:
- `https://<Environment>.usbank.com/sp/startSSO.ping?PartnerIdpId=<ID>`
-
- > [!NOTE]
- > These values are not real. Update these values with the actual Reply URL and Sign-on URL. Contact [U.S. Bank Prepaid Client support team](mailto:web.access.management@usbank.com) to get these values. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+ c. In the **Sign-on URL** text box, type the URL:
+ `https://federation.usbank.com/sp/startSSO.ping?PartnerIdpId=<ID>`
+ > [!NOTE]
+ > The Sign-on URL value is not real. Update this value with the actual Sign-on URL. Contact [U.S. Bank Prepaid Client support team](mailto:web.access.management@usbank.com) to get the value. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
1. On the **Set up single sign-on with SAML** page, In the **SAML Signing Certificate** section, click copy button to copy **App Federation Metadata Url** and save it on your computer. ![The Certificate download link](common/copy-metadataurl.png)
active-directory Vault Platform Tutorial https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/active-directory/saas-apps/vault-platform-tutorial.md
+
+ Title: 'Tutorial: Azure AD SSO integration with Vault Platform'
+description: Learn how to configure single sign-on between Azure Active Directory and Vault Platform.
++++++++ Last updated : 03/28/2022++++
+# Tutorial: Azure AD SSO integration with Vault Platform
+
+In this tutorial, you'll learn how to integrate Vault Platform with Azure Active Directory (Azure AD). When you integrate Vault Platform with Azure AD, you can:
+
+* Control in Azure AD who has access to Vault Platform.
+* Enable your users to be automatically signed-in to Vault Platform with their Azure AD accounts.
+* Manage your accounts in one central location - the Azure portal.
+
+## Prerequisites
+
+To get started, you need the following items:
+
+* An Azure AD subscription. If you don't have a subscription, you can get a [free account](https://azure.microsoft.com/free/).
+* Vault Platform single sign-on (SSO) enabled subscription.
+* Along with Cloud Application Administrator, Application Administrator can also add or manage applications in Azure AD.
+For more information, see [Azure built-in roles](../roles/permissions-reference.md).
+
+## Scenario description
+
+In this tutorial, you configure and test Azure AD SSO in a test environment.
+
+* Vault Platform supports **IDP** initiated SSO.
+
+> [!NOTE]
+> Identifier of this application is a fixed string value so only one instance can be configured in one tenant.
+
+## Add Vault Platform from the gallery
+
+To configure the integration of Vault Platform into Azure AD, you need to add Vault Platform from the gallery to your list of managed SaaS apps.
+
+1. Sign in to the Azure portal using either a work or school account, or a personal Microsoft account.
+1. On the left navigation pane, select the **Azure Active Directory** service.
+1. Navigate to **Enterprise Applications** and then select **All Applications**.
+1. To add new application, select **New application**.
+1. In the **Add from the gallery** section, type **Vault Platform** in the search box.
+1. Select **Vault Platform** from results panel and then add the app. Wait a few seconds while the app is added to your tenant.
+
+## Configure and test Azure AD SSO for Vault Platform
+
+Configure and test Azure AD SSO with Vault Platform using a test user called **B.Simon**. For SSO to work, you need to establish a link relationship between an Azure AD user and the related user in Vault Platform.
+
+To configure and test Azure AD SSO with Vault Platform, perform the following steps:
+
+1. **[Configure Azure AD SSO](#configure-azure-ad-sso)** - to enable your users to use this feature.
+ 1. **[Create an Azure AD test user](#create-an-azure-ad-test-user)** - to test Azure AD single sign-on with B.Simon.
+ 1. **[Assign the Azure AD test user](#assign-the-azure-ad-test-user)** - to enable B.Simon to use Azure AD single sign-on.
+1. **[Configure Vault Platform SSO](#configure-vault-platform-sso)** - to configure the single sign-on settings on application side.
+ 1. **[Create Vault Platform test user](#create-vault-platform-test-user)** - to have a counterpart of B.Simon in Vault Platform that is linked to the Azure AD representation of user.
+1. **[Test SSO](#test-sso)** - to verify whether the configuration works.
+
+## Configure Azure AD SSO
+
+Follow these steps to enable Azure AD SSO in the Azure portal.
+
+1. In the Azure portal, on the **Vault Platform** application integration page, find the **Manage** section and select **single sign-on**.
+1. On the **Select a single sign-on method** page, select **SAML**.
+1. On the **Set up single sign-on with SAML** page, click the pencil icon for **Basic SAML Configuration** to edit the settings.
+
+ ![Edit Basic SAML Configuration](common/edit-urls.png)
+
+1. On the **Basic SAML Configuration** section, perform the following step:
+
+ a. In the **Reply URL** text box, type a URL using the following pattern:
+ `https://vaultplatform.com/api/portal/sessions/saml/<tenant-identifier>`
+
+ > [!NOTE]
+ > The Reply URL value is not real. Update the value with the actual Reply URL. Contact [Vault Platform Client support team](mailto:azure@vaultplatform.com) to get the value. You can also refer to the patterns shown in the **Basic SAML Configuration** section in the Azure portal.
+
+1. On the **Set up single sign-on with SAML** page, in the **SAML Signing Certificate** section, find **Certificate (Base64)** and select **Download** to download the certificate and save it on your computer.
+
+ ![The Certificate download link](common/certificatebase64.png)
+
+1. On the **Set up Vault Platform** section, copy the appropriate URL(s) based on your requirement.
+
+ ![Copy configuration URLs](common/copy-configuration-urls.png)
+
+### Create an Azure AD test user
+
+In this section, you'll create a test user in the Azure portal called B.Simon.
+
+1. From the left pane in the Azure portal, select **Azure Active Directory**, select **Users**, and then select **All users**.
+1. Select **New user** at the top of the screen.
+1. In the **User** properties, follow these steps:
+ 1. In the **Name** field, enter `B.Simon`.
+ 1. In the **User name** field, enter the username@companydomain.extension. For example, `B.Simon@contoso.com`.
+ 1. Select the **Show password** check box, and then write down the value that's displayed in the **Password** box.
+ 1. Click **Create**.
+
+### Assign the Azure AD test user
+
+In this section, you'll enable B.Simon to use Azure single sign-on by granting access to Vault Platform.
+
+1. In the Azure portal, select **Enterprise Applications**, and then select **All applications**.
+1. In the applications list, select **Vault Platform**.
+1. In the app's overview page, find the **Manage** section and select **Users and groups**.
+1. Select **Add user**, then select **Users and groups** in the **Add Assignment** dialog.
+1. In the **Users and groups** dialog, select **B.Simon** from the Users list, then click the **Select** button at the bottom of the screen.
+1. If you are expecting a role to be assigned to the users, you can select it from the **Select a role** dropdown. If no role has been set up for this app, you see "Default Access" role selected.
+1. In the **Add Assignment** dialog, click the **Assign** button.
+
+## Configure Vault Platform SSO
+
+To configure single sign-on on **Vault Platform** side, you need to send the **Certificate (Base64)** to [Vault Platform support team](mailto:azure@vaultplatform.com). They set this setting to have the SAML SSO connection set properly on both sides.
+
+### Create Vault Platform test user
+
+In this section, you create a user called Britta Simon in Vault Platform. Work with [Vault Platform support team](mailto:azure@vaultplatform.com) to add the users in the Vault Platform platform. Users must be created and activated before you use single sign-on.
+
+## Test SSO
+
+In this section, you test your Azure AD single sign-on configuration with following options.
+
+* Click on Test this application in Azure portal and you should be automatically signed in to the Vault Platform for which you set up the SSO.
+
+* You can use Microsoft My Apps. When you click the Vault Platform tile in the My Apps, you should be automatically signed in to the Vault Platform for which you set up the SSO. For more information about the My Apps, see [Introduction to the My Apps](../user-help/my-apps-portal-end-user-access.md).
+
+## Next steps
+
+Once you configure Vault Platform you can enforce session control, which protects exfiltration and infiltration of your organizationΓÇÖs sensitive data in real time. Session control extends from Conditional Access. [Learn how to enforce session control with Microsoft Cloud App Security](/cloud-app-security/proxy-deployment-aad).
aks Use Azure Dedicated Hosts https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/aks/use-azure-dedicated-hosts.md
A host group is a resource that represents a collection of dedicated hosts. You
* Span across multiple availability zones. In this case, you are required to have a host group in each of the zones you wish to use. * Span across multiple fault domains, which are mapped to physical racks.
-In either case, you are need to provide the fault domain count for your host group. If you do not want to span fault domains in your group, use a fault domain count of 1.
+In either case, you need to provide the fault domain count for your host group. If you do not want to span fault domains in your group, use a fault domain count of 1.
You can also decide to use both availability zones and fault domains.
app-service Configure Common https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/configure-common.md
At runtime, connection strings are available as environment variables, prefixed
* Custom: `CUSTOMCONNSTR_` * PostgreSQL: `POSTGRESQLCONNSTR_`
-For example, a MySql connection string named *connectionstring1* can be accessed as the environment variable `MYSQLCONNSTR_connectionString1`. For language-stack specific steps, see:
+For example, a MySQL connection string named *connectionstring1* can be accessed as the environment variable `MYSQLCONNSTR_connectionString1`. For language-stack specific steps, see:
- [ASP.NET Core](configure-language-dotnetcore.md#access-environment-variables) - [Node.js](configure-language-nodejs.md#access-environment-variables)
app-service Configure Connect To Azure Storage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/configure-connect-to-azure-storage.md
description: Learn how to attach custom network share in a containerized app in
Previously updated : 12/03/2021 Last updated : 3/10/2022
-zone_pivot_groups: app-service-containers-windows-linux
+zone_pivot_groups: app-service-containers-code
# Mount Azure Storage as a local share in a custom container in App Service
+> [!NOTE]
+> Mounting Azure Storage as a local share for App Service on Windows code is currently in preview.
+>
+This guide shows how to mount Azure Storage Files as a network share in Windows code in App Service. Only [Azure Files Shares](../storage/files/storage-how-to-use-files-portal.md) and [Premium Files Shares](../storage/files/storage-how-to-create-file-share.md) are supported. The benefits of custom-mounted storage include:
+
+- Configure persistent storage for your App Service app and manage the storage separately.
+- Make static content like video and images readily available for your App Service app.
+- Write application log files or archive older application log to Azure File shares.
+- Share content across multiple apps or with other Azure services.
+
+The following features are supported for Windows code:
+
+- Secured access to storage accounts with [private endpoints](../storage/common/storage-private-endpoints.md) and [service endpoints](../storage/common/storage-network-security.md#grant-access-from-a-virtual-network) (when [VNET integration](./overview-vnet-integration.md) is used).
+- Azure Files (read/write).
+- Up to five mount points per app.
+- Mount Azure Storage file shares using "/mounts/`<path-name>`".
+ ::: zone pivot="container-windows" This guide shows how to mount Azure Storage Files as a network share in a Windows container in App Service. Only [Azure Files Shares](../storage/files/storage-how-to-use-files-portal.md) and [Premium Files Shares](../storage/files/storage-how-to-create-file-share.md) are supported. The benefits of custom-mounted storage include:
The following features are supported for Linux containers:
::: zone-end
-<!-- ::: zone pivot="container-windows"
-
+## Prerequisites
+- [An existing Windows code app in App Service](quickstart-dotnetcore.md)
+- [Create Azure file share](../storage/files/storage-how-to-use-files-portal.md)
+- [Upload files to Azure File share](../storage/files/storage-how-to-create-file-share.md)
-## Prerequisites
::: zone pivot="container-windows"
The following features are supported for Linux containers:
## Limitations +
+- [Storage firewall](../storage/common/storage-network-security.md) is supported only through [private endpoints](../storage/common/storage-private-endpoints.md) and [service endpoints](../storage/common/storage-network-security.md#grant-access-from-a-virtual-network) (when [VNET integration](./overview-vnet-integration.md) is used).
+- Azure blobs are not supported when configuring Azure storage mounts for Windows code apps deployed to App Service.
+- FTP/FTPS access to mounted storage not supported (use [Azure Storage Explorer](https://azure.microsoft.com/features/storage-explorer/)).
+- Mapping `/mounts`, `mounts/foo/bar`, `/`, and `/mounts/foo.bar/` to custom-mounted storage is not supported (you can only use /mounts/pathname for mounting custom storage to your web app.)
+- Storage mounts cannot be used together with clone settings option during [deployment slot](deploy-staging-slots.md) creation.
+- Storage mounts are not backed up when you [back up your app](manage-backup.md). Be sure to follow best practices to back up the Azure Storage accounts.
++ ::: zone pivot="container-windows" - Storage mounts are not supported for native Windows (non-containerized) apps.
The following features are supported for Linux containers:
> ::: zone-end
+## Mount storage to Windows code
::: zone pivot="container-windows" ## Mount storage to Windows container ::: zone-end
To validate that the Azure Storage is mounted successfully for the app:
## Best practices - To avoid potential issues related to latency, place the app and the Azure Storage account in the same Azure region. Note, however, if the app and Azure Storage account are in same Azure region, and if you grant access from App Service IP addresses in the [Azure Storage firewall configuration](../storage/common/storage-network-security.md), then these IP restrictions are not honored.-- The mount directory in the custom container should be empty. Any content stored at this path is deleted when the Azure Storage is mounted (if you specify a directory under `/home`, for example). If you are migrating files for an existing app, make a backup of the app and its content before you begin. -- Mounting the storage to `/home` is not recommended because it may result in performance bottlenecks for the app. - In the Azure Storage account, avoid [regenerating the access key](../storage/common/storage-account-keys-manage.md) that's used to mount the storage in the app. The storage account contains two different keys. Use a stepwise approach to ensure that the storage mount remains available to the app during key regeneration. For example, assuming that you used **key1** to configure storage mount in your app: 1. Regenerate **key2**.
To validate that the Azure Storage is mounted successfully for the app:
- If you delete an Azure Storage account, container, or share, remove the corresponding storage mount configuration in the app to avoid possible error scenarios.
+- The mounted Azure Storage account can be either Standard or Premium performance tier. Based on the app capacity and throughput requirements, choose the appropriate performance tier for the storage account. See the [scalability and performance targets for Files](../storage/files/storage-files-scale-targets.md).
+
+- If your app [scales to multiple instances](../azure-monitor/autoscale/autoscale-get-started.md), all the instances connect to the same mounted Azure Storage account. To avoid performance bottlenecks and throughput issues, choose the appropriate performance tier for the storage account.
+
+- It's not recommended to use storage mounts for local databases (such as SQLite) or for any other applications and components that rely on file handles and locks.
+
+- If you [initiate a storage failover](../storage/common/storage-initiate-account-failover.md) and the storage account is mounted to the app, the mount will fail to connect until you either restart the app or remove and add the Azure Storage mount.
+
+- When using Azure Storage [private endpoints](../storage/common/storage-private-endpoints.md) with the app, you need to [enable the **Route All** setting](configure-vnet-integration-routing.md).
+
+- When VNET integration is used, ensure app setting, `WEBSITE_CONTENTOVERVNET` is set to `1` and the following ports are open:
+ - Azure Files: 80 and 445
+ - The mounted Azure Storage account can be either Standard or Premium performance tier. Based on the app capacity and throughput requirements, choose the appropriate performance tier for the storage account. See [the scalability and performance targets for Files](../storage/files/storage-files-scale-targets.md) ::: zone-end-- The mounted Azure Storage account can be either Standard or Premium performance tier. Based on the app capacity and throughput requirements, choose the appropriate performance tier for the storage account. See the scalability and performance targets that correspond to the storage type:
- - [For Files](../storage/files/storage-files-scale-targets.md)
- - [For Blobs](../storage/blobs/scalability-targets.md)
+- To avoid potential issues related to latency, place the app and the Azure Storage account in the same Azure region. Note, however, if the app and Azure Storage account are in same Azure region, and if you grant access from App Service IP addresses in the [Azure Storage firewall configuration](../storage/common/storage-network-security.md), then these IP restrictions are not honored.
+
+- In the Azure Storage account, avoid [regenerating the access key](../storage/common/storage-account-keys-manage.md) that's used to mount the storage in the app. The storage account contains two different keys. Use a stepwise approach to ensure that the storage mount remains available to the app during key regeneration. For example, assuming that you used **key1** to configure storage mount in your app:
+
+ 1. Regenerate **key2**.
+ 1. In the storage mount configuration, update the access the key to use the regenerated **key2**.
+ 1. Regenerate **key1**.
+
+- If you delete an Azure Storage account, container, or share, remove the corresponding storage mount configuration in the app to avoid possible error scenarios.
+
+- The mounted Azure Storage account can be either Standard or Premium performance tier. Based on the app capacity and throughput requirements, choose the appropriate performance tier for the storage account. See the [scalability and performance targets for Files](../storage/files/storage-files-scale-targets.md).
- If your app [scales to multiple instances](../azure-monitor/autoscale/autoscale-get-started.md), all the instances connect to the same mounted Azure Storage account. To avoid performance bottlenecks and throughput issues, choose the appropriate performance tier for the storage account. - It's not recommended to use storage mounts for local databases (such as SQLite) or for any other applications and components that rely on file handles and locks.
+- If you [initiate a storage failover](../storage/common/storage-initiate-account-failover.md) and the storage account is mounted to the app, the mount will fail to connect until you either restart the app or remove and add the Azure Storage mount.
+
- When using Azure Storage [private endpoints](../storage/common/storage-private-endpoints.md) with the app, you need to [enable the **Route All** setting](configure-vnet-integration-routing.md). > [!NOTE] > In App Service environment V3, the **Route All** setting is disabled by default and must be explicitly enabled. ::: zone-end+ ::: zone pivot="container-linux"
+- To avoid potential issues related to latency, place the app and the Azure Storage account in the same Azure region. Note, however, if the app and Azure Storage account are in same Azure region, and if you grant access from App Service IP addresses in the [Azure Storage firewall configuration](../storage/common/storage-network-security.md), then these IP restrictions are not honored.
+
+- The mount directory in the custom container should be empty. Any content stored at this path is deleted when the Azure Storage is mounted (if you specify a directory under `/home`, for example). If you are migrating files for an existing app, make a backup of the app and its content before you begin.
+
+- Mounting the storage to `/home` is not recommended because it may result in performance bottlenecks for the app.
+
+- In the Azure Storage account, avoid [regenerating the access key](../storage/common/storage-account-keys-manage.md) that's used to mount the storage in the app. The storage account contains two different keys. Use a stepwise approach to ensure that the storage mount remains available to the app during key regeneration. For example, assuming that you used **key1** to configure storage mount in your app:
+
+ 1. Regenerate **key2**.
+ 1. In the storage mount configuration, update the access the key to use the regenerated **key2**.
+ 1. Regenerate **key1**.
+
+- If you delete an Azure Storage account, container, or share, remove the corresponding storage mount configuration in the app to avoid possible error scenarios.
+
+- The mounted Azure Storage account can be either Standard or Premium performance tier. Based on the app capacity and throughput requirements, choose the appropriate performance tier for the storage account. See the scalability and performance targets that correspond to the storage type:
+
+ - [For Files](../storage/files/storage-files-scale-targets.md)
+ - [For Blobs](../storage/blobs/scalability-targets.md)
+
+- If your app [scales to multiple instances](../azure-monitor/autoscale/autoscale-get-started.md), all the instances connect to the same mounted Azure Storage account. To avoid performance bottlenecks and throughput issues, choose the appropriate performance tier for the storage account.
+
+- It's not recommended to use storage mounts for local databases (such as SQLite) or for any other applications and components that rely on file handles and locks.
+ - When using Azure Storage [private endpoints](../storage/common/storage-private-endpoints.md) with the app, you need to [enable the **Route All** setting](configure-vnet-integration-routing.md). - If you [initiate a storage failover](../storage/common/storage-initiate-account-failover.md) and the storage account is mounted to the app, the mount will fail to connect until you either restart the app or remove and add the Azure Storage mount. ++ ## Next steps +
+- [Migrate custom software to Azure App Service using a custom container](tutorial-custom-container.md?pivots=container-windows).
++ ::: zone pivot="container-windows" - [Migrate custom software to Azure App Service using a custom container](tutorial-custom-container.md?pivots=container-windows).
app-service Configure Ssl Certificate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/configure-ssl-certificate.md
To secure a custom domain in a TLS binding, the certificate has additional requi
The free App Service managed certificate is a turn-key solution for securing your custom DNS name in App Service. It's a TLS/SSL server certificate that's fully managed by App Service and renewed continuously and automatically in six-month increments, 45 days before expiration, as long as the prerequisites set-up remain the same without any action required from you. All the associated bindings will be updated with the renewed certificate. You create the certificate and bind it to a custom domain, and let App Service do the rest.
+> [!IMPORTANT]
+> Because Azure fully manages the certificates on your behalf, any aspect of the managed certificate, including the root issuer, can be changed at anytime. These changes are outside of your control. You should avoid having a hard dependency or practice certificate "pinning" to the managed certificate, or to any part of the certificate hierarchy. If you need the certificate pinning behavior, add a certificate to your custom domain using any other available method in this article.
+ The free certificate comes with the following limitations: - Does not support wildcard certificates.
app-service How To Migrate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/how-to-migrate.md
Title: Use the migration feature to migrate App Service Environment v2 to App Se
description: Learn how to migrate your App Service Environment v2 to App Service Environment v3 using the migration feature Previously updated : 4/5/2022 Last updated : 4/11/2022 zone_pivot_groups: app-service-cli-portal
ASE_ID=$(az appservice ase show --name $ASE_NAME --resource-group $ASE_RG --quer
## 2. Validate migration is supported
-The following command will check whether your App Service Environment is supported for migration. If you receive an error or if your App Service Environment is in an unhealthy or suspended state, you can't migrate at this time. If your environment [won't be supported for migration](migrate.md#supported-scenarios) or you want to migrate to App Service Environment v3 without using the migration feature, see the [manual migration options](migration-alternatives.md).
+The following command will check whether your App Service Environment is supported for migration. If you receive an error or if your App Service Environment is in an unhealthy or suspended state, you can't migrate at this time. See the [troubleshooting](migrate.md#troubleshooting) section for descriptions of the potential error messages you may get. If your environment [won't be supported for migration](migrate.md#supported-scenarios) or you want to migrate to App Service Environment v3 without using the migration feature, see the [manual migration options](migration-alternatives.md).
```azurecli az rest --method post --uri "${ASE_ID}/migrate?api-version=2021-02-01&phase=validation"
az appservice ase show --name $ASE_NAME --resource-group $ASE_RG
## 1. Validate migration is supported
-From the [Azure portal](https://portal.azure.com), navigate to the **Overview** page for the App Service Environment you'll be migrating. The platform will validate if migration is supported for your App Service Environment. Wait a couple seconds after the page loads for this validation to take place.
+From the [Azure portal](https://portal.azure.com), navigate to the **Overview** page for the App Service Environment you'll be migrating. The platform will validate if migration is supported for your App Service Environment. Wait a couple seconds after the page loads for this validation to take place. See the [troubleshooting](migrate.md#troubleshooting) section for descriptions of the potential error messages if migration for your environment isn't supported by the migration feature.
If migration is supported for your App Service Environment, there are three ways to access the migration feature. These methods include a banner at the top of the Overview page, a new item in the left-hand side menu called **Migration**, and an info box on the **Configuration** page. Select any of these methods to move on to the next step in the migration process.
app-service Migrate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/environment/migrate.md
Title: Migrate to App Service Environment v3 by using the migration feature
description: Overview of the migration feature for migration to App Service Environment v3 Previously updated : 4/5/2022 Last updated : 4/11/2022
The following scenarios aren't supported in this version of the feature:
- [Zone pinned](zone-redundancy.md) App Service Environment v2 - App Service Environment in a region not listed in the supported regions
-The migration feature doesn't plan on supporting App Service Environment v1 within a classic VNet. See the [manual migration options](migration-alternatives.md) if your App Service Environment falls into this category.
+The migration feature doesn't plan on supporting App Service Environment v1 within a Classic VNet. See the [manual migration options](migration-alternatives.md) if your App Service Environment falls into this category.
The App Service platform will review your App Service Environment to confirm migration support. If your scenario doesn't pass all validation checks, you won't be able to migrate at this time using the migration feature. If your environment is in an unhealthy or suspended state, you won't be able to migrate until you make the needed updates.
+### Troubleshooting
+
+If your App Service Environment doesn't pass the validation checks or you try to perform a migration step in the incorrect order, you may see one of the following error messages:
+
+|Error message |Description |
+|||
+|Migrate can only be called on an ASE in ARM VNET and this ASE is in Classic VNET. |App Service Environments in Classic VNets can't migrate using the migration feature. |
+|ASEv3 Migration is not yet ready. |The underlying infrastructure isn't ready to support App Service Environment v3. |
+|Migration cannot be called on this ASE, please contact support for help migrating. |Support will need to be engaged for migrating this App Service Environment. This is potentially due to custom settings used by this environment. |
+|Migrate cannot be called on Zone Pinned ASEs. |App Service Environment v2s that are zone pinned can't be migrated using the migration feature at this time. |
+|Migrate cannot be called if IP SSL is enabled on any of the sites|App Service Environments that have sites with IP SSL enabled can't be migrated using the migration feature at this time. |
+|Migrate is not available for this kind|App Service Environment v1 can't be migrated using the migration feature at this time. |
+|Full migration cannot be called before IP addresses are generated|You'll see this error if you attempt to migrate before finishing the pre-migration steps. |
+|Migration to ASEv3 is not allowed for this ASE|You won't be able to migrate using the migration feature. |
+|Subscription has too many App Service Environments. Please remove some before trying to create more.|The App Service Environment [quota for your subscription](/azure/azure-resource-manager/management/azure-subscription-service-limits#app-service-limits) has been met. You'll need to remove unneeded environments or contact support to review your options.|
+|`<ZoneRedundant><DedicatedHosts><ASEv3/ASE>` is not available in this location|You'll see this error if you're trying to migrate an App Service Environment in a region that doesn't support one of your requested features. |
+ ## Overview of the migration process using the migration feature Migration consists of a series of steps that must be followed in order. Key points are given for a subset of the steps. It's important to understand what will happen during these steps and how your environment and apps will be impacted. After reviewing the following information and when you're ready to migrate, follow the [step-by-step guide](how-to-migrate.md).
app-service Tutorial Troubleshoot Monitor https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/app-service/tutorial-troubleshoot-monitor.md
Browse to `http://<app-name>.azurewebsites.net`.
The sample app, ImageConverter, converts included images from `JPG` to `PNG`. A bug has been deliberately placed in the code for this tutorial. If you select enough images, the the app produces a HTTP 500 error during image conversion. Imagine this scenario wasn't considered during the development phase. You'll use Azure Monitor to troubleshoot the error.
-### Verify the app is works
+### Verify the app works
To convert images, click `Tools` and select `Convert to PNG`.
What you learned:
* [Query logs with Azure Monitor](../azure-monitor/logs/log-query-overview.md) * [Troubleshooting Azure App Service in Visual Studio](troubleshoot-dotnet-visual-studio.md) * [Analyze app Logs in HDInsight](https://gallery.technet.microsoft.com/scriptcenter/Analyses-Windows-Azure-web-0b27d413)
-* [Tutorial: Run a load test to identify performance bottlenecks in a web app](../load-testing/tutorial-identify-bottlenecks-azure-portal.md)
+* [Tutorial: Run a load test to identify performance bottlenecks in a web app](../load-testing/tutorial-identify-bottlenecks-azure-portal.md)
application-gateway Add Http Header Rewrite Rule Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/add-http-header-rewrite-rule-powershell.md
Title: Rewrite HTTP headers in Azure Application Gateway description: This article provides information on how to rewrite HTTP headers in Azure Application Gateway by using Azure PowerShell -+ Last updated 04/12/2019-+ # Rewrite HTTP request and response headers with Azure Application Gateway - Azure PowerShell
application-gateway Application Gateway Autoscaling Zone Redundant https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-autoscaling-zone-redundant.md
Title: Scaling and Zone-redundant Application Gateway v2 description: This article introduces the Azure Application Standard_v2 and WAF_v2 SKU Autoscaling and Zone-redundant features. -+ Last updated 03/01/2022-+
application-gateway Application Gateway Backend Health Troubleshooting https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-backend-health-troubleshooting.md
Title: Troubleshoot backend health issues in Azure Application Gateway description: Describes how to troubleshoot backend health issues for Azure Application Gateway -+ Last updated 03/17/2022-+
application-gateway Application Gateway Configure Ssl Policy Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-configure-ssl-policy-powershell.md
Title: Configure TLS policy using PowerShell
description: This article provides instructions to configure TLS Policy on Azure Application Gateway -+ Last updated 11/14/2019-+
application-gateway Application Gateway Create Probe Classic Ps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-create-probe-classic-ps.md
Title: Create a custom probe using the Classic deployment model - Azure Application Gateway description: Learn how to create a custom probe for Application Gateway by using PowerShell in the classic deployment model -+ Last updated 11/13/2019-+ # Create a custom probe for Azure Application Gateway (classic) by using PowerShell
application-gateway Application Gateway Create Probe Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-create-probe-portal.md
Title: Create a custom probe using the portal
description: Learn how to create a custom probe for Application Gateway by using the portal -+ Last updated 07/09/2020-+ # Create a custom probe for Application Gateway by using the portal
application-gateway Application Gateway Create Probe Ps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-create-probe-ps.md
Title: Create a custom probe using PowerShell
description: Learn how to create a custom probe for Application Gateway by using PowerShell in Resource Manager -+ Last updated 07/09/2020-+
application-gateway Application Gateway Diagnostics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-diagnostics.md
Title: Back-end health and diagnostic logs
description: Learn how to enable and manage access logs and performance logs for Azure Application Gateway -+ Last updated 02/25/2022-+
application-gateway Application Gateway End To End Ssl Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-end-to-end-ssl-powershell.md
Title: Configure end-to-end TLS with Azure Application Gateway description: This article describes how to configure end-to-end TLS with Azure Application Gateway by using PowerShell -+ Last updated 06/09/2020-+
application-gateway Application Gateway Ilb Arm https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-ilb-arm.md
Title: Use with Internal Load Balancer - Azure Application Gateway description: This article provides instructions to create, configure, start, and delete an Azure application gateway with internal load balancer (ILB) -+ Last updated 01/11/2022-+
application-gateway Application Gateway Probe Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-probe-overview.md
Title: Health monitoring overview for Azure Application Gateway description: Azure Application Gateway monitors the health of all resources in its back-end pool and automatically removes any resource considered unhealthy from the pool. -+ Last updated 07/09/2020-+
application-gateway Application Gateway Troubleshooting 502 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-troubleshooting-502.md
Title: Troubleshoot Bad Gateway errors - Azure Application Gateway description: 'Learn how to troubleshoot Application Gateway Server Error: 502 - Web server received an invalid response while acting as a gateway or proxy server.' -+ Last updated 11/16/2019
application-gateway Application Gateway Websocket https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/application-gateway-websocket.md
Title: WebSocket support in Azure Application Gateway description: Application Gateway provides native support for WebSocket across all gateway sizes. There are no user-configurable settings.-+
application-gateway Certificates For Backend Authentication https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/certificates-for-backend-authentication.md
Title: Certificates required to allow backend servers
description: This article provides examples of how a TLS/SSL certificate can be converted to authentication certificate and trusted root certificate that are required to allow backend instances in Azure Application Gateway -+ Last updated 07/30/2021-+ # Create certificates to allow the backend with Azure Application Gateway
application-gateway Cli Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/cli-samples.md
Title: Azure CLI examples for Azure Application Gateway description: This article has links to Azure CLI examples so you can quickly deploy Azure Application Gateway configured in various ways. -+ Last updated 11/16/2019-+
application-gateway Configuration Front End Ip https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/configuration-front-end-ip.md
Title: Azure Application Gateway front-end IP address configuration description: This article describes how to configure the Azure Application Gateway front-end IP address. -+ Last updated 09/09/2020
application-gateway Configuration Http Settings https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/configuration-http-settings.md
Title: Azure Application Gateway HTTP settings configuration description: This article describes how to configure Azure Application Gateway HTTP settings. -+ Last updated 02/17/2022-+ # Application Gateway HTTP settings configuration
application-gateway Configuration Infrastructure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/configuration-infrastructure.md
Title: Azure Application Gateway infrastructure configuration description: This article describes how to configure the Azure Application Gateway infrastructure. -+ Last updated 06/14/2021
application-gateway Configuration Listeners https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/configuration-listeners.md
Title: Azure Application Gateway listener configuration description: This article describes how to configure Azure Application Gateway listeners. -+ Last updated 09/09/2020
application-gateway Configuration Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/configuration-overview.md
Title: Azure Application Gateway configuration overview description: This article describes how to configure the components of Azure Application Gateway -+ Last updated 09/09/2020
application-gateway Configuration Request Routing Rules https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/configuration-request-routing-rules.md
Title: Azure Application Gateway request routing rules configuration description: This article describes how to configure the Azure Application Gateway request routing rules. -+ Last updated 09/09/2020
application-gateway Configure Alerts With Templates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/configure-alerts-with-templates.md
Title: Configure Azure Monitor alerts for Application Gateway description: Learn how to use ARM templates to configure Azure Monitor alerts for Application Gateway -+ Last updated 03/03/2022
application-gateway Configure Application Gateway With Private Frontend Ip https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/configure-application-gateway-with-private-frontend-ip.md
Last updated 01/11/2022-+ # Configure an application gateway with an internal load balancer (ILB) endpoint
application-gateway Configure Keyvault Ps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/configure-keyvault-ps.md
Title: Configure TLS termination with Key Vault certificates - PowerShell
description: Learn how how to use an Azure PowerShell script to integrate your key vault with your application gateway for TLS/SSL termination certificates. -+ Last updated 05/26/2020-+
application-gateway Create Multiple Sites Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/create-multiple-sites-portal.md
Title: 'Tutorial: Hosts multiple web sites using the Azure portal'
description: In this tutorial, you learn how to create an application gateway that hosts multiple web sites using the Azure portal. -+ Last updated 03/19/2021-+ #Customer intent: As an IT administrator, I want to use the Azure portal to set up an application gateway so I can host multiple sites.
application-gateway Create Ssl Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/create-ssl-portal.md
Title: 'Tutorial: Configure TLS termination in portal - Azure Application Gateway' description: In this tutorial, you learn how to configure an application gateway and add a certificate for TLS termination using the Azure portal. -+ Last updated 01/28/2021-+ #Customer intent: As an IT administrator, I want to use the Azure portal to configure Application Gateway with TLS termination so I can secure my application traffic.
application-gateway Create Url Route Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/create-url-route-portal.md
Title: 'Tutorial: URL path-based routing rules using portal - Azure Application Gateway' description: In this tutorial, you learn how to create URL path-based routing rules for an application gateway and virtual machine scale set using the Azure portal. -+ Last updated 02/23/2021-+ #Customer intent: As an IT administrator, I want to use the Azure portal to set up an application gateway so I can route my app traffic based on path-based routing rules.
application-gateway Custom Error https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/custom-error.md
Title: Create Azure Application Gateway custom error pages description: This article shows you how to create Application Gateway custom error pages. You can use your own branding and layout using a custom error page. -+ Last updated 04/04/2022-+
application-gateway End To End Ssl Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/end-to-end-ssl-portal.md
Title: Configure end-to-end TLS encryption using the portal
description: Learn how to use the Azure portal to create an application gateway with end-to-end TLS encryption. -+ Last updated 11/14/2019-+
application-gateway Features https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/features.md
Title: Azure Application Gateway features description: Learn about Azure Application Gateway features -+ Last updated 01/18/2022-+ # Azure Application Gateway features
application-gateway How Application Gateway Works https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/how-application-gateway-works.md
Title: How an application gateway works description: This article provides information about how an application gateway accepts incoming requests and routes them to the backend. -+ Last updated 11/16/2019-+ # How an application gateway works
application-gateway How To Troubleshoot Application Gateway Session Affinity Issues https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/how-to-troubleshoot-application-gateway-session-affinity-issues.md
Title: Troubleshoot session affinity issues
description: This article provides information on how to troubleshoot session affinity issues in Azure Application Gateway -+ Last updated 01/24/2022-+ # Troubleshoot Azure Application Gateway session affinity issues
application-gateway Key Vault Certs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/key-vault-certs.md
Title: TLS termination with Azure Key Vault certificates description: Learn how you can integrate Azure Application Gateway with Key Vault for server certificates that are attached to HTTPS-enabled listeners. -+ Last updated 03/04/2022-+ # TLS termination with Key Vault certificates
application-gateway Log Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/log-analytics.md
Title: Examine WAF logs using Azure Log Analytics
description: This article shows you how you can use Azure Log Analytics to examine Application Gateway Web Application Firewall (WAF) logs. -+ Last updated 11/14/2019-+ # Use Log Analytics to examine Application Gateway Web Application Firewall (WAF) Logs
application-gateway Migrate V1 V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/migrate-v1-v2.md
Title: Migrate from v1 to v2 - Azure Application Gateway description: This article shows you how to migrate Azure Application Gateway and Web Application Firewall from v1 to v2 -+ Last updated 03/31/2020-+ # Migrate Azure Application Gateway and Web Application Firewall from v1 to v2
application-gateway Monitor Application Gateway Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/monitor-application-gateway-reference.md
Title: Monitoring Azure Application Gateway data reference description: Important reference material needed when you monitor Application Gateway --++
application-gateway Monitor Application Gateway https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/monitor-application-gateway.md
Title: Monitoring Azure Application Gateway description: Start here to learn how to monitor Azure Application Gateway --++ Last updated 06/10/2021
application-gateway Multiple Site Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/multiple-site-overview.md
Title: Hosting multiple sites on Azure Application Gateway description: This article provides an overview of the Azure Application Gateway multi-site support. -+ Last updated 08/31/2021
application-gateway Overview V2 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/overview-v2.md
Title: What is Azure Application Gateway v2? description: Learn about Azure application Gateway v2 features -+ Last updated 02/07/2022-+
application-gateway Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/overview.md
Title: What is Azure Application Gateway description: Learn how you can use an Azure application gateway to manage web traffic to your application. -+ Last updated 08/26/2020-+ #Customer intent: As an IT administrator, I want to learn about Azure Application Gateways and what I can use them for.
application-gateway Powershell Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/powershell-samples.md
Title: Azure PowerShell examples for Azure Application Gateway description: This article has links to Azure PowerShell examples so you can quickly deploy Azure Application Gateway configured in various ways. -+ Last updated 11/16/2019-+ # Azure PowerShell examples for Azure Application Gateway (AG)
application-gateway Quick Create Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/quick-create-cli.md
Title: 'Quickstart: Direct web traffic using CLI'
description: In this quickstart, you learn how to use the Azure CLI to create an Azure Application Gateway that directs web traffic to virtual machines in a backend pool. -+ Last updated 06/14/2021-+
application-gateway Quick Create Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/quick-create-portal.md
Title: 'Quickstart: Direct web traffic using the portal'
description: In this quickstart, you learn how to use the Azure portal to create an Azure Application Gateway that directs web traffic to virtual machines in a backend pool. --++ Last updated 06/14/2021
application-gateway Quick Create Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/quick-create-powershell.md
Title: 'Quickstart: Direct web traffic using PowerShell'
description: In this quickstart, you learn how to use Azure PowerShell to create an Azure Application Gateway that directs web traffic to virtual machines in a backend pool. --++ Last updated 06/14/2021
application-gateway Quick Create Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/quick-create-template.md
Title: 'Quickstart: Direct web traffic using a Resource Manager template'
description: In this quickstart, you learn how to use a Resource Manager template to create an Azure Application Gateway that directs web traffic to virtual machines in a backend pool. --++ Last updated 06/14/2021
application-gateway Redirect External Site Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/redirect-external-site-cli.md
Title: External traffic redirection using CLI - Azure Application Gateway description: Learn how to create an application gateway that redirects external web traffic to the appropriate pool using the Azure CLI. -+ Last updated 09/24/2020-+ # Create an application gateway with external redirection using the Azure CLI
application-gateway Redirect External Site Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/redirect-external-site-powershell.md
Title: External redirection using PowerShell
description: Learn how to create an application gateway that redirects web traffic to an external site using Azure PowerShell. -+ Last updated 09/24/2020-+
application-gateway Redirect Http To Https Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/redirect-http-to-https-cli.md
Title: HTTP to HTTPS redirection using CLI
description: Learn how to create an HTTP to HTTPS redirection and add a certificate for TLS termination using the Azure CLI. -+ Last updated 09/24/2020-+ # Create an application gateway with HTTP to HTTPS redirection using the Azure CLI
application-gateway Redirect Http To Https Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/redirect-http-to-https-portal.md
Title: HTTP to HTTPS redirection in portal - Azure Application Gateway description: Learn how to create an application gateway with redirected traffic from HTTP to HTTPS using the Azure portal. -+ Last updated 11/13/2019-+
application-gateway Redirect Http To Https Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/redirect-http-to-https-powershell.md
Title: HTTP to HTTPS redirection using PowerShell - Azure Application Gateway description: Learn how to create an application gateway with redirected traffic from HTTP to HTTPS using Azure PowerShell. -+ Last updated 09/28/2020-+
application-gateway Redirect Internal Site Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/redirect-internal-site-cli.md
Title: Internal redirection using CLI
description: Learn how to create an application gateway that redirects internal web traffic to the appropriate pool using the Azure CLI. -+ Last updated 11/14/2019-+ # Create an application gateway with internal redirection using the Azure CLI
application-gateway Redirect Internal Site Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/redirect-internal-site-powershell.md
Title: Internal redirection using PowerShell
description: Learn how to create an application gateway that redirects internal web traffic to the appropriate backend pool of servers using Azure PowerShell. -+ Last updated 09/28/2020-+
application-gateway Renew Certificates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/renew-certificates.md
Title: Renew an Azure Application Gateway certificate description: Learn how to renew a certificate associated with an application gateway listener. -+ Last updated 01/25/2022-+ ms.devlang: azurecli
application-gateway Resource Health Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/resource-health-overview.md
Title: Azure Application Gateway Resource Health overview description: This article is an overview of the resource health feature for Azure Application Gateway -+ Last updated 7/9/2019-+ # Azure Application Gateway Resource Health overview
application-gateway Resource Manager Template Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/resource-manager-template-samples.md
Title: Azure Resource Manager templates
description: This article has links to Azure Resource Manager template examples so you can quickly deploy Azure Application Gateway configured in various ways. -+ Last updated 11/16/2019-+ # Azure Resource Manager templates for Azure Application Gateway
application-gateway Rewrite Http Headers Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/rewrite-http-headers-portal.md
Title: Rewrite HTTP request and response headers in portal - Azure Application Gateway description: Learn how to use the Azure portal to configure an Azure Application Gateway to rewrite the HTTP headers in the requests and responses passing through the gateway -+ Last updated 11/13/2019-+ # Rewrite HTTP request and response headers with Azure Application Gateway - Azure portal
application-gateway Create Vmss Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/scripts/create-vmss-cli.md
Title: Azure CLI Script Sample - Manage web traffic | Microsoft Docs
description: Azure CLI Script Sample - Manage web traffic with an application gateway and a virtual machine scale set. documentationcenter: networking-+ tags: azure-resource-manager
vm-windows Last updated 01/29/2018-+
application-gateway Create Vmss Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/scripts/create-vmss-powershell.md
Title: Azure PowerShell Script Sample - Manage web traffic | Microsoft Docs
description: Azure PowerShell Script Sample - Manage web traffic with an application gateway and a virtual machine scale set. documentationcenter: networking-+ tags: azure-resource-manager
vm-windows Last updated 01/29/2018-+
application-gateway Create Vmss Waf Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/scripts/create-vmss-waf-cli.md
Title: Azure CLI Script Sample - Restrict web traffic | Microsoft Docs
description: Azure CLI Script Sample - Create an application gateway with a web application firewall and a virtual machine scale set that uses OWASP rules to restrict traffic. documentationcenter: networking-+ tags: azure-resource-manager
vm-windows Last updated 01/29/2018-+
application-gateway Create Vmss Waf Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/scripts/create-vmss-waf-powershell.md
Title: Azure PowerShell Script Sample - Restrict web traffic | Microsoft Docs
description: Azure PowerShell Script Sample - Create an application gateway with a web application firewall and a virtual machine scale set that uses OWASP rules to restrict traffic. documentationcenter: networking-+ tags: azure-resource-manager
vm-windows Last updated 01/29/2018-+
application-gateway Waf Custom Rules Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/scripts/waf-custom-rules-powershell.md
Title: Azure PowerShell Script Sample - Create WAF custom rules description: Azure PowerShell Script Sample - Create Web Application Firewall custom rules-+ Last updated 6/7/2019-+
application-gateway Self Signed Certificates https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/self-signed-certificates.md
Title: Generate self-signed certificate with a custom root CA
description: Learn how to generate an Azure Application Gateway self-signed certificate with a custom root CA -+ Last updated 07/23/2019-+
application-gateway Ssl Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/ssl-overview.md
Last updated 06/03/2021-+ # Overview of TLS termination and end to end TLS with Application Gateway
application-gateway Tutorial Autoscale Ps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-autoscale-ps.md
Title: 'Tutorial: Improve web application access - Azure Application Gateway' description: In this tutorial, learn how to create an autoscaling, zone-redundant application gateway with a reserved IP address using Azure PowerShell. -+ Last updated 03/08/2021-+ #Customer intent: As an IT administrator new to Application Gateway, I want to configure the service in a way that automatically scales based on customer demand and is highly available across availability zones to ensure my customers can access their web applications when they need them.
application-gateway Tutorial Http Header Rewrite Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-http-header-rewrite-powershell.md
Title: Create an Azure Application Gateway & rewrite HTTP headers description: This article provides information on how to create an Azure Application Gateway and rewrite HTTP headers using Azure PowerShell -+ Last updated 11/19/2019-+
application-gateway Tutorial Manage Web Traffic Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-manage-web-traffic-cli.md
Title: Manage web traffic - Azure CLI description: Learn how to create an application gateway with a virtual machine scale set to manage web traffic using the Azure CLI. -+ Last updated 07/20/2019-+
application-gateway Tutorial Manage Web Traffic Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-manage-web-traffic-powershell.md
Title: Manage web traffic - Azure PowerShell description: Learn how to create an application gateway with a virtual machine scale set to manage web traffic using Azure PowerShell. -+ Last updated 07/19/2019-+ # Manage web traffic with an application gateway using Azure PowerShell
application-gateway Tutorial Multiple Sites Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-multiple-sites-cli.md
Title: Host multiple web sites using CLI
description: Learn how to create an application gateway that hosts multiple web sites using the Azure CLI. -+ Last updated 11/13/2019-+ #Customer intent: As an IT administrator, I want to use Azure CLI to configure Application Gateway to host multiple web sites , so I can ensure my customers can access the web information they need.
application-gateway Tutorial Multiple Sites Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-multiple-sites-powershell.md
Title: Host multiple web sites using PowerShell
description: Learn how to create an application gateway that hosts multiple web sites using Azure PowerShell. -+ Last updated 07/20/2020-+ #Customer intent: As an IT administrator, I want to use Azure PowerShell to configure Application Gateway to host multiple web sites , so I can ensure my customers can access the web information they need.
application-gateway Tutorial Ssl Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-ssl-cli.md
Title: TLS termination using CLI - Azure Application Gateway description: Learn how to create an application gateway and add a certificate for TLS termination using the Azure CLI. -+ Last updated 11/14/2019-+
application-gateway Tutorial Ssl Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-ssl-powershell.md
Title: TLS termination using PowerShell
description: Learn how to create an application gateway and add a certificate for TLS termination using Azure PowerShell. -+ Last updated 11/14/2019-+
application-gateway Tutorial Url Redirect Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-url-redirect-cli.md
Title: 'Tutorial: URL path-based redirection using CLI'
description: In this tutorial, you learn how to create an application gateway with URL path-based redirected traffic using the Azure CLI. -+ Last updated 03/05/2021-+ #Customer intent: As an IT administrator, I want to use Azure CLI to set up URL path redirection of web traffic to specific pools of servers so I can ensure my customers have access to the information they need.
application-gateway Tutorial Url Redirect Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-url-redirect-powershell.md
Title: URL path-based redirection using PowerShell - Azure Application Gateway description: Learn how to create an application gateway with URL path-based redirected traffic using Azure PowerShell. -+ Last updated 03/24/2021-+ #Customer intent: As an IT administrator, I want to use Azure PowerShell to set up URL path redirection of web traffic to specific pools of servers so I can ensure my customers have access to the information they need.
application-gateway Tutorial Url Route Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-url-route-cli.md
Title: Route web traffic based on the URL - Azure CLI description: In this article, learn how to route web traffic based on the URL to specific scalable pools of servers using the Azure CLI. -+ Last updated 08/01/2019-+ #Customer intent: As an IT administrator, I want to use Azure CLI to set up routing of web traffic to specific pools of servers based on the URL that the customer uses, so I can ensure my customers have the most efficient route to the information they need.
application-gateway Tutorial Url Route Powershell https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/tutorial-url-route-powershell.md
Title: Route web traffic based on the URL - Azure PowerShell description: Learn how to route web traffic based on the URL to specific scalable pools of servers using Azure PowerShell. -+ Last updated 07/31/2019-+ #Customer intent: As an IT administrator, I want to use PowerShell to set up routing of web traffic to specific pools of servers based on the URL that the customer uses, so I can ensure my customers have the most efficient route to the information they need.
application-gateway Url Route Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/application-gateway/url-route-overview.md
Title: Azure Application Gateway URL-based content routing overview description: This article provides an overview of the Azure Application Gateway URL-based content routing, UrlPathMap configuration and PathBasedRouting rule. -+ Last updated 01/14/2022-+
applied-ai-services Concept Read https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/concept-read.md
# Form Recognizer read model
-The Form Recognizer v3.0 preview includes the new Read API. Read extracts text lines, words, their locations, detected languages, and handwritten style if detected from documents (PDF and TIFF) and images (JPG, PNG, and BMP).
+The prebuilt-read model extracts printed and handwritten textual elements including lines, words, locations, and detected languages from documents (PDF and TIFF) and images (JPG, PNG, and BMP). The read model is the foundation for all Form Recognizer models. Layout, general document, custom, and prebuilt models use the prebuilt-read model as a basis for extracting text from documents.
## Development options
applied-ai-services Try V3 Python Sdk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/applied-ai-services/form-recognizer/quickstarts/try-v3-python-sdk.md
recommendations: false
[Reference documentation](/python/api/azure-ai-formrecognizer/azure.ai.formrecognizer?view=azure-python-preview&preserve-view=true) | [Library source code](https://github.com/Azure/azure-sdk-for-python/tree/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/) | [Package (PyPi)](https://pypi.org/project/azure-ai-formrecognizer/3.2.0b3/) | [Samples](https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-formrecognizer_3.2.0b3/sdk/formrecognizer/azure-ai-formrecognizer/samples/README.md)
- Get started with Azure Form Recognizer using the Python programming language. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
+Get started with Azure Form Recognizer using the Python programming language. Azure Form Recognizer is a cloud-based Azure Applied AI Service that uses machine learning to extract key-value pairs, text, and tables from your documents. You can easily call Form Recognizer models by integrating our client library SDks into your workflows and applications. We recommend that you use the free service when you're learning the technology. Remember that the number of free pages is limited to 500 per month.
To learn more about Form Recognizer features and development options, visit our [Overview](../overview.md#form-recognizer-features-and-development-options) page.
automanage Automanage Windows Server Services Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automanage/automanage-windows-server-services-overview.md
Azure Extended Network enables you to stretch an on-premises subnet into Azure t
It's important to consider up front, which Automanage for Windows Server capabilities you would like to use, then choose a corresponding VM image that supports all of those capabilities. Some of the _Windows Server Azure Edition_ images support only a subset of capabilities, see the table below for more details.
+> [!NOTE]
+> If you would like to preview the upcoming version of **Windows Server Azure Edition**, see [Windows Server VNext Datacenter: Azure Edition](windows-server-azure-edition-vnext.md).
+ ### Deciding which image to use |Image|Capabilities|
automanage Virtual Machines Custom Profile https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automanage/virtual-machines-custom-profile.md
Azure Automanage for machine best practices has default best practice profiles that cannot be edited. However, if you need more flexibility, you can pick and choose the set of services and settings by creating a custom profile.
-We support toggling services ON and OFF. We also currently support customizing settings on [Azure Backup](..\backup\backup-azure-arm-vms-prepare.md#create-a-custom-policy) and [Microsoft Antimalware](../security/fundamentals/antimalware.md#default-and-custom-antimalware-configuration). Also, for Windows machines only, you can modify the audit modes for the [Azure security baselines in Guest Configuration](../governance/policy/concepts/guest-configuration.md). Check out the [ARM template](#create-a-custom-profile-using-azure-resource-manager-templates) for modifying the **azureSecurityBaselineAssignmentType**.
+Automanage supports toggling services ON and OFF. It also currently supports customizing settings on [Azure Backup](..\backup\backup-azure-arm-vms-prepare.md#create-a-custom-policy) and [Microsoft Antimalware](../security/fundamentals/antimalware.md#default-and-custom-antimalware-configuration). You can also specify an existing log analytics workspace. Also, for Windows machines only, you can modify the audit modes for the [Azure security baselines in Guest Configuration](../governance/policy/concepts/guest-configuration.md).
+Automanage allows you to tag the following resources in the custom profile:
+* Resource Group
+* Automation Account
+* Log Analytics Workspace
+* Recovery Vault
+Check out the [ARM template](#create-a-custom-profile-using-azure-resource-manager-templates) for modifying these settings.
## Create a custom profile in the Azure portal
Sign in to the [Azure portal](https://portal.azure.com/).
## Create a custom profile using Azure Resource Manager Templates The following ARM template will create an Automanage custom profile. Details on the ARM template and steps on how to deploy are located in the ARM template deployment [section](#arm-template-deployment).+
+> [!NOTE]
+> If you want to use a specific log analytics workspace, specify the ID of the workspace like this: "/subscriptions/**subscriptionId**/resourceGroups/**resourceGroupName**/providers/Microsoft.OperationalInsights/workspaces/**workspaceName**"
+ ```json { "$schema": "http://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json",
The following ARM template will create an Automanage custom profile. Details on
"ApplyAndMonitor", "Audit" ]
- }
+ },
+ "logAnalyticsWorkspace": {
+ "type": "String"
+ },
+ "LogAnalyticsBehavior": {
+ "defaultValue": false,
+ "type": "Bool"
+ }
}, "resources": [ {
The following ARM template will create an Automanage custom profile. Details on
"BootDiagnostics/Enable": true, "ChangeTrackingAndInventory/Enable": true, "LogAnalytics/Enable": true,
+ "LogAnalytics/Reprovision": "[parameters('LogAnalyticsBehavior')]",
+ "LogAnalytics/Workspace": "[parameters('logAnalyticsWorkspace')]",
"UpdateManagement/Enable": true,
- "VMInsights/Enable": true
+ "VMInsights/Enable": true,
+ "Tags/ResourceGroup": {
+ "foo": "rg"
+ },
+ "Tags/AzureAutomation": {
+ "foo": "automationAccount"
+ },
+ "Tags/LogAnalyticsWorkspace": {
+ "foo": "workspace"
+ },
+ "Tags/RecoveryVault": {
+ "foo": "recoveryVault"
+ }
} } }
The `azureSecurityBaselineAssignmentType` is the audit mode that you can choose
* ApplyAndMonitor : This will apply the Azure security baseline through the Guest Configuration extention when you first assign this profile to each machine. After it is applied, the Guest Configuration service will monitor the sever baseline and report any drift from the desired state. However, it will not auto-remdiate. * Audit : This will install the Azure security baseline using the Guest Configuration extension. You will be able to see where your machine is out of compliance with the baseline, but noncompliance won't be automatically remediated.
+You can also specify an existing log analytics workspace by adding this setting to the configuration section of properties below:
+* "LogAnalytics/Workspace": "/subscriptions/**subscriptionId**/resourceGroups/**resourceGroupName**/providers/Microsoft.OperationalInsights/workspaces/**workspaceName**"
+* "LogAnalytics/Reprovision": false
+Specify your existing workspace in the `LogAnalytics/Workspace` line. Set the `LogAnalytics/Reprovision` setting to true if you would like this log analytics workspace to be used in all cases. This means that any machine with this custom profile will use this workspace, even it is already connected to one. By default, the `LogAnalytics/Reprovision` is set to false. If your machine is already connected to a workspace, then that workspace will continue to be used. If it is not connected to a workspace, then the workspace specified in `LogAnalytics\Workspace` will be used.
+
+Also, you can add tags to resources specified in the custom profile like below:
+
+```json
+"Tags/ResourceGroup": {
+ "foo": "rg"
+},
+"Tags/ResourceGroup/Behavior": "Preserve",
+"Tags/AzureAutomation": {
+ "foo": "automationAccount"
+},
+"Tags/AzureAutomation/Behavior": "Replace",
+"Tags/LogAnalyticsWorkspace": {
+ "foo": "workspace"
+},
+"Tags/LogAnalyticsWorkspace/Behavior": "Replace",
+"Tags/RecoveryVault": {
+ "foo": "recoveryVault"
+},
+"Tags/RecoveryVault/Behavior": "Preserve"
+```
+The `Tags/Behavior` can either be set to Preserve or Replace. If the resource you are tagging already has the same tag key in the key/value pair, you can choose if you would like to replace that key with the specified value in the configuration profile by using the *Replace* behavior. By default, the behavior is set to *Preserve*, meaning that the tag key that is already associated with that resource will be kept and not overwritten by the key/value pair specified in the configuration profile.
+ Follow these steps to deploy the ARM template: 1. Save this ARM template as `azuredeploy.json` 1. Run this ARM template deployment with `az deployment group create --resource-group myResourceGroup --template-file azuredeploy.json`
automanage Windows Server Azure Edition Vnext https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automanage/windows-server-azure-edition-vnext.md
+
+ Title: "Preview: Windows Server VNext Datacenter (Azure Edition) for Azure Automanage"
+description: Learn how to preview the upcoming version of Windows Server Azure Edition and know what to expect.
++++ Last updated : 04/02/2022+++
+# Preview: Windows Server VNext Datacenter (Azure Edition) for Azure Automanage
+
+> [!IMPORTANT]
+> **Windows Server VNext Datacenter Azure Edition** is currently in public preview.
+> This preview version is provided without a service-level agreement, and we don't recommend it for production workloads. Certain features might not be supported or might have constrained capabilities.
+> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
+
+[Windows Server: Azure Edition (WSAE)](https://aka.ms/wsae) is a new edition of Windows Server focused on innovation and efficiency. Featuring an annual release cadence and optimized to run on Azure properties, WSAE brings new functionality to Windows Server users faster than the traditional Long-Term Servicing Channel (LTSC) editions of Windows Server (2016,2019,2022, etc.) the first version of this new variant is Windows Server 2022 Datacenter: Azure Edition, announced at Microsoft Ignite in November 2021.
+
+The annual WSAE releases are delivered using Windows Update, rather than a full OS upgrade. As part of this annual release cadence, the WSAE Insider preview program will spin up each spring with the opportunity to access early builds of the next release - leading to general availability in the fall. Install the preview to get early access to all the new features and functionality prior to general availability. If you are a registered Microsoft Server Insider, you have access to create and use virtual machine images from this preview. For more information and to manage your Insider membership, visit the [Windows Insider home page](https://insider.windows.com/) or [Windows Insiders for Business home page.](https://insider.windows.com/for-business/)
+
+## WhatΓÇÖs in the preview?
+
+Much like the [Dev Channel of the insider program for Windows 11](https://insider.windows.com/understand-flighting), the WSAE Preview comes directly from an active, still-in-development codebase. As a result, you may encounter some features that are not yet complete, and in some cases, you may see features & functionality that will not appear in the Azure Edition annual release.
+Details regarding each preview will be shared in release announcements posted to the [Windows Server Insiders](https://techcommunity.microsoft.com/t5/windows-server-insiders/bd-p/WindowsServerInsiders) space on Microsoft Tech Community.
+
+## How do I get started?
+
+To create a Windows Server: Azure Edition preview virtual machine in Azure, visit the [Microsoft Server Operating Systems Preview](https://aka.ms/createWSAEpreview) offer in the Azure marketplace.
+
+## Next steps
+
+To learn more about the features and functionality in the Windows Server: Azure Edition preview, [view the most recent preview announcement here.](https://aka.ms/currentWSAEpreview)
automation Automation Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/automation/automation-services.md
Provides a serverless event-driven compute platform for automation that allows y
- You can use a variety of languages to write functions in a language of your choice such as C#, Java, JavaScript, PowerShell, or Python and focus on specific pieces of code. Functions runtime is an open source. - You can choose the hosting plan according to your function app scaling requirements, functionality, and resources required.
- - You can orchestrate complex workflows through [durable functions](/azure-functions/durable/durable-functions-overview?tabs=csharp).
+ - You can orchestrate complex workflows through [durable functions](/azure/azure-functions/durable/durable-functions-overview?tabs=csharp).
- You should avoid large, and long-running functions that can cause unexpected timeout issues. [Learn more](/azure/azure-functions/functions-best-practices?tabs=csharp#write-robust-functions). - When you write Powershell scripts within the Function Apps, you must tweak the scripts to define how the function behaves such as - how it's triggered, its input and output parameters. [Learn more](/azure/azure-functions/functions-reference-powershell?tabs=portal).
Provides a serverless event-driven compute platform for automation that allows y
- You can use a variety of languages to write functions in a language of your choice such as C#, Java, JavaScript, PowerShell, or Python and focus on specific pieces of code. Functions runtime is an open source. - You can choose the hosting plan according to your function app scaling requirements, functionality, and resources required.
- - You can orchestrate complex workflows through [durable functions](/azure-functions/durable/durable-functions-overview?tabs=csharp).
+ - You can orchestrate complex workflows through [durable functions](/azure/azure-functions/durable/durable-functions-overview?tabs=csharp).
- You should avoid large, and long-running functions that can cause unexpected timeout issues. [Learn more](/azure/azure-functions/functions-best-practices?tabs=csharp#write-robust-functions). - When you write Powershell scripts within the Function Apps, you must tweak the scripts to define how the function behaves such as - how it's triggered, its input and output parameters. [Learn more](/azure/azure-functions/functions-reference-powershell?tabs=portal).
azure-arc Manage Agent https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/manage-agent.md
# Managing and maintaining the Connected Machine agent
-After initial deployment of the Azure Connected Machine agent, you may need to reconfigure the agent, upgrade it, or remove it from the computer. You can easily manage these routine maintenance tasks manually or through automation, which reduces both operational error and expenses.
+After initial deployment of the Azure Connected Machine agent, you may need to reconfigure the agent, upgrade it, or remove it from the computer. These routine maintenance tasks can be done manually or through automation (which reduces both operational error and expenses).
## About the azcmagent tool
-The azcmagent tool is used to configure the Azure Connected Machine agent during installation, or modify the initial configuration of the agent after installation. azcmagent.exe provides command-line parameters to customize the agent and view its status:
-
-* **check** - To troubleshoot network connectivity issues
-
-* **connect** - To connect the machine to Azure Arc
-
-* **disconnect** - To disconnect the machine from Azure Arc
+The azcmagent tool is used to configure the Azure Connected Machine agent during installation, or modify the initial configuration of the agent after installation. azcmagent.exe provides the following command-line parameters to customize the agent and view its status:
+* **check** - Troubleshoot network connectivity issues.
+* **connect** - Connect the machine to Azure Arc.
+* **disconnect** - Disconnect the machine from Azure Arc.
* **show** - View agent status and its configuration properties (Resource Group name, Subscription ID, version, etc.), which can help when troubleshooting an issue with the agent. Include the `-j` parameter to output the results in JSON format.- * **config** - View and change settings to enable features and control agent behavior.- * **check** - Validate network connectivity.
+* **logs** - Create a .zip file in the current directory containing logs to assist you while troubleshooting.
+* **version** - Show the Connected Machine agent version.
+* **-useStderr** - Direct error and verbose output to stderr. Include the `-json` parameter to output the results in JSON format.
+* **-h or --help** - Show available command-line parameters. For example, to see detailed help for the **Connect** parameter, type `azcmagent connect -h`.
+* **-v or --verbose** - Enable verbose logging.
-* **logs** - Creates a .zip file in the current directory containing logs to assist you while troubleshooting.
-
-* **version** - Shows the Connected Machine agent version.
-
-* **-useStderr** - Directs error and verbose output to stderr. Include the `-json` parameter to output the results in JSON format.
-
-* **-h or --help** - Shows available command-line parameters
-
- For example, to see detailed help for the **Connect** parameter, type `azcmagent connect -h`.
-
-* **-v or --verbose** - Enable verbose logging
-
-You can perform a **Connect** and **Disconnect** manually while logged on interactively, or automate using the same service principal you used to onboard multiple agents or with a Microsoft identity platform [access token](../../active-directory/develop/access-tokens.md). If you didn't use a service principal to register the machine with Azure Arc-enabled servers, see the following [article](onboard-service-principal.md#create-a-service-principal-for-onboarding-at-scale) to create a service principal.
+You can perform a **connect** and **disconnect** manually while logged on interactively, or with a Microsoft identity platform [access token](../../active-directory/develop/access-tokens.md), or by using the same service principal you used to onboard multiple agents. If you didn't use a service principal to register the machine with Azure Arc-enabled servers, you can [create a service principal now](onboard-service-principal.md#create-a-service-principal-for-onboarding-at-scale).
>[!NOTE] >You must have *Administrator* permissions on Windows or *root* access permissions on Linux machines to run **azcmagent**.
-### Check
+### check
-This parameter allows you to run the network connectivity tests to troubleshoot networking issues between the agent and Azure services. The network connectivity check includes all [required Azure Arc network endpoints](network-requirements.md#urls), but does not include endpoints accessed by extensions you install.
+This parameter allows you to run network connectivity tests to troubleshoot networking issues between the agent and Azure services. The network connectivity check includes all [required Azure Arc network endpoints](network-requirements.md#urls), but does not include endpoints accessed by extensions you install.
-When running a network connectivity check, you must provide the name of the Azure region (for example, eastus) that you want to test. It's also recommended to use the `--verbose` parameter to see the results of both successful and unsuccessful tests.
+When running a network connectivity check, you must provide the name of the Azure region (for example, eastus) that you want to test. It's also recommended to use the `--verbose` parameter to see the results of both successful and unsuccessful tests:
`azcmagent check --location <regionName> --verbose`
-### Connect
+### connect
-This parameter specifies a resource in Azure Resource Manager representing the machine is created in Azure. The resource is in the subscription and resource group specified, and data about the machine is stored in the Azure region specified by the `--location` setting. The default resource name is the hostname of the machine if not specified.
+This parameter specifies a resource in Azure Resource Manager and connects it to Azure Arc. You must specify the subscription and resource group of the resource to connect. Data about the machine is stored in the Azure region specified by the `--location` setting. The default resource name is the hostname of the machine unless otherwise specified.
A certificate corresponding to the system-assigned identity of the machine is then downloaded and stored locally. Once this step is completed, the Azure Connected Machine Metadata Service and guest configuration agent service begins synchronizing with Azure Arc-enabled servers.
To connect with your elevated logged-on credentials (interactive), run the follo
`azcmagent connect --tenant-id <TenantID> --subscription-id <subscriptionID> --resource-group <ResourceGroupName> --location <resourceLocation>`
-### Disconnect
+### disconnect
-This parameter specifies a resource in Azure Resource Manager representing the machine is deleted in Azure. It doesn't remove the agent from the machine, you uninstall the agent separately. After the machine is disconnected, if you want to re-register it with Azure Arc-enabled servers, use `azcmagent connect` so a new resource is created for it in Azure.
+This parameter specifies a resource in Azure Resource Manager to delete from Azure Arc. Running this parameter doesn't remove the agent from the machine; you must uninstall the agent separately. After the machine is disconnected, you can re-register it with Azure Arc-enabled servers by using `azcmagent connect` so a new resource is created for it in Azure.
> [!NOTE]
-> If you have deployed one or more of the Azure VM extensions to your Azure Arc-enabled server and you delete its registration in Azure, the extensions are still installed. It's important to understand that depending on the extension installed, it's actively performing its function. Machines that are intended to be retired or no longer managed by Azure Arc-enabled servers should first have the extensions removed before removing its registration from Azure.
+> If you have deployed one or more Azure VM extensions to your Azure Arc-enabled server and you delete its registration in Azure, the extensions remain installed and may continue performing their functions. Any machine intended to be retired or no longer managed by Azure Arc-enabled servers should first have its [extensions removed](#step-1-remove-vm-extensions) before removing its registration from Azure.
To disconnect using a service principal, run the following command:
To disconnect with your elevated logged-on credentials (interactive), run the fo
`azcmagent disconnect`
-### Config
+### config
This parameter allows you to view and configure settings that control agent behavior.
To change a configuration property, run the following command:
`azcmagent config set <propertyName> <propertyValue>`
-If the property you're changing supports a list of values, you can use the `--add` and `--remove` flags to add or remove specific items without having to re-type the entire list.
+If the property you're changing supports a list of values, you can use the `--add` and `--remove` flags to add or remove specific items without having to re-type the entire list:
`azcmagent config set <propertyName> <propertyValue> --add`
To clear a configuration property's value, run the following command:
`azcmagent config clear <propertyName>`
-## Upgrading agent
+## Upgrade the agent
-The Azure Connected Machine agent is updated regularly to address bug fixes, stability enhancements, and new functionality. [Azure Advisor](../../advisor/advisor-overview.md) identifies resources that are not using the latest version of machine agent and recommends that you upgrade to the latest version. It will notify you when you select the Azure Arc-enabled server by presenting a banner on the **Overview** page or when you access Advisor through the Azure portal.
+The Azure Connected Machine agent is updated regularly to address bug fixes, stability enhancements, and new functionality. [Azure Advisor](../../advisor/advisor-overview.md) identifies resources that are not using the latest version of the machine agent and recommends that you upgrade to the latest version. It will notify you when you select the Azure Arc-enabled server by presenting a banner on the **Overview** page or when you access Advisor through the Azure portal.
The Azure Connected Machine agent for Windows and Linux can be upgraded to the latest release manually or automatically depending on your requirements. Installing, upgrading, or uninstalling the Azure Connected Machine Agent will not require you to restart your server.
-The following table describes the methods supported to perform the agent upgrade.
+The following table describes the methods supported to perform the agent upgrade:
| Operating system | Upgrade method | ||-|
The latest version of the Azure Connected Machine agent for Windows-based machin
The recommended way of keeping the Windows agent up to date is to automatically obtain the latest version through Microsoft Update. This allows you to utilize your existing update infrastructure (such as Microsoft Endpoint Configuration Manager or Windows Server Update Services) and include Azure Connected Machine agent updates with your regular OS update schedule.
-Windows Server doesn't check for updates in Microsoft Update by default. You need to configure the Windows Update client on the machine to also check for other Microsoft products in order to receive automatic updates for the Azure Connected Machine Agent.
+Windows Server doesn't check for updates in Microsoft Update by default. To receive automatic updates for the Azure Connected Machine Agent, you must configure the Windows Update client on the machine to check for other Microsoft products.
For Windows Servers that belong to a workgroup and connect to the Internet to check for updates, you can enable Microsoft Update by running the following commands in PowerShell as an administrator:
$ServiceManager.AddService2($ServiceId,7,"")
For Windows Servers that belong to a domain and connect to the Internet to check for updates, you can configure this setting at-scale using Group Policy:
-1. Sign into a computer used for server administration with an account that can manage Group Policy Objects (GPO) for your organization
-1. Open the **Group Policy Management Console**
+1. Sign into a computer used for server administration with an account that can manage Group Policy Objects (GPO) for your organization.
+
+1. Open the **Group Policy Management Console**.
+ 1. Expand the forest, domain, and organizational unit(s) to select the appropriate scope for your new GPO. If you already have a GPO you wish to modify, skip to step 6.
-1. Right click the container and select **Create a GPO in this domain, and Link it here...**
-1. Provide a name for your policy such as "Enable Microsoft Update"
-1. Right click the policy and select **Edit**
-1. Navigate to **Computer Configuration > Administrative Templates > Windows Components > Windows Update**
-1. Double click the **Configure Automatic Updates** setting to edit it
-1. Select the **Enabled** radio button to allow the policy to take effect
-1. In the Options section, check the box for **Install updates for other Microsoft products** at the bottom
-1. Select **OK**
+
+1. Right-click the container and select **Create a GPO in this domain, and Link it here...**.
+
+1. Provide a name for your policy such as "Enable Microsoft Update".
+
+1. Right-click the policy and select **Edit**.
+
+1. Navigate to **Computer Configuration > Administrative Templates > Windows Components > Windows Update**.
+
+1. Select the **Configure Automatic Updates** setting to edit it.
+
+1. Select the **Enabled** radio button to allow the policy to take effect.
+
+1. At the bottom of the **Options** section, check the box for **Install updates for other Microsoft products** at the bottom.
+
+1. Select **OK**.
The next time computers in your selected scope refresh their policy, they will start to check for updates in both Windows Update and Microsoft Update.
Once the updates are being synchronized, you can optionally add the Azure Connec
#### To manually upgrade using the Setup Wizard
-1. Sign on to the computer with an account that has administrative rights.
+1. Sign in to the computer with an account that has administrative rights.
-2. Download the latest agent installer from https://aka.ms/AzureConnectedMachineAgent
+1. Download the latest agent installer from https://aka.ms/AzureConnectedMachineAgent
-3. Execute **AzureConnectedMachineAgent.msi** to start the Setup Wizard.
+1. Run **AzureConnectedMachineAgent.msi** to start the Setup Wizard.
-The Setup Wizard discovers if a previous version exists, and then it automatically performs an upgrade of the agent. When the upgrade completes, the Setup Wizard automatically closes.
+If the Setup Wizard discovers a previous version of the agent, it will upgrade it automatically. When the upgrade completes, the Setup Wizard closes automatically.
#### To upgrade from the command line
If you're unfamiliar with the command-line options for Windows Installer package
1. Sign on to the computer with an account that has administrative rights.
-2. Download the latest agent installer from https://aka.ms/AzureConnectedMachineAgent
+1. Download the latest agent installer from https://aka.ms/AzureConnectedMachineAgent
-3. To upgrade the agent silently and create a setup log file in the `C:\Support\Logs` folder, run the following command.
+1. To upgrade the agent silently and create a setup log file in the `C:\Support\Logs` folder, run the following command:
```dos msiexec.exe /i AzureConnectedMachineAgent.msi /qn /l*v "C:\Support\Logs\azcmagentupgradesetup.log"
If you're unfamiliar with the command-line options for Windows Installer package
### Linux agent
-To update the agent on a Linux machine to the latest version, it involves two commands. One command to update the local package index with the list of latest available packages from the repositories, and one command to upgrade the local package.
+Updating the agent on a Linux machine involves two commands; one command to update the local package index with the list of latest available packages from the repositories, and another command to upgrade the local package.
You can download the latest agent package from Microsoft's [package repository](https://packages.microsoft.com/). > [!NOTE]
-> To upgrade the agent, you must have *root* access permissions or with an account that has elevated rights using Sudo.
+> To upgrade the agent, you must have *root* access permissions or an account that has elevated rights using Sudo.
#### Upgrade the agent on Ubuntu
Actions of the [zypper](https://en.opensuse.org/Portal:Zypper) command, such as
## Renaming an Azure Arc-enabled server resource
-When you change the name of the Linux or Windows machine connected to Azure Arc-enabled servers, the new name is not recognized automatically because the resource name in Azure is immutable. As with other Azure resources, you have to delete the resource and re-create it in order to use the new name.
+When you change the name of a Linux or Windows machine connected to Azure Arc-enabled servers, the new name is not recognized automatically because the resource name in Azure is immutable. As with other Azure resources, you must delete the resource and re-create it in order to use the new name.
+
+For Azure Arc-enabled servers, before you rename the machine, it's necessary to remove the VM extensions before proceeding:
-For Azure Arc-enabled servers, before you rename the machine, it's necessary to remove the VM extensions before proceeding.
+1. Audit the VM extensions installed on the machine and note their configuration using the [Azure CLI](manage-vm-extensions-cli.md#list-extensions-installed) or [Azure PowerShell](manage-vm-extensions-powershell.md#list-extensions-installed).
-1. Audit the VM extensions installed on the machine and note their configuration, using the [Azure CLI](manage-vm-extensions-cli.md#list-extensions-installed) or using [Azure PowerShell](manage-vm-extensions-powershell.md#list-extensions-installed).
+2. Remove any VM extensions installed on the machine. You can do this using the [Azure portal](manage-vm-extensions-portal.md#remove-extensions), the [Azure CLI](manage-vm-extensions-cli.md#remove-extensions), or [Azure PowerShell](manage-vm-extensions-powershell.md#remove-extensions).
-2. Remove VM extensions installed from the [Azure portal](manage-vm-extensions-portal.md#remove-extensions), using the [Azure CLI](manage-vm-extensions-cli.md#remove-extensions), or using [Azure PowerShell](manage-vm-extensions-powershell.md#remove-extensions).
+3. Use the **azcmagent** tool with the [Disconnect](manage-agent.md#disconnect) parameter to disconnect the machine from Azure Arc and delete the machine resource from Azure. You can run this manually while logged on interactively, with a Microsoft identity [access token](../../active-directory/develop/access-tokens.md), or with the service principal you used for onboarding (or with a [new service principal that you create](onboard-service-principal.md#create-a-service-principal-for-onboarding-at-scale).
-3. Use the **azcmagent** tool with the [Disconnect](manage-agent.md#disconnect) parameter to disconnect the machine from Azure Arc and delete the machine resource from Azure. Disconnecting the machine from Azure Arc-enabled servers doesn't remove the Connected Machine agent, and you do not need to remove the agent as part of this process. You can run azcmagent manually while logged on interactively, or automate using the same service principal you used to onboard multiple agents, or with a Microsoft identity platform [access token](../../active-directory/develop/access-tokens.md). If you didn't use a service principal to register the machine with Azure Arc-enabled servers, see the following [article](onboard-service-principal.md#create-a-service-principal-for-onboarding-at-scale) to create a service principal.
+ Disconnecting the machine from Azure Arc-enabled servers doesn't remove the Connected Machine agent, and you do not need to remove the agent as part of this process.
-4. Re-register the Connected Machine agent with Azure Arc-enabled servers. Run the `azcmagent` tool with the [Connect](manage-agent.md#connect) parameter complete this step. The agent will default to using the computer's current hostname, but you can choose your own resource name by passing the `--resource-name` parameter to the connect command.
+4. Re-register the Connected Machine agent with Azure Arc-enabled servers. Run the `azcmagent` tool with the [Connect](manage-agent.md#connect) parameter to complete this step. The agent will default to using the computer's current hostname, but you can choose your own resource name by passing the `--resource-name` parameter to the connect command.
5. Redeploy the VM extensions that were originally deployed to the machine from Azure Arc-enabled servers. If you deployed the Azure Monitor for VMs (insights) agent or the Log Analytics agent using an Azure Policy definition, the agents are redeployed after the next [evaluation cycle](../../governance/policy/how-to/get-compliance-data.md#evaluation-triggers). ## Uninstall the agent
-For servers you no longer want to manage with Azure Arc-enabled servers, follow the steps below to remove any VM extensions from the server, disconnect the agent, and uninstall the software from your server. it's important to complete each of the 3 steps to fully remove all related software components from your system.
+For servers you no longer want to manage with Azure Arc-enabled servers, follow the steps below to remove any VM extensions from the server, disconnect the agent, and uninstall the software from your server. It's important to complete all of these steps to fully remove all related software components from your system.
### Step 1: Remove VM extensions
-If you have deployed Azure VM extensions to an Azure Arc-enabled server, you must uninstall the extensions before disconnecting the agent or uninstalling the software. Uninstalling the Azure Connected Machine agent doesn't automatically remove extensions, and they won't be recognized if you later connect the server to Azure Arc again.
+If you have deployed Azure VM extensions to an Azure Arc-enabled server, you must uninstall the extensions before disconnecting the agent or uninstalling the software. Uninstalling the Azure Connected Machine agent doesn't automatically remove extensions, and these extensions won't be recognized if you reconnect the server to Azure Arc.
For guidance on how to identify and remove any extensions on your Azure Arc-enabled server, see the following resources:
For guidance on how to identify and remove any extensions on your Azure Arc-enab
### Step 2: Disconnect the server from Azure Arc
-Disconnecting the agent deletes the corresponding Azure resource for the server and clears the local state of the agent. The recommended way to disconnect the agent is to run the `azcmagent disconnect` command as an administrator on the server. You'll be prompted to log in with an Azure account that has permission to delete the resource in your subscription. If the resource has already been deleted in Azure, you'll need to pass an additional flag to only clean up the local state: `azcmagent disconnect --force-local-only`.
+Disconnecting the agent deletes the corresponding Azure resource for the server and clears the local state of the agent. To disconnect the agent, run the `azcmagent disconnect` command as an administrator on the server. You'll be prompted to log in with an Azure account that has permission to delete the resource in your subscription. If the resource has already been deleted in Azure, you'll need to pass an additional flag to clean up the local state: `azcmagent disconnect --force-local-only`.
### Step 3a: Uninstall the Windows agent
Both of the following methods remove the agent, but they do not remove the *C:\P
#### Uninstall from Control Panel
-1. To uninstall the Windows agent from the machine, do the following:
+Follow these steps to uninstall the Windows agent from the machine:
+
+1. Sign in to the computer with an account that has administrator permissions.
- a. Sign in to the computer with an account that has administrator permissions.
- b. In **Control Panel**, select **Programs and Features**.
- c. In **Programs and Features**, select **Azure Connected Machine Agent**, select **Uninstall**, and then select **Yes**.
+1. In **Control panel**, select **Programs and Features**.
- >[!NOTE]
- > You can also run the agent setup wizard by double-clicking the **AzureConnectedMachineAgent.msi** installer package.
+1. In **Programs and Features**, select **Azure Connected Machine Agent**, select **Uninstall**, and then select **Yes**.
+
+You can also delete the Windows agent directly from the agent setup wizard. Run the **AzureConnectedMachineAgent.msi** installer package to do so.
#### Uninstall from the command line
-To uninstall the agent manually from the Command Prompt or to use an automated method, such as a script, you can use the following example. First you need to retrieve the product code, which is a GUID that is the principal identifier of the application package, from the operating system. The uninstall is performed by using the Msiexec.exe command line - `msiexec /x {Product Code}`.
+You can uninstall the agent manually from the Command Prompt or by using an automated method (such as a script) by following the example below. First you need to retrieve the product code, which is a GUID that is the principal identifier of the application package, from the operating system. The uninstall is performed by using the Msiexec.exe command line - `msiexec /x {Product Code}`.
1. Open the Registry Editor. 2. Under registry key `HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\Uninstall`, look for and copy the product code GUID.
-3. You can then uninstall the agent by using Msiexec using the following examples:
+3. Uninstall the agent using Msiexec, as in the following examples:
* From the command-line type:
To uninstall the agent manually from the Command Prompt or to use an automated m
### Step 3b: Uninstall the Linux agent > [!NOTE]
-> To uninstall the agent, you must have *root* access permissions or with an account that has elevated rights using sudo.
+> To uninstall the agent, you must have *root* access permissions or an account that has elevated rights using sudo.
-To uninstall the Linux agent, the command to use depends on the Linux operating system.
+The command used to uninstall the Linux agent depends on the Linux operating system.
* For Ubuntu, run the following command:
To uninstall the Linux agent, the command to use depends on the Linux operating
## Update or remove proxy settings
-To configure the agent to communicate to the service through a proxy server or remove this configuration after deployment, or use one of the following methods to complete this task. The agent communicates outbound using the HTTP protocol under this scenario.
+To configure the agent to communicate to the service through a proxy server or to remove this configuration after deployment, use one of the methods described below. Note that the agent communicates outbound using the HTTP protocol under this scenario.
-As of agent version 1.13, proxy settings can be configured using the `azcmagent config` command or system environment variables. If a proxy server is specified in both the agent configuration and system environment variables, the agent configuration will take precedence and become the effective setting. `azcmagent show` returns the effective proxy configuration for the agent.
+As of agent version 1.13, proxy settings can be configured using the `azcmagent config` command or system environment variables. If a proxy server is specified in both the agent configuration and system environment variables, the agent configuration will take precedence and become the effective setting. Use `azcmagent show` to view the effective proxy configuration for the agent.
> [!NOTE] > Azure Arc-enabled servers doesn't support using proxy servers that require authentication, TLS (HTTPS) connections, or a [Log Analytics gateway](../../azure-monitor/agents/gateway.md) as a proxy for the Connected Machine agent.
sudo /opt/azcmagent/bin/azcmagent_proxy remove
If you're already using environment variables to configure the proxy server for the Azure Connected Machine agent and want to migrate to the agent-specific proxy configuration based on local agent settings, follow these steps:
-1. [Upgrade the Azure Connected Machine agent](#upgrading-agent) to the latest version (starting with version 1.13) to use the new proxy configuration settings
-1. Configure the agent with your proxy server information by running `azcmagent config set proxy.url "http://ProxyServerFQDN:port"`
-1. Remove the unused environment variables by following the steps for [Windows](#windows-environment-variables) or [Linux](#linux-environment-variables)
+1. [Upgrade the Azure Connected Machine agent](#upgrade-the-agent) to the latest version (starting with version 1.13) to use the new proxy configuration settings.
+
+1. Configure the agent with your proxy server information by running `azcmagent config set proxy.url "http://ProxyServerFQDN:port"`.
+
+1. Remove the unused environment variables by following the steps for [Windows](#windows-environment-variables) or [Linux](#linux-environment-variables).
## Next steps
azure-arc Manage Vm Extensions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-arc/servers/manage-vm-extensions.md
Verify your machine matches the [supported versions](prerequisites.md#supported-
The minimum version of the Connected Machine agent that is supported with this feature on Windows and Linux is the 1.0 release.
-To upgrade your machine to the version of the agent required, see [Upgrade agent](manage-agent.md#upgrading-agent).
+To upgrade your machine to the version of the agent required, see [Upgrade agent](manage-agent.md#upgrade-the-agent).
## Next steps
azure-functions Functions Bindings Storage Table Input https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-functions/functions-bindings-storage-table-input.md
Write-Host "Person entity name: $($PersonEntity.Name)"
::: zone-end ::: zone pivot="programming-language-python"
-The following function uses a queue trigger to read a single table row as input to a function.
+The following function uses an HTTP trigger to read a single table row as input to a function.
-In this example, binding configuration specifies an explicit value for the table's `partitionKey` and uses an expression to pass to the `rowKey`. The `rowKey` expression, `{id}` indicates that the row key comes from the queue message string.
+In this example, binding configuration specifies an explicit value for the table's `partitionKey` and uses an expression to pass to the `rowKey`. The `rowKey` expression, `{id}` indicates that the row key comes from the `{id}` part of the route in the request.
Binding configuration in the _function.json_ file:
For specific usage details, see [Example](#example).
[CloudTable]: /dotnet/api/microsoft.azure.cosmos.table.cloudtable [TableEntity]: /dotnet/api/azure.data.tables.tableentity [`IQueryable<T>`]: /dotnet/api/system.linq.iqueryable-1
-[`IEnumerable<T>`]: /dotnet/api/system.collections.generic.ienumerable-1
+[`IEnumerable<T>`]: /dotnet/api/system.collections.generic.ienumerable-1
azure-maps Weather Coverage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-maps/weather-coverage.md
Azure Maps [Severe weather alerts][severe-weather-alerts] service returns severe
[current-conditions]: /rest/api/maps/weather/get-current-conditions
-[dh-records]: /rest/api/maps/weather/get-dh-records
-[dh-actuals]: /rest/api/maps/weather/get-dh-actuals
-[dh-normals]: /rest/api/maps/weather/get-dh-normals
+[dh-records]: /rest/api/maps/weather/get-daily-historical-records
+[dh-actuals]: /rest/api/maps/weather/get-daily-historical-actuals
+[dh-normals]: /rest/api/maps/weather/get-daily-historical-normals
[tropical-storm-active]: /rest/api/maps/weather/get-tropical-storm-active [tropical-storm-forecasts]: /rest/api/maps/weather/get-tropical-storm-forecast
azure-monitor Azure Monitor Agent Extension Versions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-extension-versions.md
+
+ Title: Azure Monitor agent extension versions
+description: This article describes the version details for the Azure Monitor agent virtual machine extension.
+++ Last updated : 4/11/2022++++
+# Azure Monitor agent extension versions
+This article describes the version details for the Azure Monitor agent virtual machine extension. This extension deploys the agent on virtual machines, scale sets, and Arc-enabled servers (on premise servers with Azure Arc agent installed).
+
+We strongly recommended to update to the latest version at all times, or opt in to the [Automatic Extension Update](../../virtual-machines/automatic-extension-upgrade.md) feature.
++
+## Version details
+| Release Date | Release notes | Windows | Linux |
+|:|:|:|:|
+| March 2022 | <ul><li>Fixed timestamp and XML format bugs in Windows Event logs</li><li>Full Windows OS information in Log Analytics Heartbeat table</li><li>Fixed Linux performance counters to collect instance values instead of 'total' only</li></ul> | 1.3.0.0 | 1.17.5.0 |
+| February 2022 | <ul><li>Bugfixes for the AMA Client installer (private preview)</li><li>Versioning fix to reflect appropriate Windows major/minor/hotfix versions</li><li>Internal test improvement on Linux</li></ul> | 1.2.0.0 | 1.15.3 |
+| January 2022 | <ul><li>Syslog RFC compliance for Linux</li><li>Fixed issue for Linux perf counters not flowing on restart</li><li>Fixed installation failure on Windows Server 2008 R2 SP1</li></ul> | 1.1.5.1<sup>Hotfix</sup> | 1.15.2.0<sup>Hotfix</sup> |
+| December 2021 | <ul><li>Fixed issues impacting Linux Arc-enabled servers</li><li>'Heartbeat' table > 'Category' column reports "Azure Monitor Agent" in Log Analytics for Windows</li></ul> | 1.1.4.0 | 1.14.7.0<sup>2</sup> |
+| September 2021 | <ul><li>Fixed issue causing data loss on restarting the agent</li><li>Fixed issue for Arc Windows servers</li></ul> | 1.1.3.2<sup>Hotfix</sup> | 1.12.2.0 <sup>1</sup> |
+| August 2021 | Fixed issue allowing Azure Monitor Metrics as the only destination | 1.1.2.0 | 1.10.9.0<sup>Hotfix</sup> |
+| July 2021 | <ul><li>Support for direct proxies</li><li>Support for Log Analytics gateway</li></ul> [Learn more](https://azure.microsoft.com/updates/general-availability-azure-monitor-agent-and-data-collection-rules-now-support-direct-proxies-and-log-analytics-gateway/) | 1.1.1.0 | 1.10.5.0 |
+| June 2021 | General availability announced. <ul><li>All features except metrics destination now generally available</li><li>Production quality, security and compliance</li><li>Availability in all public regions</li><li>Performance and scale improvements for higher EPS</li></ul> [Learn more](https://azure.microsoft.com/updates/azure-monitor-agent-and-data-collection-rules-now-generally-available/) | 1.0.12.0 | 1.9.1.0 |
+
+<sup>Hotfix</sup> Do not use AMA Linux versions v1.10.7, v1.15.1 and AMA Windows v1.1.3.1, v1.1.5.0. Please use hotfix versions listed above.
+<sup>1</sup> Known issue: No data collected from Linux Arc-enabled servers
+<sup>2</sup> Known issue: Linux performance counters data stops flowing on restarting/rebooting the machine(s)
+
+## Next steps
+
+- [Install and manage the extension](azure-monitor-agent-manage.md).
+- [Create a data collection rule](data-collection-rule-azure-monitor-agent.md) to collect data from the agent and send it to Azure Monitor.
azure-monitor Azure Monitor Agent Manage https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-manage.md
The Azure Monitor agent is implemented as an [Azure VM extension](../../virtual-
| TypeHandlerVersion | 1.2 | 1.15 | ## Extension versions
-We strongly recommended to update to generally available versions listed as follows instead of using preview or intermediate versions.
-
-| Release Date | Release notes | Windows | Linux |
-|:|:|:|:|
-| June 2021 | General availability announced. <ul><li>All features except metrics destination now generally available</li><li>Production quality, security and compliance</li><li>Availability in all public regions</li><li>Performance and scale improvements for higher EPS</li></ul> [Learn more](https://azure.microsoft.com/updates/azure-monitor-agent-and-data-collection-rules-now-generally-available/) | 1.0.12.0 | 1.9.1.0 |
-| July 2021 | <ul><li>Support for direct proxies</li><li>Support for Log Analytics gateway</li></ul> [Learn more](https://azure.microsoft.com/updates/general-availability-azure-monitor-agent-and-data-collection-rules-now-support-direct-proxies-and-log-analytics-gateway/) | 1.1.1.0 | 1.10.5.0 |
-| August 2021 | Fixed issue allowing Azure Monitor Metrics as the only destination | 1.1.2.0 | 1.10.9.0<sup>Hotfix</sup> |
-| September 2021 | <ul><li>Fixed issue causing data loss on restarting the agent</li><li>Fixed issue for Arc Windows servers</li></ul> | 1.1.3.2<sup>Hotfix</sup> | 1.12.2.0 <sup>1</sup> |
-| December 2021 | <ul><li>Fixed issues impacting Linux Arc-enabled servers</li><li>'Heartbeat' table > 'Category' column reports "Azure Monitor Agent" in Log Analytics for Windows</li></ul> | 1.1.4.0 | 1.14.7.0<sup>2</sup> |
-| January 2022 | <ul><li>Syslog RFC compliance for Linux</li><li>Fixed issue for Linux perf counters not flowing on restart</li><li>Fixed installation failure on Windows Server 2008 R2 SP1</li></ul> | 1.1.5.1<sup>Hotfix</sup> | 1.15.2.0<sup>Hotfix</sup> |
-| Feburary 2022 | <ul><li>Bugfixes for the AMA Client installer (private preview)</li><li>Versioning fix to reflect appropriate Windows major/minor/hotfix versions</li></ul> | 1.2.0.0 | Not yet available |
-
-<sup>Hotfix</sup> Do not use AMA Linux versions v1.10.7, v1.15.1 and AMA Windows v1.1.3.1, v1.1.5.0. Please use hotfixed versions listed above.
-<sup>1</sup> Known issue: No data collected from Linux Arc-enabled servers
-<sup>2</sup> Known issue: Linux performance counters data stops flowing on restarting/rebooting the machine(s)
+[View Azure Monitor Agent extension versions](./azure-monitor-agent-extension-versions.md).
## Prerequisites
azure-monitor Azure Monitor Agent Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/agents/azure-monitor-agent-overview.md
The Azure Monitor agent doesn't require any keys but instead requires a [system-
The Azure Monitor agent supports Azure service tags (both AzureMonitor and AzureResourceManager tags are required). It supports connecting via **direct proxies, Log Analytics gateway, and private links** as described below. ### Firewall requirements
-|Endpoint |Purpose |Port |Direction |Bypass HTTPS inspection|
-||||--|--|
-|global.handler.control.monitor.azure.com |Access control service|Port 443 |Outbound|Yes |
-|`<virtual-machine-region-name>`.handler.control.monitor.azure.com |Fetch data collection rules for specific machine |Port 443 |Outbound|Yes |
-|`<log-analytics-workspace-id>`.ods.opinsights.azure.com |Ingest logs data |Port 443 |Outbound|Yes |
+| Cloud |Endpoint |Purpose |Port |Direction |Bypass HTTPS inspection|
+|||||--|--|
+| Azure Commercial |global.handler.control.monitor.azure.com |Access control service|Port 443 |Outbound|Yes |
+| Azure Commercial |`<virtual-machine-region-name>`.handler.control.monitor.azure.com |Fetch data collection rules for specific machine |Port 443 |Outbound|Yes |
+| Azure Commercial |`<log-analytics-workspace-id>`.ods.opinsights.azure.com |Ingest logs data |Port 443 |Outbound|Yes |
+| Azure Government |global.handler.control.monitor.azure.us |Access control service|Port 443 |Outbound|Yes |
+| Azure Government |`<virtual-machine-region-name>`.handler.control.monitor.azure.us |Fetch data collection rules for specific machine |Port 443 |Outbound|Yes |
+| Azure Government |`<log-analytics-workspace-id>`.ods.opinsights.azure.us |Ingest logs data |Port 443 |Outbound|Yes |
+| Azure China |global.handler.control.monitor.azure.cn |Access control service|Port 443 |Outbound|Yes |
+| Azure China |`<virtual-machine-region-name>`.handler.control.monitor.azure.cn |Fetch data collection rules for specific machine |Port 443 |Outbound|Yes |
+| Azure China |`<log-analytics-workspace-id>`.ods.opinsights.azure.cn |Ingest logs data |Port 443 |Outbound|Yes |
+ If using private links on the agent, you must also add the [dce endpoints](../essentials/data-collection-endpoint-overview.md#components-of-a-data-collection-endpoint)
azure-monitor Action Groups Create Resource Manager Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/alerts/action-groups-create-resource-manager-template.md
First template, describes how to create a Resource Manager template for an actio
```json {
- "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0", "parameters": { "actionGroupName": {
First template, describes how to create a Resource Manager template for an actio
```json {
- "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0", "parameters": { "actionGroupName": {
azure-monitor Api Filtering Sampling https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/api-filtering-sampling.md
To filter telemetry, you write a telemetry processor and register it with `Telem
> >
-### Create a telemetry processor (C#)
+### Create a telemetry processor
+
+### C#
1. To create a filter, implement `ITelemetryProcessor`.
public void Process(ITelemetry item)
<a name="add-properties"></a>
+### Java
+
+To learn more about telemetry processors and their implementation in Java, please reference the [Java telemetry processors documentation](./java-standalone-telemetry-processors.md).
+ ### JavaScript web applications **Filter by using ITelemetryInitializer**
azure-monitor Asp Net Core https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/asp-net-core.md
For Visual Studio for Mac, use the [manual guidance](#enable-application-insight
If you want to store the connection string in ASP.NET Core user secrets or retrieve it from another configuration provider, you can use the overload with a `Microsoft.Extensions.Configuration.IConfiguration` parameter. For example, `services.AddApplicationInsightsTelemetry(Configuration);`. In Microsoft.ApplicationInsights.AspNetCore version [2.15.0](https://www.nuget.org/packages/Microsoft.ApplicationInsights.AspNetCore) and later, calling `services.AddApplicationInsightsTelemetry()` automatically reads the connection string from `Microsoft.Extensions.Configuration.IConfiguration` of the application. There's no need to explicitly provide the `IConfiguration`.
+If `IConfiguration` has loaded configuration from multiple providers, then `services.AddApplicationInsightsTelemetry` prioritizes configuration from `appsettings.json`, irrespective of the order in which providers are added. Use the `services.AddApplicationInsightsTelemetry(IConfiguration)` method to read configuration from IConfiguration without this preferential treatment for `appsettings.json`.
+ ## Run your application Run your application and make requests to it. Telemetry should now flow to Application Insights. The Application Insights SDK automatically collects incoming web requests to your application, along with the following telemetry.
Support for [performance counters](./performance-counters.md) in ASP.NET Core is
By default, `EventCounterCollectionModule` is enabled. To learn how to configure the list of counters to be collected, see [EventCounters introduction](eventcounters.md).
+### Enrich data through HTTP
+
+```csharp
+HttpContext.Features.Get<RequestTelemetry>().Properties["myProp"] = someData
+```
+ ## Enable client-side telemetry for web applications The preceding steps are enough to help you start collecting server-side telemetry. If your application has client-side components, follow the next steps to start collecting [usage telemetry](./usage-overview.md).
For the latest updates and bug fixes, see the [release notes](./release-notes.md
* [Configure a snapshot collection](./snapshot-debugger.md) to see the state of source code and variables at the moment an exception is thrown. * [Use the API](./api-custom-events-metrics.md) to send your own events and metrics for a detailed view of your app's performance and usage. * Use [availability tests](./monitor-web-app-availability.md) to check your app constantly from around the world.
-* [Dependency Injection in ASP.NET Core](/aspnet/core/fundamentals/dependency-injection)
+* [Dependency Injection in ASP.NET Core](/aspnet/core/fundamentals/dependency-injection)
azure-monitor Java 2X Micrometer https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/java-2x-micrometer.md
Micrometer application monitoring measures metrics for JVM-based application cod
## Using Spring Boot 1.5x Add the following dependencies to your pom.xml or build.gradle file:
-* [Application Insights spring-boot-starter](https://github.com/Azure/azure-sdk-for-java/tree/master/sdk/spring/azure-spring-boot-starter)
+* Application Insights spring-boot-starter
2.5.0 or later * Micrometer Azure Registry 1.1.0 or above * [Micrometer Spring Legacy](https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#production-ready-metrics) 1.1.0 or above (this backports the autoconfig code in the Spring framework).
Steps
`azure.application-insights.instrumentation-key=<your-instrumentation-key-here>` 1. Build your application and run
-2. The above should get you up and running with pre-aggregated metrics auto collected to Azure Monitor. For details on how to fine-tune Application Insights Spring Boot starter refer to the [readme on GitHub](https://github.com/Azure/azure-sdk-for-jav).
+2. The above should get you up and running with pre-aggregated metrics auto collected to Azure Monitor.
[!INCLUDE [azure-monitor-log-analytics-rebrand](../../../includes/azure-monitor-instrumentation-key-deprecation.md)]
azure-monitor Java Standalone Telemetry Processors https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/app/java-standalone-telemetry-processors.md
# Telemetry processors (preview) - Azure Monitor Application Insights for Java > [!NOTE]
-> The telemetry processors feature is in preview.
+> The telemetry processors feature is designated as preview because we cannot guarantee backwards compatibility from release to release due to the experimental state of the attribute [semantic conventions](https://opentelemetry.io/docs/reference/specification/trace/semantic_conventions). However, the feature has been tested and is supported in production.
Application Insights Java 3.x can process telemetry data before the data is exported.
azure-monitor Activity Log Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/activity-log-schema.md
This category contains the record of any resource health events that have occurr
| eventDataId |Unique identifier of the alert event. | | category | Always "ResourceHealth" | | eventTimestamp |Timestamp when the event was generated by the Azure service processing the request corresponding the event. |
-| level |Level of the event. One of the following values: ΓÇ£CriticalΓÇ¥, ΓÇ£ErrorΓÇ¥, ΓÇ£WarningΓÇ¥, ΓÇ£InformationalΓÇ¥, and ΓÇ£VerboseΓÇ¥ |
+| level |Level of the event. One of the following values: ΓÇ£CriticalΓÇ¥, or ΓÇ£InformationalΓÇ¥ (other levels are not supported) |
| operationId |A GUID shared among the events that correspond to a single operation. | | operationName |Name of the operation. | | resourceGroupName |Name of the resource group that contains the resource. |
azure-monitor Resource Logs Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/essentials/resource-logs-schema.md
The schema for resource logs varies depending on the resource and log category.
| Azure Load Balancer |[Log Analytics for Azure Load Balancer](../../load-balancer/monitor-load-balancer.md) | | Azure Logic Apps |[Logic Apps B2B custom tracking schema](../../logic-apps/logic-apps-track-integration-account-custom-tracking-schema.md) | | Azure Machine Learning | [Diagnostic logging in Azure Machine Learning](../../machine-learning/monitor-resource-reference.md) |
-| Azure Media Services | [Media Services monitoring schemas](/media-services/latest/monitoring/monitor-media-services-data-reference#schemas) |
+| Azure Media Services | [Media Services monitoring schemas](/azure/media-services/latest/monitoring/monitor-media-services-data-reference#schemas) |
| Network security groups |[Log Analytics for network security groups (NSGs)](../../virtual-network/virtual-network-nsg-manage-log.md) | | Azure Power BI Embedded | [Logging for Power BI Embedded in Azure](/power-bi/developer/azure-pbie-diag-logs) | | Recovery Services | [Data model for Azure Backup](../../backup/backup-azure-reports-data-model.md)|
azure-monitor Cost Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/cost-logs.md
If your linked workspace is using legacy Per Node pricing tier, it will be bille
See [Create a dedicated cluster](logs-dedicated-clusters.md#create-a-dedicated-cluster) for details on creating a dedicated cluster and specifying its billing type. ## Basic Logs
-You can configure certain tables in a Log Analytics workspace to use [Basic Logs](basic-logs-configure.md). Data in these tables has a significantly reduced ingestion charge and a limited retention period. There is a charge though to query against these tables. Basic Logs are intended for high-volume verbose logs you use for debugging, troubleshooting and auditing, but not for analytics and alerts.
+You can configure certain tables in a Log Analytics workspace to use [Basic Logs](basic-logs-configure.md). Data in these tables has a significantly reduced ingestion charge and a limited retention period. There is a charge though to search against these tables. Basic Logs are intended for high-volume verbose logs you use for debugging, troubleshooting and auditing, but not for analytics and alerts.
+
+The charge for searching against Basic Logs is based on the GB of data scanned in performing the search.
See [Configure Basic Logs in Azure Monitor](basic-logs-configure.md) for details on Basic Logs including how to configure them and query their data.
-## Data retention and Archive Logs
-In addition to data ingestion, there is a charge for the retention of data in each Log Analytics workspace. You can set the retention period for the entire workspace or for each table. After this period, the data is either removed or archived. Archived Logs have a reduced retention charge, but there is a charge to restore or search against them. Use Archive Logs to reduce your costs for data that you must store for compliance or occasional investigation.
+
+## Log data retention and archive
+In addition to data ingestion, there is a charge for the retention of data in each Log Analytics workspace. You can set the retention period for the entire workspace or for each table. After this period, the data is either removed or archived. Archived Logs have a reduced retention charge, and there is a charge search against them. Use Archive Logs to reduce your costs for data that you must store for compliance or occasional investigation.
See [Configure data retention and archive policies in Azure Monitor Logs](data-retention-archive.md) for details on data retention and archiving including how to configure these settings and access archived data. +
+## Search jobs
+Searching against Archived Logs uses [search jobs](search-jobs.md). Search jobs are asynchronous queries that fetch records into a new search table within your workspace for further analytics. Search jobs are billed by the number of GB of data scanned on each day that is accessed to perform the search.
+
+## Log data restore
+For situations in which older or archived logs need to be intensively queried with the full analyitics query capabilities, the [data restore](restore.md) feature is a powerful tool. The restore operation makes a specific time range of data in a table available in the hot cache for high-performance queries. You can later dismiss the data when you're done. Log data restore is billed by the amount of data restored, and by the time the restore is kept active. The minimal values billed for any data restore is 2 TB and 12 hours. Data restored of more than 2 TB and/or more than 12 hours in duration are billed on a pro-rated basis.
+
+## Log data export
+[Data export](logs-data-export.md) in Log Analytics workspace lets you continuously export data per selected tables in your workspace, to an Azure Storage Account or Azure Event Hubs as it arrives to Azure Monitor pipeline. Charges for the use of data export are based on the amount of data exported. The size of data exported is the number of bytes in the exported JSON formatted data.
+ ## Application insights billing Since [workspace-based Application Insights resources](../app/create-workspace-resource.md) store their data in a Log Analytics workspace, the billing for data ingestion and retention is done by the workspace where the Application Insights data is located. This enables you to leverage all options of the Log Analytics pricing model, including [commitment tiers](#commitment-tiers) in addition to Pay-As-You-Go.
This query isn't an exact replication of how usage is calculated, but it provide
- See [Azure Monitor cost and usage](../usage-estimated-costs.md) for a description of the different types of Azure Monitor charges and how to analyze them on your Azure bill. - See [Analyze usage in Log Analytics workspace](analyze-usage.md) for details on analyzing the data in your workspace to determine to source of any higher than expected usage and opportunities to reduce your amount of data collected. - See [Set daily cap on Log Analytics workspace](daily-cap.md) to control your costs by configuring a maximum volume that may be ingested in a workspace each day.-- See [Azure Monitor best practices - Cost management](../best-practices-cost.md) for best practices on configuring and managing Azure Monitor to minimize your charges.
+- See [Azure Monitor best practices - Cost management](../best-practices-cost.md) for best practices on configuring and managing Azure Monitor to minimize your charges.
azure-monitor Logs Data Export https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/logs/logs-data-export.md
Log Analytics workspace data export continuously exports data that is sent to yo
## Data completeness Data export is optimized for moving large data volume to your destinations, and in certain retry conditions, can include a fraction of duplicated records. The export operation could fail when ingress limits are reached, see details under [Create or update data export rule](#create-or-update-data-export-rule). In such case, a retry continues for up to 30 minutes, and if destination is unavailable yet, data will be discarded until destination becomes available.
-## Cost
-Billing for the Log Analytics Data Export feature is not enabled yet. View more details in [pricing page](https://azure.microsoft.com/pricing/details/monitor/).
+## Pricing model
+Data export charges are based on the volume of data exported measured in bytes. The size of data exported by Log Analytics Data Export is the number of bytes in the exported JSON formatted data. Data volume is measured in GB (10^9 bytes)
+
+For more information, including the data export billing timeline, see [Azure Monitor pricing](https://azure.microsoft.com/pricing/details/monitor/).
## Export destinations
azure-monitor Monitor Reference https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-monitor/monitor-reference.md
The following table lists Azure services and the data they collect into Azure Mo
| [Azure Logic Apps](../logic-apps/index.yml) | Microsoft.Logic/workflows | [**Yes**](./essentials/metrics-supported.md#microsoftlogicworkflows) | [**Yes**](./essentials/resource-logs-categories.md#microsoftlogicworkflows) | | | | [Azure Machine Learning](../machine-learning/index.yml) | Microsoft.MachineLearningServices/workspaces | [**Yes**](./essentials/metrics-supported.md#microsoftmachinelearningservicesworkspaces) | [**Yes**](./essentials/resource-logs-categories.md#microsoftmachinelearningservicesworkspaces) | | | | [Azure Maps](../azure-maps/index.yml) | Microsoft.Maps/accounts | [**Yes**](./essentials/metrics-supported.md#microsoftmapsaccounts) | No | | |
- | [Azure Media Services](/media-services/) | Microsoft.Medi#microsoftmediamediaservices) | | |
- | [Azure Media Services](/media-services/) | Microsoft.Medi#microsoftmediamediaservicesliveevents) | No | | |
- | [Azure Media Services](/media-services/) | Microsoft.Medi#microsoftmediamediaservicesstreamingendpoints) | No | | |
- | [Azure Media Services](/media-services/) | Microsoft.Medi#microsoftmediavideoanalyzers) | | |
+ | [Azure Media Services](/azure/media-services/) | Microsoft.Medi#microsoftmediamediaservices) | | |
+ | [Azure Media Services](/azure/media-services/) | Microsoft.Medi#microsoftmediamediaservicesliveevents) | No | | |
+ | [Azure Media Services](/azure/media-services/) | Microsoft.Medi#microsoftmediamediaservicesstreamingendpoints) | No | | |
+ | [Azure Media Services](/azure/media-services/) | Microsoft.Medi#microsoftmediavideoanalyzers) | | |
| [Azure Spatial Anchors](../spatial-anchors/index.yml) | Microsoft.MixedReality/remoteRenderingAccounts | [**Yes**](./essentials/metrics-supported.md#microsoftmixedrealityremoterenderingaccounts) | No | | | | [Azure Spatial Anchors](../spatial-anchors/index.yml) | Microsoft.MixedReality/spatialAnchorsAccounts | [**Yes**](./essentials/metrics-supported.md#microsoftmixedrealityspatialanchorsaccounts) | No | | | | [Azure NetApp Files](../azure-netapp-files/index.yml) | Microsoft.NetApp/netAppAccounts/capacityPools | [**Yes**](./essentials/metrics-supported.md#microsoftnetappnetappaccountscapacitypools) | No | | |
The following table lists Azure services and the data they collect into Azure Mo
- Read more about the [Azure Monitor data platform which stores the logs and metrics collected by insights and solutions](data-platform.md). - Complete a [tutorial on monitoring an Azure resource](essentials/tutorial-resource-logs.md). - Complete a [tutorial on writing a log query to analyze data in Azure Monitor Logs](essentials/tutorial-resource-logs.md).-- Complete a [tutorial on creating a metrics chart to analyze data in Azure Monitor Metrics](essentials/tutorial-metrics.md).
+- Complete a [tutorial on creating a metrics chart to analyze data in Azure Monitor Metrics](essentials/tutorial-metrics.md).
azure-netapp-files Azure Netapp Files Network Topologies https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/azure-netapp-files-network-topologies.md
na Previously updated : 04/01/2022 Last updated : 04/11/2022 # Guidelines for Azure NetApp Files network planning
Azure NetApp Files standard network features are supported for the following reg
* Australia Central * East US 2 * France Central
+* Germany West Central
* North Central US * North Europe * South Central US
azure-netapp-files Azure Netapp Files Solution Architectures https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-netapp-files/azure-netapp-files-solution-architectures.md
na Previously updated : 04/07/2022 Last updated : 04/11/2022 # Solution architectures using Azure NetApp Files
This section provides references for solutions for Linux OSS applications and da
* [Oracle Database with Azure NetApp Files - Azure Example Scenarios](/azure/architecture/example-scenario/file-storage/oracle-azure-netapp-files) * [Oracle Databases on Microsoft Azure Using Azure NetApp Files](https://www.netapp.com/media/17105-tr4780.pdf) * [Oracle VM images and their deployment on Microsoft Azure: Shared storage configuration options](../virtual-machines/workloads/oracle/oracle-vm-solutions.md#shared-storage-configuration-options)
+* [Run Your Most Demanding Oracle Workloads in Azure without Sacrificing Performance or Scalability](https://techcommunity.microsoft.com/t5/azure-architecture-blog/run-your-most-demanding-oracle-workloads-in-azure-without/ba-p/3264545)
* [Oracle database performance on Azure NetApp Files single volumes](performance-oracle-single-volumes.md) * [Benefits of using Azure NetApp Files with Oracle Database](solutions-benefits-azure-netapp-files-oracle-database.md)
azure-resource-manager Control Plane And Data Plane https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/control-plane-and-data-plane.md
All requests for control plane operations are sent to the Azure Resource Manager
* For Azure Germany, the URL is `https://management.microsoftazure.de/`. * For Microsoft Azure China 21Vianet, the URL is `https://management.chinacloudapi.cn`.
-To discover which operations use the Azure Resource Manager URL, see the [Azure REST API](/rest/api/azure/). For example, the [create or update operation](/rest/api/mysql/singleserver/databases/create-or-update) for MySql is a control plane operation because the request URL is:
+To discover which operations use the Azure Resource Manager URL, see the [Azure REST API](/rest/api/azure/). For example, the [create or update operation](/rest/api/mysql/singleserver/databases/create-or-update) for MySQL is a control plane operation because the request URL is:
```http PUT https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DBforMySQL/servers/{serverName}/databases/{databaseName}?api-version=2017-12-01
azure-resource-manager Classic Model Move Limitations https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-resource-manager/management/move-limitations/classic-model-move-limitations.md
Title: Move Azure Classic deployment resources
-description: Use Azure Resource Manager to move Classic deployment resources to a new resource group or subscription.
+ Title: Move Azure classic deployment resources
+description: Use Azure Resource Manager to move classic deployment resources to a new resource group or subscription.
Previously updated : 07/09/2019 Last updated : 04/11/2022+
-# Move guidance for Classic deployment model resources
+# Move guidance for classic deployment model resources
The steps to move resources deployed through the classic model differ based on whether you're moving the resources within a subscription or to a new subscription.
The steps to move resources deployed through the classic model differ based on w
When moving resources from one resource group to another resource group within the same subscription, the following restrictions apply:
-* Virtual networks (classic) can't be moved.
-* Virtual machines (classic) must be moved with the cloud service.
-* Cloud service can only be moved when the move includes all its virtual machines.
-* Only one cloud service can be moved at a time.
-* Only one storage account (classic) can be moved at a time.
-* Storage account (classic) can't be moved in the same operation with a virtual machine or a cloud service.
+- Virtual networks (classic) can't be moved.
+- Virtual machines (classic) must be moved with the cloud service.
+- Cloud service can only be moved when the move includes all its virtual machines.
+- Only one cloud service can be moved at a time.
+- Only one storage account (classic) can be moved at a time.
+- Storage account (classic) can't be moved in the same operation with a virtual machine or a cloud service.
To move classic resources to a new resource group within the same subscription, use the [standard move operations](../move-resource-group-and-subscription.md) through the portal, Azure PowerShell, Azure CLI, or REST API. You use the same operations as you use for moving Resource Manager resources. ## Move across subscriptions
-When moving resources to a new subscription, the following restrictions apply:
+When moving classic cloud services to a new subscription, the following restrictions apply:
-* All classic resources in the subscription must be moved in the same operation.
-* The target subscription must not have any other classic resources.
-* The move can only be requested through a separate REST API for classic moves. The standard Resource Manager move commands don't work when moving classic resources to a new subscription.
+- The source and target subscriptions need to be under the same Azure AD tenant.
+- Cloud Service Provider (CSP) subscriptions do not support migrating classic cloud services.
+- All classic resources in the subscription must be moved in the same operation.
+- The target subscription must not have any other classic resources.
+- The move can only be requested through a separate REST API for classic moves. The standard Resource Manager move commands don't work when moving classic resources to a new subscription.
To move classic resources to a new subscription, use the REST operations that are specific to classic resources. To use REST, do the following steps:
To move classic resources to a new subscription, use the REST operations that ar
The operation may run for several minutes.
+## Possible error messages in the source subscription validation stage
+
+### "Subscription migration for SubscriptionId {subscription ID} cannot continue as IaaS classic to ARM migration is in progress for the following deployment resource: _xx in HostedService {classic-cloud-service-name}_"
+
+This message means there is a classic cloud service that is ongoing migrating to the cloud service (extended support). Users should abort this ARM migration operation and then retry validation.
+
+### "Source subscription _{subscription ID}_ is empty"
+
+The source subscription cannot be empty, disabled, deleted or currently undergoing migration. During the migration period, write operations are not allowed on resources within the subscription.
+
+### "Source subscription contains application(s) which doesn't support migration: _{application name}_"
+
+### "Source subscription contains following cloud service(s) which doesn't support migration: _{cloud service name}_"
+
+The resources mentioned in the error message cannot be migrated, so users should delete these resources before triggering the migration.
+
+### More information
+
+The domain name and the public IP are still the same as before migration. Under normal circumstances, there should be no downtime for the cloud service during the migration.
+ ## Next steps If you have trouble moving classic resources, contact [Support](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/overview).
azure-sql Sql Data Sync Sync Data Between Azure Onprem https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/scripts/sql-data-sync-sync-data-between-azure-onprem.md
ms.devlang: PowerShell --++ Last updated 03/12/2019
azure-sql Sql Data Sync Sync Data Between Sql Databases Rest Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/scripts/sql-data-sync-sync-data-between-sql-databases-rest-api.md
ms.devlang: rest-api --++ Last updated 03/12/2019
azure-sql Sql Data Sync Sync Data Between Sql Databases https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/scripts/sql-data-sync-sync-data-between-sql-databases.md
ms.devlang: PowerShell --++ Last updated 03/12/2019
azure-sql Update Sync Schema In Sync Group https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/scripts/update-sync-schema-in-sync-group.md
ms.devlang: PowerShell --++ Last updated 03/12/2019
azure-sql Sql Data Sync Agent Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/sql-data-sync-agent-overview.md
ms.devlang: --++ Last updated 12/20/2018
azure-sql Sql Data Sync Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/sql-data-sync-best-practices.md
--++ Last updated 12/20/2018
azure-sql Sql Data Sync Data Sql Server Sql Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/sql-data-sync-data-sql-server-sql-database.md
ms.devlang: --++ Last updated 2/2/2022
azure-sql Sql Data Sync Sql Server Configure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/sql-data-sync-sql-server-configure.md
ms.devlang: --++ Last updated 01/14/2019
azure-sql Sql Data Sync Troubleshoot https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/sql-data-sync-troubleshoot.md
ms.devlang: --++ Last updated 12/20/2018
azure-sql Sql Data Sync Update Sync Schema https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-sql/database/sql-data-sync-update-sync-schema.md
ms.devlang: --++ Last updated 11/14/2018
azure-video-analyzer Monitor Log Edge https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-docs/edge/monitor-log-edge.md
Using [Prometheus endpoint](https://prometheus.io/docs/practices/naming/) along
[![Diagram that shows the metrics collection using Log Analytics.](./media/telemetry-schema/log-analytics.svg)](./media/telemetry-schema/log-analytics.svg#lightbox)
-1. Learn how to [collect metrics](https://github.com/Azure/iotedge/tree/master/edge-modules/MetricsCollector)
-1. Use Docker CLI commands to build the [Docker file](https://github.com/Azure/iotedge/tree/master/edge-modules/MetricsCollector/docker/linux) and publish the image to your Azure container registry.
+1. Learn how to [collect metrics](https://github.com/Azure/iotedge/blob/main/test/modules/TestMetricsCollector/Program.cs)
+1. Use Docker CLI commands to build the [Docker file](https://github.com/Azure/iotedge/blob/main/mqtt/docker/linux/amd64/Dockerfile) and publish the image to your Azure container registry.
For more information about using the Docker CLI to push to a container registry, see [Push and pull Docker images](../../../container-registry/container-registry-get-started-docker-cli.md). For other information about Azure Container Registry, see the [documentation](../../../container-registry/index.yml). 1. After the push to Azure Container Registry is complete, the following is inserted into the deployment manifest:
azure-video-analyzer Compare Video Indexer With Media Services Presets https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/compare-video-indexer-with-media-services-presets.md
Currently, there is an overlap between features offered by the [Video Analyzer f
|Feature|Video Analyzer for Media APIs |Video Analyzer and Audio Analyzer Presets<br/>in Media Services v3 APIs| ||||
-|Media Insights|[Enhanced](video-indexer-output-json-v2.md) |[Fundamentals](/media-services/latest/analyze-video-audio-files-concept)|
+|Media Insights|[Enhanced](video-indexer-output-json-v2.md) |[Fundamentals](/azure/media-services/latest/analyze-video-audio-files-concept)|
|Experiences|See the full list of supported features: <br/> [Overview](video-indexer-overview.md)|Returns video insights only| |Billing|[Media Services pricing](https://azure.microsoft.com/pricing/details/media-services/#analytics)|[Media Services pricing](https://azure.microsoft.com/pricing/details/media-services/#analytics)| |Compliance|For the most current compliance updates, visit [Azure Compliance Offerings.pdf](https://gallery.technet.microsoft.com/Overview-of-Azure-c1be3942/file/178110/23/Microsoft%20Azure%20Compliance%20Offerings.pdf) and search for "Video Analyzer for Media" to see if it complies with a certificate of interest.|For the most current compliance updates, visit [Azure Compliance Offerings.pdf](https://gallery.technet.microsoft.com/Overview-of-Azure-c1be3942/file/178110/23/Microsoft%20Azure%20Compliance%20Offerings.pdf) and search for "Media Services" to see if it complies with a certificate of interest.|
Currently, there is an overlap between features offered by the [Video Analyzer f
[Video Analyzer for Media overview](video-indexer-overview.md)
-[Media Services v3 overview](/media-services/latest/media-services-overview)
+[Media Services v3 overview](/azure/media-services/latest/media-services-overview)
azure-video-analyzer Connect To Azure https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/connect-to-azure.md
If the connection to Azure failed, you can attempt to troubleshoot the problem b
### Create and configure a Media Services account
-1. Use the [Azure](https://portal.azure.com/) portal to create an Azure Media Services account, as described in [Create an account](/media-services/previous/media-services-portal-create-account).
+1. Use the [Azure](https://portal.azure.com/) portal to create an Azure Media Services account, as described in [Create an account](/azure/media-services/previous/media-services-portal-create-account).
Make sure the Media Services account was created with the classic APIs.
If the connection to Azure failed, you can attempt to troubleshoot the problem b
In the new Media Services account, select **Streaming endpoints**. Then select the streaming endpoint and press start. ![Streaming endpoints](./media/create-account/create-ams-account-se.png)
-4. For Video Analyzer for Media to authenticate with Media Services API, an AD app needs to be created. The following steps guide you through the Azure AD authentication process described in [Get started with Azure AD authentication by using the Azure portal](/media-services/previous/media-services-portal-get-started-with-aad):
+4. For Video Analyzer for Media to authenticate with Media Services API, an AD app needs to be created. The following steps guide you through the Azure AD authentication process described in [Get started with Azure AD authentication by using the Azure portal](/azure/media-services/previous/media-services-portal-get-started-with-aad):
1. In the new Media Services account, select **API access**.
- 2. Select [Service principal authentication method](/media-services/previous/media-services-portal-get-started-with-aad).
+ 2. Select [Service principal authentication method](/azure/media-services/previous/media-services-portal-get-started-with-aad).
3. Get the client ID and client secret After you select **Settings**->**Keys**, add **Description**, press **Save**, and the key value gets populated.
azure-video-analyzer Considerations When Use At Scale https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/considerations-when-use-at-scale.md
To see an example of how to upload videos using URL, check out [this example](up
## Automatic Scaling of Media Reserved Units
-Starting August 1st 2021, Azure Video Analyzer for Media (formerly Video Indexer) enabled [Reserved Units](/media-services/latest/concept-media-reserved-units)(MRUs) auto scaling by [Azure Media Services](/media-services/latest/media-services-overview) (AMS), as a result you do not need to manage them through Azure Video Analyzer for Media. That will allow price optimization, e.g. price reduction in many cases, based on your business needs as it is being auto scaled.
+Starting August 1st 2021, Azure Video Analyzer for Media (formerly Video Indexer) enabled [Reserved Units](/azure/media-services/latest/concept-media-reserved-units)(MRUs) auto scaling by [Azure Media Services](/azure/media-services/latest/media-services-overview) (AMS), as a result you do not need to manage them through Azure Video Analyzer for Media. That will allow price optimization, e.g. price reduction in many cases, based on your business needs as it is being auto scaled.
## Respect throttling
azure-video-analyzer Create Video Analyzer For Media Account https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/create-video-analyzer-for-media-account.md
Learn how to [Upload a video using C#](https://github.com/Azure-Samples/media-se
<!-- links --> [docs-uami]: ../../active-directory/managed-identities-azure-resources/overview.md
-[docs-ms]: /media-services/latest/media-services-overview
+[docs-ms]: /azure/media-services/latest/media-services-overview
[docs-role-contributor]: ../../role-based-access-control/built-in-roles.md#contibutor [docs-contributor-on-ms]: ./add-contributor-role-on-the-media-service.md
azure-video-analyzer Deploy With Arm Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/deploy-with-arm-template.md
The resource will be deployed to your subscription and will create the Azure Vid
## Prerequisites
-* An Azure Media Services (AMS) account. You can create one for free through the [Create AMS Account](/media-services/latest/account-create-how-to).
+* An Azure Media Services (AMS) account. You can create one for free through the [Create AMS Account](/azure/media-services/latest/account-create-how-to).
## Deploy the sample
azure-video-analyzer Odrv Download https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/odrv-download.md
This article shows how to index videos stored on OneDrive by using the Azure Vid
## Supported file formats
-For a list of file formats that you can use with Video Analyzer for Media, see [Standard Encoder formats and codecs](/media-services/latest/encode-media-encoder-standard-formats-reference).
+For a list of file formats that you can use with Video Analyzer for Media, see [Standard Encoder formats and codecs](/azure/media-services/latest/encode-media-encoder-standard-formats-reference).
## Index a video by using the website
When you're using the [Upload Video](https://api-portal.videoindexer.ai/api-deta
After the indexing and encoding jobs are done, the video is published so you can also stream your video. The streaming endpoint from which you want to stream the video must be in the **Running** state. For `SingleBitrate`, the standard encoder cost will apply for the output. If the video height is greater than or equal to 720, Video Analyzer for Media encodes it as 1280 x 720. Otherwise, it's encoded as 640 x 468.
-The default setting is [content-aware encoding](/media-services/latest/encode-content-aware-concept).
+The default setting is [content-aware encoding](/azure/media-services/latest/encode-content-aware-concept).
If you only want to index your video and not encode it, set `streamingPreset` to `NoStreaming`.
azure-video-analyzer Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/release-notes.md
Fixed bugs related to CSS, theming and accessibility:
### Automatic Scaling of Media Reserved Units
-Starting August 1st 2021, Azure Video Analyzer for Media (formerly Video Indexer) enabled [Media Reserved Units (MRUs)](/media-services/latest/concept-media-reserved-units) auto scaling by [Azure Media Services](/media-services/latest/media-services-overview), as a result you do not need to manage them through Azure Video Analyzer for Media. That will allow price optimization, for example price reduction in many cases, based on your business needs as it is being auto scaled.
+Starting August 1st 2021, Azure Video Analyzer for Media (formerly Video Indexer) enabled [Media Reserved Units (MRUs)](/azure/media-services/latest/concept-media-reserved-units) auto scaling by [Azure Media Services](/azure/media-services/latest/media-services-overview), as a result you do not need to manage them through Azure Video Analyzer for Media. That will allow price optimization, for example price reduction in many cases, based on your business needs as it is being auto scaled.
## June 2021
azure-video-analyzer Upload Index Videos https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/upload-index-videos.md
When you're uploading videos by using the API, you have the following options:
* Upload your video from a URL (preferred). * Send the video file as a byte array in the request body.
-* Use existing an Azure Media Services asset by providing the [asset ID](/media-services/latest/assets-concept). This option is supported in paid accounts only.
+* Use existing an Azure Media Services asset by providing the [asset ID](/azure/media-services/latest/assets-concept). This option is supported in paid accounts only.
## Supported file formats
-For a list of file formats that you can use with Video Analyzer for Media, see [Standard Encoder formats and codecs](/media-services/latest/encode-media-encoder-standard-formats-reference).
+For a list of file formats that you can use with Video Analyzer for Media, see [Standard Encoder formats and codecs](/azure/media-services/latest/encode-media-encoder-standard-formats-reference).
## Storage of video files
When you're using the [Upload Video](https://api-portal.videoindexer.ai/api-deta
After the indexing and encoding jobs are done, the video is published so you can also stream your video. The streaming endpoint from which you want to stream the video must be in the **Running** state. For `SingleBitrate`, the standard encoder cost will apply for the output. If the video height is greater than or equal to 720, Video Analyzer for Media encodes it as 1280 x 720. Otherwise, it's encoded as 640 x 468.
-The default setting is [content-aware encoding](/media-services/latest/encode-content-aware-concept).
+The default setting is [content-aware encoding](/azure/media-services/latest/encode-content-aware-concept).
If you only want to index your video and not encode it, set `streamingPreset` to `NoStreaming`.
azure-video-analyzer Video Indexer Get Started https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-video-analyzer/video-analyzer-for-media-docs/video-indexer-get-started.md
Once you start using Video Analyzer for Media, all your stored data and uploaded
### Supported file formats for Video Analyzer for Media
-See the [input container/file formats](/media-services/latest/encode-media-encoder-standard-formats-reference) article for a list of file formats that you can use with Video Analyzer for Media.
+See the [input container/file formats](/azure/media-services/latest/encode-media-encoder-standard-formats-reference) article for a list of file formats that you can use with Video Analyzer for Media.
### Upload a video
azure-vmware Configure Dns Azure Vmware Solution https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/configure-dns-azure-vmware-solution.md
Title: Configure DNS forwarder for Azure VMware Solution
description: Learn how to configure DNS forwarder for Azure VMware Solution using the Azure portal. Previously updated : 07/15/2021 Last updated : 04/11/2022
-#Customer intent: As an Azure service administrator, I want to <define conditional forwarding rules for a desired domain name to a desired set of private DNS servers via the NSX-T DNS Service.>
+#Customer intent: As an Azure service administrator, I want to <define conditional forwarding rules for a desired domain name to a desired set of private DNS servers via the NSX-T Data Center DNS Service.>
Last updated 07/15/2021
>[!IMPORTANT] >For Azure VMware Solution private clouds created on or after July 1, 2021, you now have the ability to configure private DNS resolution. For private clouds created before July 1, 2021, that need private DNS resolution, open a [support request](https://rc.portal.azure.com/#create/Microsoft.Support) and request Private DNS configuration.
-By default, Azure VMware Solution management components such as vCenter can only resolve name records available through Public DNS. However, certain hybrid use cases require Azure VMware Solution management components to resolve name records from privately hosted DNS to properly function, including customer-managed systems such as vCenter and Active Directory.
+By default, Azure VMware Solution management components such as vCenter Server can only resolve name records available through Public DNS. However, certain hybrid use cases require Azure VMware Solution management components to resolve name records from privately hosted DNS to properly function, including customer-managed systems such as vCenter Server and Active Directory.
-Private DNS for Azure VMware Solution management components lets you define conditional forwarding rules for the desired domain name to a selected set of private DNS servers through the NSX-T DNS Service.
+Private DNS for Azure VMware Solution management components lets you define conditional forwarding rules for the desired domain name to a selected set of private DNS servers through the NSX-T Data Center DNS Service.
-This capability uses the DNS Forwarder Service in NSX-T. A DNS service and default DNS zone are provided as part of your private cloud. To enable Azure VMware Solution management components to resolve records from your private DNS systems, you must define an FQDN zone and apply it to the NSX-T DNS Service. The DNS Service conditionally forwards DNS queries for each zone based on the external DNS servers defined in that zone.
+This capability uses the DNS Forwarder Service in NSX-T Data Center. A DNS service and default DNS zone are provided as part of your private cloud. To enable Azure VMware Solution management components to resolve records from your private DNS systems, you must define an FQDN zone and apply it to the NSX-T Data Center DNS Service. The DNS Service conditionally forwards DNS queries for each zone based on the external DNS servers defined in that zone.
>[!NOTE] >The DNS Service is associated with up to five FQDN zones. Each FQDN zone is associated with up to three DNS servers. >[!TIP]
->If desired, you can also use the conditional forwarding rules for workload segments by configuring virtual machines on those segments to use the NSX-T DNS Service IP address as their DNS server.
+>If desired, you can also use the conditional forwarding rules for workload segments by configuring virtual machines on those segments to use the NSX-T Data Center DNS Service IP address as their DNS server.
## Architecture
-The diagram shows that the NSX-T DNS Service can forward DNS queries to DNS systems hosted in Azure and on-premises environments.
+The diagram shows that the NSX-T Data Center DNS Service can forward DNS queries to DNS systems hosted in Azure and on-premises environments.
:::image type="content" source="media/networking/dns/dns-forwarder-diagram.png" alt-text="Diagram showing that the NSX-T DNS Service can forward DNS queries to DNS systems hosted in Azure and on-premises environments." border="false":::
The diagram shows that the NSX-T DNS Service can forward DNS queries to DNS syst
:::image type="content" source="media/networking/dns/nsxt-workload-networking-configure-fqdn-zone.png" alt-text="Screenshot showing the required information needed to add an FQDN zone."::: >[!IMPORTANT]
- >While NSX-T allows spaces and other non-alphanumeric characters in a DNS zone name, certain NSX resources such as a DNS Zone are mapped to an Azure resource whose names donΓÇÖt permit certain characters.
+ >While NSX-T Data Center allows spaces and other non-alphanumeric characters in a DNS zone name, certain NSX-T Data Center resources such as a DNS Zone are mapped to an Azure resource whose names donΓÇÖt permit certain characters.
>
- >As a result, DNS zone names that would otherwise be valid in NSX-T may need adjustment to adhere to the [Azure resource naming conventions](../azure-resource-manager/management/resource-name-rules.md#microsoftresources).
+ >As a result, DNS zone names that would otherwise be valid in NSX-T Data Center may need adjustment to adhere to the [Azure resource naming conventions](../azure-resource-manager/management/resource-name-rules.md#microsoftresources).
It takes several minutes to complete, and you can follow the progress from **Notifications**. YouΓÇÖll see a message in the Notifications when the DNS zone has been created.
The diagram shows that the NSX-T DNS Service can forward DNS queries to DNS syst
:::image type="content" source="media/networking/dns/configure-dns-forwarder-3.png" alt-text="Screenshot showing the selected FQDN for the DNS service.":::
- It takes several minutes to complete, and once finished, you'll see the *Completed* message from **Notifications**. At this point, management components in your private cloud should be able to resolve DNS entries from the FQDN zone provided to the NSX-T DNS Service.
+ It takes several minutes to complete, and once finished, you'll see the *Completed* message from **Notifications**. At this point, management components in your private cloud should be able to resolve DNS entries from the FQDN zone provided to the NSX-T Data Center DNS Service.
1. Repeat the above steps for other FQDN zones, including any applicable reverse lookup zones.
NSX-T Manager provides the DNS Forwarder Service statistics at the global servic
### PowerCLI
-The NSX-T Policy API lets you run nslookup commands from the NSX-T DNS Forwarder Service. The required cmdlets are part of the `VMware.VimAutomation.Nsxt` module in PowerCLI. The following example demonstrates output from version 12.3.0 of that module.
+The NSX-T Policy API lets you run nslookup commands from the NSX-T Data Center DNS Forwarder Service. The required cmdlets are part of the `VMware.VimAutomation.Nsxt` module in PowerCLI. The following example demonstrates output from version 12.3.0 of that module.
-1. Connect to your NSX-T Server.
+1. Connect to your NSX-T Manager cluster.
>[!TIP]
- >You can obtain the IP address of your NSX-T Server from the Azure portal under **Manage** > **Identity**.
+ >You can obtain the IP address of your NSX-T Manager cluster from the Azure portal under **Manage** > **Identity**.
```powershell Connect-NsxtServer -Server 10.103.64.3
azure-vmware Configure L2 Stretched Vmware Hcx Networks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/configure-l2-stretched-vmware-hcx-networks.md
Title: Configure DHCP on L2 stretched VMware HCX networks
description: Learn how to send DHCP requests from your Azure VMware Solution VMs to a non-NSX-T DHCP server. Previously updated : 05/28/2021 Last updated : 04/11/2022 # Customer intent: As an Azure service administrator, I want to configure DHCP on L2 stretched VMware HCX networks to send DHCP requests from my Azure VMware Solution VMs to a non-NSX-T DHCP server.
Last updated 05/28/2021
# Configure DHCP on L2 stretched VMware HCX networks
-DHCP does not work for virtual machines (VMs) on the VMware HCX L2 stretch network when the DHCP server is in the on-premises datacenter. This is because NSX, by default, blocks all DHCP requests from traversing the L2 stretch. Therefore, to send DHCP requests from your Azure VMware Solution VMs to a non-NSX-T DHCP server, you'll need to configure DHCP on L2 stretched VMware HCX networks.
+DHCP does not work for virtual machines (VMs) on the VMware HCX L2 stretch network when the DHCP server is in the on-premises data center. This is because NSX-T Data Center, by default, blocks all DHCP requests from traversing the L2 stretch. Therefore, to send DHCP requests from your Azure VMware Solution VMs to a non-NSX-T Data Center DHCP server, you'll need to configure DHCP on L2 stretched VMware HCX networks.
1. (Optional) If you need to locate the segment name of the L2 extension:
- 1. Sign in to your on-premises vCenter, and under **Home**, select **HCX**.
+ 1. Sign in to your on-premises vCenter Server, and under **Home**, select **HCX**.
1. Select **Network Extension** under **Services**.
DHCP does not work for virtual machines (VMs) on the VMware HCX L2 stretch netwo
1. Select **Add Segment Profile** and then **Segment Security**.
- :::image type="content" source="media/manage-dhcp/add-segment-profile.png" alt-text="Screenshot of how to add a segment profile in NSX-T." lightbox="media/manage-dhcp/add-segment-profile.png":::
+ :::image type="content" source="media/manage-dhcp/add-segment-profile.png" alt-text="Screenshot of how to add a segment profile in NSX-T Data Center." lightbox="media/manage-dhcp/add-segment-profile.png":::
1. Provide a name and a tag, and then set the **BPDU Filter** toggle to ON and all the DHCP toggles to OFF.
azure-vmware Configure Nsx Network Components Azure Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/configure-nsx-network-components-azure-portal.md
Title: Configure NSX network components using Azure VMware Solution
-description: Learn how to use the Azure VMware Solution to configure NSX-T network segments.
+ Title: Configure NSX-T Data Center network components using Azure VMware Solution
+description: Learn how to use the Azure VMware Solution to configure NSX-T Data Center network segments.
Previously updated : 09/13/2021 Last updated : 04/11/2022
-# Customer intent: As an Azure service administrator, I want to configure NSX network components using a simplified view of NSX-T operations a VMware administrator needs daily. The simplified view is targeted at users unfamiliar with NSX-T Manager.
+# Customer intent: As an Azure service administrator, I want to configure NSX-T Data Center network components using a simplified view of NSX-T Data Center operations a VMware administrator needs daily. The simplified view is targeted at users unfamiliar with NSX-T Manager.
-# Configure NSX network components using Azure VMware Solution
+# Configure NSX-T Data Center network components using Azure VMware Solution
-An Azure VMware Solution private cloud comes with NSX-T by default. The private cloud comes pre-provisioned with an NSX-T Tier-0 gateway in **Active/Active** mode and a default NSX-T Tier-1 gateway in Active/Standby mode. These gateways let you connect the segments (logical switches) and provide East-West and North-South connectivity.
+An Azure VMware Solution private cloud comes with NSX-T Data Center by default. The private cloud comes pre-provisioned with an NSX-T Data Center Tier-0 gateway in **Active/Active** mode and a default NSX-T Data Center Tier-1 gateway in Active/Standby mode. These gateways let you connect the segments (logical switches) and provide East-West and North-South connectivity.
-After deploying Azure VMware Solution, you can configure the necessary NSX-T objects from the Azure portal. It presents a simplified view of NSX-T operations a VMware administrator needs daily and targeted at users not familiar with NSX-T Manager.
+After deploying Azure VMware Solution, you can configure the necessary NSX-T Data Center objects from the Azure portal. It presents a simplified view of NSX-T Data Center operations a VMware administrator needs daily and is targeted at users not familiar with NSX-T Manager.
-You'll have four options to configure NSX-T components in the Azure VMware Solution console:
+You'll have four options to configure NSX-T Data Center components in the Azure VMware Solution console:
-- **Segments** - Create segments that display in NSX-T Manager and vCenter. For more information, see [Add an NSX-T segment using the Azure portal](tutorial-nsx-t-network-segment.md#use-azure-portal-to-add-an-nsx-t-segment).
+- **Segments** - Create segments that display in NSX-T Manager and vCenter Server. For more information, see [Add an NSX-T Data Center segment using the Azure portal](tutorial-nsx-t-network-segment.md#use-azure-portal-to-add-an-nsx-t-segment).
- **DHCP** - Create a DHCP server or DHCP relay if you plan to use DHCP. For more information, see [Use the Azure portal to create a DHCP server or relay](configure-dhcp-azure-vmware-solution.md#use-the-azure-portal-to-create-a-dhcp-server-or-relay).
You'll have four options to configure NSX-T components in the Azure VMware Solut
- **DNS** ΓÇô Create a DNS forwarder to send DNS requests to a designated DNS server for resolution. For more information, see [Configure a DNS forwarder in the Azure portal](configure-dns-azure-vmware-solution.md). >[!IMPORTANT]
->You'll still have access to the NSX-T Manager console, where you can use the advanced settings mentioned and other NSX-T features.
+>You'll still have access to the NSX-T Manager console, where you can use the advanced settings mentioned and other NSX-T Data Center features.
azure-vmware Configure Port Mirroring Azure Vmware Solution https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/configure-port-mirroring-azure-vmware-solution.md
Title: Configure port mirroring for Azure VMware Solution
description: Learn how to configure port mirroring to monitor network traffic that involves forwarding a copy of each packet from one network switch port to another. Previously updated : 07/16/2021 Last updated : 04/11/2022 # Customer intent: As an Azure service administrator, I want to configure port mirroring to monitor network traffic that involves forwarding a copy of each packet from one network switch port to another.
In this how-to, you'll configure port mirroring to monitor network traffic, whic
## Prerequisites
-An Azure VMware Solution private cloud with access to the vCenter and NSX-T Manager interfaces. For more information, see the [Configure networking](tutorial-configure-networking.md) tutorial.
+An Azure VMware Solution private cloud with access to the vCenter Server and NSX-T Manager interfaces. For more information, see the [Configure networking](tutorial-configure-networking.md) tutorial.
## Create the VMs or VM groups
azure-vmware Configure Site To Site Vpn Gateway https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/configure-site-to-site-vpn-gateway.md
Title: Configure a site-to-site VPN in vWAN for Azure VMware Solution
description: Learn how to establish a VPN (IPsec IKEv1 and IKEv2) site-to-site tunnel into Azure VMware Solutions. Previously updated : 06/30/2021 Last updated : 04/11/2022 # Configure a site-to-site VPN in vWAN for Azure VMware Solution
A virtual hub is a virtual network that is created and used by Virtual WAN. It's
1. Select **Add** to establish the link.
-1. Test your connection by [creating an NSX-T segment](./tutorial-nsx-t-network-segment.md) and provisioning a VM on the network. Ping both the on-premise and Azure VMware Solution endpoints.
+1. Test your connection by [creating an NSX-T Data Center segment](./tutorial-nsx-t-network-segment.md) and provisioning a VM on the network. Ping both the on-premise and Azure VMware Solution endpoints.
>[!NOTE] >Wait approximately 5 minutes before you test connectivity from a client behind your ExpressRoute circuit, for example, a VM in the VNet that you created earlier.
azure-vmware Configure Storage Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/configure-storage-policy.md
Title: Configure storage policy description: Learn how to configure storage policy for your Azure VMware Solution virtual machines. Previously updated : 08/31/2021 Last updated : 04/11/2022
-#Customer intent: As an Azure service administrator, I want set the vSAN storage policies to determine how storage is allocated to the VM.
+#Customer intent: As an Azure service administrator, I want set the VMware vSAN storage policies to determine how storage is allocated to the VM.
# Configure storage policy
-vSAN storage policies define storage requirements for your virtual machines (VMs). These policies guarantee the required level of service for your VMs because they determine how storage is allocated to the VM. Each VM deployed to a vSAN datastore is assigned at least one VM storage policy.
+VMware vSAN storage policies define storage requirements for your virtual machines (VMs). These policies guarantee the required level of service for your VMs because they determine how storage is allocated to the VM. Each VM deployed to a vSAN datastore is assigned at least one VM storage policy.
You can assign a VM storage policy in an initial deployment of a VM or when you do other VM operations, such as cloning or migrating. Post-deployment cloudadmin users or equivalent roles can't change the default storage policy for a VM. However, **VM storage policy** per disk changes is permitted.
You'll run the `Set-ClusterDefaultStoragePolicy` cmdlet to specify default stora
## Next steps
-Now that you've learned how to configure vSAN storage policies, you can learn more about:
+Now that you've learned how to configure VMware vSAN storage policies, you can learn more about:
- [How to attach disk pools to Azure VMware Solution hosts (Preview)](attach-disk-pools-to-azure-vmware-solution-hosts.md) - You can use disks as the persistent storage for Azure VMware Solution for optimal cost and performance. -- [How to configure external identity for vCenter](configure-identity-source-vcenter.md) - vCenter has a built-in local user called cloudadmin and assigned to the CloudAdmin role. The local cloudadmin user is used to set up users in Active Directory (AD). With the Run command feature, you can configure Active Directory over LDAP or LDAPS for vCenter as an external identity source.
+- [How to configure external identity for vCenter](configure-identity-source-vcenter.md) - vCenter Server has a built-in local user called cloudadmin and assigned to the CloudAdmin role. The local cloudadmin user is used to set up users in Active Directory (AD). With the Run command feature, you can configure Active Directory over LDAP or LDAPS for vCenter as an external identity source.
azure-vmware Configure Vmware Syslogs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/configure-vmware-syslogs.md
Title: Configure VMware syslogs for Azure VMware Solution description: Learn how to configure diagnostic settings to collect VMware syslogs for your Azure VMware Solution private cloud. Previously updated : 09/24/2021 Last updated : 04/11/2022
-#Customer intent: As an Azure service administrator, I want to collect VMWare syslogs and store it in my storage account so that I can view the vCenter logs and analyze for any diagnostic purposes.
+#Customer intent: As an Azure service administrator, I want to collect VMware syslogs and store it in my storage account so that I can view the vCenter Server logs and analyze for any diagnostic purposes.
Last updated 09/24/2021
Diagnostic settings are used to configure streaming export of platform logs and metrics for a resource to the destination of your choice. You can create up to five different diagnostic settings to send different logs and metrics to independent destinations.
-In this article, you'll configure a diagnostic setting to collect VMware syslogs for your Azure VMware Solution private cloud. You'll store the syslog to a storage account to view the vCenter logs and analyze for diagnostic purposes.
+In this article, you'll configure a diagnostic setting to collect VMware syslogs for your Azure VMware Solution private cloud. You'll store the syslog to a storage account to view the vCenter Server logs and analyze for diagnostic purposes.
## Prerequisites
-Make sure you have an Azure VMware Solution private cloud with access to the vCenter and NSX-T Manager interfaces.
+Make sure you have an Azure VMware Solution private cloud with access to the vCenter Server and NSX-T Manager interfaces.
## Configure diagnostic settings
azure-vmware Configure Windows Server Failover Cluster https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/configure-windows-server-failover-cluster.md
Title: Configure Windows Server Failover Cluster on Azure VMware Solution vSAN description: Learn how to configure Windows Server Failover Cluster (WSFC) on Azure VMware Solution vSAN with native shared disks. Previously updated : 05/04/2021 Last updated : 04/11/2022 # Configure Windows Server Failover Cluster on Azure VMware Solution vSAN
You'll need first to [create a WSFC](/windows-server/failover-clustering/create-
Azure VMware Solution provides native support for virtualized WSFC. It supports SCSI-3 Persistent Reservations (SCSI3PR) on a virtual disk level. WSFC requires this support to arbitrate access to a shared disk between nodes. Support of SCSI3PRs enables configuration of WSFC with a disk resource shared between VMs natively on vSAN datastores.
-The following diagram illustrates the architecture of WSFC virtual nodes on an Azure VMware Solution private cloud. It shows where Azure VMware Solution resides, including the WSFC virtual servers (red box), in relation to the broader Azure platform. This diagram illustrates a typical hub-spoke architecture, but a similar setup is possible using Azure Virtual WAN. Both offer all the value other Azure services can bring you.
+The following diagram illustrates the architecture of WSFC virtual nodes on an Azure VMware Solution private cloud. It shows where Azure VMware Solution resides, including the WSFC virtual servers (blue box), in relation to the broader Azure platform. This diagram illustrates a typical hub-spoke architecture, but a similar setup is possible using Azure Virtual WAN. Both offer all the value other Azure services can bring you.
:::image type="content" source="media/windows-server-failover-cluster/windows-server-failover-architecture.svg" alt-text="Diagram of Windows Server Failover Cluster virtual nodes on an Azure VMware Solution private cloud." border="false" lightbox="media/windows-server-failover-cluster/windows-server-failover-architecture.svg":::
The following activities aren't supported and might cause WSFC node failover:
3. Power on all VMs, configure the hostname and IP addresses, join all VMs to an Active Directory domain, and install the latest available OS updates. 4. Install the latest VMware Tools. 5. Enable and configure the Windows Server Failover Cluster feature on each VM.
-6. Configure a Cluster Witness for quorum (a file share witness works fine).
+6. Configure a Cluster Witness for quorum (this can be a file share witness).
7. Power off all nodes of the WSFC cluster. 8. Add one or more Paravirtual SCSI controllers (up to four) to each VM part of the WSFC. Use the settings per the previous paragraphs. 9. On the first cluster node, add all needed shared disks using **Add New Device** > **Hard Disk**. Leave Disk sharing as **Unspecified** (default) and Disk mode as **Independent - Persistent**. Then attach it to the controller(s) created in the previous steps.
The following activities aren't supported and might cause WSFC node failover:
- **Validate Storage Spaces Persistent Reservation**. If you aren't using Storage Spaces with your cluster (such as on Azure VMware Solution vSAN), this test isn't applicable. You can ignore any results of the Validate Storage Spaces Persistent Reservation test including this warning. To avoid warnings, you can exclude this test.
- - **Validate Network Communication**. The Cluster Validation test displays a warning indicating that only one network interface per cluster node is available. You can ignore this warning. Azure VMware Solution provides the required availability and performance needed, since the nodes are connected to one of the NSX-T segments. However, keep this item as part of the Cluster Validation test, as it validates other aspects of network communication.
+ - **Validate Network Communication**. The Cluster Validation test displays a warning indicating that only one network interface per cluster node is available. You can ignore this warning. Azure VMware Solution provides the required availability and performance needed, since the nodes are connected to one of the NSX-T Data Center segments. However, keep this item as part of the Cluster Validation test, as it validates other aspects of network communication.
-16. Create a DRS rule to place the WSFC VMs on the same Azure VMware Solution nodes. To do so, you need a host-to-VM affinity rule. This way, cluster nodes will run on the same Azure VMware Solution host. Again, it's for pilot purposes until placement policies are available.
+16. Create a Placement Policy to situate the WSFC VMs on the same Azure VMware Solution nodes. To do so, you need a host-to-VM affinity rule. This way, cluster nodes will run on the same Azure VMware Solution host.
>[!NOTE] > For this you need to create a Support Request ticket. Our Azure support organization will be able to help you with this.
azure-vmware Deploy Disaster Recovery Using Jetstream https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/deploy-disaster-recovery-using-jetstream.md
Title: Deploy disaster recovery using JetStream DR description: Learn how to implement JetStream DR for your Azure VMware Solution private cloud and on-premises VMware workloads. Previously updated : 11/17/2021 Last updated : 04/11/2022
To learn more about JetStream DR, see:
- [JetStream Solution brief](https://www.jetstreamsoft.com/2020/09/28/solution-brief-disaster-recovery-for-avs/) -- [JetStream DR on Azure Marketplace](https://portal.azure.com/#create/jetstreamsoftware1596597632545.jsdravs-093020)
+- [JetStream DR on Azure Marketplace](https://ms.portal.azure.com/#blade/Microsoft_Azure_Marketplace/GalleryItemDetailsBladeNopdl/id/jetstreamsoftware1596597632545.jsdravs-111721)
- [JetStream knowledge base articles](https://www.jetstreamsoft.com/resources/knowledge-base/)
To learn more about JetStream DR, see:
| Items | Description | | | |
-| **JetStream Management Server Virtual Appliance (MSA)** | MSA enables both Day 0 and Day 2 configuration, such as primary sites, protection domains, and recovering VMs. MSA is installed on a vSphere node by the cloud admin. The MSA implements a vCenter plugin that allows you to manage JetStream DR natively from vCenter. The MSA doesn't handle replication data of protected VMs. |
+| **JetStream Management Server Virtual Appliance (MSA)** | MSA enables both Day 0 and Day 2 configuration, such as primary sites, protection domains, and recovering VMs. MSA is installed on a vSphere node by the cloud admin. The MSA implements a vCenter Server plugin that allows you to manage JetStream DR natively from vCenter Server. The MSA doesn't handle replication data of protected VMs. |
| **JetStream DR Virtual Appliance (DRVA)** | Linux-based Virtual Machine appliance receives protected VMs replication data from the source ESXi host. It's responsible for storing the replication data at the DR site, typically in an object store such as Azure Blob Storage. Depending on the number of protected VMs and the amount of storage to replicate, the private cloudadmin can create one or more DRVA instances. |
-| **JetStream ESXi host components (IO Filter packages)** | JetStream software installed on each ESXi host configured for JetStream DR. The host driver intercepts a vSphere VMs IO and sends the replication data to the DRVA. |
+| **JetStream ESXi host components (IO Filter packages)** | JetStream software installed on each ESXi host configured for JetStream DR. The host driver intercepts the vSphere VMs IO and sends the replication data to the DRVA. |
| **JetStream protection domain** | Logical group of VMs that will be protected together using the same policies and run book. The data for all VMs in a protection domain is stored in the same Azure Blob container instance. The same DRVA instance handles replication to remote DR storage for all VMs in a protection domain. | | **Azure Blob Storage containers** | The protected VMs replicated data is stored in Azure Blobs. JetStream software creates one Azure Blob container instance for each JetStream protection domain. |
To learn more about JetStream DR, see:
## JetStream scenarios on Azure VMware Solution You can use JetStream DR with Azure VMware Solution for the following two scenarios:  -- On-premises VMware to Azure VMware Solution DR
+- On-premises VMware vSphere to Azure VMware Solution DR
- Azure VMware Solution to Azure VMware Solution DR
-### Scenario 1: On-premises VMware to Azure VMware Solution DR
+### Scenario 1: On-premises VMware vSphere to Azure VMware Solution DR
-In this scenario, the primary site is your on-premises VMware environment and the DR site is an Azure VMware Solution private cloud.
+In this scenario, the primary site is your on-premises VMware vSphere environment and the DR site is an Azure VMware Solution private cloud.
:::image type="content" source="media/jetstream-disaster-recovery/jetstream-on-premises-to-cloud-diagram.png" alt-text="Diagram showing the on-premises to Azure VMware Solution private cloud JetStream deployment." border="false" lightbox="media/jetstream-disaster-recovery/jetstream-on-premises-to-cloud-diagram.png":::
In this scenario, the primary site is an Azure VMware Solution private cloud in
## Prerequisites
-### Scenario 1: On-premises VMware to Azure VMware Solution DR
+### Scenario 1: On-premises VMware vSphere to Azure VMware Solution DR
- Azure VMware Solution private cloud deployed with a minimum of three nodes in the target DR region.
In this scenario, the primary site is an Azure VMware Solution private cloud in
- Network connectivity configured between the primary site JetStream appliances and the Azure Storage blob instance. -- [Setup and Subscribe to JetStream DR](https://portal.azure.com/#create/jetstreamsoftware1596597632545.jsdravs-093020) from the Azure Marketplace to download the JetStream DR software.
+- [Setup and Subscribe to JetStream DR](https://ms.portal.azure.com/#blade/Microsoft_Azure_Marketplace/GalleryItemDetailsBladeNopdl/id/jetstreamsoftware1596597632545.jsdravs-111721) from the Azure Marketplace to download the JetStream DR software.
- [Azure Blob Storage account](../storage/common/storage-account-create.md) created using either Standard or Premium Performance tier. For [access tier, select **Hot**](../storage/blobs/access-tiers-overview.md). >[!NOTE] >The **Enable hierarchical namespace** option on the blob isn't supported. -- An NSX-T network segment configured on Azure VMware Solution private cloud and optionally enable DHCP on the segment for the JetStream Virtual appliances.
+- An NSX-T Data Center network segment configured on Azure VMware Solution private cloud and optionally enable DHCP on the segment for the JetStream Virtual appliances.
-- A DNS server configured to resolve the IP addresses of Azure VMware Solution vCenter, Azure VMware Solution ESXi hosts, Azure Storage account, and the JetStream Marketplace service for the JetStream virtual appliances.
+- A DNS server configured to resolve the IP addresses of Azure VMware Solution vCenter Server, Azure VMware Solution ESXi hosts, Azure Storage account, and the JetStream Marketplace service for the JetStream virtual appliances.
In this scenario, the primary site is an Azure VMware Solution private cloud in
- Network connectivity configured between the primary site JetStream appliances and the Azure Storage blob instance. -- [Setup and Subscribe to JetStream DR](https://portal.azure.com/#create/jetstreamsoftware1596597632545.jsdravs-093020) from the Azure Marketplace to download the JetStream DR software.
+- [Setup and Subscribe to JetStream DR](https://ms.portal.azure.com/#blade/Microsoft_Azure_Marketplace/GalleryItemDetailsBladeNopdl/id/jetstreamsoftware1596597632545.jsdravs-111721) from the Azure Marketplace to download the JetStream DR software.
- [Azure Blob Storage account](../storage/common/storage-account-create.md) created using either Standard or Premium Performance tier. For [access tier, select **Hot**](../storage/blobs/access-tiers-overview.md). >[!NOTE] >The **Enable hierarchical namespace** option on the blob isn't supported. -- An NSX-T network segment configured on Azure VMware Solution private cloud and optionally enable DHCP on the segment for the JetStream Virtual appliances.
+- An NSX-T Data Center network segment configured on Azure VMware Solution private cloud and optionally enable DHCP on the segment for the JetStream Virtual appliances.
-- A DNS server configured on both the primary and DR sites to resolve the IP addresses of Azure VMware Solution vCenter, Azure VMware Solution ESXi hosts, Azure Storage account, and the JetStream Marketplace service for the JetStream virtual appliances.
+- A DNS server configured on both the primary and DR sites to resolve the IP addresses of Azure VMware Solution vCenter Server, Azure VMware Solution ESXi hosts, Azure Storage account, and the JetStream Marketplace service for the JetStream virtual appliances.
For more on-premises JetStream DR prerequisites, see the [JetStream Pre-Installation Guide](https://www.jetstreamsoft.com/portal/jetstream-knowledge-base/pre-installation-guidelines/).
You can follow these steps for both supported scenarios.
| **Field** | **Value** | | | |
- | **Network** | Name of the NSX-T network segment where you must deploy the JetStream MSA. |
+ | **Network** | Name of the NSX-T Data Center network segment where you must deploy the JetStream MSA. |
| **Datastore** | Name of the datastore where you'll deploy the MSA. | | **ProtectedCluster** | Name of the Azure VMware Solution private cloud cluster to be protected, for example, **Cluster-1**. You can only provide one cluster name. | | **Cluster** | Name of the Azure VMware Solution private cluster where the JetStream MSA is deployed, for example, **Cluster-1**. |
Azure VMware Solution supports the installation of JetStream using either static
| **Gateway** | IP address of the network gateway for the JetStream MSA VM. | | **Credential** | Credentials of the root user of the JetStream MSA VM. | | **HostName** | Hostname (FQDN) of the JetStream MSA VM. |
- | **Network** | Name of the NSX-T network segment where you must deploy the JetStream MSA. |
+ | **Network** | Name of the NSX-T Data Center network segment where you must deploy the JetStream MSA. |
| **Specify name for execution** | Alphanumeric name of the execution, for example, **Install-JetDRWithStaticIP-Exec1**. It's used to verify if the cmdlet ran successfully. |
This step also installs JetStream vSphere Installation Bundle (VIB) on the clust
| **Cluster** | Name of the Azure VMware Solution private cluster where the JetStream MSA is deployed, for example, **Cluster-1**. | | **Credential** | Credentials of the root user of the JetStream MSA VM. | | **HostName** | Hostname (FQDN) of the JetStream MSA VM. |
- | **Network** | Name of the NSX-T network segment where you must deploy the JetStream MSA. |
+ | **Network** | Name of the NSX-T Data Center network segment where you must deploy the JetStream MSA. |
| **Specify name for execution** | Alphanumeric name of the execution, for example, **Install-JetDRWithDHCP-Exec1**. It's used to verify if the cmdlet ran successfully. |
azure-vmware Vmware Hcx Mon Guidance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/azure-vmware/vmware-hcx-mon-guidance.md
Title: VMware HCX Mobility Optimized Networking (MON) guidance description: Learn about Azure VMware Solution-specific use cases for Mobility Optimized Networking (MON). Previously updated : 10/04/2021 Last updated : 04/11/2022 # VMware HCX Mobility Optimized Networking (MON) guidance
In this scenario, we assume a VM from on-premises has been migrated to Azure VMw
By default and without using MON, a VM in Azure VMware Solution on a stretched network without MON can communicate back to on-premises using the ExpressRoute preferred path. Ideally, and based on customers use case one should evaluate how a VM on an Azure VMware Solution stretched segment enabled with MON should be traversing back to on-premises either over the NE or the T0 gateway via the ExpressRoute, but keeping traffic flows symmetric.
-If choosing the NE path for example, the MON policy routes have to specifically address the subnet on the on-premises side; otherwise, the 0.0/0 route is used. Policy routes can be found under the NE segment, selecting advanced. By default, all RFC1918 IP addresses are included in the MON policy routes definition.
+If choosing the NE path for example, the MON policy routes have to specifically address the subnet on the on-premises side; otherwise, the 0.0.0.0/0 default route is used. Policy routes can be found under the NE segment, selecting advanced. By default, all RFC1918 IP addresses are included in the MON policy routes definition.
:::image type="content" source="media/tutorial-vmware-hcx/default-hcx-mon-policy-based-routes.png" alt-text="Screenshot showing the default policy-based routes.":::
backup Backup Azure Delete Vault https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-delete-vault.md
Title: Delete a Microsoft Azure Recovery Services vault description: In this article, learn how to remove dependencies and then delete an Azure Backup Recovery Services vault. Previously updated : 01/28/2022 Last updated : 04/11/2022
If you try to delete the vault without removing the dependencies, you'll encount
## Delete a Recovery Services vault
-> [!VIDEO https://www.microsoft.com/videoplayer/embed/RWQGC5]
+> [!VIDEO https://www.youtube.com/embed/xg_TnyhK34o]
Choose a client:
backup Backup Azure Policy Supported Skus https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-policy-supported-skus.md
Title: Supported VM SKUs for Azure Policy description: 'An article describing the supported VM SKUs (by Publisher, Image Offer and Image SKU) which are supported for the built-in Azure Policies provided by Backup' Previously updated : 03/15/2022 Last updated : 04/08/2022
MicrosoftWindowsServer | WindowsServer | Windows Server 2016 Datacenter (2016-Da
MicrosoftWindowsServer | WindowsServer | Windows Server 2016 Datacenter - Server Core (2016-Datacenter-Server-Core) MicrosoftWindowsServer | WindowsServer | [smalldisk] Windows Server 2016 Datacenter - Server Core (2016-Datacenter-Server-Core-smalldisk) MicrosoftWindowsServer | WindowsServer | [smalldisk] Windows Server 2016 Datacenter (2016-Datacenter-smalldisk)
+MicrosoftWindowsServer | WindowsServer | Windows Server 2016 Datacenter - Gen 2 (2016-Datacenter-gensecond)
MicrosoftWindowsServer | WindowsServer | Windows Server 2019 Datacenter Server Core with Containers (2016-Datacenter-with-Containers) MicrosoftWindowsServer | WindowsServer | Windows Server 2016 Remote Desktop Session Host 2016 (2016-Datacenter-with-RDSH) MicrosoftWindowsServer | WindowsServer | Windows Server 2019 Datacenter (2019-Datacenter)
MicrosoftWindowsServer | WindowsServer | Windows Server 2019 Datacenter (zh-cn)
MicrosoftWindowsServer | WindowsServerSemiAnnual | Datacenter-Core-1709-smalldisk MicrosoftWindowsServer | WindowsServerSemiAnnual | Datacenter-Core-1709-with-Containers-smalldisk MicrosoftWindowsServer | WindowsServerSemiAnnual | Datacenter-Core-1803-with-Containers-smalldisk
-MicrosoftWindowsServer | WindowsServer | Windows Server 2019 Datacenter gen2(2019-Datacenter- gensecond)
-MicrosoftWindowsServer | WindowsServer | Windows Server 2022 Datacenter - Gen 2(2022-datacenter-g2)
+MicrosoftWindowsServer | WindowsServer | Windows Server 2019 Datacenter - Gen 2 (2019-Datacenter-gensecond)
+MicrosoftWindowsServer | WindowsServer | Windows Server 2022 Datacenter - Gen 2 (2022-datacenter-g2)
MicrosoftWindowsServer | WindowsServer | Windows Server 2022 Datacenter(2022-datacenter) MicrosoftWindowsServer | WindowsServer | Windows Server 2022 Datacenter: Azure Edition - Gen 2 (2022-datacenter-azure-edition)
-MicrosoftWindowsServer | WindowsServer | [smalldisk] Windows Server 2022 Datacenter: Azure Edition - Gen 2(2022-datacenter-azure-edition-smalldisk)
+MicrosoftWindowsServer | WindowsServer | [smalldisk] Windows Server 2022 Datacenter: Azure Edition - Gen 2 (2022-datacenter-azure-edition-smalldisk)
MicrosoftWindowsServer | WindowsServer | Windows Server 2022 Datacenter: Azure Edition Core- Gen 2 (2022-datacenter-azure-edition-core)
-MicrosoftWindowsServer | WindowsServer | [smalldisk] Windows Server 2022 Datacenter: Azure Edition Core-Gen 2 (2022-datacenter-azure-edition-core-smalldisk)
-MicrosoftWindowsServer | WindowsServer | [smalldisk] Windows Server 2022 Datacenter-Gen 2 (2022-datacenter-smalldisk-g2)
-MicrosoftWindowsServer | WindowsServer | [smalldisk] Windows Server 2022 Datacenter-Gen 1 (2022-datacenter-smalldisk)
+MicrosoftWindowsServer | WindowsServer | [smalldisk] Windows Server 2022 Datacenter: Azure Edition Core - Gen 2 (2022-datacenter-azure-edition-core-smalldisk)
+MicrosoftWindowsServer | WindowsServer | [smalldisk] Windows Server 2022 Datacenter -Gen 2 (2022-datacenter-smalldisk-g2)
+MicrosoftWindowsServer | WindowsServer | [smalldisk] Windows Server 2022 Datacenter -Gen 1 (2022-datacenter-smalldisk)
MicrosoftWindowsServer | WindowsServer | Windows Server 2022 Datacenter Server Core -Gen 2 (2022-datacenter-core-g2) MicrosoftWindowsServer | WindowsServer | Windows Server 2022 Datacenter Server Core -Gen 1 (2022-datacenter-core) MicrosoftWindowsServer | WindowsServer | [smalldisk]Windows Server 2022 Datacenter Server Core -Gen 2 (2022-datacenter-core-smalldisk-g2)
-MicrosoftWindowsServer | WindowsServer | [smalldisk]Windows Server 2022 Datacenter Server Core -Gen 1(2022-datacenter-core-smalldisk)
+MicrosoftWindowsServer | WindowsServer | [smalldisk]Windows Server 2022 Datacenter Server Core -Gen 1 (2022-datacenter-core-smalldisk)
MicrosoftWindowsServerHPCPack | WindowsServerHPCPack | All Image SKUs MicrosoftSQLServer | SQL2016SP1-WS2016 | All Image SKUs MicrosoftSQLServer | SQL2016-WS2016 | All Image SKUs
backup Backup Azure Sap Hana Database https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/backup/backup-azure-sap-hana-database.md
More details around using these options are shared below:
Private endpoints allow you to connect securely from servers inside a virtual network to your Recovery Services vault. The private endpoint uses an IP from the VNET address space for your vault. The network traffic between your resources inside the virtual network and the vault travels over your virtual network and a private link on the Microsoft backbone network. This eliminates exposure from the public internet. Read more on private endpoints for Azure Backup [here](./private-endpoints.md).
+> [!NOTE]
+> Private endpoints are supported for Azure Backup and Azure storage. Azure AD has support private end-points in private preview. Until they are generally available, Azure backup supports setting up proxy for AAD so that no outbound connectivity is required for HANA VMs. Refer to the [proxy support section](#use-an-http-proxy-server-to-route-traffic) for more details.
+ #### NSG tags If you use Network Security Groups (NSG), use the *AzureBackup* service tag to allow outbound access to Azure Backup. In addition to the Azure Backup tag, you also need to allow connectivity for authentication and data transfer by creating similar [NSG rules](../virtual-network/network-security-groups-overview.md#service-tags) for Azure AD (*AzureActiveDirectory*) and Azure Storage(*Storage*). The following steps describe the process to create a rule for the Azure Backup tag:
You can also use the following FQDNs to allow access to the required services fr
> [!NOTE] > Currently, we only support HTTP Proxy for Azure Active Directory (Azure AD) traffic for SAP HANA. If you need to remove outbound connectivity requirements (for Azure Backup and Azure Storage traffic) for database backups via Azure Backup in HANA VMs, use other options, such as private endpoints.
+##### Using an HTTP proxy server for AAD traffic
+
+1. Go to the "opt/msawb/bin" folder
+2. Create a new JSON file named "ExtensionSettingOverrides.JSON"
+3. Add a key-value pairs to the JSON file as follows:
+
+ ```json
+ {
+ "UseProxyForAAD":true,
+ "UseProxyForAzureBackup":false,
+ "UseProxyForAzureStorage":false,
+ "ProxyServerAddress":"http://xx.yy.zz.mm:port"
+ }
+ ```
+
+4. Change the permissions and ownership of the file as follows:
+
+ ```bash
+ chmod 750 ExtensionSettingsOverrides.json
+ chown root:msawb ExtensionSettingsOverrides.json
+ ```
+
+5. No restart of any service is required. The Azure Backup service will attempt to route the AAD traffic via the proxy server mentioned in the JSON file.
+ [!INCLUDE [How to create a Recovery Services vault](../../includes/backup-create-rs-vault.md)] ## Enable Cross Region Restore
cdn Cdn Caching Policy https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cdn/cdn-caching-policy.md
Azure Media Services provides [integrated CDN](https://azure.microsoft.com/updat
## Configuring cache headers with Azure Media Services You can use Azure portal or Azure Media Services APIs to configure cache header values.
-1. To configure cache headers using Azure portal, refer to [How to Manage Streaming Endpoints](/media-services/previous/media-services-portal-manage-streaming-endpoints) section Configuring the Streaming Endpoint.
+1. To configure cache headers using Azure portal, refer to [How to Manage Streaming Endpoints](/azure/media-services/previous/media-services-portal-manage-streaming-endpoints) section Configuring the Streaming Endpoint.
2. Azure Media Services REST API, [StreamingEndpoint](/rest/api/media/operations/streamingendpoint#StreamingEndpointCacheControl). 3. Azure Media Services .NET SDK, [StreamingEndpointCacheControl Properties](/dotnet/api/microsoft.windowsazure.mediaservices.client.streamingendpointcachecontrol).
cdn Cdn Create New Endpoint https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cdn/cdn-create-new-endpoint.md
After you've created a CDN profile, you use it to create an endpoint.
1. In the Azure portal, select in your dashboard the CDN profile that you created. If you can't find it, you can either open the resource group in which you created it, or use the search bar at the top of the portal, enter the profile name, and select the profile from the results. 1. On the CDN profile page, select **+ Endpoint**.
-
- ![CDN profile](./media/cdn-create-new-endpoint/cdn-select-endpoint.png)
+
+ :::image type="content" source="./media/cdn-create-new-endpoint/cdn-select-endpoint.png" alt-text="Create CDN endpoint.":::
The **Add an endpoint** pane appears.
After you've created a CDN profile, you use it to create an endpoint.
| **Origin port** | Leave the default port values. | | **Optimized for** | Leave the default selection, **General web delivery**. |
- ![Add endpoint pane](./media/cdn-create-new-endpoint/cdn-add-endpoint.png)
+ :::image type="content" source="./media/cdn-create-new-endpoint/cdn-add-endpoint.png" alt-text="Add endpoint pane.":::
+
3. Select **Add** to create the new endpoint. After the endpoint is created, it appears in the list of endpoints for the profile.
-
- ![CDN endpoint](./media/cdn-create-new-endpoint/cdn-endpoint-success.png)
+
+ :::image type="content" source="./media/cdn-create-new-endpoint/cdn-endpoint-success.png" alt-text="View added endpoint.":::
The time it takes for the endpoint to propagate depends on the pricing tier selected when you created the profile. **Standard Akamai** usually completes within one minute, **Standard Microsoft** in 10 minutes, and **Standard Verizon** and **Premium Verizon** in up to 30 minutes.
cognitive-services Howtocallvisionapi https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Computer-vision/Vision-API-How-to-Topics/HowToCallVisionAPI.md
Previously updated : 01/05/2022 Last updated : 04/11/2022 # Call the Image Analysis API
-This article demonstrates how to call the Image Analysis API to return information about an image's visual features.
+This article demonstrates how to call the Image Analysis API to return information about an image's visual features. It also shows you how to parse the returned information using the client SDKs or REST API.
-This guide assumes you have already <a href="https://portal.azure.com/#create/Microsoft.CognitiveServicesComputerVision" title="created a Computer Vision resource" target="_blank">created a Computer Vision resource </a> and obtained a subscription key and endpoint URL. If you haven't, follow a [quickstart](../quickstarts-sdk/image-analysis-client-library.md) to get started.
+This guide assumes you have already <a href="https://portal.azure.com/#create/Microsoft.CognitiveServicesComputerVision" title="created a Computer Vision resource" target="_blank">created a Computer Vision resource </a> and obtained a subscription key and endpoint URL. If you're using a client SDK, you'll also need to authenticate a client object. If you haven't done these steps, follow the [quickstart](../quickstarts-sdk/image-analysis-client-library.md) to get started.
## Submit data to the service
-You submit either a local image or a remote image to the Analyze API. For local, you put the binary image data in the HTTP request body. For remote, you specify the image's URL by formatting the request body like the following: `{"url":"http://example.com/images/test.jpg"}`.
+The code in this guide uses remote images referenced by URL. You may want to try different images on your own to see the full capability of the Image Analysis features.
+
+#### [REST](#tab/rest)
+
+When analyzing a local image, you put the binary image data in the HTTP request body. For a remote image, you specify the image's URL by formatting the request body like this: `{"url":"http://example.com/images/test.jpg"}`.
+
+#### [C#](#tab/csharp)
+
+In your main class, save a reference to the URL of the image you want to analyze.
+
+[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/ComputerVision/ImageAnalysisQuickstart.cs?name=snippet_analyze_url)]
+
+#### [Java](#tab/java)
+
+In your main class, save a reference to the URL of the image you want to analyze.
+
+[!code-java[](~/cognitive-services-quickstart-code/java/ComputerVision/src/main/java/ImageAnalysisQuickstart.java?name=snippet_urlimage)]
+
+#### [JavaScript](#tab/javascript)
+
+In your main function, save a reference to the URL of the image you want to analyze.
+
+[!code-javascript[](~/cognitive-services-quickstart-code/javascript/ComputerVision/ImageAnalysisQuickstart.js?name=snippet_describe_image)]
+
+#### [Python](#tab/python)
+
+Save a reference to the URL of the image you want to analyze.
+
+[!code-python[](~/cognitive-services-quickstart-code/python/ComputerVision/ImageAnalysisQuickstart.py?name=snippet_remoteimage)]
+++ ## Determine how to process the data
-### Select visual features
+### Select visual features
+
+The Analyze API gives you access to all of the service's image analysis features. Choose which operations to do based on your own use case. See the [overview](../overview.md) for a description of each feature. The examples below add all of the available visual features, but for practical usage you'll likely only need one or two.
-The [Analyze API](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2/operations/56f91f2e778daf14a499f21b) gives you access to all of the service's image analysis features. You need to specify which features you want to use by setting the URL query parameters. A parameter can have multiple values, separated by commas. Each feature you specify will require additional computation time, so only specify what you need.
+#### [REST](#tab/rest)
+
+You can specify which features you want to use by setting the URL query parameters of the [Analyze API](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2/operations/56f91f2e778daf14a499f21b). A parameter can have multiple values, separated by commas. Each feature you specify will require more computation time, so only specify what you need.
|URL parameter | Value | Description| |||--|
-|`visualFeatures`|`Adult` | detects if the image is pornographic in nature (depicts nudity or a sex act), or is gory (depicts extreme violence or blood). Sexually suggestive content (aka racy content) is also detected.|
+|`visualFeatures`|`Adult` | detects if the image is pornographic in nature (depicts nudity or a sex act), or is gory (depicts extreme violence or blood). Sexually suggestive content ("racy" content) is also detected.|
|`visualFeatures`|`Brands` | detects various brands within an image, including the approximate location. The Brands argument is only available in English.|
-|`visualFeatures`|`Categories` | categorizes image content according to a taxonomy defined in documentation. This is the default value of `visualFeatures`.|
+|`visualFeatures`|`Categories` | categorizes image content according to a taxonomy defined in documentation. This value is the default value of `visualFeatures`.|
|`visualFeatures`|`Color` | determines the accent color, dominant color, and whether an image is black&white.| |`visualFeatures`|`Description` | describes the image content with a complete sentence in supported languages.| |`visualFeatures`|`Faces` | detects if faces are present. If present, generate coordinates, gender and age.|
The [Analyze API](https://westus.dev.cognitive.microsoft.com/docs/services/compu
|`details`| `Celebrities` | identifies celebrities if detected in the image.| |`details`|`Landmarks` |identifies landmarks if detected in the image.|
-A populated URL might look like the following:
+A populated URL might look like this:
`https://{endpoint}/vision/v2.1/analyze?visualFeatures=Description,Tags&details=Celebrities`
+#### [C#](#tab/csharp)
+
+Define your new method for image analysis. Add the code below, which specifies visual features you'd like to extract in your analysis. See the **[VisualFeatureTypes](/dotnet/api/microsoft.azure.cognitiveservices.vision.computervision.models.visualfeaturetypes)** enum for a complete list.
+
+[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/ComputerVision/ImageAnalysisQuickstart.cs?name=snippet_visualfeatures)]
++
+#### [Java](#tab/java)
+
+Specify which visual features you'd like to extract in your analysis. See the [VisualFeatureTypes](/java/api/com.microsoft.azure.cognitiveservices.vision.computervision.models.visualfeaturetypes) enum for a complete list.
+
+[!code-java[](~/cognitive-services-quickstart-code/java/ComputerVision/src/main/java/ImageAnalysisQuickstart.java?name=snippet_features_remote)]
+
+#### [JavaScript](#tab/javascript)
+
+Specify which visual features you'd like to extract in your analysis. See the [VisualFeatureTypes](/javascript/api/@azure/cognitiveservices-computervision/visualfeaturetypes?view=azure-node-latest) enum for a complete list.
+
+[!code-javascript[](~/cognitive-services-quickstart-code/javascript/ComputerVision/ImageAnalysisQuickstart.js?name=snippet_features_remote)]
+
+#### [Python](#tab/python)
+
+Specify which visual features you'd like to extract in your analysis. See the [VisualFeatureTypes](/python/api/azure-cognitiveservices-vision-computervision/azure.cognitiveservices.vision.computervision.models.visualfeaturetypes?view=azure-python) enum for a complete list.
+
+[!code-python[](~/cognitive-services-quickstart-code/python/ComputerVision/ImageAnalysisQuickstart.py?name=snippet_features_remote)]
+++++ ### Specify languages
-You can also specify the language of the returned data. The following URL query parameter specifies the language. The default value is `en`.
+You can also specify the language of the returned data.
+
+#### [REST](#tab/rest)
+
+The following URL query parameter specifies the language. The default value is `en`.
|URL parameter | Value | Description| |||--|
You can also specify the language of the returned data. The following URL query
|`language`|`pt` | Portuguese| |`language`|`zh` | Simplified Chinese|
-A populated URL might look like the following:
+A populated URL might look like this:
`https://{endpoint}/vision/v2.1/analyze?visualFeatures=Description,Tags&details=Celebrities&language=en`
+#### [C#](#tab/csharp)
+
+Use the *language* parameter of [AnalyzeImageAsync](/dotnet/api/microsoft.azure.cognitiveservices.vision.computervision.computervisionclientextensions.analyzeimageasync?view=azure-dotnet#microsoft-azure-cognitiveservices-vision-computervision-computervisionclientextensions-analyzeimageasync(microsoft-azure-cognitiveservices-vision-computervision-icomputervisionclient-system-string-system-collections-generic-ilist((system-nullable((microsoft-azure-cognitiveservices-vision-computervision-models-visualfeaturetypes))))-system-collections-generic-ilist((system-nullable((microsoft-azure-cognitiveservices-vision-computervision-models-details))))-system-string-system-collections-generic-ilist((system-nullable((microsoft-azure-cognitiveservices-vision-computervision-models-descriptionexclude))))-system-string-system-threading-cancellationtoken)) call to specify a language. A method call that specifies a language might look like the following.
+
+```csharp
+ImageAnalysis results = await client.AnalyzeImageAsync(imageUrl, visualFeatures: features, language: "en");
+```
+
+#### [Java](#tab/java)
+
+Use the [AnalyzeImageOptionalParameter](/java/api/com.microsoft.azure.cognitiveservices.vision.computervision.models.analyzeimageoptionalparameter) input in your Analyze call to specify a language. A method call that specifies a language might look like the following.
++
+```java
+ImageAnalysis analysis = compVisClient.computerVision().analyzeImage().withUrl(pathToRemoteImage)
+ .withVisualFeatures(featuresToExtractFromLocalImage)
+ .language("en")
+ .execute();
+```
+
+#### [JavaScript](#tab/javascript)
+
+Use the **language** property of the [ComputerVisionClientAnalyzeImageOptionalParams](/javascript/api/@azure/cognitiveservices-computervision/computervisionclientanalyzeimageoptionalparams) input in your Analyze call to specify a language. A method call that specifies a language might look like the following.
+
+```javascript
+const result = (await computerVisionClient.analyzeImage(imageURL,{visualFeatures: features, language: 'en'}));
+```
+
+#### [Python](#tab/python)
+
+Use the *language* parameter of your [analyze_image](/python/api/azure-cognitiveservices-vision-computervision/azure.cognitiveservices.vision.computervision.operations.computervisionclientoperationsmixin?view=azure-python#azure-cognitiveservices-vision-computervision-operations-computervisionclientoperationsmixin-analyze-image) call to specify a language. A method call that specifies a language might look like the following.
+
+```python
+results_remote = computervision_client.analyze_image(remote_image_url , remote_image_features, remote_image_details, 'en')
+```
++++
+## Get results from the service
+
+This section shows you how to parse the results of the API call. It includes the API call itself.
+ > [!NOTE] > **Scoped API calls** >
-> Some of the features in Image Analysis can be called directly as well as through the Analyze API call. For example, you can do a scoped analysis of only image tags by making a request to `https://{endpoint}/vision/v3.2/tag`. See the [reference documentation](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2/operations/56f91f2e778daf14a499f21b) for other features that can be called separately.
+> Some of the features in Image Analysis can be called directly as well as through the Analyze API call. For example, you can do a scoped analysis of only image tags by making a request to `https://{endpoint}/vision/v3.2/tag` (or to the corresponding method in the SDK). See the [reference documentation](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2/operations/56f91f2e778daf14a499f21b) for other features that can be called separately.
-## Get results from the service
+#### [REST](#tab/rest)
-The service returns a `200` HTTP response, and the body contains the returned data in the form of a JSON string. The following is an example of a JSON response.
+The service returns a `200` HTTP response, and the body contains the returned data in the form of a JSON string. The following text is an example of a JSON response.
```json {
See the following list of possible errors and their causes:
* `InvalidImageUrl` - Image URL is badly formatted or not accessible. * `InvalidImageFormat` - Input data is not a valid image. * `InvalidImageSize` - Input image is too large.
- * `NotSupportedVisualFeature` - Specified feature type is not valid.
+ * `NotSupportedVisualFeature` - Specified feature type isn't valid.
* `NotSupportedImage` - Unsupported image, for example child pornography. * `InvalidDetails` - Unsupported `detail` parameter value.
- * `NotSupportedLanguage` - The requested operation is not supported in the language specified.
- * `BadArgument` - Additional details are provided in the error message.
-* 415 - Unsupported media type error. The Content-Type is not in the allowed types:
+ * `NotSupportedLanguage` - The requested operation isn't supported in the language specified.
+ * `BadArgument` - More details are provided in the error message.
+* 415 - Unsupported media type error. The Content-Type isn't in the allowed types:
* For an image URL, Content-Type should be `application/json` * For a binary image data, Content-Type should be `application/octet-stream` or `multipart/form-data` * 500
See the following list of possible errors and their causes:
* `Timeout` - Image processing timed out. * `InternalServerError` +
+#### [C#](#tab/csharp)
+
+The following code calls the Image Analysis API and prints the results to the console.
+
+[!code-csharp[](~/cognitive-services-quickstart-code/dotnet/ComputerVision/ImageAnalysisQuickstart.cs?name=snippet_analyze)]
+
+#### [Java](#tab/java)
+
+The following code calls the Image Analysis API and prints the results to the console.
+
+[!code-java[](~/cognitive-services-quickstart-code/java/ComputerVision/src/main/java/ImageAnalysisQuickstart.java?name=snippet_analyze)]
+
+#### [JavaScript](#tab/javascript)
+
+The following code calls the Image Analysis API and prints the results to the console.
+
+[!code-javascript[](~/cognitive-services-quickstart-code/javascript/ComputerVision/ImageAnalysisQuickstart.js?name=snippet_analyze)]
+
+#### [Python](#tab/python)
+
+The following code calls the Image Analysis API and prints the results to the console.
+
+[!code-python[](~/cognitive-services-quickstart-code/python/ComputerVision/ImageAnalysisQuickstart.py?name=snippet_analyze)]
++++ > [!TIP] > While working with Computer Vision, you might encounter transient failures caused by [rate limits](https://azure.microsoft.com/pricing/details/cognitive-services/computer-vision/) enforced by the service, or other transient problems like network outages. For information about handling these types of failures, see [Retry pattern](/azure/architecture/patterns/retry) in the Cloud Design Patterns guide, and the related [Circuit Breaker pattern](/azure/architecture/patterns/circuit-breaker). ## Next steps
-To try out the REST API, go to the [Image Analysis API Reference](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2/operations/56f91f2e778daf14a499f21b).
+* Explore the [concept articles](../concept-object-detection.md) to learn more about each feature.
+* See the [API reference](https://westus.dev.cognitive.microsoft.com/docs/services/computer-vision-v3-2/operations/56f91f2e778daf14a499f21b) to learn more about the API functionality.
cognitive-services Client Library https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Computer-vision/quickstarts-sdk/client-library.md
Previously updated : 02/02/2022 Last updated : 03/02/2022 ms.devlang: csharp, golang, java, javascript, python
Get started with the Computer Vision Read REST API or client libraries. The Read
::: zone-end --- ::: zone pivot="programming-language-rest-api" [!INCLUDE [REST API quickstart](../includes/curl-quickstart.md)]
cognitive-services Image Analysis Client Library https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Computer-vision/quickstarts-sdk/image-analysis-client-library.md
keywords: computer vision, computer vision service
# Quickstart: Use the Image Analysis client library or REST API
-Get started with the Image Analysis REST API or client libraries. The Analyze Image service provides you with AI algorithms for processing images and returning information on their visual features. Follow these steps to install a package to your application and try out the sample code for basic tasks.
--
+Get started with the Image Analysis REST API or client libraries. The Analyze Image service provides you with AI algorithms for processing images and returning information on their visual features. Follow these steps to install a package to your application and try out the sample code for a basic task.
::: zone pivot="programming-language-csharp"
Get started with the Image Analysis REST API or client libraries. The Analyze Im
::: zone-end --- ::: zone pivot="programming-language-rest-api" [!INCLUDE [REST API quickstart](../includes/image-analysis-curl-quickstart.md)]
cognitive-services Video Moderation Api https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Content-Moderator/video-moderation-api.md
The Content Moderator's video moderation capability is available as a free publi
### Create an Azure Media Services account
-Follow the instructions in [Create an Azure Media Services account](/media-services/previous/media-services-portal-create-account) to subscribe to AMS and create an associated Azure storage account. In that storage account, create a new Blob storage container.
+Follow the instructions in [Create an Azure Media Services account](/azure/media-services/previous/media-services-portal-create-account) to subscribe to AMS and create an associated Azure storage account. In that storage account, create a new Blob storage container.
### Create an Azure Active Directory application
In the **Azure AD app** section, select **Create New** and name your new Azure A
Select your app registration and click the **Manage application** button below it. Note the value in the **Application ID** field; you will need this later. Select **Settings** > **Keys**, and enter a description for a new key (such as "VideoModKey"). Click **Save**, and then notice the new key value. Copy this string and save it somewhere secure.
-For a more thorough walkthrough of the above process, See [Get started with Azure AD authentication](/media-services/previous/media-services-portal-get-started-with-aad).
+For a more thorough walkthrough of the above process, See [Get started with Azure AD authentication](/azure/media-services/previous/media-services-portal-get-started-with-aad).
Once you've done this, you can use the video moderation media processor in two different ways.
After the Content Moderation job is completed, analyze the JSON response. It con
## Next steps
-[Download the Visual Studio solution](https://github.com/Azure-Samples/cognitive-services-dotnet-sdk-samples/tree/master/ContentModerator) for this and other Content Moderator quickstarts for .NET.
+[Download the Visual Studio solution](https://github.com/Azure-Samples/cognitive-services-dotnet-sdk-samples/tree/master/ContentModerator) for this and other Content Moderator quickstarts for .NET.
cognitive-services Get Started Build Detector https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Custom-Vision-Service/get-started-build-detector.md
To upload another set of images, return to the top of this section and repeat th
## Train the detector
-To train the detector model, select the **Train** button. The detector uses all of the current images and their tags to create a model that identifies each tagged object.
+To train the detector model, select the **Train** button. The detector uses all of the current images and their tags to create a model that identifies each tagged object. This process can take several minutes.
![The train button in the top right of the web page's header toolbar](./media/getting-started-build-a-classifier/train01.png)
cognitive-services Getting Started Build A Classifier https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Custom-Vision-Service/getting-started-build-a-classifier.md
If you don't have an Azure subscription, create a [free account](https://azure.m
## Prerequisites -- A set of images with which to train your classifier. See below for tips on choosing images.
+- A set of images with which to train your classifier. You can use the set of [sample images](https://github.com/Azure-Samples/cognitive-services-sample-data-files/tree/master/CustomVision/ImageClassification/Images) on GitHub. Or, you can choose your own images using the tips below.
- A [supported web browser](overview.md#supported-browsers-for-custom-vision-web-portal)
To upload another set of images, return to the top of this section and repeat th
## Train the classifier
-To train the classifier, select the **Train** button. The classifier uses all of the current images to create a model that identifies the visual qualities of each tag.
+To train the classifier, select the **Train** button. The classifier uses all of the current images to create a model that identifies the visual qualities of each tag. This process can take several minutes.
![The train button in the top right of the web page's header toolbar](./media/getting-started-build-a-classifier/train01.png)
cognitive-services How To Custom Voice Create Voice https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/how-to-custom-voice-create-voice.md
The issues are divided into three types. Refer to the following tables to check
**Auto-rejected**
-Data with these errors will be excluded during training.
+Data with these errors will not be used for training. Imported data with errors will be ignored, so you don't need to delete them. You can resubmit the corrected data for training.
| Category | Name | Description | | | -- | |
cognitive-services Record Custom Voice Samples https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cognitive-services/Speech-Service/record-custom-voice-samples.md
Below are some general guidelines that you can follow to create a good corpus (r
With that, make sure your voice talent pronounces these words in the expected way. Keep your script and recordings match consistently during the training process.
- > [!NOTE]
- > The scripts prepared for your voice talent need to follow the native reading conventions, such as 50% and $45, while the scripts used for training need to be normalized to make sure that the scripts match the audio content, such as *fifty percent* and *forty-five dollars*. Check the scripts used for training against the recordings of your voice talent, to make sure they match.
- - Your script should include many different words and sentences with different kinds of sentence lengths, structures, and moods. - Check the script carefully for errors. If possible, have someone else check it too. When you run through the script with your talent, you'll probably catch a few more mistakes.
+### Difference between voice talent script and training script
+
+The training script can differ from the voice talent script, especially for scripts that contain digits, symbols, abbreviations, date, and time. Scripts prepared for the voice talent must follow the native reading conventions, such as 50% and $45. The scripts used for training must be normalized to match the audio recording, such as *fifty percent* and *forty-five dollars*.
+
+> [!NOTE]
+> We provide some example scripts for the voice talent on [GitHub](https://github.com/Azure-Samples/Cognitive-Speech-TTS/tree/master/CustomVoice/script). To use the example scripts for training, you must normalize them according to the recordings of your voice talent before uploading the file.
+
+The following table shows the difference between scripts for voice talent and the normalized script for training.
+
+| Category |Voice talent script example | Training script example (normalized) |
+| | | |
+| Digits |123| one hundred and twenty-three |
+| Symbols |50%| fifty percent|
+| Abbreviation |ASAP| as soon as possible|
+| Date and time |March 3rd at 5:00 PM| March third at five PM|
+ ### Typical defects of a script The script's poor quality can adversely affect the training results. To achieve high-quality training results, it's crucial to avoid the defects.
cosmos-db Glowroot Cassandra https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/cassandra/glowroot-cassandra.md
Glowroot is an application performance management tool used to optimize and moni
## Prerequisites and Setup * [Create an Azure Cosmos DB Cassandra API account](manage-data-java.md#create-a-database-account).
-* [Install JAVA (version 8) for Windows](https://developers.redhat.com/products/openjdk/download)
+* [Install Java (version 8) for Windows](https://developers.redhat.com/products/openjdk/download)
> [!NOTE]
-> Note that there are certain known incompatible build targets with newer versions. If you already have a newer version of JAVA, you can still download JDK8.
-> If you have newer JAVA installed in addition to JDK8: Set the %JAVA_HOME% variable in the local command prompt to target JDK8. This will only change Java version for the current session and leave global machine settings intact.
+> Note that there are certain known incompatible build targets with newer versions. If you already have a newer version of Java, you can still download JDK8.
+> If you have newer Java installed in addition to JDK8: Set the %JAVA_HOME% variable in the local command prompt to target JDK8. This will only change Java version for the current session and leave global machine settings intact.
* [Install maven](https://maven.apache.org/download.cgi) * Verify successful installation by running: `mvn --version`
cosmos-db Feature Support 36 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/mongodb/feature-support-36.md
description: Learn about Azure Cosmos DB's API for MongoDB (3.6 version) support
Previously updated : 03/02/2021 Last updated : 04/04/2022
Azure Cosmos DB's API for MongoDB supports the following database commands:
||| | TTL | Yes | | Unique | Yes |
-| Partial | No |
+| Partial | Only supported with unique indexes |
| Case Insensitive | No | | Sparse | No | | Background | Yes |
Azure Cosmos DB's API for MongoDB supports the following database commands:
| Command | Supported | |||
-| $expr | No |
+| $expr | Yes |
| $jsonSchema | No | | $mod | Yes | | $regex | Yes |
cosmos-db Feature Support 40 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/mongodb/feature-support-40.md
description: Learn about Azure Cosmos DB's API for MongoDB 4.0 server version su
Previously updated : 08/26/2021 Last updated : 04/05/2022
In an [upgrade scenario](upgrade-mongodb-version.md), documents written prior to
||| | TTL | Yes | | Unique | Yes |
-| Partial | No |
+| Partial | Only supported with unique indexes |
| Case Insensitive | No | | Sparse | No | | Background | Yes |
In an [upgrade scenario](upgrade-mongodb-version.md), documents written prior to
| Command | Supported | |||
-| $expr | No |
+| $expr | Yes |
| $jsonSchema | No | | $mod | Yes | | $regex | Yes |
Azure Cosmos DB does not yet support server-side sessions commands.
## Time-to-live (TTL)
-Azure Cosmos DB supports a time-to-live (TTL) based on the timestamp of the document. TTL can be enabled for collections by going to the [Azure portal](https://portal.azure.com).
+Azure Cosmos DB supports a time-to-live (TTL) based on the timestamp of the document. TTL can be enabled for collections from the [Azure portal](https://portal.azure.com).
## Transactions
Some applications rely on a [Write Concern](https://docs.mongodb.com/manual/refe
- Learn how to [use Robo 3T](connect-using-robomongo.md) with Azure Cosmos DB's API for MongoDB. - Explore MongoDB [samples](nodejs-console-app.md) with Azure Cosmos DB's API for MongoDB. - Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
- - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
+ - If all you know is the number of vCores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md)
- If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md)
cosmos-db Feature Support 42 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/mongodb/feature-support-42.md
description: Learn about Azure Cosmos DB API for MongoDB 4.2 server version supp
Previously updated : 02/23/2022 Last updated : 04/05/2022
In an [upgrade scenario](upgrade-mongodb-version.md), documents written prior to
||| | TTL | Yes | | Unique | Yes |
-| Partial | No |
+| Partial | Only supported with unique indexes |
| Case Insensitive | No | | Sparse | No | | Background | Yes |
In an [upgrade scenario](upgrade-mongodb-version.md), documents written prior to
| Command | Supported | |||
-| $expr | No |
+| $expr | Yes |
| $jsonSchema | No | | $mod | Yes | | $regex | Yes |
Azure Cosmos DB does not yet support server-side sessions commands.
## Time-to-live (TTL)
-Azure Cosmos DB supports a time-to-live (TTL) based on the timestamp of the document. TTL can be enabled for collections by going to the [Azure portal](https://portal.azure.com).
+Azure Cosmos DB supports a time-to-live (TTL) based on the timestamp of the document. TTL can be enabled for collections from the [Azure portal](https://portal.azure.com).
## Transactions
Some applications rely on a [Write Concern](https://docs.mongodb.com/manual/refe
- Learn how to [use Robo 3T](connect-using-robomongo.md) with Azure Cosmos DB API for MongoDB. - Explore MongoDB [samples](nodejs-console-app.md) with Azure Cosmos DB API for MongoDB. - Trying to do capacity planning for a migration to Azure Cosmos DB? You can use information about your existing database cluster for capacity planning.
- - If all you know is the number of vcores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md).
+ - If all you know is the number of vCores and servers in your existing database cluster, read about [estimating request units using vCores or vCPUs](../convert-vcore-to-request-unit.md).
- If you know typical request rates for your current database workload, read about [estimating request units using Azure Cosmos DB capacity planner](estimate-ru-capacity-planner.md).
cosmos-db Mongodb Indexing https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/mongodb/mongodb-indexing.md
ms.devlang: javascript Previously updated : 10/13/2021 Last updated : 4/5/2022
The following operations are common for accounts serving wire protocol version 4
[Unique indexes](../unique-keys.md) are useful for enforcing that two or more documents do not contain the same value for indexed fields.
-> [!IMPORTANT]
-> Unique indexes can be created only when the collection is empty (contains no documents).
- The following command creates a unique index on the field `student_id`: ```shell
In the preceding example, omitting the ```"university":1``` clause returns an er
`cannot create unique index over {student_id : 1.0} with shard key pattern { university : 1.0 }`
+#### Limitations
+
+On accounts that have continuous backup or synapse link enabled, unique indexes will need to be created while the collection is empty.
+
+#### Unique partial indexes
+
+Unique partial indexes can be created by specifying a partialFilterExpression along with the 'unique' constraint in the index. This results in the unique constraint being applied only to the documents that meet the specified filter expression.
+
+The unique constraint will not be effective for documents that do not meet the specified criteria. As a result, other documents will not be prevented from being inserted into the collection.
+
+This feature is supported with the Cosmos DB API for MongoDB versions 3.6 and above.
+
+To create a unique partial index from Mongo Shell, use the command `db.collection.createIndex()` with the 'partialFilterExpression' option and 'unique' constraint.
+The partialFilterExpression option accepts a json document that specifies the filter condition using:
+
+* equality expressions (i.e. field: value or using the $eq operator),
+*'$exists: true' expression,
+* $gt, $gte, $lt, $lte expressions,
+* $type expressions,
+* $and operator at the top-level only
+
+The following command creates an index on collection `books` that specifies a unique constraint on the `title` field and a partial filter expression `rating: { $gte: 3 }`:
+
+```shell
+db.books.createIndex(
+ { Title: 1 },
+ { unique: true, partialFilterExpression: { rating: { $gte: 3 } } }
+)
+```
+
+To delete a partial unique index using om Mongo Shell, use the command `getIndexes()` to list the indexes in the collection.
+Then drop the index with the following command:
+
+```shell
+db.books.dropIndex("indexName")
+```
+ ### TTL indexes To enable document expiration in a particular collection, you need to create a [time-to-live (TTL) index](../time-to-live.md). A TTL index is an index on the `_ts` field with an `expireAfterSeconds` value.
cosmos-db Best Practice Java https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cosmos-db/sql/best-practice-java.md
This article walks through the best practices for using the Azure Cosmos DB Java
| <input type="checkbox"/> | Indexing | The Azure Cosmos DB indexing policy also allows you to specify which document paths to include or exclude from indexing by using indexing paths `IndexingPolicy#getIncludedPaths()` and `IndexingPolicy#getExcludedPaths()`. Ensure that you exclude unused paths from indexing for faster writes. For a sample on how to create indexes using the SDK [visit here](performance-tips-java-sdk-v4-sql.md#indexing-policy) | | <input type="checkbox"/> | Document Size | The request charge of a specified operation correlates directly to the size of the document. We recommend reducing the size of your documents as operations on large documents cost more than operations on smaller documents. | | <input type="checkbox"/> | Enabling Query Metrics | For additional logging of your backend query executions, follow instructions on how to capture SQL Query Metrics using [Java SDK](troubleshoot-java-sdk-v4-sql.md#query-operations) |
-| <input type="checkbox"/> | SDK Logging | Use SDK logging to capture additional diagnostics information and troubleshoot latency issues. Log the [CosmosDiagnostics](/jav#capture-the-diagnostics) |
+| <input type="checkbox"/> | SDK Logging | Use SDK logging to capture additional diagnostics information and troubleshoot latency issues. Log the [CosmosDiagnostics](/java/api/com.azure.cosmos.cosmosdiagnostics?view=azure-java-stable&preserve-view=true) in Java SDK for more detailed cosmos diagnostic information for the current request to the service. As an example use case, capture Diagnostics on any exception and on completed operations if the `CosmosDiagnostics#getDuration()` is greater than a designated threshold value (i.e. if you have an SLA of 10 seconds, then capture diagnostics when `getDuration()` > 10 seconds). It's advised to only use these diagnostics during performance testing. For more information, follow [capture diagnostics on Java SDK](/azure/cosmos-db/sql/troubleshoot-java-sdk-v4-sql#capture-the-diagnostics) |
## Best practices when using Gateway mode Azure Cosmos DB requests are made over HTTPS/REST when you use Gateway mode. They're subject to the default connection limit per hostname or IP address. You might need to tweak [maxConnectionPoolSize](/java/api/com.azure.cosmos.gatewayconnectionconfig.setmaxconnectionpoolsize?view=azure-java-stable#com-azure-cosmos-gatewayconnectionconfig-setmaxconnectionpoolsize(int)&preserve-view=true) to a different value (from 100 through 1,000) so that the client library can use multiple simultaneous connections to Azure Cosmos DB. In Java v4 SDK, the default value for `GatewayConnectionConfig#maxConnectionPoolSize` is 1000. To change the value, you can set `GatewayConnectionConfig#maxConnectionPoolSize` to a different value.
cost-management-billing Analyze Cost Data Azure Cost Management Power Bi Template App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/cost-management-billing/costs/analyze-cost-data-azure-cost-management-power-bi-template-app.md
Title: Analyze Azure costs with the Power BI App
description: This article explains how to install and use the Cost Management Power BI App. Previously updated : 04/05/2022 Last updated : 04/08/2022
The following reports are available in the app.
- Azure Marketplace charges - Overages and total charges
+The Billing account overview page might show costs that differ from costs shown in the EA portal.
+
+>[!NOTE]
+>The **Select date range** selector doesnΓÇÖt affect or change overview tiles. Instead, the overview tiles show the costs for the current billing month. This behavior is intentional.
+
+Data shown in the bar graph is determined by the date selection.
+
+Here's how values in the overview tiles are calculated.
+
+- The value shown in the **Charges against credit** tile is calculated as the sum of `adjustments`.
+- The value shown in the **Service overage** tile is calculated as the sum of `ServiceOverage`.
+- The value shown in the **Billed separately** tile is calculated as the sum of `chargesBilledseparately`.
+- The value shown in the **Azure Marketplace** tile is calculated as the sum of `azureMarketplaceServiceCharges`.
+- The value shown in the **New purchase amount** tile is calculated as the sum of `newPurchases`.
+- The value shown in the **Total charges** tile is calculated as the sum of (`adjustments` + `ServiceOverage` + `chargesBilledseparately` + `azureMarketplaceServiceCharges`).
+
+The EA portal doesn't the Total charges column. The Power BI template app includes Adjustments, Service Overage, Charges billed separately, and Azure marketplace service charges as Total charges.
+
+The Prepayment Usage shown in the EA portal isn't available in the Template app as part of the total charges.
+ **Usage by Subscriptions and Resource Groups** - Provides a cost over time view and charts showing cost by subscription and resource group. **Usage by Services** - Provides a view over time of usage by MeterCategory. You can track your usage data and drill into any anomalies to understand usage spikes or dips.
data-factory Tutorial Data Flow Delta Lake https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/data-factory/tutorial-data-flow-delta-lake.md
In this step, you'll create a pipeline that contains a data flow activity.
## Build transformation logic in the data flow canvas
-You will generate two data flows in this tutorial. The fist data flow is a simple source to sink to generate a new Delta Lake from the movies CSV file from above. Lastly, you'll create this flow design below to update data in Delta Lake.
+You will generate two data flows in this tutorial. The first data flow is a simple source to sink to generate a new Delta Lake from the movies CSV file from above. Lastly, you'll create this flow design below to update data in Delta Lake.
:::image type="content" source="media/data-flow/data-flow-tutorial-6.png" alt-text="Final flow":::
databox-online Azure Stack Edge Gpu Deploy Virtual Machine Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/databox-online/azure-stack-edge-gpu-deploy-virtual-machine-portal.md
Previously updated : 08/03/2021 Last updated : 04/11/2022 # Customer intent: As an IT admin, I need to understand how to configure compute on an Azure Stack Edge Pro GPU device so that I can use it to transform data before I send it to Azure.
Follow these steps to connect to a Windows VM.
## Next steps
+- [Deploy a cloud managed VM via a script](https://github.com/Azure-Samples/azure-stack-edge-deploy-vms/blob/master/dotnetSamples/CloudManaged/README.md)
- [Deploy a GPU VM](azure-stack-edge-gpu-deploy-gpu-virtual-machine.md) - [Troubleshoot VM deployment](azure-stack-edge-gpu-troubleshoot-virtual-machine-provisioning.md) - [Monitor VM activity on your device](azure-stack-edge-gpu-monitor-virtual-machine-activity.md)
defender-for-cloud Enhanced Security Features Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/enhanced-security-features-overview.md
Title: Understand the enhanced security features of Microsoft Defender for Cloud description: Learn about the benefits of enabling enhanced security in Microsoft Defender for Cloud -- Previously updated : 02/24/2022 Last updated : 04/11/2022+++ # Microsoft Defender for Cloud's enhanced security features
This article explained Defender for Cloud's pricing options. For related materia
- [How to optimize your Azure workload costs](https://azure.microsoft.com/blog/how-to-optimize-your-azure-workload-costs/) - [Pricing details according to currency or region](https://azure.microsoft.com/pricing/details/defender-for-cloud/) - You may want to manage your costs and limit the amount of data collected for a solution by limiting it to a particular set of agents. Use [solution targeting](../azure-monitor/insights/solution-targeting.md) to apply a scope to the solution and target a subset of computers in the workspace. If you're using solution targeting, Defender for Cloud lists the workspace as not having a solution.
+> [!IMPORTANT]
+> Solution targeting has been deprecated because the Log Analytics agent is being replaced with the Azure Monitor agent and solutions in Azure Monitor are being replaced with insights. You can continue to use solution targeting if you already have it configured, but it is not available in new regions.
+> The feature will not be supported after August 31, 2024.
+> Regions that support solution targeting until the deprecation date are:
+>
+> | Region code | Region name |
+> | : | :- |
+> | CCAN | canadacentral |
+> | CHN | switzerlandnorth |
+> | CID | centralindia |
+> | CQ | brazilsouth |
+> | CUS | centralus |
+> | DEWC | germanywestcentral |
+> | DXB | UAENorth |
+> | EA | eastasia |
+> | EAU | australiaeast |
+> | EJP | japaneast |
+> | EUS | eastus |
+> | EUS2 | eastus2 |
+> | NCUS | northcentralus |
+> | NEU | NorthEurope |
+> | NOE | norwayeast |
+> | PAR | FranceCentral |
+> | SCUS | southcentralus |
+> | SE | KoreaCentral |
+> | SEA | southeastasia |
+> | SEAU | australiasoutheast |
+> | SUK | uksouth |
+> | WCUS | westcentralus |
+> | WEU | westeurope |
+> | WUS | westus |
+> | WUS2 | westus2 |
+>
+> | Air-gapped clouds | Region code | Region name |
+> | :- | :- | :- |
+> | UsNat | EXE | usnateast |
+> | UsNat | EXW | usnatwest |
+> | UsGov | FF | usgovvirginia |
+> | China | MC | ChinaEast2 |
+> | UsGov | PHX | usgovarizona |
+> | UsSec | RXE | usseceast |
+> | UsSec | RXW | ussecwest |
defender-for-cloud Export To Siem https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/export-to-siem.md
description: Learn how to stream your security alerts to Microsoft Sentinel, thi
Previously updated : 03/29/2022 Last updated : 04/04/2022 # Stream alerts to a SIEM, SOAR, or IT Service Management solution [!INCLUDE [Banner for top of topics](./includes/banner.md)]
-Microsoft Defender for Cloud can stream your security alerts into the most popular Security Information and Event Management (SIEM), Security Orchestration Automated Response (SOAR), and IT Service Management (ITSM) solutions.
+Microsoft Defender for Cloud can stream your security alerts into the most popular Security Information and Event Management (SIEM),
+Security Orchestration Automated Response (SOAR), and IT Service Management (ITSM) solutions.
+Security alerts are notifications that Defender for Cloud generates when it detects threats on your resources.
+Defender for Cloud prioritizes and lists the alerts, along with the information needed for you to quickly investigate the problem.
+Defender for Cloud also provides detailed steps to help you remediate attacks.
+Alerts data is retained for 90 days.
-There are Azure-native tools for ensuring you can view your alert data in all of the most popular solutions in use today, including:
+There are built-in Azure tools for ensuring you can view your alert data in all of the most popular solutions in use today, including:
- **Microsoft Sentinel** - **Splunk Enterprise and Splunk Cloud**
There are Azure-native tools for ensuring you can view your alert data in all of
- **Power BI** - **Palo Alto Networks**
-## Stream alerts to Microsoft Sentinel
+## Stream alerts to Microsoft Sentinel
-Defender for Cloud natively integrates with Microsoft Sentinel, Azure's cloud-native SIEM and SOAR solution.
+Defender for Cloud natively integrates with Microsoft Sentinel, Azure's cloud-native SIEM and SOAR solution.
[Learn more about Microsoft Sentinel](../sentinel/overview.md).
Defender for Cloud natively integrates with Microsoft Sentinel, Azure's cloud-na
Microsoft Sentinel includes built-in connectors for Microsoft Defender for Cloud at the subscription and tenant levels: - [Stream alerts to Microsoft Sentinel at the subscription level](../sentinel/connect-azure-security-center.md)-- [Connect all subscriptions in your tenant to Microsoft Sentinel](https://techcommunity.microsoft.com/t5/azure-sentinel/azure-security-center-auto-connect-to-sentinel/ba-p/1387539)
+- [Connect all subscriptions in your tenant to Microsoft Sentinel](https://techcommunity.microsoft.com/t5/azure-sentinel/azure-security-center-auto-connect-to-sentinel/ba-p/1387539)
When you connect Defender for Cloud to Microsoft Sentinel, the status of Defender for Cloud alerts that get ingested into Microsoft Sentinel is synchronized between the two services. So, for example, when an alert is closed in Defender for Cloud, that alert is also shown as closed in Microsoft Sentinel. If you change the status of an alert in Defender for Cloud, the status of the alert in Microsoft Sentinel is also updated, but the statuses of any Microsoft Sentinel **incidents** that contain the synchronized Microsoft Sentinel alert aren't updated.
Another alternative for investigating Defender for Cloud alerts in Microsoft Sen
> Microsoft Sentinel is billed based on the volume of data that it ingests for analysis in Microsoft Sentinel and stores in the Azure Monitor Log Analytics workspace. Microsoft Sentinel offers a flexible and predictable pricing model. [Learn more at the Microsoft Sentinel pricing page](https://azure.microsoft.com/pricing/details/azure-sentinel/).
+## Stream alerts to QRadar and Splunk
-## Stream alerts with Azure Monitor
+The export of security alerts to Splunk and QRadar uses Event Hubs and a built-in connector.
+You can either use a PowerShell script or the Azure portal to set up the requirements for exporting security alerts for your subscription or tenant.
+Then youΓÇÖll need to use the procedure specific to each SIEM to install the solution in the SIEM platform.
-To stream alerts into **ArcSight**, **Splunk**, **QRadar**, **SumoLogic**, **Syslog servers**, **LogRhythm**, **Logz.io Cloud Observability Platform**, and other monitoring solutions, connect Defender for Cloud to Azure monitor using Azure Event Hubs:
+### Prerequisites
+
+Before you set up the Azure services for exporting alerts, make sure you have:
+
+- Azure subscription ([Create a free account](https://azure.microsoft.com/free/))
+- Azure resource group ([Create a resource group](../azure-resource-manager/management/manage-resource-groups-portal.md))
+- **Owner** role on the alerts scope (subscription, management group or tenant), or these specific permissions:
+ - Write permissions for event hubs and the Event Hub Policy
+ - Create permissions for [Azure AD applications](../active-directory/develop/howto-create-service-principal-portal.md#permissions-required-for-registering-an-app), if you aren't using an existing Azure AD application
+ - Assign permissions for policies, if you're using the Azure Policy 'DeployIfNotExist'
+ <!-
+ - if it **has the SecurityCenterFree solution**, you'll need a minimum of read permissions for the workspace solution: `Microsoft.OperationsManagement/solutions/read`
+ - if it **doesn't have the SecurityCenterFree solution**, you'll need write permissions for the workspace solution: `Microsoft.OperationsManagement/solutions/action` -->
+
+### Step 1. Set up the Azure services
+
+You can set up your Azure environment to support continuous export using either:
+
+- A PowerShell script (Recommended)
+
+ Download and run [the PowerShell script](https://github.com/Azure/Microsoft-Defender-for-Cloud/tree/main/Powershell%20scripts/3rd%20party%20SIEM%20integration).
+ Enter the required parameters and the script performs all of the steps for you.
+ When the script finishes, it outputs the information youΓÇÖll use to install the solution in the SIEM platform.
+
+- In the Azure portal
+
+ Here's an overview of the steps you'll do in the Azure portal:
+
+ 1. Create an Event Hubs namespace and event hub.
+ 2. Define a policy for the event hub with ΓÇ£SendΓÇ¥ permissions.
+ 3. **If you are streaming your alerts to QRadar SIEM** - Create an event hub "Listen" policy, then copy and save the connection string of the policy that youΓÇÖll use in QRadar.
+ 4. Create a consumer group, then copy and save the name that youΓÇÖll use in the SIEM platform.
+ 5. Enable continuous export of your security alerts to the defined event hub.
+ 6. **If you are streaming your alerts to QRadar SIEM** - Create a storage account, then copy and save the connection string to the account that youΓÇÖll use in QRadar.
+ 7. **If you are streaming your alerts to Splunk SIEM**:
+ 1. Create a Microsoft Azure Active Directory application.
+ 2. Save the Tenant, App ID, and App password.
+ 3. Give permissions to the Azure AD Application to read from the event hub you created before.
+
+ For more detailed instructions, see [Prepare Azure resources for exporting to Splunk and QRadar](export-to-splunk-or-qradar.md).
+
+### Step 2. Connect the event hub to your preferred solution using the built-in connectors
+
+Each SIEM platform has a tool to enable it to receive alerts from Azure Event Hubs. Install the tool for your platform to start receiving alerts.
+
+| Tool | Hosted in Azure | Description |
+|:|:| :|
+| IBM QRadar | No | The Microsoft Azure DSM and Microsoft Azure Event Hubs Protocol are available for download from [the IBM support website](https://www.ibm.com/docs/en/qsip/7.4?topic=microsoft-azure-platform). |
+| Splunk | No | [Splunk Add-on for Microsoft Cloud Services](https://splunkbase.splunk.com/app/3110/) is an open source project available in Splunkbase. <br><br> If you can't install an add-on in your Splunk instance, for example if you're using a proxy or running on Splunk Cloud, you can forward these events to the Splunk HTTP Event Collector using [Azure Function For Splunk](https://github.com/Microsoft/AzureFunctionforSplunkVS), which is triggered by new messages in the event hub. |
+
+## Stream alerts with continuous export
+
+To stream alerts into **ArcSight**, **SumoLogic**, **Syslog servers**, **LogRhythm**, **Logz.io Cloud Observability Platform**, and other monitoring solutions, connect Defender for Cloud using continuous export and Azure Event Hubs:
> [!NOTE] > To stream alerts at the tenant level, use this Azure policy and set the scope at the root management group. You'll need permissions for the root management group as explained in [Defender for Cloud permissions](permissions.md): [Deploy export to an event hub for Microsoft Defender for Cloud alerts and recommendations](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2fproviders%2fMicrosoft.Authorization%2fpolicyDefinitions%2fcdfcce10-4578-4ecd-9703-530938e4abcb).
-1. Enable [continuous export](continuous-export.md) to stream Defender for Cloud alerts into a dedicated event hub at the subscription level. To do this at the Management Group level using Azure Policy, see [Create continuous export automation configurations at scale](continuous-export.md?tabs=azure-policy#configure-continuous-export-at-scale-using-the-supplied-policies)
+1. Enable [continuous export](continuous-export.md) to stream Defender for Cloud alerts into a dedicated event hub at the subscription level. To do this at the Management Group level using Azure Policy, see [Create continuous export automation configurations at scale](continuous-export.md?tabs=azure-policy#configure-continuous-export-at-scale-using-the-supplied-policies).
-1. [Connect the event hub to your preferred solution using Azure Monitor's built-in connectors](../azure-monitor/essentials/stream-monitoring-data-event-hubs.md#partner-tools-with-azure-monitor-integration).
+2. Connect the event hub to your preferred solution using the built-in connectors:
-1. Optionally, stream the raw logs to the event hub and connect to your preferred solution. Learn more in [Monitoring data available](../azure-monitor/essentials/stream-monitoring-data-event-hubs.md#monitoring-data-available).
+ | Tool | Hosted in Azure | Description |
+ |:|:| :|
+ | SumoLogic | No | Instructions for setting up SumoLogic to consume data from an event hub are available at [Collect Logs for the Azure Audit App from Event Hubs](https://help.sumologic.com/Send-Data/Applications-and-Other-Data-Sources/Azure-Audit/02Collect-Logs-for-Azure-Audit-from-Event-Hub). |
+ | ArcSight | No | The ArcSight Azure Event Hubs smart connector is available as part of [the ArcSight smart connector collection](https://community.microfocus.com/cyberres/arcsight/f/arcsight-product-announcements/163662/announcing-general-availability-of-arcsight-smart-connectors-7-10-0-8114-0). |
+ | Syslog server | No | If you want to stream Azure Monitor data directly to a syslog server, you can use a [solution based on an Azure function](https://github.com/miguelangelopereira/azuremonitor2syslog/).
+ | LogRhythm | No| Instructions to set up LogRhythm to collect logs from an event hub are available [here](https://logrhythm.com/six-tips-for-securing-your-azure-cloud-environment/).
+ |Logz.io | Yes | For more information, see [Getting started with monitoring and logging using Logz.io for Java apps running on Azure](/azure/developer/java/fundamentals/java-get-started-with-logzio)
-To view the event schemas of the exported data types, visit the [Event Hubs event schemas](https://aka.ms/ASCAutomationSchemas).
+3. Optionally, stream the raw logs to the event hub and connect to your preferred solution. Learn more in [Monitoring data available](../azure-monitor/essentials/stream-monitoring-data-event-hubs.md#monitoring-data-available).
+To view the event schemas of the exported data types, visit the [Event Hubs event schemas](https://aka.ms/ASCAutomationSchemas).
-## Other streaming options
+## Use the Microsoft Graph Security API to stream alerts to third-party applications
-As an alternative to Sentinel and Azure Monitor, you can use Defender for Cloud's built-in integration with [Microsoft Graph Security API](https://www.microsoft.com/security/business/graph-security-api). No configuration is required and there are no additional costs.
+As an alternative to Sentinel and Azure Monitor, you can use Defender for Cloud's built-in integration with [Microsoft Graph Security API](https://www.microsoft.com/security/business/graph-security-api). No configuration is required and there are no additional costs.
-You can use this API to stream alerts from your **entire tenant** (and data from many other Microsoft Security products) into third-party SIEMs and other popular platforms:
+You can use this API to stream alerts from your **entire tenant** (and data from many Microsoft Security products) into third-party SIEMs and other popular platforms:
-- **Splunk Enterprise and Splunk Cloud** - [Use the Microsoft Graph Security API Add-On for Splunk](https://splunkbase.splunk.com/app/4564/)
+- **Splunk Enterprise and Splunk Cloud** - [Use the Microsoft Graph Security API Add-On for Splunk](https://splunkbase.splunk.com/app/4564/)
- **Power BI** - [Connect to the Microsoft Graph Security API in Power BI Desktop](/power-bi/connect-data/desktop-connect-graph-security). - **ServiceNow** - [Install and configure the Microsoft Graph Security API application from the ServiceNow Store](https://docs.servicenow.com/bundle/sandiego-security-management/page/product/secops-integration-sir/secops-integration-ms-graph/task/ms-graph-install.html?cshalt=yes). - **QRadar** - [Use IBM's Device Support Module for Microsoft Defender for Cloud via Microsoft Graph API](https://www.ibm.com/support/knowledgecenter/SS42VS_DSM/com.ibm.dsm.doc/c_dsm_guide_ms_azure_security_center_overview.html). - **Palo Alto Networks**, **Anomali**, **Lookout**, **InSpark**, and more - [Use the Microsoft Graph Security API](https://www.microsoft.com/security/business/graph-security-api#office-MultiFeatureCarousel-09jr2ji). -- ## Next steps This page explained how to ensure your Microsoft Defender for Cloud alert data is available in your SIEM, SOAR, or ITSM tool of choice. For related material, see:
defender-for-cloud Export To Splunk Or Qradar https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/export-to-splunk-or-qradar.md
+
+ Title: Set up the required Azure resources to export security alerts to IBM QRadar and Splunk
+description: Learn how to configure the required Azure resources in the Azure portal to stream security alerts to IBM QRadar and Splunk
+++ Last updated : 04/04/2022++
+# Prepare Azure resources for exporting to Splunk and QRadar
+
+In order to stream Microsoft Defender for Cloud security alerts to IBM QRadar and Splunk, you have to set up resources in Azure, such as Event Hubs and Azure Active Directory (Azure AD). Here are the instructions for configuring these resources in the Azure portal, but you can also configure them using a PowerShell script. Make sure you review [Stream alerts to QRadar and Splunk](export-to-siem.md#stream-alerts-to-qradar-and-splunk) before you configure the Azure resources for exporting alerts to QRadar and Splunk.
+
+To configure the Azure resources for QRadar and Splunk in the Azure portal:
+
+## Step 1. Create an Event Hubs namespace and event hub with send permissions
+
+1. In the [Event Hubs service](../event-hubs/event-hubs-create.md), create an Event Hubs namespace:
+ 1. Select **Create**.
+ 1. Enter the details of the namespace, select **Review + create**, and select **Create**.
+
+ :::image type="content" source="media/export-to-siem/create-event-hub-namespace.png" alt-text="Screenshot of creating an Event Hubs namespace in Microsoft Event Hubs." lightbox="media/export-to-siem/create-event-hub-namespace.png":::
+
+1. Create an event hub:
+ 1. In the namespace that you create, select **+ Event Hub**.
+ 1. Enter the details of the event hub, and select **Review + create**, and select **Create**.
+1. Create a shared access policy.
+ 1. In the Event Hub menu, select the Event Hubs namespace you created.
+ 1. In the Event Hub namespace menu, select **Event Hubs**.
+ 1. Select the event hub that you just created.
+ 1. In the event hub menu, select **Shared access policies**.
+ 1. Select **Add**, enter a unique policy name, and select **Send**.
+ 1. Select **Create** to create the policy.
+ :::image type="content" source="media/export-to-siem/create-shared-access-policy.png" alt-text="Screenshot of creating a shared policy in Microsoft Event Hubs." lightbox="media/export-to-siem/create-shared-access-policy.png":::
+
+## Step 2. **For streaming to QRadar SIEM** - Create a Listen policy
+
+1. Select **Add**, enter a unique policy name, and select **Listen**.
+1. Select **Create** to create the policy.
+1. After the listen policy is created, copy the **Connection string primary key** and save it to use later.
+
+ :::image type="content" source="media/export-to-siem/create-shared-listen-policy.png" alt-text="Screenshot of creating a listen policy in Microsoft Event Hubs." lightbox="media/export-to-siem/create-shared-listen-policy.png":::
+
+## Step 3. Create a consumer group, then copy and save the name to use in the SIEM platform
+
+1. In the Entities section of the Event Hubs event hub menu, select **Event Hubs** and select the event hub you created.
+
+ :::image type="content" source="media/export-to-siem/open-event-hub.png" alt-text="Screenshot of opening the event hub Microsoft Event Hubs." lightbox="media/export-to-siem/open-event-hub.png":::
+
+1. Select **Consumer group**.
+
+## Step 4. Enable continuous export for the scope of the alerts
+
+1. In the Azure search box, search for "policy" and go to the Policy.
+1. In the Policy menu, select **Definitions**.
+1. Search for "deploy export" and select the **Deploy export to Event Hub for Azure Security Center data** built-in policy.
+1. Select **Assign**.
+1. Define the basic policy options:
+ 1. In Scope, select the **...** to select the scope to apply the policy to.
+ 1. Find the root management group (for tenant scope), management group, subscription, or resource group in the scope and select **Select**.
+ - To select a tenant root management group level you need to have permissions on tenant level.
+ 1. (Optional) In Exclusions you can define specific subscriptions to exclude from the export.
+ 1. Enter an assignment name.
+ 1. Make sure policy enforcement is enabled.
+
+ :::image type="content" source="media/export-to-siem/create-export-policy.png" alt-text="Screenshot of assignment for the export policy." lightbox="media/export-to-siem/create-export-policy.png":::
+
+1. In the policy parameters:
+ 1. Enter the resource group where the automation resource is saved.
+ 1. Select resource group location.
+ 1. Select the **...** next to the **Event Hub details** and enter the details for the event hub, including:
+ - Subscription.
+ - The Event Hubs namespace you created.
+ - The event hub you created.
+ - In **authorizationrules**, select the shared access policy that you created to send alerts.
+
+ :::image type="content" source="media/export-to-siem/create-export-policy-parameters.png" alt-text="Screenshot of parameters for the export policy." lightbox="media/export-to-siem/create-export-policy-parameters.png":::
+
+1. Select **Review and Create** and **Create** to finish the process of defining the continuous export to Event Hubs.
+ - Notice that when you activate continuous export policy on the tenant (root management group level), it automatically streams your alerts on any **new** subscription that will be created under this tenant.
+
+## Step 5. **For streaming alerts to QRadar SIEM** - Create a storage account
+
+1. Go to the Azure portal, select **Create a resource**, and select **Storage account**. If that option isn't shown, search for "storage account".
+1. Select **Create**.
+1. Enter the details for the storage account, select **Review and Create**, and then **Create**.
+
+ :::image type="content" source="media/export-to-siem/create-storage-account.png" alt-text="Screenshot of creating storage account." lightbox="media/export-to-siem/create-storage-account.png":::
+
+1. After you create your storage account and go to the resource, in the menu select **Access Keys**.
+1. Select **Show keys** to see the keys, and copy the connection string of Key 1.
+
+ :::image type="content" source="media/export-to-siem/copy-storage-account-key.png" alt-text="Screenshot of copying storage account key." lightbox="media/export-to-siem/copy-storage-account-key.png":::
+
+## Step 6. **For streaming alerts to Splunk SIEM** - Create an Azure AD application
+
+1. In the menu search box, search for "Azure Active Directory" and go to Azure Active Directory.
+1. Go to the Azure portal, select **Create a resource**, and select **Azure Active Directory**. If that option isn't shown, search for "active directory".
+1. In the menu, select **App registrations**.
+1. Select **New registration**.
+1. Enter a unique name for the application and select **Register**.
+
+ :::image type="content" source="media/export-to-siem/register-application.png" alt-text="Screenshot of registering application." lightbox="media/export-to-siem/register-application.png":::
+
+1. Copy to Clipboard and save the **Application (client) ID** and **Directory (tenant) ID**.
+1. Create the client secret for the application:
+ 1. In the menu, go to **Certificates & secrets**.
+ 1. Create a password for the application to prove its identity when requesting a token:
+ 1. Select **New client secret**.
+ 1. Enter a short description, choose the expiration time of the secret, and select **Add**.
+
+ :::image type="content" source="media/export-to-siem/create-client-secret.png" alt-text="Screenshot of creating client secret." lightbox="media/export-to-siem/create-client-secret.png":::
+
+1. After the secret is created, copy the Secret ID and save it for later use together with the Application ID and Directory (tenant) ID.
+
+## Step 7. **For streaming alerts to Splunk SIEM** - Allow Azure AD to read from the event hub
+
+1. Go to the Event Hubs namespace you created.
+1. In the menu, go to **Access control**.
+1. Select **Add** and select **Add role assignment**.
+1. Select **Add role assignment**.
+
+ :::image type="content" source="media/export-to-siem/add-role-assignment.png" alt-text="Screenshot of adding a role assignment." lightbox="media/export-to-siem/add-role-assignment.png":::
+
+1. In the Roles tab, search for **Azure Event Hubs Data Receiver**.
+1. Select **Next**.
+1. Select **Select Members**.
+1. Search for the Azure AD application you created before and select it.
+1. Select **Close**.
+
+To continue setting up export of alerts, [install the built-in connectors](export-to-siem.md#step-2-connect-the-event-hub-to-your-preferred-solution-using-the-built-in-connectors) for the SIEM you're using.
defender-for-cloud Release Notes Archive https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/release-notes-archive.md
description: A description of what's new and changed in Microsoft Defender for C
Previously updated : 04/04/2022 Last updated : 04/11/2022 + # Archive for what's new in Defender for Cloud? [!INCLUDE [Banner for top of topics](./includes/banner.md)]
This page provides you with information about:
- Bug fixes - Deprecated functionality
+## October 2021
+
+Updates in October include:
+
+- [Microsoft Threat and Vulnerability Management added as vulnerability assessment solution (in preview)](#microsoft-threat-and-vulnerability-management-added-as-vulnerability-assessment-solution-in-preview)
+- [Vulnerability assessment solutions can now be auto enabled (in preview)](#vulnerability-assessment-solutions-can-now-be-auto-enabled-in-preview)
+- [Software inventory filters added to asset inventory (in preview)](#software-inventory-filters-added-to-asset-inventory-in-preview)
+- [Changed prefix of some alert types from "ARM_" to "VM_"](#changed-prefix-of-some-alert-types-from-arm_-to-vm_)
+- [Changes to the logic of a security recommendation for Kubernetes clusters](#changes-to-the-logic-of-a-security-recommendation-for-kubernetes-clusters)
+- [Recommendations details pages now show related recommendations](#recommendations-details-pages-now-show-related-recommendations)
+- [New alerts for Azure Defender for Kubernetes (in preview)](#new-alerts-for-azure-defender-for-kubernetes-in-preview)
++
+### Microsoft Threat and Vulnerability Management added as vulnerability assessment solution (in preview)
+
+We've extended the integration between [Azure Defender for Servers](defender-for-servers-introduction.md) and Microsoft Defender for Endpoint, to support a new vulnerability assessment provider for your machines: [Microsoft threat and vulnerability management](/microsoft-365/security/defender-endpoint/next-gen-threat-and-vuln-mgt).
+
+Use **threat and vulnerability management** to discover vulnerabilities and misconfigurations in near real time with the [integration with Microsoft Defender for Endpoint](integration-defender-for-endpoint.md) enabled, and without the need for additional agents or periodic scans. Threat and vulnerability management prioritizes vulnerabilities based on the threat landscape and detections in your organization.
+
+Use the security recommendation "[A vulnerability assessment solution should be enabled on your virtual machines](https://portal.azure.com/#blade/Microsoft_Azure_Security/RecommendationsBlade/assessmentKey/ffff0522-1e88-47fc-8382-2a80ba848f5d)" to surface the vulnerabilities detected by threat and vulnerability management for your [supported machines](/microsoft-365/security/defender-endpoint/tvm-supported-os?view=o365-worldwide&preserve-view=true).
+
+To automatically surface the vulnerabilities, on existing and new machines, without the need to manually remediate the recommendation, see [Vulnerability assessment solutions can now be auto enabled (in preview)](#vulnerability-assessment-solutions-can-now-be-auto-enabled-in-preview).
+
+Learn more in [Investigate weaknesses with Microsoft Defender for Endpoint's threat and vulnerability management](deploy-vulnerability-assessment-tvm.md).
+
+### Vulnerability assessment solutions can now be auto enabled (in preview)
+
+Security Center's auto provisioning page now includes the option to automatically enable a vulnerability assessment solution to Azure virtual machines and Azure Arc machines on subscriptions protected by [Azure Defender for Servers](defender-for-servers-introduction.md).
+
+If the [integration with Microsoft Defender for Endpoint](integration-defender-for-endpoint.md) is enabled, Defender for Cloud presents a choice of vulnerability assessment solutions:
+
+- (**NEW**) The Microsoft threat and vulnerability management module of Microsoft Defender for Endpoint (see [the release note](#microsoft-threat-and-vulnerability-management-added-as-vulnerability-assessment-solution-in-preview))
+- The integrated Qualys agent
++
+Your chosen solution will be automatically enabled on supported machines.
+
+Learn more in [Automatically configure vulnerability assessment for your machines](auto-deploy-vulnerability-assessment.md).
+
+### Software inventory filters added to asset inventory (in preview)
+
+The [asset inventory](asset-inventory.md) page now includes a filter to select machines running specific software - and even specify the versions of interest.
+
+Additionally, you can query the software inventory data in **Azure Resource Graph Explorer**.
+
+To use these new features, you'll need to enable the [integration with Microsoft Defender for Endpoint](integration-defender-for-endpoint.md).
+
+For full details, including sample Kusto queries for Azure Resource Graph, see [Access a software inventory](asset-inventory.md#access-a-software-inventory).
++
+### Changed prefix of some alert types from "ARM_" to "VM_"
+
+In July 2021, we announced a [logical reorganization of Azure Defender for Resource Manager alerts](release-notes-archive.md#logical-reorganization-of-azure-defender-for-resource-manager-alerts)
+
+As part of a logical reorganization of some of the Azure Defender plans, we moved twenty-one alerts from [Azure Defender for Resource Manager](defender-for-resource-manager-introduction.md) to [Azure Defender for Servers](defender-for-servers-introduction.md).
+
+With this update, we've changed the prefixes of these alerts to match this reassignment and replaced "ARM_" with "VM_" as shown in the following table:
+
+| Original name | From this change |
+||--|
+| ARM_AmBroadFilesExclusion | VM_AmBroadFilesExclusion |
+| ARM_AmDisablementAndCodeExecution | VM_AmDisablementAndCodeExecution |
+| ARM_AmDisablement | VM_AmDisablement |
+| ARM_AmFileExclusionAndCodeExecution | VM_AmFileExclusionAndCodeExecution |
+| ARM_AmTempFileExclusionAndCodeExecution | VM_AmTempFileExclusionAndCodeExecution |
+| ARM_AmTempFileExclusion | VM_AmTempFileExclusion |
+| ARM_AmRealtimeProtectionDisabled | VM_AmRealtimeProtectionDisabled |
+| ARM_AmTempRealtimeProtectionDisablement | VM_AmTempRealtimeProtectionDisablement |
+| ARM_AmRealtimeProtectionDisablementAndCodeExec | VM_AmRealtimeProtectionDisablementAndCodeExec |
+| ARM_AmMalwareCampaignRelatedExclusion | VM_AmMalwareCampaignRelatedExclusion |
+| ARM_AmTemporarilyDisablement | VM_AmTemporarilyDisablement |
+| ARM_UnusualAmFileExclusion | VM_UnusualAmFileExclusion |
+| ARM_CustomScriptExtensionSuspiciousCmd | VM_CustomScriptExtensionSuspiciousCmd |
+| ARM_CustomScriptExtensionSuspiciousEntryPoint | VM_CustomScriptExtensionSuspiciousEntryPoint |
+| ARM_CustomScriptExtensionSuspiciousPayload | VM_CustomScriptExtensionSuspiciousPayload |
+| ARM_CustomScriptExtensionSuspiciousFailure | VM_CustomScriptExtensionSuspiciousFailure |
+| ARM_CustomScriptExtensionUnusualDeletion | VM_CustomScriptExtensionUnusualDeletion |
+| ARM_CustomScriptExtensionUnusualExecution | VM_CustomScriptExtensionUnusualExecution |
+| ARM_VMAccessUnusualConfigReset | VM_VMAccessUnusualConfigReset |
+| ARM_VMAccessUnusualPasswordReset | VM_VMAccessUnusualPasswordReset |
+| ARM_VMAccessUnusualSSHReset | VM_VMAccessUnusualSSHReset |
++
+Learn more about the [Azure Defender for Resource Manager](defender-for-resource-manager-introduction.md) and [Azure Defender for Servers](defender-for-servers-introduction.md) plans.
+
+### Changes to the logic of a security recommendation for Kubernetes clusters
+
+The recommendation "Kubernetes clusters should not use the default namespace" prevents usage of the default namespace for a range of resource types. Two of the resource types that were included in this recommendation have been removed: ConfigMap and Secret.
+
+Learn more about this recommendation and hardening your Kubernetes clusters in [Understand Azure Policy for Kubernetes clusters](../governance/policy/concepts/policy-for-kubernetes.md).
+
+### Recommendations details pages now show related recommendations
+
+To clarify the relationships between different recommendations, we've added a **Related recommendations** area to the details pages of many recommendations.
+
+The three relationship types that are shown on these pages are:
+
+- **Prerequisite** - A recommendation that must be completed before the selected recommendation
+- **Alternative** - A different recommendation which provides another way of achieving the goals of the selected recommendation
+- **Dependent** - A recommendation for which the selected recommendation is a prerequisite
+
+For each related recommendation, the number of unhealthy resources is shown in the "Affected resources" column.
+
+> [!TIP]
+> If a related recommendation is grayed out, its dependency isn't yet completed and so isn't available.
+
+An example of related recommendations:
+
+1. Security Center checks your machines for supported vulnerability assessment solutions:<br>
+ **A vulnerability assessment solution should be enabled on your virtual machines**
+
+1. If one is found, you'll get notified about discovered vulnerabilities:<br>
+ **Vulnerabilities in your virtual machines should be remediated**
+
+Obviously, Security Center can't notify you about discovered vulnerabilities unless it finds a supported vulnerability assessment solution.
+
+Therefore:
+
+ - Recommendation #1 is a prerequisite for recommendation #2
+ - Recommendation #2 depends upon recommendation #1
+++++
+### New alerts for Azure Defender for Kubernetes (in preview)
+
+To expand the threat protections provided by Azure Defender for Kubernetes, we've added two preview alerts.
+
+These alerts are generated based on a new machine learning model and Kubernetes advanced analytics, measuring multiple deployment and role assignment attributes against previous activities in the cluster and across all clusters monitored by Azure Defender.
+
+| Alert (alert type) | Description | MITRE tactic | Severity |
+|||:--:|-|
+| **Anomalous pod deployment (Preview)**<br>(K8S_AnomalousPodDeployment) | Kubernetes audit log analysis detected pod deployment that is anomalous based on previous pod deployment activity. This activity is considered an anomaly when taking into account how the different features seen in the deployment operation are in relations to one another. The features monitored by this analytics include the container image registry used, the account performing the deployment, day of the week, how often does this account performs pod deployments, user agent used in the operation, is this a namespace which is pod deployment occur to often, or other feature. Top contributing reasons for raising this alert as anomalous activity are detailed under the alert extended properties. | Execution | Medium |
+| **Excessive role permissions assigned in Kubernetes cluster (Preview)**<br>(K8S_ServiceAcountPermissionAnomaly) | Analysis of the Kubernetes audit logs detected an excessive permissions role assignment to your cluster. From examining role assignments, the listed permissions are uncommon to the specific service account. This detection considers previous role assignments to the same service account across clusters monitored by Azure, volume per permission, and the impact of the specific permission. The anomaly detection model used for this alert takes into account how this permission is used across all clusters monitored by Azure Defender. | Privilege Escalation | Low |
++
+For a full list of the Kubernetes alerts, see [Alerts for Kubernetes clusters](alerts-reference.md#alerts-k8scluster).
+ ## September 2021 In September, the following update was released:
defender-for-cloud Release Notes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/release-notes.md
Title: Release notes for Microsoft Defender for Cloud description: A description of what's new and changed in Microsoft Defender for Cloud Previously updated : 04/06/2022 Last updated : 04/11/2022 + # What's new in Microsoft Defender for Cloud? [!INCLUDE [Banner for top of topics](./includes/banner.md)]
Updates in April include:
- [New Defender for Servers plans](#new-defender-for-servers-plans) - [Relocation of custom recommendations](#relocation-of-custom-recommendations)
+- [PowerShell script to stream alerts to Splunk and QRadar](#powershell-script-to-stream-alerts-to-splunk-and-ibm-qradar)
+
+### PowerShell script to stream alerts to Splunk and IBM QRadar
+
+We recommend that you use Event Hubs and a built-in connector to export security alerts to Splunk and IBM QRadar. Now you can use a PowerShell script to set up the Azure resources needed to export security alerts for your subscription or tenant.
+
+Just download and run the PowerShell script. After you provide a few details of your environment, the script configures the resources for you. The script then produces output that you use in the SIEM platform to complete the integration.
+
+To learn more, see [Stream alerts to Splunk and QRadar](export-to-siem.md#stream-alerts-to-qradar-and-splunk).
### New Defender for Servers plans
While Microsoft Defender for Servers Plan 2 continues to provide complete protec
If you have been using Defender for Servers until now ΓÇô no action is required.
-In addition, Defender for Cloud also begins gradual support for the [Defender for Endpoint unified agent for Windows Server 2012 R2 and 2016 (Preview)](https://techcommunity.microsoft.com/t5/microsoft-defender-for-endpoint/defending-windows-server-2012-r2-and-2016/ba-p/2783292). Defender for Servers Plan 1 deploys the new unified agent to Windows Server 2012 R2 and 2016 workloads. Defender for Servers Plan 2 deploy the legacy agent to Windows Server 2012 R2 and 2016 workloads, and will deploy the unified agent soon after it is approved for general use.
+In addition, Defender for Cloud also begins gradual support for the [Defender for Endpoint unified agent for Windows Server 2012 R2 and 2016 (Preview)](https://techcommunity.microsoft.com/t5/microsoft-defender-for-endpoint/defending-windows-server-2012-r2-and-2016/ba-p/2783292). Defender for Servers Plan 1 deploys the new unified agent to Windows Server 2012 R2 and 2016 workloads. Defender for Servers Plan 2 deploys the legacy agent to Windows Server 2012 R2 and 2016 workloads, and will deploy the unified agent soon after it is approved for general use.
### Relocation of custom recommendations
Learn more in [Review your security recommendations](review-security-recommendat
### Microsoft Threat and Vulnerability Management added as vulnerability assessment solution - released for general availability (GA)
-In October, [we announced](#microsoft-threat-and-vulnerability-management-added-as-vulnerability-assessment-solution-in-preview) an extension to the integration between [Microsoft Defender for Servers](defender-for-servers-introduction.md) and Microsoft Defender for Endpoint, to support a new vulnerability assessment provider for your machines: [Microsoft threat and vulnerability management](/microsoft-365/security/defender-endpoint/next-gen-threat-and-vuln-mgt). This feature is now released for general availability (GA).
+In October, [we announced](release-notes-archive.md#microsoft-threat-and-vulnerability-management-added-as-vulnerability-assessment-solution-in-preview) an extension to the integration between [Microsoft Defender for Servers](defender-for-servers-introduction.md) and Microsoft Defender for Endpoint, to support a new vulnerability assessment provider for your machines: [Microsoft threat and vulnerability management](/microsoft-365/security/defender-endpoint/next-gen-threat-and-vuln-mgt). This feature is now released for general availability (GA).
Use **threat and vulnerability management** to discover vulnerabilities and misconfigurations in near real time with the [integration with Microsoft Defender for Endpoint](integration-defender-for-endpoint.md) enabled, and without the need for additional agents or periodic scans. Threat and vulnerability management prioritizes vulnerabilities based on the threat landscape and detections in your organization. Use the security recommendation "[A vulnerability assessment solution should be enabled on your virtual machines](https://portal.azure.com/#blade/Microsoft_Azure_Security/RecommendationsBlade/assessmentKey/ffff0522-1e88-47fc-8382-2a80ba848f5d)" to surface the vulnerabilities detected by threat and vulnerability management for your [supported machines](/microsoft-365/security/defender-endpoint/tvm-supported-os?view=o365-worldwide&preserve-view=true).
-To automatically surface the vulnerabilities, on existing and new machines, without the need to manually remediate the recommendation, see [Vulnerability assessment solutions can now be auto enabled (in preview)](#vulnerability-assessment-solutions-can-now-be-auto-enabled-in-preview).
+To automatically surface the vulnerabilities, on existing and new machines, without the need to manually remediate the recommendation, see [Vulnerability assessment solutions can now be auto enabled (in preview)](release-notes-archive.md#vulnerability-assessment-solutions-can-now-be-auto-enabled-in-preview).
Learn more in [Investigate weaknesses with Microsoft Defender for Endpoint's threat and vulnerability management](deploy-vulnerability-assessment-tvm.md).
Even though the feature is called *continuous*, there's also an option to export
### Auto provisioning of vulnerability assessment solutions released for general availability (GA)
-In October, [we announced](#vulnerability-assessment-solutions-can-now-be-auto-enabled-in-preview) the addition of vulnerability assessment solutions to Defender for Cloud's auto provisioning page. This is relevant to Azure virtual machines and Azure Arc machines on subscriptions protected by [Azure Defender for Servers](defender-for-servers-introduction.md). This feature is now released for general availability (GA).
+In October, [we announced](release-notes-archive.md#vulnerability-assessment-solutions-can-now-be-auto-enabled-in-preview) the addition of vulnerability assessment solutions to Defender for Cloud's auto provisioning page. This is relevant to Azure virtual machines and Azure Arc machines on subscriptions protected by [Azure Defender for Servers](defender-for-servers-introduction.md). This feature is now released for general availability (GA).
If the [integration with Microsoft Defender for Endpoint](integration-defender-for-endpoint.md) is enabled, Defender for Cloud presents a choice of vulnerability assessment solutions:
Learn more in [Automatically configure vulnerability assessment for your machine
### Software inventory filters in asset inventory released for general availability (GA)
-In October, [we announced](#software-inventory-filters-added-to-asset-inventory-in-preview) new filters for the [asset inventory](asset-inventory.md) page to select machines running specific software - and even specify the versions of interest. This feature is now released for general availability (GA).
+In October, [we announced](release-notes-archive.md#software-inventory-filters-added-to-asset-inventory-in-preview) new filters for the [asset inventory](asset-inventory.md) page to select machines running specific software - and even specify the versions of interest. This feature is now released for general availability (GA).
You can query the software inventory data in **Azure Resource Graph Explorer**.
To improve the presentation of resources in the [Asset inventory](asset-inventor
- **Previous format:** ``machine-name_source-computer-id_VMUUID`` - **From this update:** ``machine-name_VMUUID``-
-## October 2021
-
-Updates in October include:
--- [Microsoft Threat and Vulnerability Management added as vulnerability assessment solution (in preview)](#microsoft-threat-and-vulnerability-management-added-as-vulnerability-assessment-solution-in-preview)-- [Vulnerability assessment solutions can now be auto enabled (in preview)](#vulnerability-assessment-solutions-can-now-be-auto-enabled-in-preview)-- [Software inventory filters added to asset inventory (in preview)](#software-inventory-filters-added-to-asset-inventory-in-preview)-- [Changed prefix of some alert types from "ARM_" to "VM_"](#changed-prefix-of-some-alert-types-from-arm_-to-vm_)-- [Changes to the logic of a security recommendation for Kubernetes clusters](#changes-to-the-logic-of-a-security-recommendation-for-kubernetes-clusters)-- [Recommendations details pages now show related recommendations](#recommendations-details-pages-now-show-related-recommendations)-- [New alerts for Azure Defender for Kubernetes (in preview)](#new-alerts-for-azure-defender-for-kubernetes-in-preview)--
-### Microsoft Threat and Vulnerability Management added as vulnerability assessment solution (in preview)
-
-We've extended the integration between [Azure Defender for Servers](defender-for-servers-introduction.md) and Microsoft Defender for Endpoint, to support a new vulnerability assessment provider for your machines: [Microsoft threat and vulnerability management](/microsoft-365/security/defender-endpoint/next-gen-threat-and-vuln-mgt).
-
-Use **threat and vulnerability management** to discover vulnerabilities and misconfigurations in near real time with the [integration with Microsoft Defender for Endpoint](integration-defender-for-endpoint.md) enabled, and without the need for additional agents or periodic scans. Threat and vulnerability management prioritizes vulnerabilities based on the threat landscape and detections in your organization.
-
-Use the security recommendation "[A vulnerability assessment solution should be enabled on your virtual machines](https://portal.azure.com/#blade/Microsoft_Azure_Security/RecommendationsBlade/assessmentKey/ffff0522-1e88-47fc-8382-2a80ba848f5d)" to surface the vulnerabilities detected by threat and vulnerability management for your [supported machines](/microsoft-365/security/defender-endpoint/tvm-supported-os?view=o365-worldwide&preserve-view=true).
-
-To automatically surface the vulnerabilities, on existing and new machines, without the need to manually remediate the recommendation, see [Vulnerability assessment solutions can now be auto enabled (in preview)](#vulnerability-assessment-solutions-can-now-be-auto-enabled-in-preview).
-
-Learn more in [Investigate weaknesses with Microsoft Defender for Endpoint's threat and vulnerability management](deploy-vulnerability-assessment-tvm.md).
-
-### Vulnerability assessment solutions can now be auto enabled (in preview)
-
-Security Center's auto provisioning page now includes the option to automatically enable a vulnerability assessment solution to Azure virtual machines and Azure Arc machines on subscriptions protected by [Azure Defender for Servers](defender-for-servers-introduction.md).
-
-If the [integration with Microsoft Defender for Endpoint](integration-defender-for-endpoint.md) is enabled, Defender for Cloud presents a choice of vulnerability assessment solutions:
--- (**NEW**) The Microsoft threat and vulnerability management module of Microsoft Defender for Endpoint (see [the release note](#microsoft-threat-and-vulnerability-management-added-as-vulnerability-assessment-solution-in-preview))-- The integrated Qualys agent--
-Your chosen solution will be automatically enabled on supported machines.
-
-Learn more in [Automatically configure vulnerability assessment for your machines](auto-deploy-vulnerability-assessment.md).
-
-### Software inventory filters added to asset inventory (in preview)
-
-The [asset inventory](asset-inventory.md) page now includes a filter to select machines running specific software - and even specify the versions of interest.
-
-Additionally, you can query the software inventory data in **Azure Resource Graph Explorer**.
-
-To use these new features, you'll need to enable the [integration with Microsoft Defender for Endpoint](integration-defender-for-endpoint.md).
-
-For full details, including sample Kusto queries for Azure Resource Graph, see [Access a software inventory](asset-inventory.md#access-a-software-inventory).
--
-### Changed prefix of some alert types from "ARM_" to "VM_"
-
-In July 2021, we announced a [logical reorganization of Azure Defender for Resource Manager alerts](release-notes-archive.md#logical-reorganization-of-azure-defender-for-resource-manager-alerts)
-
-As part of a logical reorganization of some of the Azure Defender plans, we moved twenty-one alerts from [Azure Defender for Resource Manager](defender-for-resource-manager-introduction.md) to [Azure Defender for Servers](defender-for-servers-introduction.md).
-
-With this update, we've changed the prefixes of these alerts to match this reassignment and replaced "ARM_" with "VM_" as shown in the following table:
-
-| Original name | From this change |
-||--|
-| ARM_AmBroadFilesExclusion | VM_AmBroadFilesExclusion |
-| ARM_AmDisablementAndCodeExecution | VM_AmDisablementAndCodeExecution |
-| ARM_AmDisablement | VM_AmDisablement |
-| ARM_AmFileExclusionAndCodeExecution | VM_AmFileExclusionAndCodeExecution |
-| ARM_AmTempFileExclusionAndCodeExecution | VM_AmTempFileExclusionAndCodeExecution |
-| ARM_AmTempFileExclusion | VM_AmTempFileExclusion |
-| ARM_AmRealtimeProtectionDisabled | VM_AmRealtimeProtectionDisabled |
-| ARM_AmTempRealtimeProtectionDisablement | VM_AmTempRealtimeProtectionDisablement |
-| ARM_AmRealtimeProtectionDisablementAndCodeExec | VM_AmRealtimeProtectionDisablementAndCodeExec |
-| ARM_AmMalwareCampaignRelatedExclusion | VM_AmMalwareCampaignRelatedExclusion |
-| ARM_AmTemporarilyDisablement | VM_AmTemporarilyDisablement |
-| ARM_UnusualAmFileExclusion | VM_UnusualAmFileExclusion |
-| ARM_CustomScriptExtensionSuspiciousCmd | VM_CustomScriptExtensionSuspiciousCmd |
-| ARM_CustomScriptExtensionSuspiciousEntryPoint | VM_CustomScriptExtensionSuspiciousEntryPoint |
-| ARM_CustomScriptExtensionSuspiciousPayload | VM_CustomScriptExtensionSuspiciousPayload |
-| ARM_CustomScriptExtensionSuspiciousFailure | VM_CustomScriptExtensionSuspiciousFailure |
-| ARM_CustomScriptExtensionUnusualDeletion | VM_CustomScriptExtensionUnusualDeletion |
-| ARM_CustomScriptExtensionUnusualExecution | VM_CustomScriptExtensionUnusualExecution |
-| ARM_VMAccessUnusualConfigReset | VM_VMAccessUnusualConfigReset |
-| ARM_VMAccessUnusualPasswordReset | VM_VMAccessUnusualPasswordReset |
-| ARM_VMAccessUnusualSSHReset | VM_VMAccessUnusualSSHReset |
--
-Learn more about the [Azure Defender for Resource Manager](defender-for-resource-manager-introduction.md) and [Azure Defender for Servers](defender-for-servers-introduction.md) plans.
-
-### Changes to the logic of a security recommendation for Kubernetes clusters
-
-The recommendation "Kubernetes clusters should not use the default namespace" prevents usage of the default namespace for a range of resource types. Two of the resource types that were included in this recommendation have been removed: ConfigMap and Secret.
-
-Learn more about this recommendation and hardening your Kubernetes clusters in [Understand Azure Policy for Kubernetes clusters](../governance/policy/concepts/policy-for-kubernetes.md).
-
-### Recommendations details pages now show related recommendations
-
-To clarify the relationships between different recommendations, we've added a **Related recommendations** area to the details pages of many recommendations.
-
-The three relationship types that are shown on these pages are:
--- **Prerequisite** - A recommendation that must be completed before the selected recommendation-- **Alternative** - A different recommendation which provides another way of achieving the goals of the selected recommendation-- **Dependent** - A recommendation for which the selected recommendation is a prerequisite-
-For each related recommendation, the number of unhealthy resources is shown in the "Affected resources" column.
-
-> [!TIP]
-> If a related recommendation is grayed out, its dependency isn't yet completed and so isn't available.
-
-An example of related recommendations:
-
-1. Security Center checks your machines for supported vulnerability assessment solutions:<br>
- **A vulnerability assessment solution should be enabled on your virtual machines**
-
-1. If one is found, you'll get notified about discovered vulnerabilities:<br>
- **Vulnerabilities in your virtual machines should be remediated**
-
-Obviously, Security Center can't notify you about discovered vulnerabilities unless it finds a supported vulnerability assessment solution.
-
-Therefore:
------
-### New alerts for Azure Defender for Kubernetes (in preview)
-
-To expand the threat protections provided by Azure Defender for Kubernetes, we've added two preview alerts.
-
-These alerts are generated based on a new machine learning model and Kubernetes advanced analytics, measuring multiple deployment and role assignment attributes against previous activities in the cluster and across all clusters monitored by Azure Defender.
-
-| Alert (alert type) | Description | MITRE tactic | Severity |
-|||:--:|-|
-| **Anomalous pod deployment (Preview)**<br>(K8S_AnomalousPodDeployment) | Kubernetes audit log analysis detected pod deployment that is anomalous based on previous pod deployment activity. This activity is considered an anomaly when taking into account how the different features seen in the deployment operation are in relations to one another. The features monitored by this analytics include the container image registry used, the account performing the deployment, day of the week, how often does this account performs pod deployments, user agent used in the operation, is this a namespace which is pod deployment occur to often, or other feature. Top contributing reasons for raising this alert as anomalous activity are detailed under the alert extended properties. | Execution | Medium |
-| **Excessive role permissions assigned in Kubernetes cluster (Preview)**<br>(K8S_ServiceAcountPermissionAnomaly) | Analysis of the Kubernetes audit logs detected an excessive permissions role assignment to your cluster. From examining role assignments, the listed permissions are uncommon to the specific service account. This detection considers previous role assignments to the same service account across clusters monitored by Azure, volume per permission, and the impact of the specific permission. The anomaly detection model used for this alert takes into account how this permission is used across all clusters monitored by Azure Defender. | Privilege Escalation | Low |
--
-For a full list of the Kubernetes alerts, see [Alerts for Kubernetes clusters](alerts-reference.md#alerts-k8scluster).
defender-for-cloud Upcoming Changes https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-cloud/upcoming-changes.md
Title: Important changes coming to Microsoft Defender for Cloud description: Upcoming changes to Microsoft Defender for Cloud that you might need to be aware of and for which you might need to plan Previously updated : 04/06/2022 Last updated : 04/11/2022 # Important upcoming changes to Microsoft Defender for Cloud
If you're looking for the latest release notes, you'll find them in the [What's
| Planned change | Estimated date for change | |--|--|
-| [Changes to recommendations for managing endpoint protection solutions](#changes-to-recommendations-for-managing-endpoint-protection-solutions) | March 2022 |
+| [Changes to recommendations for managing endpoint protection solutions](#changes-to-recommendations-for-managing-endpoint-protection-solutions) | May 2022 |
| [Multiple changes to identity recommendations](#multiple-changes-to-identity-recommendations) | May 2022 |
+| [Changes to vulnerability assessment](#changes-to-vulnerability-assessment) | May 2022 |
### Changes to recommendations for managing endpoint protection solutions
-**Estimated date for change:** March 2022
+**Estimated date for change:** May 2022
In August 2021, we added two new **preview** recommendations to deploy and maintain the endpoint protection solutions on your machines. For full details, [see the release note](release-notes-archive.md#two-new-recommendations-for-managing-endpoint-protection-solutions-in-preview).
Defender for Cloud includes multiple recommendations for improving the managemen
|Description |User accounts that have been blocked from signing in, should be removed from your subscriptions.<br>These accounts can be targets for attackers looking to find ways to access your data without being noticed.|User accounts that have been blocked from signing into Active Directory, should be removed from your subscriptions. These accounts can be targets for attackers looking to find ways to access your data without being noticed.<br>Learn more about securing the identity perimeter in [Azure Identity Management and access control security best practices](../security/fundamentals/identity-management-best-practices.md).| |Related policy |[Deprecated accounts should be removed from your subscription](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2fproviders%2fMicrosoft.Authorization%2fpolicyDefinitions%2f6b1cbf55-e8b6-442f-ba4c-7246b6381474)|Subscriptions should be purged of accounts that are blocked in Active Directory and have read and write permissions|
+### Changes to vulnerability assessment
+
+**Estimated date for change:** May 2022
+
+Currently, Defender for Containers doesn't show vulnerabilities that have medium and low level severities that are not patchable.
+
+As part of this update, vulnerabilities that have medium and low severities, that don't have patches will be shown. This update will provide maximum visibility, while still allowing you to filter undesired vulnerabilities by using the provided Disable rule.
++
+Learn more about [vulnerability management](deploy-vulnerability-assessment-tvm.md)
## Next steps
defender-for-iot Concept Micro Agent Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-iot/device-builders/concept-micro-agent-configuration.md
The micro agent's behavior is configured by a set of module twin properties. You
After any change in configuration, the collector will immediately send all unsent event data. After the data is sent, the changes will be applied, and all the collectors will restart.
-> [!Note]
-> Aggregation mode is supported, but it is not configurable.
- ## General configuration Define the frequency in which messages are sent for each priority level. All values are required.
defender-for-iot How To Identify Required Appliances https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-iot/organizations/how-to-identify-required-appliances.md
Title: Identify required appliances
-description: Learn about hardware and virtual appliances for certified Defender for IoT sensors and the on-premises management console.
Previously updated : 03/23/2022
+ Title: OT monitoring appliance catalog - Microsoft Defender for IoT
+description: Learn about hardware and virtual appliances for certified Microsoft Defender for IoT sensors and the on-premises management console.
Last updated : 04/10/2022
-# Identify required appliances
+# OT monitoring appliance catalog
This article provides information on certified Defender for IoT sensor appliances. Defender for IoT can be deployed on physical and virtual appliances.
defender-for-iot How To Install Software https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/defender-for-iot/organizations/how-to-install-software.md
To install the software:
## Legacy appliances
-This section describes installation procedures for *legacy* appliances only. See [Identify required appliances](how-to-identify-required-appliances.md#identify-required-appliances), if you are buying a new appliance.
+This section describes installation procedures for *legacy* appliances only. See [Identify required appliances](how-to-identify-required-appliances.md), if you are buying a new appliance.
### Nuvo 5006LP installation
digital-twins Concepts Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/concepts-cli.md
In many twin queries, the `$` character is used to reference the `$dtId` propert
Here is an example of querying for a twin with a CLI command in the Cloud Shell Bash environment: ```azurecli
-az dt twin query -n <instance-name> -q "SELECT * FROM DigitalTwins T Where T.\$dtId = 'room0'"
+az dt twin query --dt-name <instance-hostname-or-name> --query-command "SELECT * FROM DigitalTwins T Where T.\$dtId = 'room0'"
``` ### PowerShell
Some commands, like [az dt twin create](/cli/azure/dt/twin#az-dt-twin-create), a
Here is an example of creating a twin with a CLI command in PowerShell: ```azurecli
-az dt twin create --dt-name <instance-name> --dtmi "dtmi:contosocom:DigitalTwins:Thermostat;1" --twin-id thermostat67 --properties '{\"Temperature\": 0.0}'
+az dt twin create --dt-name <instance-hostname-or-name> --dtmi "dtmi:contosocom:DigitalTwins:Thermostat;1" --twin-id thermostat67 --properties '{\"Temperature\": 0.0}'
``` >[!TIP]
In many twin queries, the `$` character is used to reference the `$dtId` propert
Here is an example of querying for a twin with a CLI command in PowerShell: ```azurecli
-az dt twin query -n <instance-name> -q "SELECT * FROM DigitalTwins T Where T.`$dtId = 'room0'"
+az dt twin query --dt-name <instance-hostname-or-name> --query-command "SELECT * FROM DigitalTwins T Where T.`$dtId = 'room0'"
``` ### Windows CMD
Some commands, like [az dt twin create](/cli/azure/dt/twin#az-dt-twin-create), a
Here is an example of creating a twin with a CLI command in the local Windows CMD: ```azurecli
-az dt twin create --dt-name <instance-name> --dtmi "dtmi:contosocom:DigitalTwins:Thermostat;1" --twin-id thermostat67 --properties "{\"Temperature\": 0.0}"
+az dt twin create --dt-name <instance-hostname-or-name> --dtmi "dtmi:contosocom:DigitalTwins:Thermostat;1" --twin-id thermostat67 --properties "{\"Temperature\": 0.0}"
``` >[!TIP]
digital-twins How To Ingest Iot Hub Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/how-to-ingest-iot-hub-data.md
To create a thermostat-type twin, you'll first need to upload the thermostat [mo
[!INCLUDE [digital-twins-thermostat-model-upload.md](../../includes/digital-twins-thermostat-model-upload.md)]
-You'll then need to create one twin using this model. Use the following command to create a thermostat twin named thermostat67, and set 0.0 as an initial temperature value.
+You'll then need to create one twin using this model. Use the following command to create a thermostat twin named thermostat67, and set 0.0 as an initial temperature value. There's one placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance).
```azurecli-interactive
-az dt twin create --dt-name <instance-name> --dtmi "dtmi:contosocom:DigitalTwins:Thermostat;1" --twin-id thermostat67 --properties '{"Temperature": 0.0}'
+az dt twin create --dt-name <instance-hostname-or-name> --dtmi "dtmi:contosocom:DigitalTwins:Thermostat;1" --twin-id thermostat67 --properties '{"Temperature": 0.0}'
``` >[!NOTE]
In the end-to-end tutorial, complete the following steps:
## Validate your results
-While running the device simulator above, the temperature value of your digital twin will be changing. In the Azure CLI, run the following command to see the temperature value.
+While running the device simulator above, the temperature value of your digital twin will be changing. In the Azure CLI, run the following command to see the temperature value. There's one placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance).
```azurecli-interactive
-az dt twin query --query-command "select * from digitaltwins" --dt-name <Digital-Twins-instance-name>
+az dt twin query --query-command "select * from digitaltwins" --dt-name <instance-hostname-or-name>
``` Your output should contain a temperature value like this:
digital-twins How To Integrate Maps https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/how-to-integrate-maps.md
Azure Digital Twins instances can emit twin update events whenever a twin's stat
This pattern reads from the room twin directly, rather than the IoT device, which gives you the flexibility to change the underlying data source for temperature without needing to update your mapping logic. For example, you can add multiple thermometers or set this room to share a thermometer with another room, all without needing to update your map logic.
-1. Create an Event Grid topic, which will receive events from your Azure Digital Twins instance.
+1. Create an Event Grid topic, which will receive events from your Azure Digital Twins instance, using the CLI command below:
```azurecli-interactive az eventgrid topic create --resource-group <your-resource-group-name> --name <your-topic-name> --location <region> ```
-2. Create an endpoint to link your Event Grid topic to Azure Digital Twins.
+2. Create an endpoint to link your Event Grid topic to Azure Digital Twins, using the CLI command below:
```azurecli-interactive az dt endpoint create eventgrid --endpoint-name <Event-Grid-endpoint-name> --eventgrid-resource-group <Event-Grid-resource-group-name> --eventgrid-topic <your-Event-Grid-topic-name> --dt-name <your-Azure-Digital-Twins-instance-name> ```
-3. Create a route in Azure Digital Twins to send twin update events to your endpoint.
+3. Create a route in Azure Digital Twins to send twin update events to your endpoint, using the CLI command below. For the Azure Digital Twins instance name placeholder in this command, you can use the friendly name or the host name for a boost in performance.
>[!NOTE] >There is currently a known issue in Cloud Shell affecting these command groups: `az dt route`, `az dt model`, `az dt twin`.
This pattern reads from the room twin directly, rather than the IoT device, whic
>To resolve, either run `az login` in Cloud Shell prior to running the command, or use the [local CLI](/cli/azure/install-azure-cli) instead of Cloud Shell. For more detail on this, see [Troubleshoot known issues](troubleshoot-known-issues.md#400-client-error-bad-request-in-cloud-shell). ```azurecli-interactive
- az dt route create --dt-name <your-Azure-Digital-Twins-instance-name> --endpoint-name <Event-Grid-endpoint-name> --route-name <my-route> --filter "type = 'Microsoft.DigitalTwins.Twin.Update'"
+ az dt route create --dt-name <your-Azure-Digital-Twins-instance-hostname-or-name> --endpoint-name <Event-Grid-endpoint-name> --route-name <my-route> --filter "type = 'Microsoft.DigitalTwins.Twin.Update'"
``` ## Create a function to update maps
digital-twins How To Integrate Time Series Insights https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/how-to-integrate-time-series-insights.md
az dt endpoint create eventhub --dt-name <your-Azure-Digital-Twins-instance-name
Azure Digital Twins instances can emit [twin update events](./concepts-event-notifications.md) whenever a twin's state is updated. In this section, you'll create an Azure Digital Twins event route that will direct these update events to the twins hub for further processing.
-Create a [route](concepts-route-events.md#create-an-event-route) in Azure Digital Twins to send twin update events to your endpoint from above. The filter in this route will only allow twin update messages to be passed to your endpoint. Specify a name for the twins hub event route.
+Create a [route](concepts-route-events.md#create-an-event-route) in Azure Digital Twins to send twin update events to your endpoint from above. The filter in this route will only allow twin update messages to be passed to your endpoint. Specify a name for the twins hub event route. For the Azure Digital Twins instance name placeholder in this command, you can use the friendly name or the host name for a boost in performance.
```azurecli-interactive
-az dt route create --dt-name <your-Azure-Digital-Twins-instance-name> --endpoint-name <your-twins-hub-endpoint-from-earlier> --route-name <name-for-your-twins-hub-event-route> --filter "type = 'Microsoft.DigitalTwins.Twin.Update'"
+az dt route create --dt-name <your-Azure-Digital-Twins-instance-hostname-or-name> --endpoint-name <your-twins-hub-endpoint-from-earlier> --route-name <name-for-your-twins-hub-event-route> --filter "type = 'Microsoft.DigitalTwins.Twin.Update'"
``` ### Get twins hub connection string
In this section, you'll set up Time Series Insights instance to receive data fro
To begin sending data to Time Series Insights, you'll need to start updating the digital twin properties in Azure Digital Twins with changing data values.
-Use the [az dt twin update](/cli/azure/dt/twin#az-dt-twin-update) CLI command to update a property on the twin you added in the [Prerequisites](#prerequisites) section. If you used the twin creation instructions from [Ingest telemetry from IoT Hub](how-to-ingest-iot-hub-data.md)), you can use the following command in the local CLI or the Cloud Shell bash terminal to update the temperature property on the thermostat67 twin.
+Use the [az dt twin update](/cli/azure/dt/twin#az-dt-twin-update) CLI command to update a property on the twin you added in the [Prerequisites](#prerequisites) section. If you used the twin creation instructions from [Ingest telemetry from IoT Hub](how-to-ingest-iot-hub-data.md)), you can use the following command in the local CLI or the Cloud Shell bash terminal to update the temperature property on the thermostat67 twin. There's one placeholder for the Azure Digital Twins instance's host name (you can also use the instance's friendly name with a slight decrease in performance).
```azurecli-interactive
-az dt twin update --dt-name <your-Azure-Digital-Twins-instance-name> --twin-id thermostat67 --json-patch '{"op":"replace", "path":"/Temperature", "value": 20.5}'
+az dt twin update --dt-name <your-Azure-Digital-Twins-instance-hostname-or-name> --twin-id thermostat67 --json-patch '{"op":"replace", "path":"/Temperature", "value": 20.5}'
``` Repeat the command at least 4 more times with different property values to create several data points that can be observed later in the Time Series Insights environment.
digital-twins How To Provision Using Device Provisioning Service https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/how-to-provision-using-device-provisioning-service.md
You should see the device being registered and connected to IoT Hub, and then st
### Validate
-The flow you've set up in this article will result in the device automatically being registered in Azure Digital Twins. Use the following [Azure Digital Twins CLI](/cli/azure/dt/twin#az-dt-twin-show) command to find the twin of the device in the Azure Digital Twins instance you created.
+The flow you've set up in this article will result in the device automatically being registered in Azure Digital Twins. Use the following [Azure Digital Twins CLI](/cli/azure/dt/twin#az-dt-twin-show) command to find the twin of the device in the Azure Digital Twins instance you created. There's a placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance), and a placeholder for the device registration ID.
```azurecli-interactive
-az dt twin show --dt-name <Digital-Twins-instance-name> --twin-id "<Device-Registration-ID>"
+az dt twin show --dt-name <instance-hostname-or-name> --twin-id "<device-registration-ID>"
``` You should see the twin of the device being found in the Azure Digital Twins instance.
Follow the steps below to delete the device in the Azure portal:
It might take a few minutes to see the changes reflected in Azure Digital Twins.
-Use the following [Azure Digital Twins CLI](/cli/azure/dt/twin#az-dt-twin-show) command to verify the twin of the device in the Azure Digital Twins instance was deleted.
+Use the following [Azure Digital Twins CLI](/cli/azure/dt/twin#az-dt-twin-show) command to verify the twin of the device in the Azure Digital Twins instance was deleted. There's a placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance), and a placeholder for the device registration ID.
```azurecli-interactive
-az dt twin show --dt-name <Digital-Twins-instance-name> --twin-id "<Device-Registration-ID>"
+az dt twin show --dt-name <instance-hostname-or-name> --twin-id "<device-registration-ID>"
``` You should see that the twin of the device cannot be found in the Azure Digital Twins instance anymore.
digital-twins How To Route With Managed Identity https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/how-to-route-with-managed-identity.md
You can add the `--scopes` parameter onto the `az dt create` command to assign t
Here's an example that creates an instance with a system managed identity, and assigns that identity a custom role called `MyCustomRole` in an event hub. ```azurecli-interactive
-az dt create --dt-name <instance-name> --resource-group <resource-group> --assign-identity --scopes "/subscriptions/<subscription ID>/resourceGroups/<resource-group>/providers/Microsoft.EventHub/namespaces/<Event-Hubs-namespace>/eventhubs/<event-hub-name>" --role MyCustomRole
+az dt create --dt-name <new-instance-name> --resource-group <resource-group> --assign-identity --scopes "/subscriptions/<subscription ID>/resourceGroups/<resource-group>/providers/Microsoft.EventHub/namespaces/<Event-Hubs-namespace>/eventhubs/<event-hub-name>" --role MyCustomRole
``` For more examples of role assignments with this command, see the [az dt create reference documentation](/cli/azure/dt#az-dt-create).
digital-twins Tutorial Command Line Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/digital-twins/tutorial-command-line-cli.md
After you set up your Azure Digital Twins instance, make a note of the following
* The **Azure subscription** that you used to create the instance >[!TIP]
->If you know the name of your instance, you can use the following CLI command to get the host name and subscription values:
+>If you know the friendly name of your instance, you can use the following CLI command to get the host name and subscription values:
> >```azurecli-interactive >az dt show --dt-name <Azure-Digital-Twins-instance-name>
After designing models, you need to upload them to your Azure Digital Twins inst
Navigate to the *Room.json* file on your machine and select "Open." Then, repeat this step for *Floor.json*.
-1. Next, use the [az dt model create](/cli/azure/dt/model#az-dt-model-create) command as shown below to upload your updated Room model to your Azure Digital Twins instance. The second command uploads another model, Floor, which you'll also use in the next section to create different types of twins. If you're using Cloud Shell, *Room.json* and *Floor.json* are in the main storage directory, so you can just use the file names directly in the command below where a path is required.
+1. Next, use the [az dt model create](/cli/azure/dt/model#az-dt-model-create) command as shown below to upload your updated Room model to your Azure Digital Twins instance. The second command uploads another model, Floor, which you'll also use in the next section to create different types of twins. There's a placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance), and a placeholder for a path to each model file. If you're using Cloud Shell, *Room.json* and *Floor.json* are in the main storage directory, so you can just use the file names directly in the command below where a path is required.
```azurecli-interactive
- az dt model create --dt-name <Azure-Digital-Twins-instance-name> --models <path-to-Room.json>
- az dt model create --dt-name <Azure-Digital-Twins-instance-name> --models <path-to-Floor.json>
+ az dt model create --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --models <path-to-Room.json>
+ az dt model create --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --models <path-to-Floor.json>
``` The output from each command will show information about the successfully uploaded model.
After designing models, you need to upload them to your Azure Digital Twins inst
>[!TIP] >You can also upload all models within a directory at the same time, by using the `--from-directory` option for the model create command. For more information, see [Optional parameters for az dt model create](/cli/azure/dt/model#az-dt-model-create-optional-parameters).
-1. Verify the models were created with the [az dt model list](/cli/azure/dt/model#az-dt-model-list) command as shown below. Doing so will print a list of all models that have been uploaded to the Azure Digital Twins instance with their full information.
+1. Verify the models were created with the [az dt model list](/cli/azure/dt/model#az-dt-model-list) command as shown below. Doing so will print a list of all models that have been uploaded to the Azure Digital Twins instance with their full information. There's one placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance).
```azurecli-interactive
- az dt model list --dt-name <Azure-Digital-Twins-instance-name> --definition
+ az dt model list --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --definition
``` Look for the edited Room model in the results:
The CLI also handles errors from the service.
Rerun the `az dt model create` command to try re-uploading one of the same models you uploaded, for a second time: ```azurecli-interactive
-az dt model create --dt-name <Azure-Digital-Twins-instance-name> --models Room.json
+az dt model create --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --models Room.json
``` As models cannot be overwritten, running this command on the same model will now return an error code of `ModelIdAlreadyExists`.
Now that some models have been uploaded to your Azure Digital Twins instance, yo
To create a digital twin, you use the [az dt twin create](/cli/azure/dt/twin#az-dt-twin-create) command. You must reference the model that the twin is based on, and can optionally define initial values for any properties in the model. You don't have to pass any relationship information at this stage.
-1. Run this code in the CLI to create several twins, based on the Room model you updated earlier and another model, Floor. Recall that Room has three properties, so you can provide arguments with the initial values for these properties. (Initializing property values is optional in general, but they're needed for this tutorial.)
+1. Run this code in the CLI to create several twins, based on the Room model you updated earlier and another model, Floor. Recall that Room has three properties, so you can provide arguments with the initial values for these properties. (Initializing property values is optional in general, but they're needed for this tutorial.) There's one placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance).
```azurecli-interactive
- az dt twin create --dt-name <Azure-Digital-Twins-instance-name> --dtmi "dtmi:example:Room;2" --twin-id room0 --properties '{"RoomName":"Room0", "Temperature":70, "HumidityLevel":30}'
- az dt twin create --dt-name <Azure-Digital-Twins-instance-name> --dtmi "dtmi:example:Room;2" --twin-id room1 --properties '{"RoomName":"Room1", "Temperature":80, "HumidityLevel":60}'
- az dt twin create --dt-name <Azure-Digital-Twins-instance-name> --dtmi "dtmi:example:Floor;1" --twin-id floor0
- az dt twin create --dt-name <Azure-Digital-Twins-instance-name> --dtmi "dtmi:example:Floor;1" --twin-id floor1
+ az dt twin create --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --dtmi "dtmi:example:Room;2" --twin-id room0 --properties '{"RoomName":"Room0", "Temperature":70, "HumidityLevel":30}'
+ az dt twin create --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --dtmi "dtmi:example:Room;2" --twin-id room1 --properties '{"RoomName":"Room1", "Temperature":80, "HumidityLevel":60}'
+ az dt twin create --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --dtmi "dtmi:example:Floor;1" --twin-id floor0
+ az dt twin create --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --dtmi "dtmi:example:Floor;1" --twin-id floor1
``` >[!NOTE]
To create a digital twin, you use the [az dt twin create](/cli/azure/dt/twin#az-
The output from each command will show information about the successfully created twin (including properties for the room twins that were initialized with them).
-1. You can verify that the twins were created with the [az dt twin query](/cli/azure/dt/twin#az-dt-twin-query) command as shown below. The query shown finds all the digital twins in your Azure Digital Twins instance.
+1. You can verify that the twins were created with the [az dt twin query](/cli/azure/dt/twin#az-dt-twin-query) command as shown below. The query shown finds all the digital twins in your Azure Digital Twins instance. There's one placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance).
```azurecli-interactive
- az dt twin query --dt-name <Azure-Digital-Twins-instance-name> --query-command "SELECT * FROM DIGITALTWINS"
+ az dt twin query --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --query-command "SELECT * FROM DIGITALTWINS"
``` Look for the room0, room1, floor0, and floor1 twins in the results. Here's an excerpt showing part of the result of this query.
To create a digital twin, you use the [az dt twin create](/cli/azure/dt/twin#az-
You can also modify the properties of a twin you've created.
-1. Run this [az dt twin update](/cli/azure/dt/twin#az-dt-twin-update) command to change room0's RoomName from Room0 to PresidentialSuite:
+1. Run the following [az dt twin update](/cli/azure/dt/twin#az-dt-twin-update) command to change room0's RoomName from Room0 to PresidentialSuite. There's one placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance).
```azurecli-interactive
- az dt twin update --dt-name <Azure-Digital-Twins-instance-name> --twin-id room0 --json-patch '{"op":"add", "path":"/RoomName", "value": "PresidentialSuite"}'
+ az dt twin update --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --twin-id room0 --json-patch '{"op":"add", "path":"/RoomName", "value": "PresidentialSuite"}'
``` >[!NOTE]
You can also modify the properties of a twin you've created.
:::image type="content" source="media/tutorial-command-line/cli/output-update-twin.png" alt-text="Screenshot of Cloud Shell showing result of the update command, which includes a RoomName of PresidentialSuite." lightbox="media/tutorial-command-line/cli/output-update-twin.png":::
-1. You can verify the update succeeded by running the [az dt twin show](/cli/azure/dt/twin#az-dt-twin-show) command to see room0's information:
+1. You can verify the update succeeded by running the [az dt twin show](/cli/azure/dt/twin#az-dt-twin-show) command to see room0's information. There's one placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance).
```azurecli-interactive
- az dt twin show --dt-name <Azure-Digital-Twins-instance-name> --twin-id room0
+ az dt twin show --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --twin-id room0
``` The output should reflect the updated name.
The types of relationships that you can create from one twin to another are defi
To add a relationship, use the [az dt twin relationship create](/cli/azure/dt/twin/relationship#az-dt-twin-relationship-create) command. Specify the twin that the relationship is coming from, the type of relationship, and the twin that the relationship is connecting to. Lastly, give the relationship a unique ID. If a relationship was defined to have properties, you can initialize the relationship properties in this command as well.
-1. Run the following code to add a `contains`-type relationship from each of the Floor twins you created earlier to the corresponding Room twin. The relationships are named relationship0 and relationship1.
+1. Run the following code to add a `contains`-type relationship from each of the Floor twins you created earlier to the corresponding Room twin. The relationships are named relationship0 and relationship1. There's one placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance).
```azurecli-interactive
- az dt twin relationship create --dt-name <Azure-Digital-Twins-instance-name> --relationship-id relationship0 --relationship contains --twin-id floor0 --target room0
- az dt twin relationship create --dt-name <Azure-Digital-Twins-instance-name> --relationship-id relationship1 --relationship contains --twin-id floor1 --target room1
+ az dt twin relationship create --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --relationship-id relationship0 --relationship contains --twin-id floor0 --target room0
+ az dt twin relationship create --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --relationship-id relationship1 --relationship contains --twin-id floor1 --target room1
``` >[!TIP]
To add a relationship, use the [az dt twin relationship create](/cli/azure/dt/tw
The output from each command will show information about the successfully created relationship.
-1. You can verify the relationships with any of the following commands, which print the relationships in your Azure Digital Twins instance.
+1. You can verify the relationships with any of the following commands, which print the relationships in your Azure Digital Twins instance. Each command has one placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance).
* To see all relationships coming off of each floor (viewing the relationships from one side): ```azurecli-interactive
- az dt twin relationship list --dt-name <Azure-Digital-Twins-instance-name> --twin-id floor0
- az dt twin relationship list --dt-name <Azure-Digital-Twins-instance-name> --twin-id floor1
+ az dt twin relationship list --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --twin-id floor0
+ az dt twin relationship list --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --twin-id floor1
``` * To see all relationships arriving at each room (viewing the relationship from the "other" side): ```azurecli-interactive
- az dt twin relationship list --dt-name <Azure-Digital-Twins-instance-name> --twin-id room0 --incoming
- az dt twin relationship list --dt-name <Azure-Digital-Twins-instance-name> --twin-id room1 --incoming
+ az dt twin relationship list --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --twin-id room0 --incoming
+ az dt twin relationship list --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --twin-id room1 --incoming
``` * To look for these relationships individually, by ID: ```azurecli-interactive
- az dt twin relationship show --dt-name <Azure-Digital-Twins-instance-name> --twin-id floor0 --relationship-id relationship0
- az dt twin relationship show --dt-name <Azure-Digital-Twins-instance-name> --twin-id floor1 --relationship-id relationship1
+ az dt twin relationship show --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --twin-id floor0 --relationship-id relationship0
+ az dt twin relationship show --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --twin-id floor1 --relationship-id relationship1
``` The twins and relationships you have set up in this tutorial form the following conceptual graph:
A main feature of Azure Digital Twins is the ability to [query](concepts-query-l
[!INCLUDE [digital-twins-query-latency-note.md](../../includes/digital-twins-query-latency-note.md)]
-Run the following queries in the CLI to answer some questions about the sample environment.
+Run the following queries in the CLI to answer some questions about the sample environment. Each command has one placeholder for the instance's host name (you can also use the instance's friendly name with a slight decrease in performance).
1. What are all the entities from my environment represented in Azure Digital Twins? (query all) ```azurecli-interactive
- az dt twin query --dt-name <Azure-Digital-Twins-instance-name> --query-command "SELECT * FROM DIGITALTWINS"
+ az dt twin query --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --query-command "SELECT * FROM DIGITALTWINS"
``` This query allows you to take stock of your environment at a glance, and make sure everything is represented as you want it to be within Azure Digital Twins. The result of this query is an output containing each digital twin with its details. Here's an excerpt:
Run the following queries in the CLI to answer some questions about the sample e
1. What are all the rooms in my environment? (query by model) ```azurecli-interactive
- az dt twin query --dt-name <Azure-Digital-Twins-instance-name> --query-command "SELECT * FROM DIGITALTWINS T WHERE IS_OF_MODEL(T, 'dtmi:example:Room;2')"
+ az dt twin query --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --query-command "SELECT * FROM DIGITALTWINS T WHERE IS_OF_MODEL(T, 'dtmi:example:Room;2')"
``` You can restrict your query to twins of a certain type, to get more specific information about what's represented. The result of this shows room0 and room1, but doesn't show floor0 or floor1 (since they're floors, not rooms).
Run the following queries in the CLI to answer some questions about the sample e
1. What are all the rooms on floor0? (query by relationship) ```azurecli-interactive
- az dt twin query --dt-name <Azure-Digital-Twins-instance-name> --query-command "SELECT room FROM DIGITALTWINS floor JOIN room RELATED floor.contains where floor.\$dtId = 'floor0'"
+ az dt twin query --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --query-command "SELECT room FROM DIGITALTWINS floor JOIN room RELATED floor.contains where floor.\$dtId = 'floor0'"
``` You can query based on relationships in your graph, to get information about how twins are connected or to restrict your query to a certain area. This query also illustrates that a twin's ID (like floor0 in the query above) is queried using the metadata field `$dtId`. Only room0 is on floor0, so it's the only room in the result for this query.
Run the following queries in the CLI to answer some questions about the sample e
1. What are all the twins in my environment with a temperature above 75? (query by property) ```azurecli-interactive
- az dt twin query --dt-name <Azure-Digital-Twins-instance-name> --query-command "SELECT * FROM DigitalTwins T WHERE T.Temperature > 75"
+ az dt twin query --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --query-command "SELECT * FROM DigitalTwins T WHERE T.Temperature > 75"
``` You can query the graph based on properties to answer different kinds of questions, including finding outliers in your environment that might need attention. Other comparison operators (*<*,*>*, *=*, or *!=*) are also supported. room1 shows up in the results here, because it has a temperature of 80.
Run the following queries in the CLI to answer some questions about the sample e
1. What are all the rooms on floor0 with a temperature above 75? (compound query) ```azurecli-interactive
- az dt twin query --dt-name <Azure-Digital-Twins-instance-name> --query-command "SELECT room FROM DIGITALTWINS floor JOIN room RELATED floor.contains where floor.\$dtId = 'floor0' AND IS_OF_MODEL(room, 'dtmi:example:Room;2') AND room.Temperature > 75"
+ az dt twin query --dt-name <Azure-Digital-Twins-instance-hostname-or-name> --query-command "SELECT room FROM DIGITALTWINS floor JOIN room RELATED floor.contains where floor.\$dtId = 'floor0' AND IS_OF_MODEL(room, 'dtmi:example:Room;2') AND room.Temperature > 75"
``` You can also combine the earlier queries like you would in SQL, using combination operators such as `AND`, `OR`, `NOT`. This query uses `AND` to make the previous query about twin temperatures more specific. The result now only includes rooms with temperatures above 75 that are on floor0ΓÇöwhich in this case, is none of them. The result set is empty.
event-grid Delivery Properties https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/delivery-properties.md
Authorization: BEARER SlAV32hkKG...
``` > [!NOTE]
-> Defining authorization headers is a sensible option when your destination is a Webhook. It should not be used for [functions subscribed with a resource id](/rest/api/eventgrid/controlplane-version2021-06-01-preview/event-subscriptions/create-or-update#azurefunctioneventsubscriptiondestination), Service Bus, Event Hubs, and Hybrid Connections as those destinations support their own authentication schemes when used with Event Grid.
+> Defining authorization headers is a sensible option when your destination is a Webhook. It should not be used for [functions subscribed with a resource id](/rest/api/eventgrid/controlplane-version2021-10-15-preview/event-subscriptions/create-or-update#azurefunctioneventsubscriptiondestination), Service Bus, Event Hubs, and Hybrid Connections as those destinations support their own authentication schemes when used with Event Grid.
### Service Bus example Azure Service Bus supports the use of following message properties when sending single messages.
event-grid Event Schema Media Services https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/event-schema-media-services.md
The data object has the following properties:
| `encoderPort` | string | Port of the encoder from where this stream is coming. | | `resultCode` | string | The reason the connection was rejected. The result codes are listed in the following table. |
-You can find the error result codes in [live Event error codes](/media-services/latest/live-event-error-codes-reference).
+You can find the error result codes in [live Event error codes](/azure/media-services/latest/live-event-error-codes-reference).
### LiveEventEncoderConnected
The data object has the following properties:
| `encoderPort` | string | Port of the encoder from where this stream is coming. | | `resultCode` | string | The reason for the encoder disconnecting. It could be graceful disconnect or from an error. The result codes are listed in the following table. |
-You can find the error result codes in [live Event error codes](/media-services/latest/live-event-error-codes-reference).
+You can find the error result codes in [live Event error codes](/azure/media-services/latest/live-event-error-codes-reference).
The graceful disconnect result codes are:
An event has the following top-level data:
## Next steps
-[Register for job state change events](/media-services/latest/monitoring/job-state-events-cli-how-to)
+[Register for job state change events](/azure/media-services/latest/monitoring/job-state-events-cli-how-to)
## See also - [EventGrid .NET SDK that includes Media Service events](https://www.nuget.org/packages/Microsoft.Azure.EventGrid/) - [Definitions of Media Services events](https://github.com/Azure/azure-rest-api-specs/blob/master/specification/eventgrid/data-plane/Microsoft.Media/stable/2018-01-01/MediaServices.json)-- [Live Event error codes](/media-services/latest/live-event-error-codes-reference)
+- [Live Event error codes](/azure/media-services/latest/live-event-error-codes-reference)
event-grid System Topics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/system-topics.md
In the past, a system topic was implicit and wasn't exposed for simplicity. Syst
## Lifecycle of system topics You can create a system topic in two ways: -- Create an [event subscription on an Azure resource as an extension resource](/rest/api/eventgrid/controlplane-version2021-06-01-preview/event-subscriptions/create-or-update), which automatically creates a system topic with the name in the format: `<Azure resource name>-<GUID>`. The system topic created in this way is automatically deleted when the last event subscription for the topic is deleted.
+- Create an [event subscription on an Azure resource as an extension resource](/rest/api/eventgrid/controlplane-version2021-10-15-preview/event-subscriptions/create-or-update), which automatically creates a system topic with the name in the format: `<Azure resource name>-<GUID>`. The system topic created in this way is automatically deleted when the last event subscription for the topic is deleted.
- Create a system topic for an Azure resource, and then create an event subscription for that system topic. When you use this method, you can specify a name for the system topic. The system topic isn't deleted automatically when the last event subscription is deleted. You need to manually delete it. When you use the Azure portal, you're always using this method. When you create an event subscription using the [**Events** page of an Azure resource](blob-event-quickstart-portal.md#subscribe-to-the-blob-storage), the system topic is created first and then the subscription for the topic is created. You can explicitly create a system topic first by using the [**Event Grid System Topics** page](create-view-manage-system-topics.md#create-a-system-topic) and then create a subscription for that topic.
event-grid Whats New https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/event-grid/whats-new.md
This release corresponds to REST API version 2021-10-15-preview, which includes
This release corresponds to REST API version 2021-06-01-preview, which includes the following new features: - [Azure Active Directory authentication for topics and domains, and partner namespaces](authenticate-with-active-directory.md)-- [Private link support for partner namespaces](/rest/api/eventgrid/controlplane-version2021-06-01-preview/partner-namespaces/create-or-update#privateendpoint). Azure portal doesn't support it yet. -- [IP Filtering for partner namespaces](/rest/api/eventgrid/controlplane-version2021-06-01-preview/partner-namespaces/create-or-update#inboundiprule). Azure portal doesn't support it yet. -- [System Identity for partner topics](/rest/api/eventgrid/controlplane-version2021-06-01-preview/partner-topics/update#request-body). Azure portal doesn't support it yet.
+- [Private link support for partner namespaces](/rest/api/eventgrid/controlplane-version2021-10-15-preview/partner-namespaces/create-or-update#privateendpoint). Azure portal doesn't support it yet.
+- [IP Filtering for partner namespaces](/rest/api/eventgrid/controlplane-version2021-10-15-preview/partner-namespaces/create-or-update#inboundiprule). Azure portal doesn't support it yet.
+- [System Identity for partner topics](/rest/api/eventgrid/controlplane-version2021-10-15-preview/partner-namespaces/update#request-body). Azure portal doesn't support it yet.
- [User Identity for system topics, custom topics and domains](enable-identity-custom-topics-domains.md) ## 6.1.0-preview (2020-10)
This release corresponds to REST API version 2021-06-01-preview, which includes
- This release corresponds to the `2019-06-01` API version. - It adds support to the following new functionalities: * [Domains](event-domains.md)
- * Pagination and search filter for resources list operations. For an example, see [Topics - List By Subscription](/rest/api/eventgrid/controlplane-version2021-06-01-preview/partner-namespaces/list-by-subscription).
+ * Pagination and search filter for resources list operations. For an example, see [Topics - List By Subscription](/rest/api/eventgrid/controlplane-version2021-10-15-preview/partner-namespaces/list-by-subscription).
* [Service Bus queue as destination](handler-service-bus.md) * [Advanced filtering](event-filtering.md#advanced-filtering) ## 4.1.0-preview (2019-03) - This release corresponds to the 2019-02-01-preview API version. - It adds support to the following new functionalities:
- * Pagination and search filter for resources list operations. For an example, see [Topics - List By Subscription](/rest/api/eventgrid/controlplane-version2021-06-01-preview/partner-namespaces/list-by-subscription).
+ * Pagination and search filter for resources list operations. For an example, see [Topics - List By Subscription](/rest/api/eventgrid/controlplane-version2021-10-15-preview/partner-namespaces/list-by-subscription).
* [Manual create/delete of domain topics](how-to-event-domains.md) * [Service Bus Queue as destination](handler-service-bus.md)
governance Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/overview.md
Azure Policy has several permissions, known as operations, in two Resource Provi
Many Built-in roles grant permission to Azure Policy resources. The **Resource Policy Contributor** role includes most Azure Policy operations. **Owner** has full rights. Both **Contributor** and
-**Reader** have access to all _read_ Azure Policy operations. **Contributor** may trigger resource
+**Reader** have access to all _read_ Azure Policy operations.
+
+**Contributor** may trigger resource
remediation, but can't _create_ or _update_ definitions and assignments. **User Access Administrator** is necessary to grant the managed identity on **deployIfNotExists** or **modify** assignments necessary
-permissions. All policy objects will be readable to all roles over the scope.
+permissions.
+
+> [!NOTE]
+> All Policy objects, including definitions, initatives, and assignments, will be readable to all
+> roles over its scope. For example, a Policy assignment scoped to an Azure subscription will be readable
+> by all role holders at the subscription scope and below.
If none of the Built-in roles have the permissions required, create a [custom role](../../role-based-access-control/custom-roles.md).
governance Cmmc L3 https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/policy/samples/cmmc-l3.md
Then, find and select the **CMMC Level 3** Regulatory Compliance built-in
initiative definition. > [!IMPORTANT]
-> This policy initiative was built to CMMC version 1.0 and will be updated in the future".
+> This policy initiative was built to CMMC version 1.0 and will be updated in the future.
> CMMC Level 2 under CMMC 2.0 is similar to CMMC Level 3 under CMMC 1.0, but has different control mappings.
->
+>
> Each control below is associated with one or more [Azure Policy](../overview.md) definitions. > These policies may help you [assess compliance](../how-to/get-compliance-data.md) with the > control; however, there often is not a one-to-one or complete match between a control and one or
governance Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/governance/resource-graph/overview.md
# What is Azure Resource Graph?
-Azure Resource Graph is a service in Azure that is designed to extend Azure Resource Management by
+Azure Resource Graph is an Azure service designed to extend Azure Resource Management by
providing efficient and performant resource exploration with the ability to query at scale across a given set of subscriptions so that you can effectively govern your environment. These queries provide the following abilities:
In this documentation, you'll go over each feature in detail.
[!INCLUDE [azure-lighthouse-supported-service](../../../includes/azure-lighthouse-supported-service.md)]
-## How does Resource Graph complement Azure Resource Manager
+## How Resource Graph complements Azure Resource Manager
-Resource Manager currently supports queries over basic resource fields, specifically:
+Azure Resource Manager currently supports queries over basic resource fields, specifically:
- Resource name - ID
Resource Manager currently supports queries over basic resource fields, specific
- Subscription - Location
-Resource Manager also provides
+Azure Resource Manager also provides
facilities for calling individual resource providers for detailed properties one resource at a time. With Azure Resource Graph, you can access these properties the resource providers return without
Now that you have a better understanding of what Azure Resource Graph is, let's
construct queries. It's important to understand that Azure Resource Graph's query language is based on the
-[Kusto query language](/azure/data-explorer/data-explorer-overview) used by Azure Data Explorer.
+[Kusto Query Language (KQL)](/azure/data-explorer/data-explorer-overview) used by Azure Data Explorer.
First, for details on operations and functions that can be used with Azure Resource Graph, see [Resource Graph query language](./concepts/query-language.md). To browse resources, see
hdinsight Apache Ambari Email https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/apache-ambari-email.md
Title: 'Tutorial: Configure Apache Ambari email notifications in Azure HDInsight
description: This article describes how to use SendGrid with Apache Ambari for email notifications. Previously updated : 03/10/2020 Last updated : 04/11/2022 #Customer intent: As a HDInsight user, I want to configure Apache Ambari to send email notifications.
In this tutorial, you learned how to configure Apache Ambari email notifications
* [Manage HDInsight clusters by using the Apache Ambari Web UI](./hdinsight-hadoop-manage-ambari.md)
-* [Create an alert notification](https://docs.cloudera.com/HDPDocuments/Ambari-latest/managing-and-monitoring-ambari/content/amb_create_an_alert_notification.html)
+* [Create an alert notification](https://docs.cloudera.com/HDPDocuments/Ambari-latest/managing-and-monitoring-ambari/content/amb_create_an_alert_notification.html)
hdinsight Cluster Availability Monitor Logs https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/cluster-availability-monitor-logs.md
Title: How to monitor cluster availability with Azure Monitor logs in HDInsight
description: Learn how to use Azure Monitor logs to monitor cluster health and availability. Previously updated : 08/12/2020 Last updated : 04/11/2022 # How to monitor cluster availability with Azure Monitor logs in HDInsight
hdinsight Cluster Management Best Practices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/cluster-management-best-practices.md
description: Learn best practices for managing HDInsight clusters.
Previously updated : 12/02/2019 Last updated : 04/11/2020 # HDInsight cluster management best practices
hdinsight Connect On Premises Network https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/connect-on-premises-network.md
description: Learn how to create an HDInsight cluster in an Azure Virtual Networ
Previously updated : 03/04/2020 Last updated : 04/11/2022 # Connect HDInsight to your on-premises network
To directly connect to HDInsight through the virtual network, use the following
* For more information on network security groups, see [Network security groups](../virtual-network/network-security-groups-overview.md).
-* For more information on user-defined routes, see [User-defined routes and IP forwarding](../virtual-network/virtual-networks-udr-overview.md).
+* For more information on user-defined routes, see [User-defined routes and IP forwarding](../virtual-network/virtual-networks-udr-overview.md).
hdinsight Control Network Traffic https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/hdinsight/control-network-traffic.md
Title: Control network traffic in Azure HDInsight
description: Learn techniques for controlling inbound and outbound traffic to Azure HDInsight clusters. Previously updated : 09/02/2020 Last updated : 04/11/2022 # Control network traffic in Azure HDInsight
healthcare-apis Dicom Services Conformance Statement https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/healthcare-apis/dicom/dicom-services-conformance-statement.md
Each dataset in the `FailedSOPSequence` will have the following elements (if the
| Tag | Name | Description | | :-- | :-- | :- | | (0008, 1150) | ReferencedSOPClassUID | The SOP class unique identifier of the instance that failed to store. |
-| (0008, 1150) | ReferencedSOPInstanceUID | The SOP instance unique identifier of the instance that failed to store. |
+| (0008, 1155) | ReferencedSOPInstanceUID | The SOP instance unique identifier of the instance that failed to store. |
| (0008, 1197) | FailureReason | The reason code why this instance failed to store. | Each dataset in the `ReferencedSOPSequence` will have the following elements:
Each dataset in the `ReferencedSOPSequence` will have the following elements:
| Tag | Name | Description | | :-- | :-- | :- | | (0008, 1150) | ReferencedSOPClassUID | The SOP class unique identifier of the instance that failed to store. |
-| (0008, 1150) | ReferencedSOPInstanceUID | The SOP instance unique identifier of the instance that failed to store. |
+| (0008, 1155) | ReferencedSOPInstanceUID | The SOP instance unique identifier of the instance that failed to store. |
| (0008, 1190) | RetrieveURL | The retrieve URL of this instance on the DICOM server. | Below is an example response with `Accept` header `application/dicom+json`:
iot-central Howto Build Iotc Device Bridge https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-build-iotc-device-bridge.md
Each key in the `measurements` object must match the name of a telemetry type in
You can include a `timestamp` field in the body to specify the UTC date and time of the message. This field must be in ISO 8601 format. For example, `2020-06-08T20:16:54.602Z`. If you don't include a timestamp, the current date and time is used.
-You can include a `modelId` field in the body. Use this field to assign the device to a device template during provisioning. This functionality is only supported by [V3 applications](howto-faq.yml#how-do-i-get-information-about-my-application-).
+You can include a `modelId` field in the body. Use this field to assign the device to a device template during provisioning.
The `deviceId` must be alphanumeric, lowercase, and may contain hyphens. If you don't include the `modelId` field, or if IoT Central doesn't recognize the model ID, then a message with an unrecognized `deviceId` creates a new _unassigned device_ in IoT Central. An operator can manually migrate the device to the correct device template. To learn more, see [Manage devices in your Azure IoT Central application > Migrating devices to a template](howto-manage-devices-individually.md).
-In [V2 applications](howto-faq.yml#how-do-i-get-information-about-my-application-), the new device appears an unassigned device on the **Devices** page. Select **Assign template** and choose a device template to start receiving incoming telemetry from the device.
- > [!NOTE] > Until the device is assigned to a template, all HTTP calls to the function return a 403 error status.
iot-central Howto Configure Rules Advanced https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-configure-rules-advanced.md
To complete the steps in this how-to guide, you need:
[!INCLUDE [iot-central-prerequisites-basic](../../../includes/iot-central-prerequisites-basic.md)]
-> [!NOTE]
-> If you're using a version 2 IoT Central application, see [Build workflows with the IoT Central connector in Azure Logic Apps](/previous-versions/azure/iot-central/core/howto-build-azure-logic-apps) on the previous versions documentation site and use the Azure IoT Central V2 connector
- ## Trigger a workflow from a rule Before you can trigger a workflow in Power Automate or Azure Logic Apps, you need a rule in your IoT Central application. To learn more, see [Configure rules and actions in Azure IoT Central](./howto-configure-rules.md).
iot-central Howto Edit Device Template https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-edit-device-template.md
You can create multiple versions of the device template. Over time, you'll have
1. Select the device template with the version you want to migrate the device to and select **Migrate**.
+> [!TIP]
+> You can use a job to migrate all the devices in a device group to a new device template at the same time.
+ ## Next steps If you're an operator or solution builder, a suggested next step is to learn [how to manage your devices](./howto-manage-devices-individually.md).
iot-central Howto Export Data https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-export-data.md
For example, you can:
## Prerequisites
-To use data export features, you must have a [V3 application](howto-faq.yml#how-do-i-get-information-about-my-application-), and you must have the [Data export](howto-manage-users-roles.md) permission.
-
-If you have a V2 application, see [Migrate your V2 IoT Central application to V3](howto-migrate.md).
+To use data export features, you must have the [Data export](howto-manage-users-roles.md) permission.
## Set up an export destination
iot-central Howto Manage Devices In Bulk https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-manage-devices-in-bulk.md
The following example shows you how to create and run a job to set the light thr
1. Select the target device group that you want your job to apply to. If your application uses organizations, the selected organization determines the available device groups. You can see how many devices your job configuration applies to below your **Device group** selection.
-1. Choose **Cloud property**, **Property**, or **Command** as the **Job type**:
+1. Choose **Cloud property**, **Property**, **Command**, or **Change device template** as the **Job type**:
- To configure a **Property** job, select a property and set its new value. To configure a **Command** job, choose the command to run. A property job can set multiple properties.
+ To configure a **Property** job, select a property and set its new value. A property job can set multiple properties. To configure a **Command** job, choose the command to run. To configure a **Change device template** job, select the device template to assign to the devices in the device group.
:::image type="content" source="media/howto-manage-devices-in-bulk/configure-job.png" alt-text="Screenshot that shows selections for creating a property job called Set Light Threshold":::
iot-central Howto Manage Devices Individually https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-manage-devices-individually.md
If you register devices by starting the import under **All devices**, then the d
1. Use the filter on the grid to determine if the value in the **Device Template** column is **Unassigned** for any of your devices.
-1. Select the devices you want to assign to a template:
+1. Select the devices you want to assign to a template.
1. Select **Migrate**:
iot-central Howto Manage Iot Central From Portal https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-manage-iot-central-from-portal.md
You can configure role assignments in the Azure portal or use the Azure CLI:
## Monitor application health
-> [!NOTE]
-> Metrics are only available for version 3 IoT Central applications. To learn how to check your application version, see [How do I get information about my application?](howto-faq.yml#how-do-i-get-information-about-my-application-).
- You can use the set of metrics provided by IoT Central to assess the health of devices connected to your IoT Central application and the health of your running data exports. Metrics are enabled by default for your IoT Central application and you access them from the [Azure portal](https://portal.azure.com/). The [Azure Monitor data platform exposes these metrics](../../azure-monitor/essentials/data-platform-metrics.md) and provides several ways for you to interact with them. For example, you can use charts in the Azure portal, a REST API, or queries in PowerShell or the Azure CLI.
iot-central Howto Migrate https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/howto-migrate.md
- Title: Migrate a V2 Azure IoT Central application to V3 | Microsoft Docs
-description: Learn how to migrate your V2 Azure IoT Central application to V3
-- Previously updated : 12/17/2021-----
-# Administrator
--
-# Migrate your V2 IoT Central application to V3
-
-Currently, when you create a new IoT Central application, it's a V3 application. If you previously created an application, then depending on when you created it, it may be V2. This article describes how to migrate a V2 to a V3 application to be sure you're using the latest IoT Central features.
-
-To learn more, see the [Retirement announcement](/answers/questions/529295/retirement-announcement-upgrade-to-iot-central-v3.html).
-
-To learn how to identify the version of an IoT Central application, see [How do I get information about my application?](howto-faq.yml#how-do-i-get-information-about-my-application-).
-
-The steps to migrate an application from V2 to V3 are:
-
-1. Create a new V3 application from the V2 application.
-1. Configure the V3 application.
-1. Delete the V2 application.
-
-## Create a new V3 application
-
-IoT Central doesn't let you migrate to an existing V3 application. To move existing devices from a V2 to V3 application, use the migration wizard to create your V3 application.
-
-The migration wizard:
--- Creates a new V3 application.-- Checks your device templates for compatibility with V3.-- Copies all your device templates to the new V3 application.-- Copies all users and role assignments to the new V3 application.-
-> [!NOTE]
-> To ensure device compatibility with V3, primitive types in the device template become object properties. You don't need to make any changes to your device code.
-
-You must be an administrator to migrate an application to V3.
-
-1. On the **Administration** menu, select **Migrate to a V3 application**:
-
- :::image type="content" source="media/howto-migrate/migrate-menu.png" alt-text="Screenshot that shows the application migration wizard":::
-
-1. Enter a new **Application name** and optionally change the autogenerated **URL**. The URL can't be the same as your current V2 application's URL. However, you can change the URL later after you delete your V2 application.
-
-1. Select **Create a new V3 app**.
-
- :::image type="content" source="media/howto-migrate/create-app.png" alt-text="Screenshot that shows the options in the application migration wizard":::
-
- Depending on the number and complexity of your device templates, this process may take several minutes to complete.
-
- > [!Warning]
- > The creation of your V3 application will fail if any device template has fields that start with a number or contain any of the following characters (`+`, `*`, `?`, `^`, `$`, `(`, `)`, `[`, `]`, `{`, `}`, `|`, `\`). The DTDL device template schema that V3 applications use doesn't allow these characters. Update your device template to resolve this issue before you migrate to V3.
-
-1. When your new V3 app is ready, select **Open your new app** to open it.
-
- :::image type="content" source="media/howto-migrate/open-app.png" alt-text="Screenshot that shows the application migration wizard after the application migration":::
-
-## Configure the V3 application
-
-After your new V3 application is created, make any configuration changes before you move your devices from the V2 application to the V3 application.
-
-> [!TIP]
-> Take a moment to [familiarize yourself with V3](overview-iot-central-tour.md#navigate-your-application) as it has some differences from V2.
-
-Here are some recommended configuration steps to consider:
--- [Configure dashboards](howto-manage-dashboards.md)-- [Configure data export](howto-export-data.md)-- [Configure rules and actions](quick-configure-rules.md)-- [Customize the application UI](howto-customize-ui.md)-
-When your V3 application is configured to meet your needs, you're ready to move your devices from your V2 application to your V3 application.
-
-## Move your devices to the V3 application
-
-When this step is complete, all your devices communicate only with your new V3 application.
-
-> [!IMPORTANT]
-> Before you move your devices to your V3 application, delete any devices that you've created in the V3 application.
-
-This step moves all your existing devices to your new V3 application. Your device data isn't copied.
-
-Select **Move all devices** to start moving your devices. This step could take several minutes to complete.
--
-After the move is complete, restart all your devices to ensure they connect to the new V3 application.
-
-> [!WARNING]
-> Don't delete your V3 application as your V2 application is now inoperable.
-
-## Delete your old V2 application
-
-After you've validated that everything works as expected in your new V3 application, delete your old V2 application. This step ensures you don't get billed for an application you no longer use.
-
-> [!Note]
-> To delete an application, you must have permissions to delete resources in the Azure subscription you chose when you created the application. To learn more, see [Use role-based access control to manage access to your Azure subscription resources](../../role-based-access-control/role-assignments-portal.md).
-
-1. In your V2 application, select the **Administration** tab in the menu
-2. Select **Delete** to permanently delete your IoT Central application. This option permanently deletes all data associated with that application.
-
-## Next steps
-
-Now that you've learned how to migrate your application, the suggested next step is to review the [Azure IoT Central UI](overview-iot-central-tour.md) to see what's changed in V3.
iot-central Overview Iot Central Admin https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/overview-iot-central-admin.md
An administrator can:
To learn more, see [Create and use a custom application template](howto-create-iot-central-application.md#create-and-use-a-custom-application-template) .
-## Migrate to a new version
-
-An administrator can migrate an application to a newer version. Currently, all newly created applications are V3 applications. Depending on when it was created, it may be V2. An administrator is responsible for migrating a V2 application to a V3 application.
-
-To learn more, see [Migrate your V2 IoT Central application to V3](howto-migrate.md).
- ## Monitor application health An administrator can use IoT Central metrics to assess the health of connected devices and the health of running data exports.
iot-central Overview Iot Central Operator https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/core/overview-iot-central-operator.md
Title: Azure IoT Central device management guide
description: Azure IoT Central is an IoT application platform that simplifies the creation of IoT solutions. This guide describes how to manage the IoT devices connected to your IoT Central application. Previously updated : 01/04/2022 Last updated : 04/07/2022
An IoT Central application lets you monitor and manage millions of devices throu
IoT Central lets you complete device management tasks such as:
+- Provision new devices.
- Monitor and manage the devices connected to the application. - Troubleshoot and remediate issues with devices.-- Provision new devices.
-## Search your devices
+You can use the following tools in your IoT Central application:
+
+- The **Devices** page lets you monitor and manage individual devices.
+- The **Device groups** and **Data explorer** pages let you monitor aggregate data from your devices.
+- The **Jobs** page lets you manage your devices in bulk.
+- Custom dashboards let you manage and monitor devices in a way that suits you.
+- The REST API and Azure CLI enable you to automate device management tasks.
-IoT Central lets you search devices by device name, ID, property value or cloud property value.
+## Search for devices
+
+IoT Central lets you search devices by device name, ID, property value, or cloud property value:
:::image type="content" source="media/overview-iot-central-operator/search-devices.png" alt-text="Screenshot that shows how to search devices":::
-## Monitor and manage devices
+## Add devices
+Use the **Devices** page to add individual devices, or [import devices](howto-manage-devices-in-bulk.md#import-devices) in bulk from a CSV file:
++
+## Group your devices
+
+On the **Device groups** page, you can use queries to define groups of devices. You can use device groups to:
+
+- Monitor aggregate data from devices on the **Device explorer** page.
+- Manage groups of devices in bulk by using jobs.
+- Control access to groups of devices if your application uses organizations.
+
+To learn more, see [Tutorial: Use device groups to analyze device telemetry](tutorial-use-device-groups.md).
-To monitor devices, use the custom device views defined by a solution builder. These views can show device telemetry and property values. An example is the **Overview** view shown in the previous screenshot.
+## Manage your devices
-For more detailed information, use device groups and the built-in analytics features. To learn more, see [How to use data explorer to analyze device data](howto-create-analytics.md).
+Use the **Devices** page to manage individual devices connected to your application:
-To manage individual devices, use device views to set device and cloud properties, and call device commands. Examples include the **Manage device** and **Commands** views in the previous screenshot.
-To manage devices in bulk, create and schedule jobs. Jobs can update properties and run commands on multiple devices. To learn more, see [Create and run a job in your Azure IoT Central application](howto-manage-devices-in-bulk.md).
+For individual device, you can complete tasks such as [block or unblock it](howto-manage-devices-individually.md#device-status-values), [attach it to a gateway](tutorial-define-gateway-device-type.md), [approve it](howto-manage-devices-individually.md#device-status-values), [migrate it to a new device template](howto-edit-device-template.md#migrate-a-device-across-versions), [associate it with an organization](howto-create-organizations.md), and [generate a map to transform the incoming telemetry and properties](howto-map-data.md).
+
+You can also set writable properties and cloud properties that are defined in the device template, and call commands on the device.
To manage IoT Edge devices, you can use the IoT Central UI to[create and edit deployment manifests](concepts-iot-edge.md#iot-edge-deployment-manifests-and-iot-central-device-templates), and then deploy them to your IoT Edge devices. You can also run commands in IoT Edge modules from within IoT Central.
-If your IoT Central application uses *organizations*, an administrator controls which devices you have access to.
+Use the **Jobs** page to manage your devices in bulk. Jobs can update properties, run commands, or assign a new device template on multiple devices. To learn more, see [Manage devices in bulk in your Azure IoT Central application](howto-manage-devices-in-bulk.md).
-## Troubleshoot and remediate issues
+> [!TIP]
+> If your IoT Central application uses *organizations*, an administrator controls which devices you have access to.
-The [troubleshooting guide](troubleshoot-connection.md) helps you to diagnose and remediate common issues. You can use the **Devices** page to block devices that appear to be malfunctioning until the problem is resolved.
+## Monitor your devices
+
+To monitor individual devices, use the custom device views on the **Devices** page. A solution builder defines these custom views as part of the [device template](concepts-device-templates.md). These views can show device telemetry and property values. An example is the **Overview** view shown in the following screenshot:
+
-## Add and remove devices
+To monitor aggregate data from multiple devices, use device groups and the **Data explorer** page. To learn more, see [How to use data explorer to analyze device data](howto-create-analytics.md).
-You can add and remove devices in your IoT Central application either individually or in bulk. To learn more, see:
+## Customize
-- [Manage individual devices in your Azure IoT Central application](howto-manage-devices-individually.md).-- [Manage devices in bulk in your Azure IoT Central application](howto-manage-devices-in-bulk.md).
+You can further customize the device management and monitoring experience using the following tools:
-## Personalize
+- Create more views to display on the **Devices** page for individual devices by adding view definitions to your [device templates](concepts-device-templates.md).
+- Customize the text that describes your devices in the application. To learn more, see [Change application text](howto-customize-ui.md#change-application-text).
+- Create [custom device management dashboards](howto-manage-dashboards.md). A dashboard can include a [pinned query](howto-manage-dashboards.md#pin-analytics-to-dashboard) from the **Data explorer**.
-Create personal dashboards in an IoT Central application that contain links to the resources you use most often. To learn more, see [Manage dashboards](howto-manage-dashboards.md).
+## Automate
+
+To automate device management tasks, you can use:
+
+- Rules to trigger actions automatically when device data that you're monitoring reaches predefined thresholds. To learn more, see [Configure rules](howto-configure-rules.md).
+- [Job scheduling](howto-manage-devices-in-bulk.md#create-and-run-a-job) for regular device management tasks.
+- The Azure CLI to manage your devices from a scripting environment. To learn more, see [az iot central](/cli/azure/iot/central).
+- The IoT Central REST API to manage your devices programmatically. To learn more, see [How to use the IoT Central REST API to manage devices](howto-manage-devices-with-rest-api.md).
+Rules, CLI, REST API, job schedule
+
+## Troubleshoot and remediate device issues
+
+The [troubleshooting guide](troubleshoot-connection.md) helps you to diagnose and remediate common issues. You can use the **Devices** page to block devices that appear to be malfunctioning until the problem is resolved.
## Next steps
iot-central Tutorial Smart Meter App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/energy/tutorial-smart-meter-app.md
Title: Tutorial - Azure IoT smart meter monitoring | Microsoft Docs description: This tutorial shows you how to deploy and use the smart meter monitoring application template for IoT Central.--++ Last updated 12/23/2021
iot-central Tutorial Solar Panel App https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-central/energy/tutorial-solar-panel-app.md
Title: Tutorial - Azure IoT solar panel monitoring | Microsoft Docs description: This tutorial shows you how to deploy and use the solar panel monitoring application template for IoT Central.--++ Last updated 12/23/2021
iot-edge Version History https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/iot-edge/version-history.md
Azure IoT Edge is a product built from the open-source IoT Edge project hosted o
The IoT Edge documentation on this site is available for two different versions of the product, so that you can choose the content that applies to your IoT Edge environment. Currently, the two supported versions are:
-* **IoT Edge 1.2** contains content for new features and capabilities that are in the latest stable release. This version of the documentation also contains content for the IoT Edge for Linux on Windows (EFLOW) continuous release version, which is based on IoT Edge 1.2 and contains the latest features and capabilities.
+* **IoT Edge 1.2** contains content for new features and capabilities that are in the latest stable release. This version of the documentation also contains content for the IoT Edge for Linux on Windows (EFLOW) continuous release version, which is based on IoT Edge 1.2 and contains the latest features and capabilities. IoT Edge 1.2 is now bundled with the [Microsoft Defender for IoT micro-agent for Edge](/azure/defender-for-iot/device-builders/overview).
* **IoT Edge 1.1 (LTS)** is the first long-term support (LTS) version of IoT Edge. The documentation for this version covers all features and capabilities from all previous versions through 1.1. This version of the documentation also contains content for the IoT Edge for Linux on Windows long-term support version, which is based on IoT Edge 1.1 LTS. * This documentation version will be stable through the supported lifetime of version 1.1, and won't reflect new features released in later versions. IoT Edge 1.1 LTS will be supported until December 3, 2022 to match the [.NET Core 3.1 release lifecycle](https://dotnet.microsoft.com/platform/support/policy/dotnet-core).
This table provides recent version history for IoT Edge package releases, and hi
| Release notes and assets | Type | Date | Highlights | | | - | - | - |
-| [1.2](https://github.com/Azure/azure-iotedge/releases/tag/1.2.0) | Stable | April 2021 | [IoT Edge devices behind gateways](how-to-connect-downstream-iot-edge-device.md?view=iotedge-2020-11&preserve-view=true)<br>[IoT Edge MQTT broker (preview)](how-to-publish-subscribe.md?view=iotedge-2020-11&preserve-view=true)<br>New IoT Edge packages introduced, with new installation and configuration steps. For more information, see [Update from 1.0 or 1.1 to 1.2](how-to-update-iot-edge.md#special-case-update-from-10-or-11-to-12)
+| [1.2](https://github.com/Azure/azure-iotedge/releases/tag/1.2.0) | Stable | April 2021 | [IoT Edge devices behind gateways](how-to-connect-downstream-iot-edge-device.md?view=iotedge-2020-11&preserve-view=true)<br>[IoT Edge MQTT broker (preview)](how-to-publish-subscribe.md?view=iotedge-2020-11&preserve-view=true)<br>New IoT Edge packages introduced, with new installation and configuration steps. For more information, see [Update from 1.0 or 1.1 to 1.2](how-to-update-iot-edge.md#special-case-update-from-10-or-11-to-12).<br>Includes [Microsoft Defender for IoT micro-agent for Edge](/azure/defender-for-iot/device-builders/overview.md).
| [1.1](https://github.com/Azure/azure-iotedge/releases/tag/1.1.0) | Long-term support (LTS) | February 2021 | [Long-term support plan and supported systems updates](support.md) | | [1.0.10](https://github.com/Azure/azure-iotedge/releases/tag/1.0.10) | Stable | October 2020 | [UploadSupportBundle direct method](how-to-retrieve-iot-edge-logs.md#upload-support-bundle-diagnostics)<br>[Upload runtime metrics](how-to-access-built-in-metrics.md)<br>[Route priority and time-to-live](module-composition.md#priority-and-time-to-live)<br>[Module startup order](module-composition.md#configure-modules)<br>[X.509 manual provisioning](how-to-provision-single-device-linux-x509.md) | | [1.0.9](https://github.com/Azure/azure-iotedge/releases/tag/1.0.9) | Stable | March 2020 | X.509 auto-provisioning with DPS<br>[RestartModule direct method](how-to-edgeagent-direct-method.md#restart-module)<br>[support-bundle command](troubleshoot.md#gather-debug-information-with-support-bundle-command) |
load-balancer Quickstart Load Balancer Standard Public Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-balancer/quickstart-load-balancer-standard-public-cli.md
Create a network security group rule using [az network nsg rule create](/cli/azu
## Create a bastion host
-In this section, you'll create te resources for Azure Bastion. Azure Bastion is used to securely manage the virtual machines in the backend pool of the load balancer.
+In this section, you'll create the resources for Azure Bastion. Azure Bastion is used to securely manage the virtual machines in the backend pool of the load balancer.
### Create a public IP address
load-testing How To Move Between Regions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/load-testing/how-to-move-between-regions.md
+
+ Title: Move an Azure Load testing resource to another region
+
+description: Learn how to move an Azure Load testing resource to another region.
+++++ Last updated : 04/12/2022+++
+# Move an Azure Load Testing Preview resource to another region
+
+This article describes how to move your Azure Load Testing Preview resource to another Azure region. You might want to move your resource for a number of reasons. For example, to take advantage of a new Azure region, to meet internal policy and governance requirements, or in response to capacity planning requirements.
+
+Azure Load Testing resources are region-specific and can't be moved across regions automatically. You can use an Azure Resource Manager template (ARM template) to export the existing configuration of your Load Testing resource instead. Then, stage the resource in another region and create the tests in the new resource.
+
+## Prerequisites
+
+- Make sure that the target region supports Azure Load Testing.
+
+- Have access to the tests in the resource you're migrating.
+
+## Prepare
+
+To get started, you'll need to export and then modify an ARM template. You will also need to download artifacts for any exiting tests in the resource.
+
+1. Export the ARM template that contains settings and information for your Azure Load Testing resource by following the steps mentioned [here](/azure/azure-resource-manager/templates/export-template-portal).
+
+1. Download the input artifacts for all the existing tests from the resource. Navigate to the **Tests** section in the resource and then click on the test name. **Download the input file** for the test by clicking the More button (...) on the right side of the latest test run.
+
+ :::image type="content" source="media/how-to-move-an-azure-load-testing-resource/download-input-artifacts.png" alt-text="Screenshot that shows how to download input files for a test.":::
+
+> [!NOTE]
+> If you are using an Azure Key Vault to configure secrets for your load test, you can continue to use the same Key Vault.
+
+## Move
+
+Load and modify the template so you can create a new Azure Load Testing resource in the target region and then create tests in the new resource.
+
+### Move the resource
+
+1. In the Azure portal, select **Create a resource**.
+
+1. In the Marketplace, search for **template deployment**. Select **Template deployment (deploy using custom templates)**.
+
+1. Select **Create**.
+
+1. Select **Build your own template in the editor**.
+
+1. Select **Load file**, and then select the template.json file that you downloaded in the last section.
+
+1. In the uploaded template.json file, name the target Azure Load Testing resource by entering a new **defaultValue** for the resource name This example sets the defaultValue of the resource name to `myLoadTestResource`.
+
+ ```json
+ {
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "parameters": {
+ "loadtest_name": {
+ "defaultValue": "myLoadTestResource",
+ "type": "String"
+ }
+ },
+ ```
+
+1. Edit the **location** property to use your target region. This example sets the target region to `eastus`.
+
+ ```json
+ "resources": [
+ {
+ "type": "Microsoft.LoadTestService/loadtests",
+ "apiVersion": "2021-12-01-preview",
+ "name": "[parameters('loadtest_name')]",
+ "location": "eastus",
+ ```
+ To obtain region location codes, see [Azure Locations](https://azure.microsoft.com/global-infrastructure/data-residency/). The code for a region is the region name with no spaces. For example, East US = eastus.
+
+1. Click on **Save**.
+
+1. Enter the **Subscription** and **Resource group** for the target resource.
+
+1. Select **Review and create**, then select **Create**.
+
+### Create tests
+
+Once the resource is created in the target location, you can create new tests by following the steps mentioned [here](/azure/load-testing/quickstart-create-and-run-load-test#create_test).
+
+1. You can refer to the test configuration in the config.yaml file of the input artifacts downloaded earlier.
+
+1. Upload the Apache JMeter script and optional configuration files from the downloaded input artifacts.
+
+If you are invoking the previous Azure Load Testing resource in a CI/CD workflow you can update the `loadTestResource` parameter in the [Azure Load testing task](/azure/devops/pipelines/tasks/test/azure-load-testing) or [Azure Load Testing action](https://github.com/marketplace/actions/azure-load-testing) of your workflow.
+
+> [!NOTE]
+> If you have configured any of your load test with secrets from Azure Key Vault, make sure to grant the new resource access to the Key Vault following the steps mentioned [here](/azure/load-testing/how-to-use-a-managed-identity?tabs=azure-portal#grant-access-to-your-azure-key-vault).
+
+## Clean up source resources
+
+After the move is complete, delete the Azure Load Testing resource from the source region. You pay for resources, even when the resource is not being utilized.
+
+1. In the Azure portal, search and select **Azure Load Testing**.
+
+1. Select your Azure Load Testing resource.
+
+1. On the resource overview page, Select **Delete**, and then confirm.
+
+> [!NOTE]
+> Test results for the test runs in the previous resource will be lost once the resource is deleted.
+
+## Next steps
+
+- Learn how to run high-scale load tests, see [Set up a high-scale load test](./how-to-high-scale-load.md).
machine-learning How To Configure Cli https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-configure-cli.md
Previously updated : 03/31/2022 Last updated : 04/08/2022
machine-learning How To Identity Based Data Access https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/machine-learning/how-to-identity-based-data-access.md
blob_dset = Dataset.File.from_files('https://myblob.blob.core.windows.net/may/ke
When you submit a training job that consumes a dataset created with identity-based data access, the managed identity of the training compute is used for data access authentication. Your Azure Active Directory token isn't used. For this scenario, ensure that the managed identity of the compute is granted at least the Storage Blob Data Reader role from the storage service. For more information, see [Set up managed identity on compute clusters](how-to-create-attach-compute-cluster.md#managed-identity).
+## Access data for training jobs on compute clusters (preview)
+++
+When training on [Azure Machine Learning compute clusters](how-to-create-attach-compute-cluster.md#what-is-a-compute-cluster), you can authenticate to storage with your Azure Active Directory token.
+
+This authentication mode allows you to:
+* Set up fine-grained permissions, where different workspace users can have access to different storage accounts or folders within storage accounts.
+* Audit storage access because the storage logs show which identities were used to access data.
+
+> [!WARNING]
+> This functionality has the following limitations
+> * Feature is only supported for experiments submitted via the [Azure Machine Learning CLI v2 (preview)](how-to-configure-cli.md)
+> * Only CommandJobs, and PipelineJobs with CommandSteps and AutoMLSteps are supported
+> * User identity and compute managed identity cannot be used for authentication within same job.
+
+The following steps outline how to set up identity-based data access for training jobs on compute clusters.
+
+1. Grant the user identity access to storage resources. For example, grant StorageBlobReader access to the specific storage account you want to use or grant ACL-based permission to specific folders or files in Azure Data Lake Gen 2 storage.
+
+1. Create an Azure Machine Learning datastore without cached credentials for the storage account. If a datastore has cached credentials, such as storage account key, those credentials are used instead of user identity.
+
+1. Submit a training job with property **identity** set to **type: user_identity**, as shown in following job specification. During the training job, the authentication to storage happens via the identity of the user that submits the job.
+
+> [!NOTE]
+> If the **identity** property is left unspecified and datastore does not have cached credentials, then compute managed identity becomes the fallback option.
+
+```yaml
+command: |
+ echo "--census-csv: ${{inputs.census_csv}}"
+ python hello-census.py --census-csv ${{inputs.census_csv}}
+code: src
+inputs:
+ census_csv:
+ type: uri_file
+ path: azureml://datastores/mydata/paths/census.csv
+environment: azureml:AzureML-sklearn-1.0-ubuntu20.04-py38-cpu@latest
+compute: azureml:cpu-cluster
+identity:
+ type: user_identity
+```
+ ## Next steps * [Create an Azure Machine Learning dataset](how-to-create-register-datasets.md)
marketplace Azure Vm Image Test https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/marketplace/azure-vm-image-test.md
Previously updated : 02/01/2022 Last updated : 04/11/2022 # Test a virtual machine image
There are two ways to run validations on the deployed image.
### Use Certification Test Tool for Azure Certified
+> [!IMPORTANT]
+> To run the certification test tool, the Windows Remote Management service must be running and configured on Windows. This enables access to port 5986.
+ #### Download and run the certification test tool The Certification Test Tool for Azure Certified runs on a local Windows machine but tests an Azure-based Windows or Linux VM. It certifies that your user VM image can be used with Microsoft Azure and that the guidance and requirements around preparing your VHD have been met.
marketplace Isv Csp Reseller https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/marketplace/isv-csp-reseller.md
The offer setup page lets you define private offer terms, notification contact,
2. Provide up to five emails as **Notification Contacts** to receive email updates on the status of your private offer. These emails are sent when your private offer moves to **Live**, **Ended**, or is **Withdrawn**.
-3. Configure the percentage-based margins or absolute **Pricing** for up to ten offers/plans in a private offer.
-
- - *Percentage-based* margin can be given at an offer level so it applies to all plans within the offer, or it can be given only for a specific plan. The margin the CSP partner receives will be a percentage off your plan's list price in the marketplace.
- - *Absolute price* can be used to specify a price point higher, lower, or equal to the publicly listed plan price; it can only be applied at a plan level and does not apply to Virtual Machine offer types. You can only customize the price based on the same pricing model, billing term, and dimensions of the public offer; you cannot change to a new pricing model or billing term or add dimensions.<br><br>
+3. Configure the percentage-based margins <!-- or absolute **Pricing** -->for up to ten offers/plans in a private offer. Margin can be given at an offer level so it applies to all plans within the offer, or it can be given only for a specific plan. The margin the CSP partner receives will be a percentage off your plan's list price in the marketplace.
+ <! *Absolute price* can be used to specify a price point higher, lower, or equal to the publicly listed plan price; it can only be applied at a plan level and does not apply to Virtual Machine offer types. You can only customize the price based on the same pricing model, billing term, and dimensions of the public offer; you cannot change to a new pricing model or billing term or add dimensions.<br><br> -->
1. Select **+ Add Offers/plans** to choose the offers/plans you want to provide a private offer for. 1. Choose to provide a custom price or margin at either an offer level (all current and future plans under that offer will have a margin associated to it) or at a plan level (only the plan you selected will have a private price associated with it). 1. Choose up to ten offers/plans and select **Add**.
- 1. Enter the margin percentage or configure the absolute price for each item in the pricing table.
+ 1. Enter the margin percentage <!-- or configure the absolute price -->for each item in the pricing table.
> [!NOTE] > Only offers/plans that are transactable in Microsoft AppSource or Azure Marketplace appear in the selection menu.
marketplace Test Saas Plan https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/marketplace/test-saas-plan.md
This article explains how to test a software as a service (SaaS) offer in previe
Here are some general guidelines to be aware of when youΓÇÖre testing your offer. -- If your SaaS offer supports metered billing using the commercial marketplace metering service, review and follow the testing best practices detailed in [Marketplace metered billing APIs](/partner-center-portal/saas-metered-billing.md).-- Review and follow the testing instructions in [Implementing a webhook on the SaaS service](/partner-center-portal/pc-saas-fulfillment-webhook#development-and-testing.md) to ensure your offer is successfully integrated with the APIs.
+- If your SaaS offer supports metered billing using the commercial marketplace metering service, review and follow the testing best practices detailed in [Marketplace metered billing APIs](/azure/marketplace/partner-center-portal/saas-metered-billing).
+- Review and follow the testing instructions in [Implementing a webhook on the SaaS service](/azure/marketplace/partner-center-portal/pc-saas-fulfillment-webhook#development-and-testing) to ensure your offer is successfully integrated with the APIs.
- If the Offer validation step resulted in warnings, a **View validation report** link appears on the **Offer overview** page. Be sure to review the report and address the issues before you select the **Go live** button. Otherwise, certification will most likely fail and delay your offer from going Live. - If you need to make changes after previewing and testing the offer, you can edit and resubmit to publish a new preview. For more information, see [Update an existing offer in the commercial marketplace](update-existing-offer.md).
migrate Migrate Replication Appliance https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/migrate/migrate-replication-appliance.md
The replication appliance needs access to these URLs.
https://management.chinacloudapi.cn | Used for replication management operations and coordination. *.services.visualstudio.com | Used for logging purposes (It is optional). time.windows.cn | Used to check time synchronization between system and global time.
-https://login.microsoftonline.cn <br/> https://secure.aadcdn.microsoftonline-p.cn <br/> https:\//login.live.com <br/> https://graph.chinacloudapi.cn <br/> https://login.chinacloudapi.cn <br/> https://www.live.com <br/> https://www.microsoft.com | Appliance setup with OVA needs access to these URLs. They are used for access control and identity management by Azure Active Directory.
+https:\//login.microsoftonline.cn <br/> https:\//secure.aadcdn.microsoftonline-p.cn <br/> https:\//login.live.com <br/> https://graph.chinacloudapi.cn <br/> https://login.chinacloudapi.cn <br/> https://www.live.com <br/> https://www.microsoft.com | Appliance setup with OVA needs access to these URLs. They are used for access control and identity management by Azure Active Directory.
https://dev.mysql.com/get/Downloads/MySQLInstaller/mysql-installer-community-5.7.20.0.msi | To complete MySQL download. In a few regions, the download might be redirected to the CDN URL. Ensure that the CDN URL is also allowed if needed. ## Port access
networking Troubleshoot Failed State https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/networking/troubleshoot-failed-state.md
+
+ Title: 'Troubleshoot Azure Microsoft.Network failed Provisioning State'
+description: Learn how to troubleshoot Azure Microsoft.Network failed Provisioning State.
+++++ Last updated : 04/08/2022++++
+# Troubleshoot Azure Microsoft.Network failed provisioning state
+
+This article helps understand the meaning of various provisioning states for Microsoft.Network resources and how to effectively troubleshoot situations when the state is **Failed**.
++
+## Provisioning states
+
+The provisioning state is the status of a user-initiated, control-plane operation on an Azure Resource Manager resource.
+
+| Provisioning state | Description |
+|||
+| Updating | Resource is being created or updated. |
+| Failed | Last operation on the resource was not successful. |
+| Succeeded | Last operation on the resource was successful. |
+| Deleting | Resource is being deleted. |
+| Migrating | Seen when migrating from Azure Service Manager to Azure Resource Manager. |
+
+These states are just metadata properties of the resource and are independent from the functionality of the resource itself.
+Being in a failed state does not necessarily mean that the resource is not functional, in fact in most cases it can continue operating and servicing traffic without issues.
+
+However in several scenarios further operations on the resource or on other resources that depend on it may fail if the resource is in failed state, so the state needs to be reverted back to succeeded before executing other operations.
+
+For example, you cannot execute an operation on a VirtualNetworkGateway if it has a dependent VirtualNetworkGatewayConnection object in failed state and viceversa.
+
+## Restoring succeeded state through a PUT operation
+
+The correct way to restore succeeded state is to execute another write (PUT) operation on the resource.
+
+Most times, the issue that caused the previous operation might no longer be current, hence the newer write operation should be successful and restore the provisioning state.
+
+The easiest way to achieve this task is to use Azure PowerShell. You will need to issue a resource-specific "Get" command that fetches all the current configuration for the impacted resource as it is deployed. Next, you can execute a "Set" command (or equivalent) to commit to Azure a write operation containing all the resource properties as they are currently configured.
+
+> [!IMPORTANT]
+> 1. Executing a "Set" command on the resource without running a "Get" first will result in overwriting the resource with default settings which might be different from those you currently have configured. Do not just run a "Set" command unless resetting settings is intentional.
+> 2. Executing a "Get" and "Set" operation using third party software or otherwise any tool using older API version may also result in loss of some settings, as those may not be present in the API version with which you have executed the command.
+>
+## Azure PowerShell cmdlets to restore succeeded provisioning state
++
+### Preliminary operations
+
+1. Install the latest version of the Azure Resource Manager PowerShell cmdlets. For more information, see [Install and configure Azure PowerShell](/powershell/azure/install-az-ps).
+
+2. Open your PowerShell console with elevated privileges, and connect to your account. Use the following example to help you connect:
+
+ ```azurepowershell-interactive
+ Connect-AzAccount
+ ```
+3. If you have multiple Azure subscriptions, check the subscriptions for the account.
+
+ ```azurepowershell-interactive
+ Get-AzSubscription
+ ```
+4. Specify the subscription that you want to use.
+
+ ```azurepowershell-interactive
+ Select-AzSubscription -SubscriptionName "Replace_with_your_subscription_name"
+ ```
+5. Run the resource-specific commands listed below to reset the provisioning state to succeeded.
+
+> [!NOTE]
+>Every sample command in this article uses "your_resource_name" for the name of the Resource and "your_resource_group_name" for the name of the Resource Group. Make sure to replace these strings with the appropriate Resource and Resource Group names according to your deployment.
+
+### Microsoft.Network/applicationGateways
+
+```azurepowershell-interactive
+Get-AzApplicationGateway -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzApplicationGateway
+```
+
+### Microsoft.Network/applicationGatewayWebApplicationFirewallPolicies
+
+```azurepowershell-interactive
+Get-AzApplicationGatewayFirewallPolicy -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzApplicationGatewayFirewallPolicy
+```
+### Microsoft.Network/azureFirewalls
+
+```azurepowershell-interactive
+Get-AzFirewall -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzFirewall
+```
+### Microsoft.Network/bastionHosts
+
+```azurepowershell-interactive
+$bastion = Get-AzBastion -Name "your_resource_name" -ResourceGroupName "your_resource_group_name"
+Set-AzBastion -InputObject $bastion
+```
+
+### Microsoft.Network/connections
+
+```azurepowershell-interactive
+Get-AzVirtualNetworkGatewayConnection -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzVirtualNetworkGatewayConnection
+
+```
+
+### Microsoft.Network/expressRouteCircuits
+
+```azurepowershell-interactive
+Get-AzExpressRouteCircuit -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzExpressRouteCircuit
+```
+
+### Microsoft.Network/expressRouteGateways
+
+```azurepowershell-interactive
+Get-AzExpressRouteGateway -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzExpressRouteGateway
+```
+
+> [!NOTE]
+> **Microsoft.Network/expressRouteGateways** are those gateways deployed within a Virtual WAN. If you have a standalone gateway of ExpressRoute type in your Virtual Network you need to execute the commands related to [Microsoft.Network/virtualNetworkGateways](#microsoftnetworkvirtualnetworkgateways).
+
+### Microsoft.Network/expressRoutePorts
+
+```azurepowershell-interactive
+Get-AzExpressRoutePort -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzExpressRoutePort
+```
+
+### Microsoft.Network/firewallPolicies
+
+```azurepowershell-interactive
+Get-AzFirewallPolicy -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzFirewallPolicy
+```
+
+### Microsoft.Network/loadBalancers
+
+```azurepowershell-interactive
+Get-AzLoadBalancer -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzLoadBalancer
+```
+
+### Microsoft.Network/localNetworkGateways
+
+```azurepowershell-interactive
+Get-AzLocalNetworkGateway -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzLocalNetworkGateway
+```
+
+### Microsoft.Network/natGateways
+
+```azurepowershell-interactive
+Get-AzNatGateway -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzNatGateway
+```
+
+### Microsoft.Network/networkInterfaces
+
+```azurepowershell-interactive
+Get-AzNetworkInterface -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzNetworkInterface
+```
+
+### Microsoft.Network/networkSecurityGroups
+
+```azurepowershell-interactive
+Get-AzNetworkSecurityGroup -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzNetworkSecurityGroup
+```
+
+### Microsoft.Network/networkVirtualAppliances
+
+```azurepowershell-interactive
+Get-AzNetworkVirtualAppliance -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Update-AzNetworkVirtualAppliance
+```
+> [!NOTE]
+> Most Virtual WAN related resources such as networkVirtualAppliances leverage the "Update" cmdlet and not the "Set" for write operations.
+>
+### Microsoft.Network/privateDnsZones
+
+```azurepowershell-interactive
+Get-AzPrivateDnsZone -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzPrivateDnsZone
+```
+
+### Microsoft.Network/privateEndpoints
+
+```azurepowershell-interactive
+Get-AzPrivateEndpoint -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzPrivateEndpoint
+```
+
+### Microsoft.Network/privateLinkServices
+
+```azurepowershell-interactive
+Get-AzPrivateLinkService -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzPrivateLinkService
+```
+
+### Microsoft.Network/publicIpAddresses
+
+```azurepowershell-interactive
+Get-AzPublicIpAddress -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzPublicIpAddress
+```
+
+### Microsoft.Network/routeFilters
+
+```azurepowershell-interactive
+Get-AzRouteFilter -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzRouteFilter
+```
+
+### Microsoft.Network/routeTables
+
+```azurepowershell-interactive
+Get-AzRouteTable -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzRouteTable
+```
+
+### Microsoft.Network/virtualHubs
+
+```azurepowershell-interactive
+Get-AzVirtualHub -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Update-AzVirtualHub
+```
+> [!NOTE]
+> Most Virtual WAN related resources such as virtualHubs leverage the "Update" cmdlet and not the "Set" for write operations.
+>
+### Microsoft.Network/virtualNetworkGateways
+
+```azurepowershell-interactive
+Get-AzVirtualNetworkGateway -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzVirtualNetworkGateway
+```
+
+### Microsoft.Network/virtualNetworks
+
+```azurepowershell-interactive
+Get-AzVirtualNetwork -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Set-AzVirtualNetwork
+```
+
+### Microsoft.Network/virtualWans
+
+```azurepowershell-interactive
+Get-AzVirtualWan -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Update-AzVirtualWan
+```
+> [!NOTE]
+> Most Virtual WAN related resources such as virtualWans leverage the "Update" cmdlet and not the "Set" for write operations.
+
+### Microsoft.Network/vpnGateways
+
+```azurepowershell-interactive
+Get-AzVpnGateway -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Update-AzVpnGateway
+```
+> [!NOTE]
+> 1. **Microsoft.Network/vpnGateways** are those gateways deployed within a Virtual WAN. If you have a standalone gateway of VPN type in your Virtual Network you need to execute the commands related to [Microsoft.Network/virtualNetworkGateways](#microsoftnetworkvirtualnetworkgateways).
+> 2. Most Virtual WAN related resources such as vpnGateways leverage the "Update" cmdlet and not the "Set" for write operations.
+
+### Microsoft.Network/vpnSites
+
+```azurepowershell-interactive
+Get-AzVpnSite -Name "your_resource_name" -ResourceGroupName "your_resource_group_name" | Update-AzVpnSite
+```
+> [!NOTE]
+> Most Virtual WAN related resources such as vpnSites leverage the "Update" cmdlet and not the "Set" for write operations.
+++
+## Next steps
+
+If the command executed didn't fix the failed state, it should return an error code for you.
+Most error codes contain a detailed description of what the problem might be and offer hints to solve it.
+
+Open a support ticket with [Microsoft support](https://portal.azure.com/?#blade/Microsoft_Azure_Support/HelpAndSupportBlade) if you're still experiencing issues.
+Make sure you specify to the Support Agent both the error code you received in the latest operation, as well as the timestamp of when the operation was executed.
private-link Availability https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/private-link/availability.md
The following tables list the Private Link services and the regions where they'r
|Azure Web Apps | All public regions<br/> China North 2 & East 2 | Supported with PremiumV2, PremiumV3, or Function Premium plan | GA <br/> [Learn how to create a private endpoint for Azure Web Apps.](./tutorial-private-endpoint-webapp-portal.md) | |Azure Search | All public regions <br/> All Government regions | Supported with service in Private Mode | GA <br/> [Learn how to create a private endpoint for Azure Search.](../search/service-create-private-endpoint.md) | |Azure Relay | All public regions | | Preview <br/> [Learn how to create a private endpoint for Azure Relay.](../azure-relay/private-link-service.md) |
+|Azure Static Web Apps | All public regions | | Preview <br/> [Configure private endpoint in Azure Static Web Apps](../static-web-apps/private-endpoint.md) |
### Private Link service with a standard load balancer
purview How To Automatically Label Your Content https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-automatically-label-your-content.md
Title: How to automatically apply sensitivity labels to your data in Azure Purview description: Learn how to create sensitivity labels and automatically apply them to your data during a scan.--++
purview How To Search Catalog https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/purview/how-to-search-catalog.md
Previously updated : 01/25/2022 Last updated : 04/11/2022 # Search the Azure Purview Data Catalog After data is scanned and ingested into the Azure Purview data map, data consumers need to easily find the data needed for their analytics or governance workloads. Data discovery can be time consuming because you may not know where to find the data that you want. Even after finding the data, you may have doubts about whether you can trust the data and take a dependency on it.
-The goal of search in Azure Purview is to speed up the process of data discovery to quickly find the data that matters. This article outlines how to search the Azure Purview data catalog to quickly find the data you're looking for.
+The goal of search in Azure Purview is to speed up the process of quickly finding the data that matters. This article outlines how to search the Azure Purview data catalog to quickly find the data you're looking for.
-## Search the catalog for assets
+## Searching the catalog
The search bar can be quickly accessed from the top bar of the Azure Purview Studio UX. In the data catalog home page, the search bar is in the center of the screen. :::image type="content" source="./media/how-to-search-catalog/purview-search-bar.png" alt-text="Screenshot showing the location of the Azure Purview search bar" border="true":::
-Once you click on the search bar, you'll be presented with your search history and the assets recently accessed in the data catalog. This allows you to quickly pick up from previous data exploration that was already done.
+Once you click on the search bar, you'll be presented with your search history and the items recently accessed in the data catalog. This allows you to quickly pick up from previous data exploration that was already done.
:::image type="content" source="./media/how-to-search-catalog/search-no-keywords.png" alt-text="Screenshot showing the search bar before any keywords have been entered" border="true":::
-Enter in keywords that help identify your asset such as its name, data type, classifications, and glossary terms. As you enter in search keywords, Azure Purview dynamically suggests assets and searches that may fit your needs. To complete your search, click on "View search results" or press "Enter".
+Enter in keywords that help narrow down your search such as name, data type, classifications, and glossary terms. As you enter in search keywords, Azure Purview dynamically suggests results and searches that may fit your needs. To complete your search, click on "View search results" or press "Enter".
:::image type="content" source="./media/how-to-search-catalog/search-keywords.png" alt-text="Screenshot showing the search bar as a user enters in keywords" border="true":::
-Once you enter in your search, Azure Purview returns a list of data assets a user is a data reader for to that matched to the keywords entered in.
+Once you enter in your search, Azure Purview returns a list of data assets and glossary terms a user is a data reader for to that matched to the keywords entered in.
The Azure Purview relevance engine sorts through all the matches and ranks them based on what it believes their usefulness is to a user. For example, a data consumer is likely more interested in a table curated by a data steward that matches on multiple keywords than an unannotated folder. Many factors determine an assetΓÇÖs relevance score and the Azure Purview search team is constantly tuning the relevance engine to ensure the top search results have value to you. If the top results donΓÇÖt include the assets you're looking for, you can use the facets on the left-hand side to filter down by business metadata such glossary terms, classifications, and the containing collection. If you're interested in a particular data source type such as Azure Data Lake Storage Gen2 or Azure SQL Database, you can use a pill filter to narrow down your search. > [!NOTE]
-> Search will only return assets in collections you're a data reader or curator for. For more information, see [create and manage Collections](how-to-create-and-manage-collections.md).
+> Search will only return items in collections you're a data reader or curator for. For more information, see [create and manage Collections](how-to-create-and-manage-collections.md).
:::image type="content" source="./media/how-to-search-catalog/search-results.png" alt-text="Screenshot showing the results of a search" border="true":::
The following table contains the operators that can be used to compose a search
| NOT | Specifies that an asset can't contain the keyword to the right of the NOT clause. Must be in all caps | The query `hive NOT database` returns assets that contain 'hive', but not 'database'. | | () | Groups a set of keywords and operators together. When combining multiple operators, parentheses specify the order of operations. | The query `hive AND (database OR warehouse)` returns assets that contain 'hive' and either 'database' or 'warehouse', or both. | | "" | Specifies exact content in a phrase that the query must match to. | The query `"hive database"` returns assets that contain the phrase "hive database" in their properties |
-| field:keyword | Searches the keyword in a specific attribute of an asset. Field search is case insensitive and is limited to the following fields at this time: <ul><li>name</li><li>description</li><li>entityType</li><li>assetType</li><li>classification</li><li>term</li><li>contact</li></ul> | The query `description: German` returns all assets that contain the word "German" in the description.<br><br>The query `term:Customer` will return all assets with glossary terms that include "Customer". |
+| field:keyword | Searches the keyword in a specific attribute of an asset. Field search is case insensitive and is limited to the following fields at this time: <ul><li>name</li><li>description</li><li>entityType</li><li>assetType</li><li>classification</li><li>term</li><li>contact</li></ul> | The query `description: German` returns all assets that contain the word "German" in the description.<br><br>The query `term:Customer` will return all assets with glossary terms that include "Customer" and all glossary terms that match to "Customer". |
> [!TIP]
-> Searching "*" will return all the assets in the catalog.
+> Searching "*" will return all the assets and glossary terms in the catalog.
### Known limitations * Searching for classifications only matches on the formal classification name. For example, the keywords "World Cities" don't match classification "MICROSOFT.GOVERNMENT.CITY_NAME".
-* Grouping isn't supported within a field search. Customers should use operators to connect field searches. For example,`name:(alice AND bob)` is invalid search syntax, but `name:alice AND name:bob` is supported.
+* Grouping isn't supported within a field search. Customers should use operators to connect field searches. For example,`name:(alice AND bob)` is invalid search syntax, but `name:alice AND name:bob` is supported.
## Next steps
search Cognitive Search How To Debug Skillset https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/search/cognitive-search-how-to-debug-skillset.md
Previously updated : 12/31/2021 Last updated : 04/10/2022 # Debug an Azure Cognitive Search skillset in Azure portal
-Start a debug session to identify and resolve errors, validate changes, and push changes to a published skillset in your Azure Cognitive Search service.
+Start a portal-based debug session to identify and resolve errors, validate changes, and push changes to a published skillset in your Azure Cognitive Search service.
A debug session is a cached indexer and skillset execution, scoped to a single document, that you can use to edit and test your changes interactively. If you're unfamiliar with how a debug session works, see [Debug sessions in Azure Cognitive Search](cognitive-search-debug-session.md). To practice a debug workflow with a sample document, see [Tutorial: Debug sessions](cognitive-search-tutorial-debug-sessions.md).
If skills produce output but the search index is empty, check the field mappings
:::image type="content" source="media/cognitive-search-debug/output-field-mappings.png" alt-text="Screenshot of the Output Field Mappings node and details." border="true":::
+## Debug a custom skill locally
+
+Custom skills can be more challenging to debug because the code runs externally. This section describes how to locally debug your Custom Web API skill, debug session, Visual Studio Code and [ngrok](https://ngrok.com/docs). This technique works with custom skills that execute in [Azure Functions](../azure-functions/functions-overview.md) or any other Web Framework that runs locally (for example, [FastAPI](https://fastapi.tiangolo.com/)).
+
+### Run ngrok
+
+[**ngrok**](https://ngrok.com/docs) is a cross-platform application that can create a tunneling or forwarding URL, so that internet requests reach your local machine. Use ngrok to forward requests from an enrichment pipeline in your search service to your machine to allow local debugging.
+
+1. Install ngrok.
+
+1. Open a terminal and go to the folder with the ngrok executable.
+
+1. Run ngrok with the following command to create a new tunnel:
+
+ ```console
+ ngrok http 7071
+ ```
+
+ > [!NOTE]
+ > By default, Azure Functions are exposed on 7071. Other tools and configurations might require that you provide a different port.
+
+1. When ngrok starts, copy and save the public forwarding URL for the next step. The forwarding URL is randomly generated.
+
+ :::image type="content" source="media/cognitive-search-debug/ngrok.png" alt-text="Screenshot of ngrok terminal." border="false":::
+
+### Configure in Azure portal
+
+Within the debug session, modify your Custom Web API Skill URI to call the ngrok forwarding URL. Ensure that you append "/api/FunctionName" when using Azure Function for executing the skillset code.
+
+You can edit the skill definition in the portal.
+
+### Test
+
+At this point, new requests from your debug session should now be sent to your local Azure Function. You can use breakpoints in your Visual Studio code to debug your code or run step by step.
+ ## Next steps Now that you understand the layout and capabilities of the Debug Sessions visual editor, try the tutorial for a hands-on experience.
sentinel Enable Entity Behavior Analytics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/sentinel/enable-entity-behavior-analytics.md
To enable or disable this feature (these prerequisites are not required to use t
- Your workspace must not have any Azure resource locks applied to it. [Learn more about Azure resource locking](../azure-resource-manager/management/lock-resources.md). > [!NOTE]
-> No special license is required to add UEBA functionality to Microsoft Sentinel, but **additional charges** may apply.
+> - No special license is required to add UEBA functionality to Microsoft Sentinel, and there's no additional cost for using it.
+> - However, since UEBA generates new data and stores it in new tables that UEBA creates in your Log Analytics workspace, **additional data storage charges** will apply.
## How to enable User and Entity Behavior Analytics
spatial-anchors Get Started Unity Android https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/spatial-anchors/quickstarts/get-started-unity-android.md
Follow the instructions [here](../how-tos/setup-unity-project.md#download-asa-pa
[!INCLUDE [Android Unity Build Settings](../../../includes/spatial-anchors-unity-android-build-settings.md)]
+## Configure the account information
[!INCLUDE [Configure Unity Scene](../../../includes/spatial-anchors-unity-configure-scene.md)] ## Export the Android Studio project
spatial-anchors Get Started Unity Hololens https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/spatial-anchors/quickstarts/get-started-unity-hololens.md
Follow the instructions [here](../how-tos/setup-unity-project.md#download-asa-pa
[!INCLUDE [Open Unity Project](../../../includes/spatial-anchors-open-unity-project.md)]
-Open **Build Settings** by selecting **File** > **Build Settings**.
--
-In the **Platform** section, select **Universal Windows Platform**. Change the **Target Device** to **HoloLens**.
-
-Select **Switch Platform** to change the platform to **Universal Windows Platform**. Unity might prompt you to install UWP support components if they're missing.
-
-![Unity Build Settings window](./media/get-started-unity-hololens/unity-build-settings.png)
-
-Close the **Build Settings** window.
+## Configure the account information
[!INCLUDE [Configure Unity Scene](../../../includes/spatial-anchors-unity-configure-scene.md)] ## Export the HoloLens Visual Studio project
spatial-anchors Get Started Unity Ios https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/spatial-anchors/quickstarts/get-started-unity-ios.md
Follow the instructions [here](../how-tos/setup-unity-project.md#download-asa-pa
[!INCLUDE [iOS Unity Build Settings](../../../includes/spatial-anchors-unity-ios-build-settings.md)]
+## Configure the account information
[!INCLUDE [Configure Unity Scene](../../../includes/spatial-anchors-unity-configure-scene.md)] ## Export the Xcode project
spatial-anchors Tutorial Share Anchors Across Devices https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/spatial-anchors/tutorials/tutorial-share-anchors-across-devices.md
In this tutorial, you'll learn how to:
[!INCLUDE [Create Spatial Anchors resource](../../../includes/spatial-anchors-get-started-create-resource.md)]
-## Download the sample project
+## Download the sample project + import SDK
+
+### Clone Samples Repo
[!INCLUDE [Clone Sample Repo](../../../includes/spatial-anchors-clone-sample-repository.md)]
+### Import ASA SDK
+
+Follow the instructions [here](../how-tos/setup-unity-project.md#download-asa-packages) to download and import the ASA SDK packages required for the HoloLens platform.
+ ## Deploy the Sharing Anchors service ## [Visual Studio](#tab/VS)
To deploy the sharing service through Visual Studio Code, follow the instruction
-## Deploy the sample app
+## Configure + deploy the sample app
[!INCLUDE [Run Share Anchors Sample](../../../includes/spatial-anchors-run-share-sample.md)]
spring-cloud Quickstart Sample App Introduction https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/spring-cloud/quickstart-sample-app-introduction.md
There are several common patterns in distributed systems that support core servi
## Database configuration
-In its default configuration, **Pet Clinic** uses an in-memory database (HSQLDB) which is populated at startup with data. A similar setup is provided for MySql if a persistent database configuration is needed. A dependency for Connector/J, the MySQL JDBC driver, is already included in the pom.xml files.
+In its default configuration, **Pet Clinic** uses an in-memory database (HSQLDB) which is populated at startup with data. A similar setup is provided for MySQL if a persistent database configuration is needed. A dependency for Connector/J, the MySQL JDBC driver, is already included in the pom.xml files.
## Sample usage of PetClinic
static-web-apps Build Configuration https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/static-web-apps/build-configuration.md
The following table lists the available configuration settings.
| `app_build_command` | For Node.js applications, you can define a custom command to build the static content application.<br><br>For example, to configure a production build for an Angular application create an npm script named `build-prod` to run `ng build --prod` and enter `npm run build-prod` as the custom command. If left blank, the workflow tries to run the `npm run build` or `npm run build:azure` commands. | No | | `api_build_command` | For Node.js applications, you can define a custom command to build the Azure Functions API application. | No | | `skip_app_build` | Set the value to `true` to skip building the front-end app. | No |
+| `skip_api_build` | Set the value to `true` to skip building the API functions. | No |
| `cwd`<br />(Azure Pipelines only) | Absolute path to the working folder. Defaults to `$(System.DefaultWorkingDirectory)`. | No | | `build_timeout_in_minutes` | Set this value to customize the build timeout. Defaults to `15`. | No |
inputs:
```
+## Skip building the API
+If you want to skip building the API, you can bypass the automatic build and deploy the API built in a previous step.
> [!NOTE]
-> You can only skip the build for the front-end app. The API is always built if it exists.
+> Currently the `skip_api_build` is only supported in GitHub Actions and not Azure Pipelines.
+
+Steps to skip building the API:
+
+- In the *staticwebapp.config.json* file, set `apiRuntime` to the correct language and version. Refer to [Configure Azure Static Web Apps](configuration.md#selecting-the-api-language-runtime-version) for the list of supported languages and versions.
+```json
+{
+ "platform": {
+ "apiRuntime": "node:16"
+ }
+}
+```
+- Set `skip_api_build` to `true`.
+
+# [GitHub Actions](#tab/github-actions)
+
+```yml
+...
+
+with:
+ azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_API_TOKEN }}
+ repo_token: ${{ secrets.GITHUB_TOKEN }}
+ action: 'upload'
+ app_location: "src" # App source code path relative to repository root
+ api_location: "api" # Api source code path relative to repository root - optional
+ output_location: "public" # Built app content directory, relative to app_location - optional
+ skip_api_build: true
+```
+# [Azure Pipelines](#tab/azure-devops)
+This feature is unsupported in Azure Pipelines.
++ ## Extend build timeout
storage Blob Upload Function Trigger https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/blob-upload-function-trigger.md
+
+ Title: Upload and analyze a file with Azure Functions and Blob Storage
+description: Learn how to upload an image to Azure Blob Storage and analyze its content using Azure Functions and Cognitive Services
++++ Last updated : 3/11/2022++
+# Tutorial: Upload and analyze a file with Azure Functions and Blob Storage
+
+In this tutorial, you'll learn how to upload an image to Azure Blob Storage and process it using Azure Functions and Computer Vision. You'll also learn how to implement Azure Function triggers and bindings as part of this process. Together, these services will analyze an uploaded image that contains text, extract the text out of it, and then store the text in a database row for later analysis or other purposes.
+
+Azure Blob Storage is Microsoft's massively scalable object storage solution for the cloud. Blob Storage is designed for storing images and documents, streaming media files, managing backup and archive data, and much more. You can read more about Blob Storage on the [overview page](/azure/storage/blobs/storage-blobs-introduction).
+
+Azure Functions is a serverless computer solution that allows you to write and run small blocks of code as highly scalable, serverless, event driven functions. You can read more about Azure Functions on the [overview page](/azure/azure-functions/functions-overview).
++
+In this tutorial, you will learn how to:
+
+> [!div class="checklist"]
+> * Upload images and files to Blob Storage
+> * Use an Azure Function event trigger to process data uploaded to Blob Storage
+> * Use Cognitive Services to analyze an image
+> * Write data to Table Storage using Azure Function output bindings
++
+## Prerequisites
+
+- An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
+- [Visual Studio 2022](https://visualstudio.microsoft.com) installed.
+
+
+## Create the storage account and container
+The first step is to create the storage account that will hold the uploaded blob data, which in this scenario will be images that contain text. A storage account offers several different services, but this tutorial utilizes Blob Storage and Table Storage.
+
+### [Azure portal](#tab/azure-portal)
+
+Sign in to the [Azure portal](https://ms.portal.azure.com/#create/Microsoft.StorageAccount).
+
+1) In the search bar at the top of the portal, search for *Storage* and select the result labeled **Storage accounts**.
+
+2) On the **Storage accounts** page, select **+ Create** in the top left.
+
+3) On the **Create a storage account** page, enter the following values:
+
+ - **Subscription**: Choose your desired subscription.
+ - **Resource Group**: Select **Create new** and enter a name of `msdocs-storage-function`, and then choose **OK**.
+ - **Storage account name**: Enter a value of `msdocsstoragefunction`. The Storage account name must be unique across Azure, so you may need to add numbers after the name, such as `msdocsstoragefunction123`.
+ - **Region**: Select the region that is closest to you.
+ - **Performance**: Choose **Standard**.
+ - **Redundancy**: Leave the default value selected.
+
+
+4) Select **Review + Create** at the bottom and Azure will validate the information you entered. Once the settings are validated, choose **Create** and Azure will begin provisioning the storage account, which might take a moment.
+
+### Create the container
+1) After the storage account is provisioned, select **Go to Resource**. The next step is to create a storage container inside of the account to hold uploaded images for analysis.
+
+2) On the navigation panel, choose **Containers**.
+
+3) On the **Containers** page, select **+ Container** at the top. In the slide out panel, enter a **Name** of *imageanalysis*, and make sure the **Public access level** is set to **Blob (anonymous read access for blobs only**. Then select **Create**.
++
+You should see your new container appear in the list of containers.
+
+### Retrieve the connection string
+
+The last step is to retrieve our connection string for the storage account.
+
+1) On the left navigation panel, select **Access Keys**.
+
+2) On the **Access Keys page**, select **Show keys**. Copy the value of the **Connection String** under the **key1** section and paste this somewhere to use for later. You'll also want to make a note of the storage account name `msdocsstoragefunction` for later as well.
++
+These values will be necessary when we need to connect our Azure Function to this storage account.
+
+### [Azure CLI](#tab/azure-cli)
+
+Azure CLI commands can be run in the [Azure Cloud Shell](https://shell.azure.com) or on a workstation with the [Azure CLI installed](/cli/azure/install-azure-cli).
+
+To create the storage account and container, we can run the CLI commands seen below.
+
+```azurecli-interactive
+az group create --location eastus --name msdocs-storage-function \
+
+az storage account create --name msdocsstorageaccount -resource-group msdocs-storage-function -l eastus --sku Standard_LRS \
+
+az storage container create --name imageanalysis --account-name msdocsstorageaccount --resource-group msdocs-storage-function
+```
+
+You may need to wait a few moments for Azure to provision these resources.
+
+After the commands complete, we also need to retrieve the connection string for the storage account. The connection string will be used later to connect our Azure Function to the storage account.
+
+```azurecli-interactive
+az storage account show-connection-string -g msdocs-storage-function -n msdocsstoragefunction
+```
+
+Copy the value of the `connectionString` property and paste it somewhere to use for later. You'll also want to make a note of the storage account name `msdocsstoragefunction` for later as well.
+++
+## Create the Computer Vision service
+Next, create the Computer Vision service account that will process our uploaded files. Computer Vision is part of Azure Cognitive Services and offers a variety of features for extracting data out of images. You can learn more about Computer Vision on the [overview page](/services/cognitive-services/computer-vision/#overview).
+
+### [Azure portal](#tab/azure-portal)
+
+1) In the search bar at the top of the portal, search for *Computer* and select the result labeled **Computer vision**.
+
+2) On the **Computer vision** page, select **+ Create**.
+
+3) On the **Create Computer Vision** page, enter the following values:
+
+- **Subscription**: Choose your desired Subscription.
+- **Resource Group**: Use the `msdocs-storage-function` resource group you created earlier.
+- **Region**: Select the region that is closest to you.
+- **Name**: Enter in a name of `msdocscomputervision`.
+- **Pricing Tier**: Choose **Free** if it is available, otherwise choose **Standard S1**.
+- Check the **Responsible AI Notice** box if you agree to the terms
+
+
+4) Select **Review + Create** at the bottom. Azure will take a moment validate the information you entered. Once the settings are validated, choose **Create** and Azure will begin provisioning the Computer Vision service, which might take a moment.
+
+5) When the operation has completed, select **Go to Resource**.
+
+### Retrieve the keys
+
+Next, we need to find the secret key and endpoint URL for the Computer Vision service to use in our Azure Function app.
+
+1) On the **Computer Vision** overview page, select **Keys and Endpoint**.
+
+2) On the **Keys and EndPoint** page, copy the **Key 1** value and the **EndPoint** values and paste them somewhere to use for later.
++
+### [Azure CLI](#tab/azure-cli)
+
+To create the Computer Vision service, we can run the CLI command below.
+
+```azurecli-interactive
+az cognitiveservices account create \
+ --name msdocs-process-image \
+ --resource-group msdocs-storage-function \
+ --kind ComputerVision \
+ --sku F1 \
+ --location eastus2 \
+ --yes
+```
+
+You may need to wait a few moments for Azure to provision these resources.
+
+Once the Computer Vision service is created, you can retrieve the secret keys and URL endpoint using the commands below.
+
+```azurelci-interactive
+ az cognitiveservices account keys list \
+ --name msdocs-process-image \
+ --resource-group msdocs-storage-function \
+
+ az cognitiveservices account list \
+ --name msdocs-process-image \
+ --resource-group msdocs-storage-function --query "[].properties.endpoint"
+```
++
+
+
+## Download and configure the sample project
+The code for the Azure Function used in this tutorial can be found in [this GitHub repository](https://github.com/Azure-Samples/msdocs-storage-bind-function-service/tree/main/dotnet). You can also clone the project using the command below.
+
+```terminal
+git clone https://github.com/Azure-Samples/msdocs-storage-bind-function-service.git \
+cd msdocs-storage-bind-function-service/dotnet
+```
+
+The sample project code accomplishes the following tasks:
+
+- Retrieves environment variables to connect to the storage account and Computer Vision service
+- Accepts the uploaded file as a blob parameter
+- Analyzes the blob using the Computer Vision service
+- Sends the analyzed image text to a new table row using output bindings
+
+Once you have downloaded and opened the project, there are a few essential concepts to understand in the main `Run` method shown below. The Azure function utilizes Trigger and Output bindings, which are applied using attributes on the `Run` method signature.
+
+The `Table` attribute uses two parameters. The first parameter specifies the name of the table to write the parsed image text value returned by the function. The second `Connection` parameter pulls a Table Storage connection string from the environment variables so that our Azure function has access to it.
+
+The `BlobTrigger` attribute is used to bind our function to the upload event in Blob Storage, and supplies that uploaded blob to the `Run` function. The blob trigger has two parameters of its own - one for the name of the blob container to monitor for uploads, and one for the connection string of our storage account again.
++
+```csharp
+// Azure Function name and output Binding to Table Storage
+[FunctionName("ProcessImageUpload")]
+[return: Table("ImageText", Connection = "StorageConnection")]
+// Trigger binding runs when an image is uploaded to the blob container below
+public async Task<ImageContent> Run([BlobTrigger("imageanalysis/{name}",
+ Connection = "StorageConnection")]Stream myBlob, string name, ILogger log)
+{
+ // Get connection configurations
+ string subscriptionKey = Environment.GetEnvironmentVariable("ComputerVisionKey");
+ string endpoint = Environment.GetEnvironmentVariable("ComputerVisionEndpoint");
+ string imgUrl = $"https://{ Environment.GetEnvironmentVariable("StorageAccountName")}
+ .blob.core.windows.net/imageanalysis/{name}";
+
+ ComputerVisionClient client = new ComputerVisionClient(
+ new ApiKeyServiceClientCredentials(subscriptionKey)) { Endpoint = endpoint };
+
+ // Get the analyzed image contents
+ var textContext = await AnalyzeImageContent(client, imgUrl);
+
+ return new ImageContent {
+ PartitionKey = "Images",
+ RowKey = Guid.NewGuid().ToString(), Text = textContext
+ };
+}
+
+public class ImageContent
+{
+ public string PartitionKey { get; set; }
+ public string RowKey { get; set; }
+ public string Text { get; set; }
+}
+```
+
+This code also retrieves essential configuration values from environment variables, such as the storage account connection string and Computer Vision key. We'll add these environment variables to our Azure Function environment after it's deployed.
+
+The `ProcessImage` function also utilizes a second method called `AnalyzeImage`, seen below. This code uses the URL Endpoint and Key of our Computer Vision account to make a request to that server to process our image. The request will return all of the text discovered in the image, which will then be written to Table Storage using the output binding on the `Run` method.
+
+```csharp
+static async Task<string> ReadFileUrl(ComputerVisionClient client, string urlFile)
+{
+ // Analyze the file using Computer Vision Client
+ var textHeaders = await client.ReadAsync(urlFile);
+ string operationLocation = textHeaders.OperationLocation;
+ Thread.Sleep(2000);
+
+ // Complete code omitted for brevity, view in sample project
+
+ return text.ToString();
+}
+```
+
+### Running locally
+
+If you'd like to run the project locally, you can populate the environment variables using the local.settings.json file. Inside of this file, fill in the placeholder values with the values you saved earlier when creating the Azure resources.
+
+Although the Azure Function code will run locally, it will still connect to the live services out on Azure, rather than using any local emulators.
+
+```javascript
+{
+ "IsEncrypted": false,
+ "Values": {
+ "AzureWebJobsStorage": "UseDevelopmentStorage=true",
+ "FUNCTIONS_WORKER_RUNTIME": "dotnet",
+ "StorageConnection": "your-storage-account-connection-string",
+ "StorageAccountName": "your-storage-account-name",
+ "ComputerVisionKey": "your-computer-vision-key",
+ "ComputerVisionEndPoint": "your-computer-vision-endpoint"
+ }
+}
+```
+
+## Deploy the code to Azure Functions
+
+You are now ready to deploy our application to Azure by using Visual Studio. You can also create the Azure Functions app in Azure at the same time as part of the deployment process.
+
+1) To begin, right select the **ProcessImage** project node and select **Publish**.
+
+2) On the **Publish** dialog screen, select Azure and choose **Next**.
+
+
+3) Select **Azure Function App (Windows)** or **Azure Function App (Linux)** on the next screen, and then choose **Next** again.
++
+4) On the **Functions instance** step, make sure to choose the subscription you'd like to deploy to. Next, select the green **+** symbol on the right side of the dialog.
+
+5) A new dialog will open. Enter the following values for your new Function App.
+
+- **Name**: Enter *msdocsprocessimage* or something similar.
+- **Subscription Name**: Choose whatever subscription you'd like to use.
+- **Resource Group**: Choose the `msdocs-storage-function` resource group you created earlier.
+- **Plan Type**: Select **Consumption**.
+- **Location**: Choose the region closest to you.
+- **Azure Storage**: Select the storage account you created earlier.
++
+6) Once you have filled in all of those values, select **Create**. Visual Studio and Azure will begin provisioning the requested resources, which will take a few moments to complete.
+
+7) Once the process has finished, select **Finish** to close out the dialog workflow.
+
+8) The final step to deploy the Azure Function is to select **Publish** in the upper right of the screen. Publishing the function might also take a few moments to complete. Once it finishes, your application will be running on Azure.
+
+## Connect the services
+
+The Azure Function was deployed successfully, but it cannot connect to our storage account and Computer Vision services yet. The correct keys and connection strings must first be added to the configuration settings of the Azure Functions app.
+
+1) At the top of the Azure portal, search for *function* and select **Function App** from the results.
+
+2) On the **Function App** screen, select the Function App you created in Visual Studio.
+
+3) On the **Function App** overview page, select **Configuration** on the left navigation. This will open up a page where we can manage various types of configuration settings for our app. For now, we are interested in **Application Settings** section.
+
+4) The next step is to add settings for our storage account name and connection string, the Computer Vision secret key, and the Computer Vision endpoint.
+
+5) On the **Application settings** tab, select **+ New application setting**. In the flyout that appears, enter the following values:
+
+- **Name**: Enter a value of *ComputerVisionKey*.
+- **Value**: Paste in the Computer Vision key you saved from earlier.
+
+6) Click **OK** to add this setting to your app.
++
+7) Next, let's repeat this process for the endpoint of our Computer Vision service, using the following values:
+
+- **Name**: Enter a value of *ComputerVisionEndpoint*.
+- **Value**: Paste in the endpoint URL you saved from earlier.
+
+8) Repeat this step again for the storage account connection, using the following values:
+
+- **Name**: Enter a value of *StorageConnection*.
+- **Value**: Paste in the connection string you saved from earlier.
+
+9) Finally, repeat this process one more time for the storage account name, using the following values:
+
+- **Name**: Enter a value of *StorageAccountName*.
+- **Value**: Enter in the name of the storage account you created.
+
+10) After you have added these application settings, make sure to select **Save** at the top of the configuration page. When the save completes, you can hit **Refresh** as well to make sure the settings are picked up.
+
+All of the required environment variables to connect our Azure function to different services are now in place.
++
+## Upload an image to Blob Storage
+
+You are now ready to test out our application! You can upload a blob to the container, and then verify that the text in the image was saved to Table Storage.
+
+1) First, at the top of the Azure portal, search for *Storage* and select **storage account**. On the **storage account** page, select the account you created earlier.
+
+2) Next, select **Containers** on the left nav, and then navigate into the **ImageAnalysis** container you created earlier. From here you can upload a test image right inside the browser.
++
+3) You can find a few sample images included in the **images** folder at the root of the downloadable sample project, or you can use one of your own.
+
+4) At the top of the **ImageAnalysis** page, select **Upload**. In the flyout that opens, select the folder icon on the right to open up a file browser. Choose the image you'd like to upload, and then select **Upload**.
++
+5) The file should appear inside of your blob container. Next, you can verify that the upload triggered the Azure Function, and that the text in the image was analyzed and saved to Table Storage properly.
+
+6) Using the breadcrumbs at the top of the page, navigate up one level in your storage account. Locate and select **Storage browser** on the left nav, and then select **Tables**.
+
+7) An **ImageText** table should now be available. Click on the table to preview the data rows inside of it. You should see an entry for the processed image text of our upload. You can verify this using either the Timestamp, or by viewing the content of the **Text** column.
++
+Congratulations! You succeeded in processing an image that was uploaded to Blob Storage using Azure Functions and Computer Vision.
++
+## Clean up resources
+
+If you're not going to continue to use this application, you can delete the resources you created by removing the resource group.
+
+1) Select **Resource groups** from the main navigation
+1) Select the `msdocs-storage-function` resource group from the list.
+2) Select the **Delete resource group** button at the top of the resource group overview page.
+3) Enter the resource group name *msdocs-storage-function* in the confirmation dialog.
+4) Select delete.
+The process to delete the resource group may take a few minutes to complete.
storage Data Lake Storage Query Acceleration How To https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/data-lake-storage-query-acceleration-how-to.md
Previously updated : 01/06/2021 Last updated : 04/11/2022
Query acceleration enables applications and analytics frameworks to dramatically
-## Enable query acceleration
-
-To use query acceleration, you must register the query acceleration feature with your subscription. Once you've verified that the feature is registered, you must register the Azure Storage resource provider.
-
-### Step 1: Register the query acceleration feature
-
-To use query acceleration, you must first register the query acceleration feature with your subscription.
-
-#### [PowerShell](#tab/powershell)
-
-1. Open a Windows PowerShell command window.
-
-1. Sign in to your Azure subscription with the `Connect-AzAccount` command and follow the on-screen directions.
-
- ```powershell
- Connect-AzAccount
- ```
-
-2. If your identity is associated with more than one subscription, then set your active subscription.
-
- ```powershell
- $context = Get-AzSubscription -SubscriptionId <subscription-id>
- Set-AzContext $context
- ```
-
- Replace the `<subscription-id>` placeholder value with the ID of your subscription.
-
-3. Register the query acceleration feature by using the [Register-AzProviderFeature](/powershell/module/az.resources/register-azproviderfeature) command.
-
- ```powershell
- Register-AzProviderFeature -ProviderNamespace Microsoft.Storage -FeatureName BlobQuery
- ```
-
-#### [Azure CLI](#tab/azure-cli)
-
-1. Open the [Azure Cloud Shell](../../cloud-shell/overview.md), or if you've [installed](/cli/azure/install-azure-cli) the Azure CLI locally, open a command console application such as Windows PowerShell.
-
-2. If your identity is associated with more than one subscription, then set your active subscription to subscription of the storage account.
-
- ```azurecli-interactive
- az account set --subscription <subscription-id>
- ```
-
- Replace the `<subscription-id>` placeholder value with the ID of your subscription.
-
-3. Register the query acceleration feature by using the [az feature register](/cli/azure/feature#az-feature-register) command.
-
- ```azurecli
- az feature register --namespace Microsoft.Storage --name BlobQuery
- ```
---
-### Step 2: Verify that the feature is registered
-
-#### [PowerShell](#tab/powershell)
-
-To verify that the registration is complete, use the [Get-AzProviderFeature](/powershell/module/az.resources/get-azproviderfeature) command.
-
-```powershell
-Get-AzProviderFeature -ProviderNamespace Microsoft.Storage -FeatureName BlobQuery
-```
-
-#### [Azure CLI](#tab/azure-cli)
-
-To verify that the registration is complete, use the [az feature](/cli/azure/feature#az-feature-show) command.
-
-```azurecli
-az feature show --namespace Microsoft.Storage --name BlobQuery
-```
---
-### Step 3: Register the Azure Storage resource provider
-
-After your registration is approved, you must re-register the Azure Storage resource provider.
-
-#### [PowerShell](#tab/powershell)
-
-To register the resource provider, use the [Register-AzResourceProvider](/powershell/module/az.resources/register-azresourceprovider) command.
-
-```powershell
-Register-AzResourceProvider -ProviderNamespace 'Microsoft.Storage'
-```
-
-#### [Azure CLI](#tab/azure-cli)
-
-To register the resource provider, use the [az provider register](/cli/azure/provider#az-provider-register) command.
-
-```azurecli
-az provider register --namespace 'Microsoft.Storage'
-```
--- ## Set up your environment ### Step 1: Install packages
storage Storage Blob Append https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-append.md
public static async void AppendToBlob
- [Understanding block blobs, append blobs, and page blobs](/rest/api/storageservices/understanding-block-blobs--append-blobs--and-page-blobs) - [OpenWrite](/dotnet/api/azure.storage.blobs.specialized.appendblobclient.openwrite) / [OpenWriteAsync](/dotnet/api/azure.storage.blobs.specialized.appendblobclient.openwriteasync)-- [Append Block](/api/storageservices/append-block) (REST API)
+- [Append Block](/rest/api/storageservices/append-block) (REST API)
storage Storage Blob Delete https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-blob-delete.md
public static void RestoreBlobsWithVersioning(BlobContainerClient container, Blo
- [Get started with Azure Blob Storage and .NET](storage-blob-dotnet-get-started.md) - [Delete Blob](/rest/api/storageservices/delete-blob) (REST API) - [Soft delete for blobs](soft-delete-blob-overview.md)-- [Undelete Blob](//rest/api/storageservices/undelete-blob) (REST API)
+- [Undelete Blob](/rest/api/storageservices/undelete-blob) (REST API)
storage Storage Monitor Troubleshoot Storage Application https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-monitor-troubleshoot-storage-application.md
- Title: Monitor and troubleshoot a cloud storage application in Azure | Microsoft Docs
-description: Use diagnostic tools, metrics, and alerting to troubleshoot and monitor a cloud application.
---- Previously updated : 07/20/2018----
-# Monitor and troubleshoot a cloud storage application
-
-This tutorial is part four and the final part of a series. You learn how to monitor and troubleshoot a cloud storage application.
-
-In part four of the series, you learn how to:
-
-> [!div class="checklist"]
-> - Turn on logging and metrics
-> - Enable alerts for authorization errors
-> - Run test traffic with incorrect SAS tokens
-> - Download and analyze logs
-
-[Azure storage analytics](../common/storage-analytics.md) provides logging and metric data for a storage account. This data provides insights into the health of your storage account. To collect data from Azure storage analytics, you can configure logging, metrics and alerts. This process involves turning on logging, configuring metrics, and enabling alerts.
-
-Logging and metrics from storage accounts are enabled from the **Diagnostics** tab in the Azure portal. Storage logging enables you to record details for both successful and failed requests in your storage account. These logs enable you to see details of read, write, and delete operations against your Azure tables, queues, and blobs. They also enable you to see the reasons for failed requests such as timeouts, throttling, and authorization errors.
-
-## Log in to the Azure portal
-
-Log in to the [Azure portal](https://portal.azure.com)
-
-## Turn on logging and metrics
-
-From the left menu, select **Resource Groups**, select **myResourceGroup**, and then select your storage account in the resource list.
-
-Under **Diagnostics settings (classic)** set **Status** to **On**. Ensure all of the options under **Blob properties** are enabled.
-
-When complete, click **Save**
-
-![Screenshot that highlights the section that contains the configuration settings for turning on logging and metrics.](media/storage-monitor-troubleshoot-storage-application/enable-diagnostics.png)
-
-## Enable alerts
-
-Alerts provide a way to email administrators or trigger a webhook based on a metric breaching a threshold. In this example, you enable an alert for the `SASClientOtherError` metric.
-
-### Navigate to the storage account in the Azure portal
-
-Under the **Monitoring** section, select **Alerts (classic)**.
-
-Select **Add metric alert (classic)** and complete the **Add rule** form by filling in the required information. From the **Metric** dropdown, select `SASClientOtherError`. To allow your alert to trigger upon the first error, from the **Condition** dropdown select **Greater than or equal to**.
-
-![Diagnostics pane](media/storage-monitor-troubleshoot-storage-application/add-alert-rule.png)
-
-## Simulate an error
-
-To simulate a valid alert, you can attempt to request a non-existent blob from your storage account. The following command requires a storage container name. You can either use the name of an existing container or create a new one for the purposes of this example.
-
-Replace the placeholders with real values (make sure `<INCORRECT_BLOB_NAME>` is set to a value that does not exist) and run the command.
-
-```azurecli-interactive
-sasToken=$(az storage blob generate-sas \
- --account-name <STORAGE_ACCOUNT_NAME> \
- --account-key <STORAGE_ACCOUNT_KEY> \
- --container-name <CONTAINER_NAME> \
- --name <INCORRECT_BLOB_NAME> \
- --permissions r \
- --expiry `date --date="next day" +%Y-%m-%d`)
-
-curl https://<STORAGE_ACCOUNT_NAME>.blob.core.windows.net/<CONTAINER_NAME>/<INCORRECT_BLOB_NAME>?$sasToken
-```
-
-The following image is an example alert that is based off the simulated failure ran with the preceding example.
-
- ![Example alert](media/storage-monitor-troubleshoot-storage-application/email-alert.png)
-
-## Download and view logs
-
-Storage logs store data in a set of blobs in a blob container named **$logs** in your storage account. This container does not show up if you list all the blob containers in your account but you can see its contents if you access it directly.
-
-In this scenario, you use [Microsoft Message Analyzer](/message-analyzer/microsoft-message-analyzer-operating-guide) to interact with your Azure storage account.
-
-### Download Microsoft Message Analyzer
-
-Download [Microsoft Message Analyzer](/message-analyzer/installing-and-upgrading-message-analyzer) and install the application.
-
-Launch the application and choose **File** > **Open** > **From Other File Sources**.
-
-In the **File Selector** dialog, select **+ Add Azure Connection**. Enter in your **storage account name** and **account key** and click **OK**.
-
-![Microsoft Message Analyzer - Add Azure Storage Connection Dialog](media/storage-monitor-troubleshoot-storage-application/figure3.png)
-
-Once you are connected, expand the containers in the storage tree view to view the log blobs. Select the latest log and click **OK**.
-
-![Screenshot that shows the Microsoft Message Analyzer and highlights the selected log file.](media/storage-monitor-troubleshoot-storage-application/figure4.png)
-
-On the **New Session** dialog, click **Start** to view your log.
-
-Once the log opens, you can view the storage events. As you can see from the following image, there was an `SASClientOtherError` triggered on the storage account. For additional information on storage logging, visit [Storage Analytics](../common/storage-analytics.md).
-
-![Microsoft Message Analyzer - Viewing events](media/storage-monitor-troubleshoot-storage-application/figure5.png)
-
-[Storage Explorer](https://azure.microsoft.com/features/storage-explorer/) is another tool that can be used to interact with your storage accounts, including the **$logs** container and the logs that are contained in it.
-
-## Next steps
-
-In part four and the final part of the series, you learned how to monitor and troubleshoot your storage account, such as how to:
-
-> [!div class="checklist"]
-> - Turn on logging and metrics
-> - Enable alerts for authorization errors
-> - Run test traffic with incorrect SAS tokens
-> - Download and analyze logs
-
-Follow this link to see pre-built storage samples.
-
-> [!div class="nextstepaction"]
-> [Azure storage script samples](storage-samples-blobs-cli.md)
storage Storage Secure Access Application https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-secure-access-application.md
- Title: Secure access to application data-
-description: Use SAS tokens, encryption and HTTPS to secure your application's data in the cloud.
------ Previously updated : 06/10/2020-----
-# Secure access to application data
-
-This tutorial is part three of a series. You learn how to secure access to the storage account.
-
-In part three of the series, you learn how to:
-
-> [!div class="checklist"]
-> - Use SAS tokens to access thumbnail images
-> - Turn on server-side encryption
-> - Enable HTTPS-only transport
-
-[Azure blob storage](../common/storage-introduction.md#blob-storage) provides a robust service to store files for applications. This tutorial extends [the previous topic][previous-tutorial] to show how to secure access to your storage account from a web application. When you're finished the images are encrypted and the web app uses secure SAS tokens to access the thumbnail images.
-
-## Prerequisites
-
-To complete this tutorial you must have completed the previous Storage tutorial: [Automate resizing uploaded images using Event Grid][previous-tutorial].
-
-## Set container public access
-
-In this part of the tutorial series, SAS tokens are used for accessing the thumbnails. In this step, you set the public access of the *thumbnails* container to `off`.
-
-# [PowerShell](#tab/azure-powershell)
-
-```powershell
-$blobStorageAccount="<blob_storage_account>"
-
-$blobStorageAccountKey=(Get-AzStorageAccountKey -ResourceGroupName myResourceGroup -AccountName $blobStorageAccount).Key1
-
-Set-AzStorageAccount -ResourceGroupName "MyResourceGroup" -AccountName $blobStorageAccount -KeyName $blobStorageAccountKey -AllowBlobPublicAccess $false
-```
-
-# [Azure CLI](#tab/azure-cli)
-
-```azurecli
-blobStorageAccount="<blob_storage_account>"
-
-blobStorageAccountKey=$(az storage account keys list -g myResourceGroup \
- --account-name $blobStorageAccount --query [0].value --output tsv)
-
-az storage container set-permission \
- --account-name $blobStorageAccount \
- --account-key $blobStorageAccountKey \
- --name thumbnails \
- --public-access off
-```
---
-## Configure SAS tokens for thumbnails
-
-In part one of this tutorial series, the web application was showing images from a public container. In this part of the series, you use shared access signatures (SAS) tokens to retrieve the thumbnail images. SAS tokens allow you to provide restricted access to a container or blob based on IP, protocol, time interval, or rights allowed. For more information about SAS, see [Grant limited access to Azure Storage resources using shared access signatures (SAS)](../common/storage-sas-overview.md).
-
-In this example, the source code repository uses the `sasTokens` branch, which has an updated code sample. Delete the existing GitHub deployment with the [az webapp deployment source delete](/cli/azure/webapp/deployment/source). Next, configure GitHub deployment to the web app with the [az webapp deployment source config](/cli/azure/webapp/deployment/source) command.
-
-In the following command, `<web-app>` is the name of your web app.
-
-```azurecli
-az webapp deployment source delete --name <web-app> --resource-group myResourceGroup
-
-az webapp deployment source config --name <web_app> \
- --resource-group myResourceGroup --branch sasTokens --manual-integration \
- --repo-url https://github.com/Azure-Samples/storage-blob-upload-from-webapp
-```
-
-```powershell
-az webapp deployment source delete --name <web-app> --resource-group myResourceGroup
-
-az webapp deployment source config --name <web_app> `
- --resource-group myResourceGroup --branch sasTokens --manual-integration `
- --repo-url https://github.com/Azure-Samples/storage-blob-upload-from-webapp
-```
-
-The `sasTokens` branch of the repository updates the `StorageHelper.cs` file. It replaces the `GetThumbNailUrls` task with the code example below. The updated task retrieves the thumbnail URLs by using a [BlobSasBuilder](/dotnet/api/azure.storage.sas.blobsasbuilder) to specify the start time, expiry time, and permissions for the SAS token. Once deployed the web app now retrieves the thumbnails with a URL using a SAS token. The updated task is shown in the following example:
-
-```csharp
-public static async Task<List<string>> GetThumbNailUrls(AzureStorageConfig _storageConfig)
-{
- List<string> thumbnailUrls = new List<string>();
-
- // Create a URI to the storage account
- Uri accountUri = new Uri("https://" + _storageConfig.AccountName + ".blob.core.windows.net/");
-
- // Create BlobServiceClient from the account URI
- BlobServiceClient blobServiceClient = new BlobServiceClient(accountUri);
-
- // Get reference to the container
- BlobContainerClient container = blobServiceClient.GetBlobContainerClient(_storageConfig.ThumbnailContainer);
-
- if (container.Exists())
- {
- // Set the expiration time and permissions for the container.
- // In this case, the start time is specified as a few
- // minutes in the past, to mitigate clock skew.
- // The shared access signature will be valid immediately.
- BlobSasBuilder sas = new BlobSasBuilder
- {
- Resource = "c",
- BlobContainerName = _storageConfig.ThumbnailContainer,
- StartsOn = DateTimeOffset.UtcNow.AddMinutes(-5),
- ExpiresOn = DateTimeOffset.UtcNow.AddHours(1)
- };
-
- sas.SetPermissions(BlobContainerSasPermissions.All);
-
- // Create StorageSharedKeyCredentials object by reading
- // the values from the configuration (appsettings.json)
- StorageSharedKeyCredential storageCredential =
- new StorageSharedKeyCredential(_storageConfig.AccountName, _storageConfig.AccountKey);
-
- // Create a SAS URI to the storage account
- UriBuilder sasUri = new UriBuilder(accountUri);
- sasUri.Query = sas.ToSasQueryParameters(storageCredential).ToString();
-
- foreach (BlobItem blob in container.GetBlobs())
- {
- // Create the URI using the SAS query token.
- string sasBlobUri = container.Uri + "/" +
- blob.Name + sasUri.Query;
-
- //Return the URI string for the container, including the SAS token.
- thumbnailUrls.Add(sasBlobUri);
- }
- }
- return await Task.FromResult(thumbnailUrls);
-}
-```
-
-The following classes, properties, and methods are used in the preceding task:
-
-| Class | Properties | Methods |
-|-|||
-|[StorageSharedKeyCredential](/dotnet/api/azure.storage.storagesharedkeycredential) | | |
-|[BlobServiceClient](/dotnet/api/azure.storage.blobs.blobserviceclient) | |[GetBlobContainerClient](/dotnet/api/azure.storage.blobs.blobserviceclient.getblobcontainerclient) |
-|[BlobContainerClient](/dotnet/api/azure.storage.blobs.blobcontainerclient) | [Uri](/dotnet/api/azure.storage.blobs.blobcontainerclient.uri) |[Exists](/dotnet/api/azure.storage.blobs.blobcontainerclient.exists) <br> [GetBlobs](/dotnet/api/azure.storage.blobs.blobcontainerclient.getblobs) |
-|[BlobSasBuilder](/dotnet/api/azure.storage.sas.blobsasbuilder) | | [SetPermissions](/dotnet/api/azure.storage.sas.blobsasbuilder.setpermissions) <br> [ToSasQueryParameters](/dotnet/api/azure.storage.sas.blobsasbuilder.tosasqueryparameters) |
-|[BlobItem](/dotnet/api/azure.storage.blobs.models.blobitem) | [Name](/dotnet/api/azure.storage.blobs.models.blobitem.name) | |
-|[UriBuilder](/dotnet/api/system.uribuilder) | [Query](/dotnet/api/system.uribuilder.query) | |
-|[List](/dotnet/api/system.collections.generic.list-1) | | [Add](/dotnet/api/system.collections.generic.list-1.add) |
-
-## Azure Storage encryption
-
-[Azure Storage encryption](../common/storage-service-encryption.md) helps you protect and safeguard your data by encrypting data at rest and by handling encryption and decryption. All data is encrypted using 256-bit [AES encryption](https://en.wikipedia.org/wiki/Advanced_Encryption_Standard), one of the strongest block ciphers available.
-
-You can choose to have Microsoft manage encryption keys, or you can bring your own keys with customer-managed keys stored in Azure Key Vault or Key Vault Managed Hardware Security Model (HSM) (preview). For more information, see [Customer-managed keys for Azure Storage encryption](../common/customer-managed-keys-overview.md).
-
-Azure Storage encryption automatically encrypts data in all performance tiers (Standard and Premium), all deployment models (Azure Resource Manager and Classic), and all of the Azure Storage services (Blob, Queue, Table, and File).
-
-## Enable HTTPS only
-
-In order to ensure that requests for data to and from a storage account are secure, you can limit requests to HTTPS only. Update the storage account required protocol by using the [az storage account update](/cli/azure/storage/account) command.
-
-```azurecli-interactive
-az storage account update --resource-group myresourcegroup --name <storage-account-name> --https-only true
-```
-
-Test the connection using `curl` using the `HTTP` protocol.
-
-```azurecli-interactive
-curl http://<storage-account-name>.blob.core.windows.net/<container>/<blob-name> -I
-```
-
-Now that secure transfer is required, you receive the following message:
-
-```
-HTTP/1.1 400 The account being accessed does not support http.
-```
-
-## Next steps
-
-In part three of the series, you learned how to secure access to the storage account, such as how to:
-
-> [!div class="checklist"]
-> - Use SAS tokens to access thumbnail images
-> - Turn on server-side encryption
-> - Enable HTTPS-only transport
-
-Advance to part four of the series to learn how to monitor and troubleshoot a cloud storage application.
-
-> [!div class="nextstepaction"]
-> [Monitor and troubleshoot application cloud application storage](storage-monitor-troubleshoot-storage-application.md)
-
-[previous-tutorial]: ../../event-grid/resize-images-on-storage-blob-upload-event.md?toc=%2fazure%2fstorage%2fblobs%2ftoc.json
storage Storage Upload Process Images https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/storage/blobs/storage-upload-process-images.md
- Title: Upload image data in the cloud with Azure Storage | Microsoft Docs
-description: Use Azure Blob Storage with web apps to store app data to a storage account. This tutorial creates a web app that stores and displays images from Azure storage.
----- Previously updated : 06/24/2020-----
-# Tutorial: Upload image data in the cloud with Azure Storage
-
-This tutorial is part one of a series. In this tutorial, you'll learn how to deploy a web app. The web app uses the Azure Blob Storage client library to upload images to a storage account. When you're finished, you'll have a web app that stores and displays images from Azure storage.
-
-# [.NET v12 SDK](#tab/dotnet)
-
-![Image resizer app in .NET](media/storage-upload-process-images/figure2.png)
-
-# [JavaScript v12 SDK](#tab/javascript)
-
-![Image resizer app in JavaScript](media/storage-upload-process-images/upload-app-nodejs-thumb.png)
---
-In part one of the series, you learn how to:
-
-> [!div class="checklist"]
-
-> - Create a storage account
-> - Create a container and set permissions
-> - Retrieve an access key
-> - Deploy a web app to Azure
-> - Configure app settings
-> - Interact with the web app
-
-## Prerequisites
-
-To complete this tutorial, you need an Azure subscription. Create a [free account](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio) before you begin.
--
-To install and use the CLI locally, run Azure CLI version 2.0.4 or later. Run `az --version` to find the version. If you need to install or upgrade, see [Install the Azure CLI](/cli/azure/install-azure-cli).
-
-## Create a resource group
-
-The following example creates a resource group named `myResourceGroup`.
-
-# [PowerShell](#tab/azure-powershell)
-
-Create a resource group with the [New-AzResourceGroup](/powershell/module/az.resources/new-azresourcegroup) command. An Azure resource group is a logical container into which Azure resources are deployed and managed.
-
-```powershell
-New-AzResourceGroup -Name myResourceGroup -Location southeastasia
-```
-
-# [Azure CLI](#tab/azure-cli)
-
-Create a resource group with the [az group create](/cli/azure/group) command. An Azure resource group is a logical container into which Azure resources are deployed and managed.
-
-```azurecli
-az group create --name myResourceGroup --location southeastasia
-```
---
-## Create a storage account
-
-The sample uploads images to a blob container in an Azure storage account. A storage account provides a unique namespace to store and access your Azure storage data objects.
-
-> [!IMPORTANT]
-> In part 2 of the tutorial, you use Azure Event Grid with Blob storage. Make sure to create your storage account in an Azure region that supports Event Grid. For a list of supported regions, see [Azure products by region](https://azure.microsoft.com/global-infrastructure/services/?products=event-grid&regions=all).
-
-In the following command, replace your own globally unique name for the Blob storage account where you see the `<blob_storage_account>` placeholder.
-
-# [PowerShell](#tab/azure-powershell)
-
-Create a storage account in the resource group you created by using the [New-AzStorageAccount](/powershell/module/az.storage/new-azstorageaccount) command.
-
-```powershell
-$blobStorageAccount="<blob_storage_account>"
-
-New-AzStorageAccount -ResourceGroupName myResourceGroup -Name $blobStorageAccount -SkuName Standard_LRS -Location southeastasia -Kind StorageV2 -AccessTier Hot
-```
-
-# [Azure CLI](#tab/azure-cli)
-
-Create a storage account in the resource group you created by using the [az storage account create](/cli/azure/storage/account) command.
-
-```azurecli
-blobStorageAccount="<blob_storage_account>"
-
-az storage account create --name $blobStorageAccount --location southeastasia \
- --resource-group myResourceGroup --sku Standard_LRS --kind StorageV2 --access-tier hot
-```
---
-## Create Blob storage containers
-
-The app uses two containers in the Blob storage account. Containers are similar to folders and store blobs. The *images* container is where the app uploads full-resolution images. In a later part of the series, an Azure function app uploads resized image thumbnails to the *thumbnail
-
-The *images* container's public access is set to `off`. The *thumbnails* container's public access is set to `container`. The `container` public access setting permits users who visit the web page to view the thumbnails.
-
-# [PowerShell](#tab/azure-powershell)
-
-Get the storage account key by using the [Get-AzStorageAccountKey](/powershell/module/az.storage/get-azstorageaccountkey) command. Then, use this key to create two containers with the [New-AzStorageContainer](/powershell/module/az.storage/new-azstoragecontainer) command.
-
-```powershell
-$blobStorageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName myResourceGroup -Name $blobStorageAccount).Key1
-$blobStorageContext = New-AzStorageContext -StorageAccountName $blobStorageAccount -StorageAccountKey $blobStorageAccountKey
-
-New-AzStorageContainer -Name images -Context $blobStorageContext
-New-AzStorageContainer -Name thumbnails -Permission Container -Context $blobStorageContext
-```
-
-# [Azure CLI](#tab/azure-cli)
-
-Get the storage account key by using the [az storage account keys list](/cli/azure/storage/account/keys) command. Then, use this key to create two containers with the [az storage container create](/cli/azure/storage/container) command.
-
-```azurecli
-blobStorageAccountKey=$(az storage account keys list -g myResourceGroup \
- -n $blobStorageAccount --query "[0].value" --output tsv)
-
-az storage container create --name images \
- --account-name $blobStorageAccount \
- --account-key $blobStorageAccountKey
-
-az storage container create --name thumbnails \
- --account-name $blobStorageAccount \
- --account-key $blobStorageAccountKey --public-access container
-```
---
-Make a note of your Blob storage account name and key. The sample app uses these settings to connect to the storage account to upload the images.
-
-## Create an App Service plan
-
-An [App Service plan](../../app-service/overview-hosting-plans.md) specifies the location, size, and features of the web server farm that hosts your app.
-
-The following example creates an App Service plan named `myAppServicePlan` in the **Free** pricing tier:
-
-# [PowerShell](#tab/azure-powershell)
-
-Create an App Service plan with the [New-AzAppServicePlan](/powershell/module/az.websites/new-azappserviceplan) command.
-
-```powershell
-New-AzAppServicePlan -ResourceGroupName myResourceGroup -Name myAppServicePlan -Tier "Free"
-```
-
-# [Azure CLI](#tab/azure-cli)
-
-Create an App Service plan with the [az appservice plan create](/cli/azure/appservice/plan) command.
-
-```azurecli
-az appservice plan create --name myAppServicePlan --resource-group myResourceGroup --sku Free
-```
---
-## Create a web app
-
-The web app provides a hosting space for the sample app code that's deployed from the GitHub sample repository.
-
-In the following command, replace `<web_app>` with a unique name. Valid characters are `a-z`, `0-9`, and `-`. If `<web_app>` isn't unique, you get the error message: *Website with given name `<web_app>` already exists.* The default URL of the web app is `https://<web_app>.azurewebsites.net`.
-
-# [PowerShell](#tab/azure-powershell)
-
-Create a [web app](../../app-service/overview.md) in the `myAppServicePlan` App Service plan with the [New-AzWebApp](/powershell/module/az.websites/new-azwebapp) command.
-
-```powershell
-$webapp="<web_app>"
-
-New-AzWebApp -ResourceGroupName myResourceGroup -Name $webapp -AppServicePlan myAppServicePlan
-```
-
-# [Azure CLI](#tab/azure-cli)
-
-Create a [web app](../../app-service/overview.md) in the `myAppServicePlan` App Service plan with the [az webapp create](/cli/azure/webapp) command.
-
-```azurecli
-webapp="<web_app>"
-
-az webapp create --name $webapp --resource-group myResourceGroup --plan myAppServicePlan
-```
---
-## Deploy the sample app from the GitHub repository
-
-# [.NET v12 SDK](#tab/dotnet)
-
-App Service supports several ways to deploy content to a web app. In this tutorial, you deploy the web app from a [public GitHub sample repository](https://github.com/Azure-Samples/storage-blob-upload-from-webapp). Configure GitHub deployment to the web app with the [az webapp deployment source config](/cli/azure/webapp/deployment/source) command.
-
-The sample project contains an [ASP.NET MVC](https://www.asp.net/mvc) app. The app accepts an image, saves it to a storage account, and displays images from a thumbnail container. The web app uses the [Azure.Storage](/dotnet/api/azure.storage), [Azure.Storage.Blobs](/dotnet/api/azure.storage.blobs), and [Azure.Storage.Blobs.Models](/dotnet/api/azure.storage.blobs.models) namespaces to interact with the Azure Storage service.
-
-```azurecli
-az webapp deployment source config --name $webapp --resource-group myResourceGroup \
- --branch master --manual-integration \
- --repo-url https://github.com/Azure-Samples/storage-blob-upload-from-webapp
-```
-
-```powershell
-az webapp deployment source config --name $webapp --resource-group myResourceGroup `
- --branch master --manual-integration `
- --repo-url https://github.com/Azure-Samples/storage-blob-upload-from-webapp
-```
-
-# [JavaScript v12 SDK](#tab/javascript)
-
-App Service supports several ways to deploy content to a web app. In this tutorial, you deploy the web app from a [public GitHub sample repository](https://github.com/Azure-Samples/azure-sdk-for-js-storage-blob-stream-nodejs). Configure GitHub deployment to the web app with the [az webapp deployment source config](/cli/azure/webapp/deployment/source) command.
-
-```azurecli
-az webapp deployment source config --name $webapp --resource-group myResourceGroup \
- --branch master --manual-integration \
- --repo-url https://github.com/Azure-Samples/azure-sdk-for-js-storage-blob-stream-nodejs
-```
-
-```powershell
-az webapp deployment source config --name $webapp --resource-group myResourceGroup `
- --branch master --manual-integration `
- --repo-url https://github.com/Azure-Samples/azure-sdk-for-js-storage-blob-stream-nodejs
-```
---
-## Configure web app settings
-
-# [.NET v12 SDK](#tab/dotnet)
-
-The sample web app uses the [Azure Storage APIs for .NET](/dotnet/api/overview/azure/storage) to upload images. Storage account credentials are set in the app settings for the web app. Add app settings to the deployed app with the [az webapp config appsettings set](/cli/azure/webapp/config/appsettings) or [New-AzStaticWebAppSetting](/powershell/module/az.websites/new-azstaticwebappsetting) command.
-
-```azurecli
-az webapp config appsettings set --name $webapp --resource-group myResourceGroup \
- --settings AzureStorageConfig__AccountName=$blobStorageAccount \
- AzureStorageConfig__ImageContainer=images \
- AzureStorageConfig__ThumbnailContainer=thumbnails \
- AzureStorageConfig__AccountKey=$blobStorageAccountKey
-```
-
-```powershell
-az webapp config appsettings set --name $webapp --resource-group myResourceGroup `
- --settings AzureStorageConfig__AccountName=$blobStorageAccount `
- AzureStorageConfig__ImageContainer=images `
- AzureStorageConfig__ThumbnailContainer=thumbnails `
- AzureStorageConfig__AccountKey=$blobStorageAccountKey
-```
-
-# [JavaScript v12 SDK](#tab/javascript)
-
-The sample web app uses the [Azure Storage client library for JavaScript](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/storage) to upload images. The storage account credentials are set in the app settings for the web app. Add app settings to the deployed app with the [az webapp config appsettings set](/cli/azure/webapp/config/appsettings) or [New-AzStaticWebAppSetting](/powershell/module/az.websites/new-azstaticwebappsetting) command.
-
-```azurecli
-az webapp config appsettings set --name $webapp --resource-group myResourceGroup \
- --settings AZURE_STORAGE_ACCOUNT_NAME=$blobStorageAccount \
- AZURE_STORAGE_ACCOUNT_ACCESS_KEY=$blobStorageAccountKey
-```
-
-```powershell
-az webapp config appsettings set --name $webapp --resource-group myResourceGroup `
- --settings AZURE_STORAGE_ACCOUNT_NAME=$blobStorageAccount `
- AZURE_STORAGE_ACCOUNT_ACCESS_KEY=$blobStorageAccountKey
-```
---
-After you deploy and configure the web app, you can test the image upload functionality in the app.
-
-## Upload an image
-
-To test the web app, browse to the URL of your published app. The default URL of the web app is `https://<web_app>.azurewebsites.net`.
-
-# [.NET v12 SDK](#tab/dotnet)
-
-Select the **Upload photos** region to specify and upload a file, or drag a file onto the region. The image disappears if successfully uploaded. The **Generated Thumbnails** section will remain empty until we test it later in this tutorial.
-
-![Upload Photos in .NET](media/storage-upload-process-images/figure1.png)
-
-In the sample code, the `UploadFileToStorage` task in the *Storagehelper.cs* file is used to upload the images to the *images* container within the storage account using the [UploadAsync](/dotnet/api/azure.storage.blobs.blobclient.uploadasync) method. The following code sample contains the `UploadFileToStorage` task.
-
-```csharp
-public static async Task<bool> UploadFileToStorage(Stream fileStream, string fileName,
- AzureStorageConfig _storageConfig)
-{
- // Create a URI to the blob
- Uri blobUri = new Uri("https://" +
- _storageConfig.AccountName +
- ".blob.core.windows.net/" +
- _storageConfig.ImageContainer +
- "/" + fileName);
-
- // Create StorageSharedKeyCredentials object by reading
- // the values from the configuration (appsettings.json)
- StorageSharedKeyCredential storageCredentials =
- new StorageSharedKeyCredential(_storageConfig.AccountName, _storageConfig.AccountKey);
-
- // Create the blob client.
- BlobClient blobClient = new BlobClient(blobUri, storageCredentials);
-
- // Upload the file
- await blobClient.UploadAsync(fileStream);
-
- return await Task.FromResult(true);
-}
-```
-
-The following classes and methods are used in the preceding task:
-
-| Class | Method |
-|-|--|
-| [Uri](/dotnet/api/system.uri) | [Uri constructor](/dotnet/api/system.uri.-ctor) |
-| [StorageSharedKeyCredential](/dotnet/api/azure.storage.storagesharedkeycredential) | [StorageSharedKeyCredential(String, String) constructor](/dotnet/api/azure.storage.storagesharedkeycredential.-ctor) |
-| [BlobClient](/dotnet/api/azure.storage.blobs.blobclient) | [UploadAsync](/dotnet/api/azure.storage.blobs.blobclient.uploadasync) |
-
-# [JavaScript v12 SDK](#tab/javascript)
-
-Select **Choose File** to select a file, then click **Upload Image**. The **Generated Thumbnails** section will remain empty until we test it later in this tutorial.
-
-![Upload Photos in Node.js](media/storage-upload-process-images/upload-app-nodejs.png)
-
-In the sample code, the `post` route is responsible for uploading the image into a blob container. The route uses the modules to help process the upload:
--- [multer](https://github.com/expressjs/multer) implements the upload strategy for the route handler.-- [into-stream](https://github.com/sindresorhus/into-stream) converts the buffer into a stream as required by [uploadStream](/javascript/api/%40azure/storage-blob/blockblobclient#uploadstream-readable--number--number--blockblobuploadstreamoptions-).-
-As the file is sent to the route, the contents of the file stay in memory until the file is uploaded to the blob container.
-
-> [!IMPORTANT]
-> Loading large files into memory may have a negative effect on your web app's performance. If you expect users to post large files, you may want to consider staging files on the web server file system and then scheduling uploads into Blob storage. Once the files are in Blob storage, you can remove them from the server file system.
-
-```javascript
-if (process.env.NODE_ENV !== 'production') {
- require('dotenv').config();
-}
-
-const {
- BlobServiceClient,
- StorageSharedKeyCredential,
- newPipeline
-} = require('@azure/storage-blob');
-
-const express = require('express');
-const router = express.Router();
-const containerName1 = 'thumbnails';
-const multer = require('multer');
-const inMemoryStorage = multer.memoryStorage();
-const uploadStrategy = multer({ storage: inMemoryStorage }).single('image');
-const getStream = require('into-stream');
-const containerName2 = 'images';
-const ONE_MEGABYTE = 1024 * 1024;
-const uploadOptions = { bufferSize: 4 * ONE_MEGABYTE, maxBuffers: 20 };
-
-const sharedKeyCredential = new StorageSharedKeyCredential(
- process.env.AZURE_STORAGE_ACCOUNT_NAME,
- process.env.AZURE_STORAGE_ACCOUNT_ACCESS_KEY);
-const pipeline = newPipeline(sharedKeyCredential);
-
-const blobServiceClient = new BlobServiceClient(
- `https://${process.env.AZURE_STORAGE_ACCOUNT_NAME}.blob.core.windows.net`,
- pipeline
-);
-
-const getBlobName = originalName => {
- // Use a random number to generate a unique file name,
- // removing "0." from the start of the string.
- const identifier = Math.random().toString().replace(/0\./, '');
- return `${identifier}-${originalName}`;
-};
-
-router.get('/', async (req, res, next) => {
-
- let viewData;
-
- try {
- const containerClient = blobServiceClient.getContainerClient(containerName1);
- const listBlobsResponse = await containerClient.listBlobFlatSegment();
-
- for await (const blob of listBlobsResponse.segment.blobItems) {
- console.log(`Blob: ${blob.name}`);
- }
-
- viewData = {
- Title: 'Home',
- viewName: 'index',
- accountName: process.env.AZURE_STORAGE_ACCOUNT_NAME,
- containerName: containerName1
- };
-
- if (listBlobsResponse.segment.blobItems.length) {
- viewData.thumbnails = listBlobsResponse.segment.blobItems;
- }
- } catch (err) {
- viewData = {
- Title: 'Error',
- viewName: 'error',
- message: 'There was an error contacting the blob storage container.',
- error: err
- };
- res.status(500);
- } finally {
- res.render(viewData.viewName, viewData);
- }
-});
-
-router.post('/', uploadStrategy, async (req, res) => {
- const blobName = getBlobName(req.file.originalname);
- const stream = getStream(req.file.buffer);
- const containerClient = blobServiceClient.getContainerClient(containerName2);;
- const blockBlobClient = containerClient.getBlockBlobClient(blobName);
-
- try {
- await blockBlobClient.uploadStream(stream,
- uploadOptions.bufferSize, uploadOptions.maxBuffers,
- { blobHTTPHeaders: { blobContentType: "image/jpeg" } });
- res.render('success', { message: 'File uploaded to Azure Blob Storage.' });
- } catch (err) {
- res.render('error', { message: err.message });
- }
-});
-
-module.exports = router;
-```
---
-## Verify the image is shown in the storage account
-
-Sign in to the [Azure portal](https://portal.azure.com). From the left menu, select **Storage accounts**, then select the name of your storage account. Select **Containers**, then select the **images** container.
-
-Verify the image is shown in the container.
-
-![Azure portal listing of images container](media/storage-upload-process-images/figure13.png)
-
-## Test thumbnail viewing
-
-To test thumbnail viewing, you'll upload an image to the **thumbnails** container to check whether the app can read the **thumbnails** container.
-
-Sign in to the [Azure portal](https://portal.azure.com). From the left menu, select **Storage accounts**, then select the name of your storage account. Select **Containers**, then select the **thumbnails** container. Select **Upload** to open the **Upload blob** pane.
-
-Choose a file with the file picker and select **Upload**.
-
-Navigate back to your app to verify that the image uploaded to the **thumbnails** container is visible.
-
-# [.NET v12 SDK](#tab/dotnet)
-
-![.NET image resizer app with new image displayed](media/storage-upload-process-images/figure2.png)
-
-# [JavaScript v12 SDK](#tab/javascript)
-
-![Node.js image resizer app with new image displayed](media/storage-upload-process-images/upload-app-nodejs-thumb.png)
---
-In part two of the series, you automate thumbnail image creation so you don't need this image. In the **thumbnails** container, select the image you uploaded, and select **Delete** to remove the image.
-
-You can enable Content Delivery Network (CDN) to cache content from your Azure storage account. For more information, see [Integrate an Azure storage account with Azure CDN](../../cdn/cdn-create-a-storage-account-with-cdn.md).
-
-## Next steps
-
-In part one of the series, you learned how to configure a web app to interact with storage.
-
-Go on to part two of the series to learn about using Event Grid to trigger an Azure function to resize an image.
-
-> [!div class="nextstepaction"]
-> [Use Event Grid to trigger an Azure Function to resize an uploaded image](../../event-grid/resize-images-on-storage-blob-upload-event.md?toc=%2fazure%2fstorage%2fblobs%2ftoc.json)
synapse-analytics Restore Sql Pool From Deleted Workspace https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/backuprestore/restore-sql-pool-from-deleted-workspace.md
Previously updated : 03/29/2022- Last updated : 04/11/2022+
In this article, you learn how to restore a dedicated SQL pool in Azure Synapse
## Restore the SQL pool from the dropped workspace 1. Open PowerShell+ 2. Connect to your Azure account.+ 3. Set the context to the subscription that contains the workspace that was dropped.+ 4. Specify the approximate datetime the workspace was dropped.+ 5. Construct the resource ID for the database you wish to recover from the dropped workspace.+ 6. Restore the database from the dropped workspace+ 7. Verify the status of the recovered database as 'online'.
synapse-analytics Restore Sql Pool https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/backuprestore/restore-sql-pool.md
Title: Restore an existing dedicated SQL pool description: How-to guide for restoring an existing dedicated SQL pool.--++ Previously updated : 10/29/2020-- Last updated : 04/11/2022++ # Restore an existing dedicated SQL pool
-In this article, you learn how to restore an existing dedicated SQL pool in Azure Synapse Analytics using Azure portal and Synapse Studio. This article applies to both restores and geo-restores.
+In this article, you learn how to restore an existing dedicated SQL pool in Azure Synapse Analytics using Azure portal, Synapse Studio, and PowerShell. This article applies to both restores and geo-restores.
## Restore an existing dedicated SQL pool through the Synapse Studio
In this article, you learn how to restore an existing dedicated SQL pool in Azur
5. Select **Review + Create**.
+## Restore an existing dedicated SQL pool through PowerShell
+
+1. Open PowerShell.
+
+2. Connect to your Azure account and list all the subscriptions associated with your account.
+
+3. Select the subscription that contains the SQL pool to be restored.
+
+4. List the restore points for the dedicated SQL pool.
+
+5. Pick the desired restore point using the RestorePointCreationDate.
+
+6. Restore the dedicated SQL pool to the desired restore point using [Restore-AzSynapseSqlPool](/powershell/module/az.synapse/restore-azsynapsesqlpool?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json) PowerShell cmdlet.
+
+ 1. To restore the dedicated SQL pool to a different workspace, make sure to specify the other workspace name. This workspace can also be in a different resource group and region.
+ 2. To restore to a different subscription, see the [below section](#restore-an-existing-dedicated-sql-pool-to-a-different-subscription-through-powershell).
+
+7. Verify that the restored dedicated SQL pool is online.
+
+```powershell
+
+$SubscriptionName="<YourSubscriptionName>"
+$ResourceGroupName="<YourResourceGroupName>"
+$WorkspaceName="<YourWorkspaceNameWithoutURLSuffixSeeNote>" # Without sql.azuresynapse.net
+#$TargetResourceGroupName="<YourTargetResourceGroupName>" # uncomment to restore to a different workspace.
+#$TargetWorkspaceName="<YourtargetWorkspaceNameWithoutURLSuffixSeeNote>"
+$SQLPoolName="<YourDatabaseName>"
+$NewSQLPoolName="<YourDatabaseName>"
+
+Connect-AzAccount
+Get-AzSubscription
+Select-AzSubscription -SubscriptionName $SubscriptionName
+
+# list all restore points
+Get-AzSynapseSqlPoolRestorePoint -ResourceGroupName $ResourceGroupName -WorkspaceName $WorkspaceName -Name $SQLPoolName
+# Pick desired restore point using RestorePointCreationDate "xx/xx/xxxx xx:xx:xx xx"
+$PointInTime="<RestorePointCreationDate>"
+
+# Get the specific SQL pool to restore
+$SQLPool = Get-AzSynapseSqlPool -ResourceGroupName $ResourceGroupName -WorkspaceName $WorkspaceName -Name $SQLPoolName
+# Transform Synapse SQL pool resource ID to SQL database ID because currently the restore command only accepts the SQL database ID format.
+$DatabaseID = $SQLPool.Id -replace "Microsoft.Synapse", "Microsoft.Sql" `
+ -replace "workspaces", "servers" `
+ -replace "sqlPools", "databases"
+
+# Restore database from a restore point
+$RestoredDatabase = Restore-AzSynapseSqlPool ΓÇôFromRestorePoint -RestorePoint $PointInTime -ResourceGroupName $SQLPool.ResourceGroupName `
+ -WorkspaceName $SQLPool.WorkspaceName -TargetSqlPoolName $NewSQLPoolName ΓÇôResourceId $DatabaseID -PerformanceLevel DW100c
+
+# Use the following command to restore to a different workspace
+#$TargetResourceGroupName = $SQLPool.ResourceGroupName # for restoring to different workspace in same resourcegroup
+#$RestoredDatabase = Restore-AzSynapseSqlPool ΓÇôFromRestorePoint -RestorePoint $PointInTime -ResourceGroupName $TargetResourceGroupName `
+# -WorkspaceName $TargetWorkspaceName -TargetSqlPoolName $NewSQLPoolName ΓÇôResourceId $DatabaseID -PerformanceLevel DW100c
+
+# Verify the status of restored database
+$RestoredDatabase.status
+```
+
+## Restore an existing dedicated SQL pool to a different subscription through PowerShell
+When performing a cross-subscription restore, a synapse workspace dedicated SQL pool can only restore to a standalone dedicated SQL pool (formerly SQL DW). The PowerShell below is similar to the above however there are three main differences:
+- After retrieving the SQL Pool object to be restored, the subscription context needs to be switched to the destination (or target) subscription name.
+- When performing the restore, use the Az.Sql modules instead of the Az.Synapse modules.
+- If it is required to restore the dedicated SQL pool to a Synapse workspace in the destination subscription, an additional restore step is required.
+
+Steps:
+
+1. Open PowerShell.
+
+2. Update Az.Sql Module to 3.8.0 (or greater) if needed
+
+3. Connect to your Azure account and list all the subscriptions associated with your account.
+
+4. Select the subscription that contains the SQL pool to be restored.
+
+5. List the restore points for the dedicated SQL pool.
+
+6. Pick the desired restore point using the RestorePointCreationDate.
+
+7. Select the destination subscription in which the SQL pool should be restored.
+
+8. Restore the dedicated SQL pool to the desired restore point using [Restore-AzSqlDatabase](/powershell/module/az.sql/restore-azsqldatabase?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json) PowerShell cmdlet.
+
+9. Verify that the restored dedicated SQL pool (formerly SQL DW) is online.
+
+10. If the desired destination is a Synapse Workspace, uncomment the code to perform the additional restore step.
+ 1. Create a restore point for the newly created data warehouse.
+ 2. Retrieve the last restore point created by using the "Select -Last 1" syntax.
+ 3. Perform the restore to the desired Synapse workspace.
+
+```powershell
+$SourceSubscriptionName="<YourSubscriptionName>"
+$SourceResourceGroupName="<YourResourceGroupName>"
+$SourceWorkspaceName="<YourServerNameWithoutURLSuffixSeeNote>" # Without sql.azuresynapse.net
+$SourceSQLPoolName="<YourDatabaseName>"
+$TargetSubscriptionName="<YourTargetSubscriptionName>"
+$TargetResourceGroupName="<YourTargetResourceGroupName>"
+$TargetServerName="<YourTargetServerNameWithoutURLSuffixSeeNote>" # Without sql.azuresynapse.net
+$TargetDatabaseName="<YourDatabaseName>"
+#$TargetWorkspaceName="<YourTargetWorkspaceName>" # uncomment if restore to a synapse workspace is required
+
+# Update Az.Sql module to the latest version (3.8.0 or above)
+# Update-Module -Name Az.Sql -RequiredVersion 3.8.0
+
+Connect-AzAccount
+Get-AzSubscription
+Select-AzSubscription -SubscriptionName $SourceSubscriptionName
+
+# list all restore points
+Get-AzSynapseSqlPoolRestorePoint -ResourceGroupName $ResourceGroupName -WorkspaceName $WorkspaceName -Name $SQLPoolName
+# Pick desired restore point using RestorePointCreationDate "xx/xx/xxxx xx:xx:xx xx"
+$PointInTime="<RestorePointCreationDate>"
+
+# Get the specific SQL pool to restore
+$SQLPool = Get-AzSynapseSqlPool -ResourceGroupName $ResourceGroupName -WorkspaceName $WorkspaceName -Name $SQLPoolName
+# Transform Synapse SQL pool resource ID to SQL database ID because currently the restore command only accepts the SQL database ID format.
+$DatabaseID = $SQLPool.Id -replace "Microsoft.Synapse", "Microsoft.Sql" `
+ -replace "workspaces", "servers" `
+ -replace "sqlPools", "databases"
+
+# Switch context to the destination subscription
+Select-AzSubscription -SubscriptionName $TargetSubscriptionName
+
+# Restore database from a desired restore point of the source database to the target server in the desired subscription
+$RestoredDatabase = Restore-AzSqlDatabase ΓÇôFromPointInTimeBackup ΓÇôPointInTime $PointInTime -ResourceGroupName $TargetResourceGroupName `
+ -ServerName $TargetServerName -TargetDatabaseName $TargetDatabaseName ΓÇôResourceId $Database.ID
+
+# Verify the status of restored database
+$RestoredDatabase.status
+
+# uncomment below cmdlets to perform one more restore to push the SQL Pool to an existing workspace in the destination subscription
+# # Create restore point
+# New-AzSqlDatabaseRestorePoint -ResourceGroupName $RestoredDatabase.ResourceGroupName -ServerName $RestoredDatabase.ServerName `
+# -DatabaseName $RestoredDatabase.DatabaseName -RestorePointLabel "UD-001"
+# # Gets the last restore point of the sql dw (will use the RestorePointCreationDate property)
+# $RestorePoint = Get-AzSqlDatabaseRestorePoint -ResourceGroupName $RestoredDatabase.ResourceGroupName -ServerName $RestoredDatabase.ServerName `
+# -DatabaseName $RestoredDatabase.DatabaseName | Select -Last 1
+# # Restore to destination synapse workspace
+# $FinalRestore = Restore-AzSynapseSqlPool ΓÇôFromRestorePoint -RestorePoint $RestorePoint.RestorePointCreationDate -ResourceGroupName $TargetResourceGroupName `
+# -WorkspaceName $TargetWorkspaceName -TargetSqlPoolName $TargetDatabaseName ΓÇôResourceId $RestoredDatabase.ResourceID -PerformanceLevel DW100c
+
+```
+++ ## Next Steps - [Create a restore point](sqlpool-create-restore-point.md)
synapse-analytics Sql Data Warehouse Restore Active Paused Dw https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/sql-data-warehouse/sql-data-warehouse-restore-active-paused-dw.md
Title: Restore an existing dedicated SQL pool (formerly SQL DW) description: How-to guide for restoring an existing dedicated SQL pool in Azure Synapse Analytics.--++ Previously updated : 11/13/2020-- Last updated : 04/11/2022++
To restore an existing dedicated SQL pool (formerly SQL DW) from a restore point
6. Restore the dedicated SQL pool (formerly SQL DW) to the desired restore point using [Restore-AzSqlDatabase](/powershell/module/az.sql/restore-azsqldatabase?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json) PowerShell cmdlet. 1. To restore the dedicated SQL pool (formerly SQL DW) to a different server, make sure to specify the other server name. This server can also be in a different resource group and region.
- 2. To restore to a different subscription, use the 'Move' button to move the server to another subscription.
+ 2. To restore to a different subscription, see the [below section](#restore-an-existing-dedicated-sql-pool-formerly-sql-dw-to-a-different-subscription-through-powershell).
7. Verify that the restored dedicated SQL pool (formerly SQL DW) is online.
$RestoredDatabase.status
![Automatic Restore Points](./media/sql-data-warehouse-restore-active-paused-dw/restoring-11.png)
+## Restore an existing dedicated SQL pool (formerly SQL DW) to a different subscription through PowerShell
+This is similar guidance to restoring an existing dedicated SQL pool, however the below instructions show that [Get-AzSqlDatabase](/powershell/module/az.sql/Get-AzSqlDatabase?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json) PowerShell cmdlet should be performed in the originating subscription while the [Restore-AzSqlDatabase](/powershell/module/az.sql/restore-azsqldatabase?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json) PowerShell cmdlet should be performed in the destination subscription. Note that the user performing the restore must have proper permissions in both the source and target subscriptions.
+
+1. Open PowerShell.
+
+2. Update Az.Sql Module to 3.8.0 (or greater) if needed
+
+3. Connect to your Azure account and list all the subscriptions associated with your account.
+
+4. Select the subscription that contains the database to be restored.
+
+5. List the restore points for the dedicated SQL pool (formerly SQL DW).
+
+6. Pick the desired restore point using the RestorePointCreationDate.
+
+7. Select the destination subscription in which the database should be restored.
+
+8. Restore the dedicated SQL pool (formerly SQL DW) to the desired restore point using [Restore-AzSqlDatabase](/powershell/module/az.sql/restore-azsqldatabase?toc=/azure/synapse-analytics/sql-data-warehouse/toc.json&bc=/azure/synapse-analytics/sql-data-warehouse/breadcrumb/toc.json) PowerShell cmdlet.
+
+9. Verify that the restored dedicated SQL pool (formerly SQL DW) is online.
+
+```powershell
+$SourceSubscriptionName="<YourSubscriptionName>"
+$SourceResourceGroupName="<YourResourceGroupName>"
+$SourceServerName="<YourServerNameWithoutURLSuffixSeeNote>" # Without database.windows.net
+$SourceDatabaseName="<YourDatabaseName>"
+$TargetSubscriptionName="<YourTargetSubscriptionName>"
+$TargetResourceGroupName="<YourTargetResourceGroupName>"
+$TargetServerName="<YourTargetServerNameWithoutURLSuffixSeeNote>" # Without database.windows.net
+$TargetDatabaseName="<YourDatabaseName>"
+
+# Update Az.Sql module to the latest version (3.8.0 or above)
+# Update-Module -Name Az.Sql -RequiredVersion 3.8.0
+
+Connect-AzAccount
+Get-AzSubscription
+Select-AzSubscription -SubscriptionName $SourceSubscriptionName
+
+# Pick desired restore point using RestorePointCreationDate "xx/xx/xxxx xx:xx:xx xx"
+$PointInTime="<RestorePointCreationDate>"
+# Or list all restore points
+Get-AzSqlDatabaseRestorePoint -ResourceGroupName $SourceResourceGroupName -ServerName $SourceServerName -DatabaseName $SourceDatabaseName
+
+# Get the specific database to restore
+$Database = Get-AzSqlDatabase -ResourceGroupName $SourceResourceGroupName -ServerName $SourceServerName -DatabaseName $SourceDatabaseName
+
+# Switch context to the destination subscription
+Select-AzSubscription -SubscriptionName $TargetSubscriptionName
+
+# Restore database from a desired restore point of the source database to the target server in the desired subscription
+$RestoredDatabase = Restore-AzSqlDatabase ΓÇôFromPointInTimeBackup ΓÇôPointInTime $PointInTime -ResourceGroupName $TargetResourceGroupName -ServerName $TargetServerName -TargetDatabaseName $TargetDatabaseName ΓÇôResourceId $Database.ResourceID
+
+# Verify the status of restored database
+$RestoredDatabase.status
+```
+ ## Next Steps - [Restore a deleted dedicated SQL pool (formerly SQL DW)](sql-data-warehouse-restore-deleted-dw.md)
synapse-analytics Sql Data Warehouse Restore From Deleted Server https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/sql-data-warehouse/sql-data-warehouse-restore-from-deleted-server.md
In this article, you learn how to restore a dedicated SQL pool (formerly SQL DW)
## Restore the SQL pool from the deleted server 1. Open PowerShell+ 2. Connect to your Azure account.+ 3. Set the context to the subscription that contains the server that was dropped.+ 4. Specify the approximate datetime the server was dropped.+ 5. Construct the resource ID for the database you wish to recover from the dropped server.+ 6. Restore the database from the dropped server+ 7. Verify the status of the recovered database as 'online'.
synapse-analytics Resources Self Help Sql On Demand https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/synapse-analytics/sql/resources-self-help-sql-on-demand.md
If you have SAS key that you should use to access files, make sure that you crea
If you are using Azure AD login without explicit credential, make sure that your Azure AD identity can access the files on storage. Your Azure AD identity need to have Blob Data Reader or list/read ACL permissions to access the files - see [Query fails because file cannot be opened](#query-fails-because-file-cannot-be-opened).
-If you are accessing storage using [credentials](develop-storage-files-storage-access-control.md#credentials), make sure that your [Managed identity](develop-storage-files-storage-access-control.md?tabs=managed-identity) or [SPN](develop-storage-files-storage-access-control.md?tabs=service-principal) has Data Reader/Contributor role, or ALC permissions. If you have used [SAS token](develop-storage-files-storage-access-control.md?tabs=shared-access-signature) make sure that it has `rl` permission and that it didn't expired.
+If you are accessing storage using [credentials](develop-storage-files-storage-access-control.md#credentials), make sure that your [Managed identity](develop-storage-files-storage-access-control.md?tabs=managed-identity) or [SPN](develop-storage-files-storage-access-control.md?tabs=service-principal) has Data Reader/Contributor role, or ACL permissions. If you have used [SAS token](develop-storage-files-storage-access-control.md?tabs=shared-access-signature) make sure that it has `rl` permission and that it hasn't expired.
If you are using SQL login and the `OPENROWSET` function [without data source](develop-storage-files-overview.md#query-files-using-openrowset), make sure that you have a server-level credential that matches the storage URI and has permission to access the storage. ### Query fails because file cannot be opened
-If your query fails with the error 'File cannot be opened because it does not exist or it is used by another process' and you're sure both file exist and it's not used by another process it means serverless SQL pool can't access the file. This problem usually happens because your Azure Active Directory identity doesn't have rights to access the file or because a firewall is blocking access to the file. By default, serverless SQL pool is trying to access the file using your Azure Active Directory identity. To resolve this issue, you need to have proper rights to access the file. Easiest way is to grant yourself 'Storage Blob Data Contributor' role on the storage account you're trying to query.
+If your query fails with the error 'File cannot be opened because it does not exist or it is used by another process' and you're sure that both files exist and aren't used by another process, then serverless SQL pool can't access the file. This problem usually happens because your Azure Active Directory identity doesn't have rights to access the file or because a firewall is blocking access to the file. By default, serverless SQL pool is trying to access the file using your Azure Active Directory identity. To resolve this issue, you need to have proper rights to access the file. The easiest way is to grant yourself a 'Storage Blob Data Contributor' role on the storage account you're trying to query.
- [Visit full guide on Azure Active Directory access control for storage for more information](../../storage/blobs/assign-azure-role-data-access.md). - [Visit Control storage account access for serverless SQL pool in Azure Synapse Analytics](develop-storage-files-storage-access-control.md)
If your query fails with the error 'File cannot be opened because it does not ex
Instead of granting Storage Blob Data Contributor, you can also grant more granular permissions on a subset of files.
-* All users that need access to some data in this container also needs to have the EXECUTE permission on all parent folders up to the root (the container).
+* All users that need access to some data in this container also need to have the EXECUTE permission on all parent folders up to the root (the container).
Learn more about [how to set ACLs in Azure Data Lake Storage Gen2](../../storage/blobs/data-lake-storage-explorer-acl.md). > [!NOTE]
virtual-machines Backup And Disaster Recovery For Azure Iaas Disks https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/backup-and-disaster-recovery-for-azure-iaas-disks.md
Another option to create consistent backups is to shut down the VM and take snap
### Copy the snapshots to another region
-Creation of the snapshots alone might not be sufficient for disaster recovery. You must also copy the snapshots to another region. See [Cross-region snapshot copy](disks-incremental-snapshots.md#cross-region-snapshot-copy).
+Creation of the snapshots alone might not be sufficient for disaster recovery. You must also copy the snapshots to another region. See [Copy an incremental snapshot to a new region](disks-copy-incremental-snapshot-across-regions.md).
## Other options
virtual-machines Disks Copy Incremental Snapshot Across Regions https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/disks-copy-incremental-snapshot-across-regions.md
+
+ Title: Copy a snapshot to a new region
+description: Learn how to copy an incremental snapshot of a managed disk to a different region.
+++ Last updated : 04/11/2022+++
+ms.devlang: azurecli
++
+# Copy an incremental snapshot to a new region
+
+Incremental snapshots can be copied to any region. The process is managed by Azure, removing the maintenance overhead of managing the copy process by staging a storage account in the target region. Azure ensures that only changes since the last snapshot in the target region are copied to the target region to reduce the data footprint, reducing the recovery point objective. You can check the progress of the copy so you know when a target snapshot is ready to restore disks in the target region. You're only charged for the bandwidth cost of the data transfer across the region and the read transactions on the source snapshots.
+
+This article covers copying an incremental snapshot from one region to another. See [Create an incremental snapshot for managed disks](disks-incremental-snapshots.md) for conceptual details on incremental snapshots.
++
+## Restrictions
+
+- You can copy 100 incremental snapshots in parallel at the same time per subscription per region.
+- If you use the REST API, you must use version 2020-12-01 or newer of the Azure Compute REST API.
+
+## Get started
+
+# [Azure CLI](#tab/azure-cli)
+
+You can use the Azure CLI to copy an incremental snapshot. You will need the latest version of the Azure CLI. See the following articles to learn how to either [install](/cli/azure/install-azure-cli) or [update](/cli/azure/update-azure-cli) the Azure CLI.
+
+The following script will copy an incremental snapshot from one region to another:
+
+```azurecli
+subscriptionId=<yourSubscriptionID>
+resourceGroupName=<yourResourceGroupName>
+name=<targetSnapshotName>
+sourceSnapshotResourceId=<sourceSnapshotResourceId>
+targetRegion=<validRegion>
+
+sourceSnapshotId=$(az snapshot show -n $sourceSnapshotName -g $resourceGroupName --query [id] -o tsv)
+
+az snapshot create -g $resourceGroupName -n $targetSnapshotName --source $sourceSnapshotId --incremental --copy-start
+
+az snapshot show -n $sourceSnapshotName -g $resourceGroupName --query [completionPercent] -o tsv
+```
+
+# [Azure PowerShell](#tab/azure-powershell)
+
+You can use the Azure PowerShell module to copy an incremental snapshot. You will need the latest version of the Azure PowerShell module. The following command will either install it or update your existing installation to latest:
+
+```PowerShell
+Install-Module -Name Az -AllowClobber -Scope CurrentUser
+```
+
+Once that is installed, login to your PowerShell session with `Connect-AzAccount`.
+
+The following script will copy an incremental snapshot from one region to another.
+
+```azurepowershell
+$subscriptionId="yourSubscriptionIdHere"
+$resourceGroupName="yourResourceGroupNameHere"
+$sourceSnapshotName="yourSourceSnapshotNameHere"
+$targetSnapshotName="yourTargetSnapshotNameHere"
+$targetRegion="desiredRegion"
+
+Set-AzContext -Subscription $subscriptionId
+
+$sourceSnapshot=Get-AzSnapshot -ResourceGroupName $resourceGroupName -SnapshotName $sourceSnapshotName
+
+$snapshotconfig = New-AzSnapshotConfig -Location $targetRegion -CreateOption CopyStart -Incremental -SourceResourceId $sourceSnapshot.Id
+
+New-AzSnapshot -ResourceGroupName $resourceGroupName -SnapshotName $targetSnapshotName -Snapshot $snapshotconfig
+
+$targetSnapshot=Get-AzSnapshot -ResourceGroupName $resourceGroupName -SnapshotName $targetSnapshotName
+
+$targetSnapshot.CompletionPercent
+```
+
+# [Portal](#tab/azure-portal)
+
+You can also copy an incremental snapshot across regions in the Azure portal. However, you must use this specific link to access the portal, for now: https://aka.ms/incrementalsnapshot
+
+1. Sign in to the [Azure portal](https://aka.ms/incrementalsnapshot) and navigate to the incremental snapshot you'd like to migrate.
+1. Select **Copy snapshot**.
+
+ :::image type="content" source="media/disks-incremental-snapshots/disks-copy-snapshot.png" alt-text="Screenshot of snapshot overview, copy snapshot highlighted." lightbox="media/disks-incremental-snapshots/disks-copy-snapshot.png":::
+
+1. For **Snapshot type** under **Instance details** select **Incremental**.
+1. Change the **Region** to the region you'd like to copy the snapshot to.
+
+ :::image type="content" source="media/disks-incremental-snapshots/disks-copy-snapshot-region-select.png" alt-text="Screenshot of copy snapshot experience, new region selected, incremental selected." lightbox="media/disks-incremental-snapshots/disks-copy-snapshot-region-select.png":::
+
+1. Select **Review + Create** and then **Create**.
+
+# [Resource Manager Template](#tab/azure-resource-manager)
+
+You can also use Azure Resource Manager templates to copy an incremental snapshot. You must use version **2020-12-01** or newer of the Azure Compute REST API. The following snippet is an example of how to copy an incremental snapshot across regions with Resource Manager templates:
+
+```json
+{
+ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
+ "contentVersion": "1.0.0.0",
+ "parameters": {
+ "name": {
+ "defaultValue": "isnapshot1",
+ "type": "String"
+ },
+ "sourceSnapshotResourceId": {
+ "defaultValue": "<your_incremental_snapshot_resource_ID>",
+ "type": "String"
+ },
+ "skuName": {
+ "defaultValue": "Standard_LRS",
+ "type": "String"
+ },
+ "targetRegion": {
+ "defaultValue": "desired_region",
+ "type": "String"
+ }
+ },
+ "variables": {},
+ "resources": [
+ {
+ "type": "Microsoft.Compute/snapshots",
+ "sku": {
+ "name": "[parameters('skuName')]",
+ "tier": "Standard"
+ },
+ "name": "[parameters('name')]",
+ "apiVersion": "2020-12-01",
+ "location": "[parameters('targetRegion')]",
+ "scale": null,
+ "properties": {
+ "creationData": {
+ "createOption": "CopyStart",
+ "sourceResourceId": "[parameters('sourceSnapshotResourceId')]"
+ },
+ "incremental": true
+ },
+ "dependsOn": []
+ }
+ ]
+}
+
+```
++
+## Next steps
+
+If you'd like to see sample code demonstrating the differential capability of incremental snapshots, using .NET, see [Copy Azure Managed Disks backups to another region with differential capability of incremental snapshots](https://github.com/Azure-Samples/managed-disks-dotnet-backup-with-incremental-snapshots).
virtual-machines Disks Incremental Snapshots https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/disks-incremental-snapshots.md
description: Learn about incremental snapshots for managed disks, including how
Previously updated : 03/30/2022 Last updated : 04/11/2022
You can also use Azure Resource Manager templates to create an incremental snaps
```
-## Cross-region snapshot copy
-
-You can copy incremental snapshots to any region of your choice. Azure manages the copy process removing the maintenance overhead of managing the copy process by staging a storage account in the target region. Moreover, Azure ensures that only changes since the last snapshot in the target region are copied to the target region to reduce the data footprint, reducing the recovery point objective. You can check the progress of the copy so you can know when a target snapshot is ready to restore disks in the target region. Customers are charged only for the bandwidth cost of the data transfer across the region and the read transactions on the source snapshots.
--
-### Restrictions
--- You can copy 100 incremental snapshots in parallel at the same time per subscription per region.-- If you use the REST API, you must use version 2020-12-01 or newer of the Azure Compute REST API.-
-### Get started
-
-# [Azure CLI](#tab/azure-cli)
-
-You can use the Azure CLI to copy an incremental snapshot. You will need the latest version of the Azure CLI. See the following articles to learn how to either [install](/cli/azure/install-azure-cli) or [update](/cli/azure/update-azure-cli) the Azure CLI.
-
-The following script will copy an incremental snapshot from one region to another:
-
-```azurecli
-subscriptionId=<yourSubscriptionID>
-resourceGroupName=<yourResourceGroupName>
-name=<targetSnapshotName>
-sourceSnapshotResourceId=<sourceSnapshotResourceId>
-targetRegion=<validRegion>
-
-sourceSnapshotId=$(az snapshot show -n $sourceSnapshotName -g $resourceGroupName --query [id] -o tsv)
-
-az snapshot create -g $resourceGroupName -n $targetSnapshotName --source $sourceSnapshotId --incremental --copy-start
-
-az snapshot show -n $sourceSnapshotName -g $resourceGroupName --query [completionPercent] -o tsv
-```
-
-# [Azure PowerShell](#tab/azure-powershell)
-
-You can use the Azure PowerShell module to copy an incremental snapshot. You will need the latest version of the Azure PowerShell module. The following command will either install it or update your existing installation to latest:
-
-```PowerShell
-Install-Module -Name Az -AllowClobber -Scope CurrentUser
-```
-
-Once that is installed, login to your PowerShell session with `Connect-AzAccount`.
-
-The following script will copy an incremental snapshot from one region to another.
-
-```azurepowershell
-$subscriptionId="yourSubscriptionIdHere"
-$resourceGroupName="yourResourceGroupNameHere"
-$sourceSnapshotName="yourSourceSnapshotNameHere"
-$targetSnapshotName="yourTargetSnapshotNameHere"
-$targetRegion="desiredRegion"
-
-Set-AzContext -Subscription $subscriptionId
-
-$sourceSnapshot=Get-AzSnapshot -ResourceGroupName $resourceGroupName -SnapshotName $sourceSnapshotName
-
-$snapshotconfig = New-AzSnapshotConfig -Location $targetRegion -CreateOption CopyStart -Incremental -SourceResourceId $sourceSnapshot.Id
-
-New-AzSnapshot -ResourceGroupName $resourceGroupName -SnapshotName $targetSnapshotName -Snapshot $snapshotconfig
-
-$targetSnapshot=Get-AzSnapshot -ResourceGroupName $resourceGroupName -SnapshotName $targetSnapshotName
-
-$targetSnapshot.CompletionPercent
-```
-
-# [Portal](#tab/azure-portal)
-
-You can also copy an incremental snapshot across regions in the Azure portal. However, you must use this specific link to access the portal, for now: https://aka.ms/incrementalsnapshot
-
-1. Sign in to the [Azure portal](https://aka.ms/incrementalsnapshot) and navigate to the incremental snapshot you'd like to migrate.
-1. Select **Copy snapshot**.
-
- :::image type="content" source="media/disks-incremental-snapshots/disks-copy-snapshot.png" alt-text="Screenshot of snapshot overview, copy snapshot highlighted." lightbox="media/disks-incremental-snapshots/disks-copy-snapshot.png":::
-
-1. For **Snapshot type** under **Instance details** select **Incremental**.
-1. Change the **Region** to the region you'd like to copy the snapshot to.
-
- :::image type="content" source="media/disks-incremental-snapshots/disks-copy-snapshot-region-select.png" alt-text="Screenshot of copy snapshot experience, new region selected, incremental selected." lightbox="media/disks-incremental-snapshots/disks-copy-snapshot-region-select.png":::
-
-1. Select **Review + Create** and then **Create**.
-
-# [Resource Manager Template](#tab/azure-resource-manager)
-
-You can also use Azure Resource Manager templates to copy an incremental snapshot. You must use version **2020-12-01** or newer of the Azure Compute REST API. The following snippet is an example of how to copy an incremental snapshot across regions with Resource Manager templates:
-
-```json
-{
- "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
- "contentVersion": "1.0.0.0",
- "parameters": {
- "name": {
- "defaultValue": "isnapshot1",
- "type": "String"
- },
- "sourceSnapshotResourceId": {
- "defaultValue": "<your_incremental_snapshot_resource_ID>",
- "type": "String"
- },
- "skuName": {
- "defaultValue": "Standard_LRS",
- "type": "String"
- },
- "targetRegion": {
- "defaultValue": "desired_region",
- "type": "String"
- }
- },
- "variables": {},
- "resources": [
- {
- "type": "Microsoft.Compute/snapshots",
- "sku": {
- "name": "[parameters('skuName')]",
- "tier": "Standard"
- },
- "name": "[parameters('name')]",
- "apiVersion": "2020-12-01",
- "location": "[parameters('targetRegion')]",
- "scale": null,
- "properties": {
- "creationData": {
- "createOption": "CopyStart",
- "sourceResourceId": "[parameters('sourceSnapshotResourceId')]"
- },
- "incremental": true
- },
- "dependsOn": []
- }
- ]
-}
-
-```
- ## Next steps
+See [Copy an incremental snapshot to a new region](disks-copy-incremental-snapshot-across-regions.md) to learn how to copy an incremental snapshot across regions.
+ If you'd like to see sample code demonstrating the differential capability of incremental snapshots, using .NET, see [Copy Azure Managed Disks backups to another region with differential capability of incremental snapshots](https://github.com/Azure-Samples/managed-disks-dotnet-backup-with-incremental-snapshots).
virtual-machines Azure Hybrid Benefit Byos Linux https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/azure-hybrid-benefit-byos-linux.md
>The below article is scoped to Azure Hybrid Benefit for BYOS VMs (AHB BYOS) which caters to conversion of custom on-prem image VMs and RHEL or SLES BYOS VMs. For conversion of RHEL PAYG or SLES PAYG VMs, refer to [Azure Hybrid Benefit for PAYG VMs here](./azure-hybrid-benefit-linux.md). >[!NOTE]
->Azure Hybrid Benefit for BYOS VMs is planned for Preview from **30 March 2022**. You can [sign up for the preview here.](https://aka.ms/ahb-linux-form) You will receive a mail from Microsoft once your subscriptions are enabled for Preview.
+>Azure Hybrid Benefit for BYOS VMs is in Preview now. You can [sign up for the preview here.](https://aka.ms/ahb-linux-form) You will receive a mail from Microsoft once your subscriptions are enabled for Preview.
Azure Hybrid Benefit for BYOS VMs is a licensing benefit that helps you to get software updates and integrated support for Red Hat Enterprise Linux (RHEL) and SUSE Linux Enterprise Server (SLES) virtual machines (VMs) directly from Azure infrastructure. This benefit is available to RHEL and SLES custom on-prem image VMs (VMs generated from on-prem images), and to RHEL and SLES Marketplace bring-your-own-subscription (BYOS) VMs.
virtual-machines N Series Driver Setup https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/linux/n-series-driver-setup.md
If the driver is installed, you will see output similar to the following. Note t
### X11 server
-If you need an X11 server for remote connections to an NV or NVv2 VM, [x11vnc](http://www.karlrunge.com/x11vnc/) is recommended because it allows hardware acceleration of graphics. The BusID of the M60 device must be manually added to the X11 configuration file (usually, `etc/X11/xorg.conf`). Add a `"Device"` section similar to the following:
+If you need an X11 server for remote connections to an NV or NVv2 VM, [x11vnc](https://wiki.archlinux.org/title/X11vnc) is recommended because it allows hardware acceleration of graphics. The BusID of the M60 device must be manually added to the X11 configuration file (usually, `etc/X11/xorg.conf`). Add a `"Device"` section similar to the following:
``` Section "Device"
virtual-machines Redhat In Place Upgrade https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-machines/workloads/redhat/redhat-in-place-upgrade.md
>[!Important] > Take a snapshot of the image before you start the upgrade as a precaution.
-## Upgrade process for RHEL 7 regular VMs
+## What is RHEL in-place upgrade?
+During an in-place upgrade, the earlier RHEL OS major version will be replaced with the new RHEL OS major version without removing the earlier version first. The installed applications and utilities, along with the configurations and preferences, are incorporated into the new version.
++
+## Upgrade from RHEL 7 VMs to RHEL 8 VMs
Instructions for an in-place upgrade from Red Hat Enterprise Linux 7 VMs to Red Hat Enterprise Linux 8 VMs on Azure is provided at the [Red Hat upgrading from RHEL 7 to RHEL 8 documentation here.](https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/8/html-single/upgrading_from_rhel_7_to_rhel_8/index)
-## Upgrade process for RHEL 7 SAP VMs
+## Upgrade SAP environments from RHEL 7 VMs to RHEL 8 VMs
Instructions for an in-place upgrade from Red Hat Enterprise Linux 7 SAP VMs to Red Hat Enterprise Linux 8 SAP VMs on Azure is provided at the [Red Hat upgrading from RHEL 7 SAP to RHEL 8 SAP documentation here.](https://access.redhat.com/solutions/5154031)
virtual-network Nat Metrics https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/nat-gateway/nat-metrics.md
Title: Metrics and alerts for Azure Virtual Network NAT description: Understand Azure Monitor metrics and alerts available for Virtual Network NAT.- # Customer intent: As an IT administrator, I want to understand available Azure Monitor metrics and alerts for Virtual Network NAT. - Previously updated : 03/04/2020 Last updated : 04/12/2022
+# Azure Virtual Network NAT metrics and alerts
-# Azure Virtual Network NAT metrics
+This article provides an overview of all NAT gateway metrics and diagnostic capabilities. This article provides general guidance on how to use metrics and alerts to monitor, manage, and [troubleshoot](troubleshoot-nat.md) your NAT gateway resource.
-Azure Virtual Network NAT gateway resources provide multi-dimensional metrics. You can use these metrics to observe the operation and for [troubleshooting](troubleshoot-nat.md). Alerts can be configured for critical issues such as SNAT exhaustion.
+Azure Virtual Network NAT gateway provides the following diagnostic capabilities:
+- Multi-dimensional metrics and alerts through Azure Monitor. You can use these metrics to monitor and manage your NAT gateway and to assist you in troubleshooting issues.
+
+- Network Insights: Azure Monitor Insights provides you with visual tools to view, monitor, and assist you in diagnosing issues with your NAT gateway resource. Insights provide you with a topological map of your Azure setup and metrics dashboards.
+ *Figure: Virtual Network NAT for outbound to Internet*
-## Metrics
+## Metrics overview
NAT gateway resources provide the following multi-dimensional metrics in Azure Monitor:
NAT gateway resources provide the following multi-dimensional metrics in Azure M
| Bytes | Bytes processed inbound and outbound | Sum | Direction (In; Out), Protocol (6 TCP; 17 UDP) | | Packets | Packets processed inbound and outbound | Sum | Direction (In; Out), Protocol (6 TCP; 17 UDP) | | Dropped packets | Packets dropped by the NAT gateway | Sum | / |
-| SNAT Connection Count | Number of SNAT connections / State transitions per interval of time | Sum | Connection State, Protocol (6 TCP; 17 UDP) |
-| Total SNAT connection count | Current active SNAT connections (~ SNAT ports currently in use by NAT gateway) | Sum | Protocol (6 TCP; 17 UDP) |
-| Datapath availability (Preview) | Availability of the data path of the NAT gateway. Used to determine whether the NAT gateway endpoints are available for outbound traffic flow. | Avg | Availability (0, 100) |
+| SNAT Connection Count | Number of new SNAT connections over a given interval of time | Sum | Connection State, Protocol (6 TCP; 17 UDP) |
+| Total SNAT connection count | Total number of active SNAT connections (~ SNAT ports currently in use by NAT gateway) | Sum | Protocol (6 TCP; 17 UDP) |
+| Data path availability (Preview) | Availability of the data path of the NAT gateway. Used to determine whether the NAT gateway endpoints are available for outbound traffic flow. | Avg | Availability (0, 100) |
+
+## Where to find my NAT gateway metrics
+
+NAT gateway metrics can be found in the following locations in the Azure portal.
+
+- **Metrics** page under **Monitoring** from a NAT gateway's resource page.
+
+- **Insights** page under **Monitoring** from a NAT gateway's resource page.
+
+ :::image type="content" source="./media/nat-metrics/nat-insights-metrics.png" alt-text="Screenshot of the insights and metrics options in NAT gateway overview.":::
+
+- Azure Monitor page under **Metrics**.
+
+ :::image type="content" source="./media/nat-metrics/azure-monitor.png" alt-text="Screenshot of the metrics section of Azure Monitor.":::
+
+To view any one of your metrics for a given NAT gateway resource:
+
+1. Select the NAT gateway resource you would like to monitor.
+
+2. In the **Metric** drop-down menu, select one of the provided metrics.
+
+3. In the **Aggregation** drop-down menu, select the recommended aggregation listed in the [metrics overview](#metrics-overview) table.
+
+ :::image type="content" source="./media/nat-metrics/nat-metrics-1.png" alt-text="Screenshot of the metrics setup configuration in NAT gateway resource.":::
+
+4. To adjust the time frame over which the chosen metric is presented on the metrics graph or to adjust how frequently the chosen metric is measured, select the **Time** window in the top right corner of the metrics page and make your adjustments.
+
+ :::image type="content" source="./media/nat-metrics/nat-metrics-2.png" alt-text="Screenshot of the metrics time setup configuration in NAT gateway resource.":::
+
+## How to use NAT gateway metrics
+
+### Bytes
+
+The **Bytes** metric shows you the amount of data going outbound through NAT gateway and returning inbound in response to an outbound connection.
+
+Use this metric for the following measurements:
+
+- Assess the amount of data being processed through NAT gateway to connect outbound or return inbound.
+
+To view the amount of data sent in one or both directions when connecting outbound through NAT gateway:
+
+1. Select the NAT gateway resource you would like to monitor.
+
+2. In the **Metric** drop-down menu, select the **Bytes** metric.
+
+3. In the **Aggregation** drop-down menu, select **Sum**.
+
+4. Select to **Add filter**.
+
+5. In the **Property** drop-down menu, select **Direction (Out | In)**.
+
+6. In the **Values** drop-down menu, select **Out**, **In**, or both.
+
+7. To see data processed inbound or outbound as their own individual lines in the metric graph, select **Apply splitting**.
+
+8. In the **Values** drop-down menu, select **Direction (Out | In)**.
+
+### Packets
+
+The packets metric shows you the number of data packets transmitted through the NAT gateway.
+
+Use this metric to:
+
+- To confirm that traffic is being sent through your NAT gateway to go outbound to the internet or return inbound.
+
+- To assess the amount of traffic being directed through your NAT gateway resource outbound or inbound (when in response to an outbound directed flow).
+
+To view the number of packets sent in one or both directions when connecting outbound through NAT gateway, follow the same steps in the [Bytes](#bytes) section.
+
+### Dropped packets
+
+The dropped packets metric shows you the number of data packets dropped by NAT gateway when directing traffic outbound or inbound in response to an outbound connection.
+
+Use this metric to:
+
+- Assess whether or not you're nearing or possibly experiencing SNAT exhaustion with a given NAT gateway resource. Check to see if periods of dropped packets coincide with periods of failed SNAT connections with the [Total SNAT Connection Count](#total-snat-connection-count) metric.
+
+- Help assess if you're experiencing a pattern of failed outbound connections.
+
+Reasons for why you may see dropped packets:
+
+- If you're seeing a high rate of dropped packets, it may be due to outbound connectivity failure. Connectivity failure may happen for various reasons. See the NAT gateway [troubleshooting guide](/azure/virtual-network/nat-gateway/troubleshoot-nat) to help you further diagnose.
+
+### SNAT connection count
+
+The SNAT connection count metric shows you the number of newly used SNAT ports within a specified time frame.
+
+Use this metric to:
+
+- Evaluate the number of successful and failed attempts to make outbound connections.
+
+- Help assess if you're experiencing a pattern of failed outbound connections.
+
+To view the number of attempted and failed connections:
+
+1. Select the NAT gateway resource you would like to monitor.
+
+2. In the **Metric** drop-down menu, select the **SNAT Connection Count** metric.
+
+3. In the **Aggregation** drop-down menu, select **Sum**.
+
+4. Select to **Add filter**.
+
+5. In the **Property** drop-down menu, select **Connection State**.
+
+6. In the **Values** drop-down menu, select **Attempted**, **Failed**, or both.
+
+7. To see attempted and failed connections as their own individual lines in the metric graph, select **Apply splitting**.
+
+8. In the **Values** drop-down menu, select **Connection State**.
+
+ :::image type="content" source="./media/nat-metrics/nat-metrics-3.png" alt-text="Screenshot of the metrics configuration.":::
+
+Reasons for why you may see failed connections:
+
+- If you're seeing a pattern of failed connections for your NAT gateway resource, there could be multiple possible reasons. See the NAT gateway [troubleshooting guide](/azure/virtual-network/nat-gateway/troubleshoot-nat) to help you further diagnose.
+
+### Total SNAT connection count
+
+The **Total SNAT connection count** metric shows you the total number of active SNAT connections over a period of time.
+
+You can use this metric to:
+
+- Monitor SNAT port utilization on a given NAT gateway resource.
+
+- Analyze over a given time interval to provide insight on whether or not NAT gateway connectivity should be scaled out further by adding more public IPs.
+
+- Assess whether or not you're nearing or possibly experiencing SNAT exhaustion with a given NAT gateway resource.
+
+### Data path availability (Preview)
+
+The data path availability metric measures the status of the NAT gateway resource over time. This metric informs on whether or not NAT gateway is available for directing outbound traffic to the internet. This metric is a reflection of the health of the Azure infrastructure.
+
+You can use this metric to:
+
+- Monitor the availability of your NAT gateway resource.
+
+- Investigate the platform where your NAT gateway is deployed and determine if itΓÇÖs healthy.
+
+- Isolate whether an event is related to your NAT gateway or to the underlying data plane.
+
+Reasons for why you may see a drop in data path availability include:
+
+- An infrastructure outage has occurred.
+
+- There aren't healthy VMs available in your NAT gateway configured subnet. For more information, see the NAT gateway [troubleshooting guide](/azure/virtual-network/nat-gateway/troubleshoot-nat).
## Alerts
-Alerts for metrics can be configured in Azure Monitor for each of the preceding [metrics](#metrics).
+Alerts can be configured in Azure Monitor for each of the preceding metrics. These alerts proactively notify you when important conditions are found in your monitoring data. They allow you to identify and address potential issues with your NAT gateway resource.
+
+For more information about how metric alerts work, see [Azure Monitor Metric Alerts](/azure/azure-monitor/alerts/alerts-metric-overview). See guidance below on how to configure some common and recommended types of alerts for your NAT gateway.
+
+### Alerts for SNAT port usage
+
+Use the total **SNAT connection count** metric and alerts for when you're nearing the limits of available SNAT ports.
+
+To create the alert, use the following steps:
+
+1. From the NAT gateway resource page, select **Alerts**.
+
+2. Select **Create alert rule**.
+
+3. From the signal list, select **Total SNAT Connection Count**.
+
+4. From the **Operator** drop-down menu, select **Less than or equal to**.
+
+5. From the **Aggregation type** drop-down menu, select **Total**.
+
+6. In the **Threshold value** box, enter a percentage value that the Total SNAT connection count must drop below before an alert is fired. When deciding what threshold value to use, keep in mind how much you've scaled out your NAT gateway outbound connectivity with public IP addresses. For more information, see [Scale NAT gateway](/azure/virtual-network/nat-gateway/nat-gateway-resource#scale-nat-gateway).
+
+7. From the **Unit** drop-down menu, select **Count**.
+
+8. From the **Aggregation granularity (Period)** drop-down menu, select a time period over which you would like the SNAT connection count to be measured.
+
+9. Create an **Action** for your alert by providing a name, notification type, and type of action that is performed when the alert is triggered.
+
+10. Before deploying your action, **test the action group**.
+
+11. Select **Create** to create the alert rule.
+
+>[!NOTE]
+>SNAT exhaustion on your NAT gateway resource is uncommon. If you see SNAT exhaustion, your NAT gateway's idle timeout timer may be holding on to SNAT ports too long or your may need to scale with additional public IPs. To troubleshoot these kinds of issues, refer to the NAT gateway [troubleshooting guide](/azure/virtual-network/nat-gateway/troubleshoot-nat).
+
+## Network Insights
+
+[Azure Monitor Network Insights](/azure/azure-monitor/insights/network-insights-overview) allows you to visualize your Azure infrastructure setup and to review all metrics for your NAT gateway resource from a pre-configured metrics dashboard. These visual tools help you diagnose and troubleshoot any issues with your NAT gateway resource.
+
+### View the topology of your Azure architectural setup
+
+To view a topological map of your setup in Azure:
+
+1. From your NAT gatewayΓÇÖs resource page, select **Insights** from the **Monitoring** section.
+
+2. On the landing page for **Insights**, you'll see a topology map of your NAT gateway setup. This map will show you the relationship between the different components of your network (subnets, virtual machines, public IP addresses).
+
+3. Hover over any component in the topology map to view configuration information.
+
+ :::image type="content" source="./media/nat-metrics/nat-insights.png" alt-text="Screenshot of the Insights section of NAT gateway.":::
+
+### View all NAT gateway metrics in a dashboard
+
+The metrics dashboard can be used to better understand the performance and health of your NAT gateway resource. The metrics dashboard shows a view of all metrics for NAT gateway on a single page.
+
+- All NAT gateway metrics can be viewed in a dashboard when selecting **Show Metrics Pane**.
+
+ :::image type="content" source="./media/nat-metrics/nat-metrics-pane.png" alt-text="Screenshot of the show metrics pane.":::
+
+- A full page view of all NAT gateway metrics can be viewed when selecting **View Detailed Metrics**.
+
+ :::image type="content" source="./media/nat-metrics/detailed-metrics.png" alt-text="Screenshot of the view detailed metrics.":::
+
+For more information on what each metric is showing you and how to analyze these metrics, see [How to use NAT gateway metrics](#how-to-use-nat-gateway-metrics).
## Limitations
-Resource health isn't supported.
+- Resource health isn't supported.
## Next steps
Resource health isn't supported.
* Learn about [NAT gateway resource](nat-gateway-resource.md) * Learn about [Azure Monitor](../../azure-monitor/overview.md) * Learn about [troubleshooting NAT gateway resources](troubleshoot-nat.md).
-* [Tell us what to build next for Virtual Network NAT in UserVoice](https://aka.ms/natuservoice).
virtual-network Quickstart Create Nat Gateway Bicep https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/nat-gateway/quickstart-create-nat-gateway-bicep.md
+
+ Title: 'Create a NAT gateway - Bicep'
+
+description: This quickstart shows how to create a NAT gateway using Bicep.
+
+documentationcenter: na
+
+# Customer intent: I want to create a NAT gateway using Bicep so that I can provide outbound connectivity for my virtual machines.
+++
+ na
+ Last updated : 04/08/2022++++
+# Quickstart: Create a NAT gateway - Bicep
+
+Get started with Virtual Network NAT using Bicep. This Bicep file deploys a virtual network, a NAT gateway resource, and Ubuntu virtual machine. The Ubuntu virtual machine is deployed to a subnet that is associated with the NAT gateway resource.
++
+## Prerequisites
+
+If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) before you begin.
+
+## Review the Bicep file
+
+The Bicep file used in this quickstart is from [Azure Quickstart Templates](https://azure.microsoft.com/resources/templates/nat-gateway-1-vm/).
+
+This Bicep file is configured to create a:
+
+* Virtual network
+* NAT gateway resource
+* Ubuntu virtual machine
+
+The Ubuntu VM is deployed to a subnet that's associated with the NAT gateway resource.
++
+Nine Azure resources are defined in the Bicep file:
+
+* **[Microsoft.Network/networkSecurityGroups](/azure/templates/microsoft.network/networksecuritygroups)**: Creates a network security group.
+* **[Microsoft.Network/networkSecurityGroups/securityRules](/azure/templates/microsoft.network/networksecuritygroups/securityrules)**: Creates a security rule.
+* **[Microsoft.Network/publicIPAddresses](/azure/templates/microsoft.network/publicipaddresses)**: Creates a public IP address.
+* **[Microsoft.Network/publicIPPrefixes](/azure/templates/microsoft.network/publicipprefixes)**: Creates a public IP prefix.
+* **[Microsoft.Compute/virtualMachines](/azure/templates/Microsoft.Compute/virtualMachines)**: Creates a virtual machine.
+* **[Microsoft.Network/virtualNetworks](/azure/templates/microsoft.network/virtualnetworks)**: Creates a virtual network.
+* **[Microsoft.Network/natGateways](/azure/templates/microsoft.network/natgateways)**: Creates a NAT gateway resource.
+* **[Microsoft.Network/virtualNetworks/subnets](/azure/templates/microsoft.network/virtualnetworks/subnets)**: Creates a virtual network subnet.
+* **[Microsoft.Network/networkinterfaces](/azure/templates/microsoft.network/networkinterfaces)**: Creates a network interface.
+
+## Deploy the Bicep file
+
+1. Save the Bicep file as **main.bicep** to your local computer.
+1. Deploy the Bicep file using either Azure CLI or Azure PowerShell.
+
+ # [CLI](#tab/CLI)
+
+ ```azurecli
+ az group create --name exampleRG --location eastus
+ az deployment group create --resource-group exampleRG --template-file main.bicep --parameters adminusername=<admin-name>
+ ```
+
+ # [PowerShell](#tab/PowerShell)
+
+ ```azurepowershell
+ New-AzResourceGroup -Name exampleRG -Location eastus
+ New-AzResourceGroupDeployment -ResourceGroupName exampleRG -TemplateFile ./main.bicep -adminusername "<admin-name>"
+ ```
+
+
+
+ > [!NOTE]
+ > Replace **\<admin-name\>** with the administrator username for the virtual machine. You'll also be prompted to enter **adminpassword**.
+
+ When the deployment finishes, you should see a message indicating the deployment succeeded.
+
+## Review deployed resources
+
+Use the Azure portal, Azure CLI, or Azure PowerShell to list the deployed resources in the resource group.
+
+# [CLI](#tab/CLI)
+
+```azurecli-interactive
+az resource list --resource-group exampleRG
+```
+
+# [PowerShell](#tab/PowerShell)
+
+```azurepowershell-interactive
+Get-AzResource -ResourceGroupName exampleRG
+```
+++
+## Clean up resources
+
+When no longer needed, use the Azure portal, Azure CLI, or Azure PowerShell to delete the resource group and its resources.
+
+# [CLI](#tab/CLI)
+
+```azurecli-interactive
+az group delete --name exampleRG
+```
+
+# [PowerShell](#tab/PowerShell)
+
+```azurepowershell-interactive
+Remove-AzResourceGroup -Name exampleRG
+```
+++
+## Next steps
+
+In this quickstart, you created a:
+
+* NAT gateway resource
+* Virtual network
+* Ubuntu virtual machine
+
+The virtual machine is deployed to a virtual network subnet associated with the NAT gateway.
+
+To learn more about Virtual Network NAT and Bicep, continue to the articles below.
+
+* Read an [Overview of Virtual Network NAT](nat-overview.md)
+* Read about the [NAT Gateway resource](nat-gateway-resource.md)
+* Learn more about [Bicep](../../azure-resource-manager/bicep/overview.md)
virtual-network Service Tags Overview https://github.com/MicrosoftDocs/azure-docs/commits/main/articles/virtual-network/service-tags-overview.md
By default, service tags reflect the ranges for the entire cloud. Some service t
| **EOPExternalPublishedIPs** | This tag represents the IP addresses used for Security & Compliance Center PowerShell. Refer to the [Connect to Security & Compliance Center PowerShell using the EXO V2 module for more details](/powershell/exchange/connect-to-scc-powershell). | Both | No | Yes | | **EventHub** | Azure Event Hubs. | Outbound | Yes | Yes | | **GatewayManager** | Management traffic for deployments dedicated to Azure VPN Gateway and Application Gateway. | Inbound | No | No |
-| **GenevaActions** | This is used for inbound security rules to allow the Geneva Actions service access to downstream service APIs. | Inbound | No | Yes |
| **GuestAndHybridManagement** | Azure Automation and Guest Configuration. | Outbound | No | Yes | | **HDInsight** | Azure HDInsight. | Inbound | Yes | No | | **Internet** | The IP address space that's outside the virtual network and reachable by the public internet.<br/><br/>The address range includes the [Azure-owned public IP address space](https://www.microsoft.com/download/details.aspx?id=41653). | Both | No | No |